(Please read Part I, Part II, and Part III first.)
7. Insulin and Low-Carbohydrate Diets
The revival of Brand's doctrine of CoEvolution in recent years was sparked by increasing popularity of low-carbohydrate diets (Atkins etc.) These diets have a principled, scientific basis derived from the study of the role of insulin in the human body.
Insulin is secreted into the blood stream in response to high levels of glucose in the blood. Insulin promotes the conversion of excess glucose into triglycerides, and eventually body fat: this is how energy is stored in times of abundance for use in times of famine.
The first function of insulin is to protect the body from excess levels of glucose. Diabetes, a group of diseases marked by the lack of normal release of insulin in response to an excess of glucose, leaves the body exposed to severe damage from excess glucose.
The second function of insulin is to convert the overly abundant glucose into triglycerides, and eventually into body fat for energy storage. The absence of insulin causes the reverse: conversion of stored fat to usable energy.
The third is to increase hunger. Until the last half-century, most human populations were plagued by famines, often exacerbated by tyranny and war. A human who had stored enough fat in times of plenty could survive a famine more often; insulin hunger encouraged more eating, and the formation of more stored fat, in those rare times when there was more than enough to eat. More fat also meant more cardiovascular disease and a shorter lifespan, but in most cultures, what happened to the individual after the age of reproduction and child-rearing had little evolutionary impact. In most local environments human evolution favored fat for surviving a famine, and a lifespan of 30-40 years.
In recent decades modern agricultural technology changed this picture. With the threat of famine removed, first in the West and then elsewhere, humans started to aim for a longer individual lifespan. In the context of changed goals, such as achieving lifespans of eight decades or longer, fat became an obstacle instead of an advantage. The logical solution, in view of the scientific identification of the three roles of insulin, was to fight fat and its consequences by limiting the intake of carbohydrates, and thus limiting blood glucose levels to below the threshold for the release of insulin. Hence the advice, from Atkins 1972 onward, to limit one's intake of bulk carbohydrates to less than one gram per two kilograms of body weight per day.
This advice was not unopposed. Before the three roles of insulin had been identified, diets were based on a well-understood principle: the law of conservation of energy. Every calorie of energy eaten must be either excreted, expended, or stored. To store less energy in the form of body fat, one must eat fewer calories. The advice to stop counting calories, even if only to count grams of carbohydrates instead, seemed unprincipled. The advocates of low-carb diets were telling patients to stop acting on principle and to act on a gimmick instead. The trouble was that for many, albeit not for everyone, the gimmick worked.
Of course the low-carb diets did not repeal the law of conservation of energy. They worked because, in the absence of insulin, the low-carb dieter was not hungry and ate fewer calories. (The Atkins diet also encouraged the consumption of high-fiber greens, facilitating the elimination of excess food from the digestive tract.) In contrast, the traditional calorie-counting dieter often consumed most of her closely watched calories in the form of bulk carbohydrates - and her insulin spiked, causing hunger. And so the calorie-counter was discouraged, stressed, tempted to cheat. The difference was, and is, a fascinating illustration of the importance of tracking the context when one is following a principle.
8. CoEvolution Becomes "Paleo."
Stewart Brand renounced the doctrine of CoEvolution when phyletic gradualism, on which the doctrine was based, had been disconfirmed and replaced by punctuated equilibrium. But, just like dietitians attached to the principle of calories, the now disconfirmed principle of CoEvolution had adherents who evaded, ignored, or simply did not understand the principle of punctuated equilibrium.
One anthropological observation sometimes cited in favor of low-carbohydrate diets was that pre-European-contact Eskimos lived on what was practically the highest-fat, lowest-carbohydrate diet on the planet, yet were physically fit enough to thrive in the world's most adverse environment. To the remaining believers in CoEvolution this made sense: the Eskimos lived without agriculture or industry, much as Paleolithic men had lived ten thousand years ago. Similarly, pre-agricultural tribes inhabiting tropical islands where there had never been famines, did not exhibit the cardiovascular pathologies observed in environments where fat accumulation had evolved as an adaptation to periodic famine. In the context of punctuated equilibrium, these would be viewed as examples of rapid (and possibly recent) evolutionary adaptation to local conditions. Believers in CoEvolution, however, saw in those selected anthropological observations a validation of their belief that a return to a Paleolithic lifestyle was a recipe for the achievement of optimal health - optimal health that humans had been designed for by slow (Co)Evolution over hundreds of thousands of years. Thus was CoEvolution re-born as "Modern Paleo." Unlike punctuated equilibrium, it was a principle that one did not need measure theory to understand. Thus one could do a low-carbohydrate diet not as a "gimmick" using the peculiar relation of hunger to insulin, but as part of the application of "The Principle of Evolution." It was no longer the principle of how evolution was understood to work, by those who cared to understand how it worked. But it did correspond to how evolution had been formerly thought to work, and how it was still thought to work by nearly everyone else. And for the popular self-help culture that was good enough.
Unfortunately for adherents, the Modern Paleo Principle leads to something quite different from optimal health - unless modified by altogether non-paleolithic, industrial-strength food supplements. The most telling example is iodine. No one really knows why, but a low-carbohydrate diet, when it does not include plenty of seafood and kelp, sometimes causes an iodine deficiency severe enough to end in hypothyroidism. And so "Modern Paleo" adherents are the world's best consumers of un-paleolithic, industrially purified, high-potency Iodine/Potassium Iodide tablets.
It happens that all "contemporary Paleo" cultures that have been found by anthropologists to enjoy relatively good health, live on islands or on seashores, where they get plenty of iodine in their diet from seafood and from kelp and other marine vegetation. Somehow even the most non-Paleolithic islanders, such as the modern Japanese, are also very healthy, as long as they get enough iodine from seafood and kelp. Even if, as in the case of the Japanese, the bulk of their diet is rice and other modern grains.
One explanation of the iodine link is that iodine is needed to live without bulk carbohydrates. So how did inland primitives live and reproduce without industrial iodine? It turns out that they ate bulk carbohydrates. Wild sugar cane is a favorite of inland primitives in New Guinea. Wild rice (Zizania) was a staple in the diet of pre-Columbian inland North Americans. Low-carb might not be, for at least some of us, a return to the diet of our stone-age ancestors.
Another potential explanation is that some of us may have had more recent ancestors who lived on islands or by the shore. The very rapid evolution to local optima - an aspect of punctuated equilibrium - would have moved the relevant ancestral genomes in the direction of dependence on abundant iodine in the diet. Some people today may need industrial iodine supplements, because some of their specific ancestors had evolved to depend on high levels of this specific nutrient.
Evolution does not do design. We are not "designed by evolution" to eat the food or live the lives of our Paleolithic ancestors. What evolution is known to have done, is to quickly if imperfectly adapt one's specific ancestors, who lived in hundreds of different environments, to be fit to survive and reproduce in those specific environments - and not in the ancestral environments of other humans. The daily food of one may be pain or death to another. The real universal principle is to use one's mind to create for oneself, by systematic self-knowledge and by the artifice of one's mind, a diet and an environment that will compensate for evolution's lack of design.
Wednesday, June 23, 2010
Tuesday, June 15, 2010
Why Evolution Doesn't Do Design: Part III
(Please read Part I and Part II before this part.)
5. Why Punctuated Equilibrium Matters

So far, this discussion dealt mainly with an issue that, at first glance, might not seem important to anyone outside a narrow circle of systematic and theoretical biologists. Hardly anyone outside this narrow circle has even heard of "phyletic gradualism" or of "punctuated equilibrium." Yet this first glance is deceiving. Up to about 1990, even knowledgeable biologists often conflated what we now call "phyletic gradualism" with evolution as such (much as the "Copenhagen Interpretation" of quantum physics is often conflated, even in the minds of the less knowledgeable physicists, with just quantum physics.) Even today, there are many professors of biology, not to mention biology teachers in middle schools and high schools, who are out of their intellectual comfort zone in measure theory and genetic algorithms - and therefore, who still think of phyletic gradualism as simply "Evolution." And therefore, so do almost all non-biologists. Most people today, if they have any ideas having to do with evolution at all, have ideas based on a false picture of how evolution works.
And thus we come to the most important reason why punctuated equilibrium matters. An important aspect of rationality is the use of principles as a guide to everyday life. One of the most useful principles is that the human body is a product of evolution. A false idea of how evolution works can sabotage the application of this principle to one's life. At worst, a false idea of how evolution works can trap the user into activities and habits that worsen, rather than improve, one's health and one's performance at life.
Unfortunately, there are many diet, exercise, and other self-help regimens that claim to be principled applications of the principle of evolution - and are actually applications of the (disconfirmed) phyletic gradualism model. Most of these regimens are based on a specific embodiment of the phyletic gradualism model: Stewart Brand's once popular doctrine of "CoEvolution."
6. "Co-Evolution" and Stewart Brand
Stewart Brand's work has affected more people who don't know his name than the work of any other near-anonymous intellectual on the planet. An early associate of Ken Kesey and his "performance art" collective the "Merry Pranksters," he had unblemished Hippie credentials, combined with (unusual for a Hippie) a genuine love of science and technology. The mainspring of the hippies' movement was opposition to anything that might be a part of "the system:" not merely against the political system, but against everything from systematic thought to industrial production to intelligible art. This took many of the early hippies into lives of applied nihilism that often led to the mental hospital or to the grave. Brand, with his love of history and technology, was a rebel among rebels.
By 1968, many of Brand's hippie friends were dead from rejection of science and, especially, from rejection of technology. In 1969, Brand began publishing The Whole Earth Catalog, a compendium of technologies (books, maps, garden tools, specialized clothing, carpenters' and masons' tools, forestry gear, tents, welding equipment, professional journals, electrical gear and so on) that might be useful for survival in various degrees of isolation from "the System." It became an instant best-seller, reaching a million and a half copies in 1972. The Whole Earth Catalog was not only a survival manual for hippies, but also for a wide range of "survivalists" preparing for eventualities that ranged from nuclear war and a Communist invasion, to an immediate collapse of civilization (as in "Atlas Shrugged" read by a literalist.) The Whole Earth Catalog was also used by millions who were neither hippies nor survivalists, but who found many of the technologies in the book simply useful for living better lives.
In the meantime, Brand developed something that the early hippies had disdained: an ideology that grounded their anti-industrial attitude and lifestyle in the evolutionary science of his time; that is, in what today is called the phyletic gradualism model. Brand's "ideology of the Hippies" came to be known as the doctrine of CoEvolution, after one of its key ideas. Brand propagated his ideas in a periodical, founded in 1974, that he called "Co-Evolution Quarterly."
According to Stewart Brand's doctrine of CoEvolution, for the first several millions of years of hominid evolution our ancestors, and the life-forms in their environment, had co-evolved into a state of optimal human health in an environment optimally suited for human life. This optimal co-existence came to an end at the breakpoint between the Paleolithic and the Neolithic periods of human prehistory. It ended when humans stopped waiting for the life in their environment to co-evolve with them, slowly, into a harmonious state optimal for the well-being of all life. Instead, humans started to change their environment by means of agriculture, engineering, and eventually the building of cities and industries. These changes resulted in an environment that was no longer suited to optimal human health and life.
The other prong of the doctrine was that, given how slow evolution appeared to be under the phyletic gradualism model, human biology today was essentially unchanged since paleolithic times. Therefore the existential counsel of Brand's doctrine of CoEvolution was that, to achieve optimal well-being, one should try to live today as closely as possible to how our paleolithic ancestors lived ten thousand years ago; while trying to live in an environment approximating, as closely as possible, their environment - and working to bring the global physical environment back to what it was then. It was not a coincidence that the result also resembled how many of the Hippies already lived, as a result of trying to live apart from "the System."
The specific recommendations were, first, to avoid eating foods created or processed by industrial or artificial methods such as milling, canning, or chemical reactions or adding artificial preservatives or flavors. Then, to avoid the products of industrial agriculture, and to grow one's own, or to barter or trade with small-scale, home-based farmers, and then only those who did not use any artificial chemicals or other artificial methods. Eating, if possible, only those breeds of animals and plants that were closest to what existed before the beginning of agriculture. Avoiding foods that could not have been hunted or gathered by pre-agricultural humans.
The CoEvolution lifestyle went beyond diet. Footwear was to be avoided in favor of going barefoot; at most, it was to be limited to protecting the bottom of the sole. Lift, lateral support and arch support footwear was out. Exercise was to be limited to the natural motions of running, climbing, and hefting - no artificial positions or exercise machines. No shaving; no artificial cosmetics, shampoo, soap, or deodorant. No furniture for sleeping or sitting off the floor (but mats and pillows, in place of paleolithic animal pelts on the floor, were OK.) Any work that could not be done on the floor was to be done standing.
Elements of the CoEvolution lifestyle caught on with many people who would never have thought of themselves as hippies, often for good reasons. Many men stopped shaving, both to save time and to avoid the inevitable nicks and cuts. Some found it easier to work standing than sitting; many slept better on futons than on beds. Most of all, Americans began to re-learn how much better authentic, unprocessed food tasted than the processed, industrial kind. Stewart Brand changed how we lived, and how we thought about living.
Yet within a few years, the phyletic gradualism model at the foundation of the CoEvolution doctrine began to be disconfirmed by increasing evidence for punctuated equilibrium. Brand, who had been a biology major at Stanford, was among the first to abandon the doctrine to which he had given life. In 1985, "CoEvolution Quarterly" became "Whole Earth Review," and CoEvolution was not heard from again. Brand re-invented himself as a corporate futurist, in 1988 co-founding the Global Business Network and working for, among others, Royal Dutch/Shell, Volvo, and AT&T. But Brand's doctrine of CoEvolution would not die with its founder's change of mind. Few non-specialists understood the evidence that had disconfirmed phyletic gradualism and put punctuated equilibrium in its place. And two decades later, CoEvolution re-entered the marketplace of self-help ideas under other names.
(Continued in Part IV.)
5. Why Punctuated Equilibrium Matters

So far, this discussion dealt mainly with an issue that, at first glance, might not seem important to anyone outside a narrow circle of systematic and theoretical biologists. Hardly anyone outside this narrow circle has even heard of "phyletic gradualism" or of "punctuated equilibrium." Yet this first glance is deceiving. Up to about 1990, even knowledgeable biologists often conflated what we now call "phyletic gradualism" with evolution as such (much as the "Copenhagen Interpretation" of quantum physics is often conflated, even in the minds of the less knowledgeable physicists, with just quantum physics.) Even today, there are many professors of biology, not to mention biology teachers in middle schools and high schools, who are out of their intellectual comfort zone in measure theory and genetic algorithms - and therefore, who still think of phyletic gradualism as simply "Evolution." And therefore, so do almost all non-biologists. Most people today, if they have any ideas having to do with evolution at all, have ideas based on a false picture of how evolution works.
And thus we come to the most important reason why punctuated equilibrium matters. An important aspect of rationality is the use of principles as a guide to everyday life. One of the most useful principles is that the human body is a product of evolution. A false idea of how evolution works can sabotage the application of this principle to one's life. At worst, a false idea of how evolution works can trap the user into activities and habits that worsen, rather than improve, one's health and one's performance at life.
Unfortunately, there are many diet, exercise, and other self-help regimens that claim to be principled applications of the principle of evolution - and are actually applications of the (disconfirmed) phyletic gradualism model. Most of these regimens are based on a specific embodiment of the phyletic gradualism model: Stewart Brand's once popular doctrine of "CoEvolution."
6. "Co-Evolution" and Stewart Brand
Stewart Brand's work has affected more people who don't know his name than the work of any other near-anonymous intellectual on the planet. An early associate of Ken Kesey and his "performance art" collective the "Merry Pranksters," he had unblemished Hippie credentials, combined with (unusual for a Hippie) a genuine love of science and technology. The mainspring of the hippies' movement was opposition to anything that might be a part of "the system:" not merely against the political system, but against everything from systematic thought to industrial production to intelligible art. This took many of the early hippies into lives of applied nihilism that often led to the mental hospital or to the grave. Brand, with his love of history and technology, was a rebel among rebels.
By 1968, many of Brand's hippie friends were dead from rejection of science and, especially, from rejection of technology. In 1969, Brand began publishing The Whole Earth Catalog, a compendium of technologies (books, maps, garden tools, specialized clothing, carpenters' and masons' tools, forestry gear, tents, welding equipment, professional journals, electrical gear and so on) that might be useful for survival in various degrees of isolation from "the System." It became an instant best-seller, reaching a million and a half copies in 1972. The Whole Earth Catalog was not only a survival manual for hippies, but also for a wide range of "survivalists" preparing for eventualities that ranged from nuclear war and a Communist invasion, to an immediate collapse of civilization (as in "Atlas Shrugged" read by a literalist.) The Whole Earth Catalog was also used by millions who were neither hippies nor survivalists, but who found many of the technologies in the book simply useful for living better lives.
In the meantime, Brand developed something that the early hippies had disdained: an ideology that grounded their anti-industrial attitude and lifestyle in the evolutionary science of his time; that is, in what today is called the phyletic gradualism model. Brand's "ideology of the Hippies" came to be known as the doctrine of CoEvolution, after one of its key ideas. Brand propagated his ideas in a periodical, founded in 1974, that he called "Co-Evolution Quarterly."
According to Stewart Brand's doctrine of CoEvolution, for the first several millions of years of hominid evolution our ancestors, and the life-forms in their environment, had co-evolved into a state of optimal human health in an environment optimally suited for human life. This optimal co-existence came to an end at the breakpoint between the Paleolithic and the Neolithic periods of human prehistory. It ended when humans stopped waiting for the life in their environment to co-evolve with them, slowly, into a harmonious state optimal for the well-being of all life. Instead, humans started to change their environment by means of agriculture, engineering, and eventually the building of cities and industries. These changes resulted in an environment that was no longer suited to optimal human health and life.
The other prong of the doctrine was that, given how slow evolution appeared to be under the phyletic gradualism model, human biology today was essentially unchanged since paleolithic times. Therefore the existential counsel of Brand's doctrine of CoEvolution was that, to achieve optimal well-being, one should try to live today as closely as possible to how our paleolithic ancestors lived ten thousand years ago; while trying to live in an environment approximating, as closely as possible, their environment - and working to bring the global physical environment back to what it was then. It was not a coincidence that the result also resembled how many of the Hippies already lived, as a result of trying to live apart from "the System."
The specific recommendations were, first, to avoid eating foods created or processed by industrial or artificial methods such as milling, canning, or chemical reactions or adding artificial preservatives or flavors. Then, to avoid the products of industrial agriculture, and to grow one's own, or to barter or trade with small-scale, home-based farmers, and then only those who did not use any artificial chemicals or other artificial methods. Eating, if possible, only those breeds of animals and plants that were closest to what existed before the beginning of agriculture. Avoiding foods that could not have been hunted or gathered by pre-agricultural humans.
The CoEvolution lifestyle went beyond diet. Footwear was to be avoided in favor of going barefoot; at most, it was to be limited to protecting the bottom of the sole. Lift, lateral support and arch support footwear was out. Exercise was to be limited to the natural motions of running, climbing, and hefting - no artificial positions or exercise machines. No shaving; no artificial cosmetics, shampoo, soap, or deodorant. No furniture for sleeping or sitting off the floor (but mats and pillows, in place of paleolithic animal pelts on the floor, were OK.) Any work that could not be done on the floor was to be done standing.
Elements of the CoEvolution lifestyle caught on with many people who would never have thought of themselves as hippies, often for good reasons. Many men stopped shaving, both to save time and to avoid the inevitable nicks and cuts. Some found it easier to work standing than sitting; many slept better on futons than on beds. Most of all, Americans began to re-learn how much better authentic, unprocessed food tasted than the processed, industrial kind. Stewart Brand changed how we lived, and how we thought about living.
Yet within a few years, the phyletic gradualism model at the foundation of the CoEvolution doctrine began to be disconfirmed by increasing evidence for punctuated equilibrium. Brand, who had been a biology major at Stanford, was among the first to abandon the doctrine to which he had given life. In 1985, "CoEvolution Quarterly" became "Whole Earth Review," and CoEvolution was not heard from again. Brand re-invented himself as a corporate futurist, in 1988 co-founding the Global Business Network and working for, among others, Royal Dutch/Shell, Volvo, and AT&T. But Brand's doctrine of CoEvolution would not die with its founder's change of mind. Few non-specialists understood the evidence that had disconfirmed phyletic gradualism and put punctuated equilibrium in its place. And two decades later, CoEvolution re-entered the marketplace of self-help ideas under other names.
(Continued in Part IV.)
Saturday, May 29, 2010
Why Evolution Doesn't Do Design: Part II
(please read Part I first.)
3. The Evidence
The first factor in the overthrow of phyletic gradualism was the identification of the actual mechanism by which the information carried by the DNA is expressed. A sequence of DNA codes is nothing like a blueprint for some specific trait. Instead, each DNA code identifies a specific amino acid in one of thousands of strings of amino acids. These strings are called peptides. Each peptide, in turn, may have one or more functions that it performs in the organism: as part of the structure of a protein, or as an enzyme, or a hormone, or a neurotransmitter, or as a part of the molecular "skeleton" that determines the structure of tissues and organs. These, in turn, participate in biochemical pathways and physiological and anatomical structures responsible for the observed traits of the organism. Thus, the correspondences between the codes and the traits are manifold, multivariate, non-linear and often discontinuous. They result in complicated probability-of-reproductive-success surfaces that have many small local optima, most of them low hills whose peaks are far below the high peak of a global optimum.
The second was discovered (see MODPAC: A modular package of programs for fitting model parameters to data and plotting fitted curves. Reed, Behavior Research Methods & Instrumentation, 1976) by mathematicians and computer scientists working on the problem of finding the optimal values of the parameters of a quantitative model to fit a body of data. All the methods, including not only algebraic approximations but also "genetic programming" methods that simulate the mechanisms of genetic evolution (Koza, Genetic Programming, MIT Press 1992) when invoked on a problem with multiple local optima, converge rapidly on some happenstance local optimum near the starting point. Once at this happenstance local optimum, the parameter-fitting mechanisms are at equilibrium. The values of the parameters stay permanently frozen, with a fit often far below the global optimum, unless dislodged by additional computational techniques (exploration, explosion, simulated annealing) that have no equivalent in natural evolution.
The third came from paleontology. In the fossil record, new traits and species appear in the course of only a few dozen generations, only to continue practically unchanged for tens of thousands, and sometimes hundreds of thousands, or millions of generations thereafter. It was this observation that first led to the label "punctuated equilibrium."
The fourth came from engineering. Until the 1970s, it was generally assumed that evolved organs, particularly those that remained unchanged over many millions of generations, and passed unchanged from very ancient classes of organisms to new ones, had evolved to a structure that was optimal for their biological function. Even when the evolved structures were not what an engineer would have designed, it was assumed that the result of evolution was optimal under some set of as yet unidentified constraints. In the 1970s, mechanical and electrical engineers began to look at evolved systems in the hope of identifying designs that might work better than those they already knew. They found only a few rare cases where the results of evolution were anything close to objectively optimal. They were confronted, instead, with all manner of clumsy contraptions just barely good enough for organisms to survive. The vertebrate eye, for example, has not changed in its basic structure from fishes to humans. Yet if an engineer were to design an array of light sensors - as for a digital camera - she would attach the outputs to cabling on the back or the side of the sensors, so that nothing would disperse, or block the path, of light coming into the front of the sensors from the lens. In the vertebrate eye, on the other hand, the optic nerve, which carries the output of retinal sensors from the eye to the brain, comes from the brain into the inside of each eyeball through a hole in the retina. This hole in the array of retinal sensors is why we have a blind spot in each eye (which digital cameras don't have.) The neurons of the optic nerve then pass in front of the light sensors, in the path of incoming light, and connect to the light-sensing rods and cones from the front (where the light comes in.) This is just one of thousands of examples of globally sub-optimal, clumsy structures; just-good-enough-to-survive local optima frozen by evolution.
The fifth was the discovery of recent, ongoing evolutionary changes in human traits whose relevance to reproductive success was affected by recent changes in the human cultural environment. One such change was the introduction of military conscription in Europe in the late 18th and early 19th centuries, and the introduction of footwear with stiff lateral support at around the same time. Before mass production of boots and shoes most humans went barefoot. When most travel was by foot, and most work was done walking or standing, anatomical abnormalities of the feet were severely disabling. The effect of flat feet and other inherited abnormalities was so adverse, and the abnormalities so rare, that men with these abnormalities were (and, in countries with conscription, still are) exempt from conscription. This exemption had two effects. Men with anatomical abnormalities of the foot were much less likely to be killed or maimed in war. More importantly, they stayed and reproduced, while conscripts were away from their neighborhoods and families for a large part of the duration of their prime reproductive years.
Switch to the 1990s. The Achilles Project (Burzykowski et al 2003) measured the incidence of foot disease, including the prevalence of inherited anatomical abnormalities of the feet, in a sample of 1085 randomly selected subjects in 16 European countries. The incidence of anatomical abnormalities of the foot varied between 20.4% (one in five subjects) and 24.8% (one in four.) This in a mere 8 generations after a changed cultural environment moved the local optimum for reproductive success to a different place.
4. Punctuated Equilibrium
Two thousand years ago, Archimedes' formulation and derivation of Archimedes' Principle demonstrated that the laws of nature can be not merely observed and measured, but grounded and understood through the application of reason - of logic and mathematics - to more fundamental and evident laws and facts. The principle of derivation set what is still the highest standard in scientific understanding of how nature works. Punctuated Equilibrium is the fact that when a change in the environment changes the locations of local optima for reproductive success, the traits and species affected by this change are efficiently and quickly moved by natural selection to new local optima - where they may stay, without further modification, until the location of the local optima changes again. The local optima of evolutionary equilibrium do not correspond to "design," in the sense of some global optimum of fitness or health. They are, rather, the product of a random process, which converges on some local optimum without regard to its optimality in any global sense. And this fact can be mathematically derived, in the best tradition of Archimedes, from the application of mathematical measure theory to genetic programming.
(continued in Part III.)
3. The Evidence
The first factor in the overthrow of phyletic gradualism was the identification of the actual mechanism by which the information carried by the DNA is expressed. A sequence of DNA codes is nothing like a blueprint for some specific trait. Instead, each DNA code identifies a specific amino acid in one of thousands of strings of amino acids. These strings are called peptides. Each peptide, in turn, may have one or more functions that it performs in the organism: as part of the structure of a protein, or as an enzyme, or a hormone, or a neurotransmitter, or as a part of the molecular "skeleton" that determines the structure of tissues and organs. These, in turn, participate in biochemical pathways and physiological and anatomical structures responsible for the observed traits of the organism. Thus, the correspondences between the codes and the traits are manifold, multivariate, non-linear and often discontinuous. They result in complicated probability-of-reproductive-success surfaces that have many small local optima, most of them low hills whose peaks are far below the high peak of a global optimum.
The second was discovered (see MODPAC: A modular package of programs for fitting model parameters to data and plotting fitted curves. Reed, Behavior Research Methods & Instrumentation, 1976) by mathematicians and computer scientists working on the problem of finding the optimal values of the parameters of a quantitative model to fit a body of data. All the methods, including not only algebraic approximations but also "genetic programming" methods that simulate the mechanisms of genetic evolution (Koza, Genetic Programming, MIT Press 1992) when invoked on a problem with multiple local optima, converge rapidly on some happenstance local optimum near the starting point. Once at this happenstance local optimum, the parameter-fitting mechanisms are at equilibrium. The values of the parameters stay permanently frozen, with a fit often far below the global optimum, unless dislodged by additional computational techniques (exploration, explosion, simulated annealing) that have no equivalent in natural evolution.
The third came from paleontology. In the fossil record, new traits and species appear in the course of only a few dozen generations, only to continue practically unchanged for tens of thousands, and sometimes hundreds of thousands, or millions of generations thereafter. It was this observation that first led to the label "punctuated equilibrium."
The fourth came from engineering. Until the 1970s, it was generally assumed that evolved organs, particularly those that remained unchanged over many millions of generations, and passed unchanged from very ancient classes of organisms to new ones, had evolved to a structure that was optimal for their biological function. Even when the evolved structures were not what an engineer would have designed, it was assumed that the result of evolution was optimal under some set of as yet unidentified constraints. In the 1970s, mechanical and electrical engineers began to look at evolved systems in the hope of identifying designs that might work better than those they already knew. They found only a few rare cases where the results of evolution were anything close to objectively optimal. They were confronted, instead, with all manner of clumsy contraptions just barely good enough for organisms to survive. The vertebrate eye, for example, has not changed in its basic structure from fishes to humans. Yet if an engineer were to design an array of light sensors - as for a digital camera - she would attach the outputs to cabling on the back or the side of the sensors, so that nothing would disperse, or block the path, of light coming into the front of the sensors from the lens. In the vertebrate eye, on the other hand, the optic nerve, which carries the output of retinal sensors from the eye to the brain, comes from the brain into the inside of each eyeball through a hole in the retina. This hole in the array of retinal sensors is why we have a blind spot in each eye (which digital cameras don't have.) The neurons of the optic nerve then pass in front of the light sensors, in the path of incoming light, and connect to the light-sensing rods and cones from the front (where the light comes in.) This is just one of thousands of examples of globally sub-optimal, clumsy structures; just-good-enough-to-survive local optima frozen by evolution.
The fifth was the discovery of recent, ongoing evolutionary changes in human traits whose relevance to reproductive success was affected by recent changes in the human cultural environment. One such change was the introduction of military conscription in Europe in the late 18th and early 19th centuries, and the introduction of footwear with stiff lateral support at around the same time. Before mass production of boots and shoes most humans went barefoot. When most travel was by foot, and most work was done walking or standing, anatomical abnormalities of the feet were severely disabling. The effect of flat feet and other inherited abnormalities was so adverse, and the abnormalities so rare, that men with these abnormalities were (and, in countries with conscription, still are) exempt from conscription. This exemption had two effects. Men with anatomical abnormalities of the foot were much less likely to be killed or maimed in war. More importantly, they stayed and reproduced, while conscripts were away from their neighborhoods and families for a large part of the duration of their prime reproductive years.
Switch to the 1990s. The Achilles Project (Burzykowski et al 2003) measured the incidence of foot disease, including the prevalence of inherited anatomical abnormalities of the feet, in a sample of 1085 randomly selected subjects in 16 European countries. The incidence of anatomical abnormalities of the foot varied between 20.4% (one in five subjects) and 24.8% (one in four.) This in a mere 8 generations after a changed cultural environment moved the local optimum for reproductive success to a different place.
4. Punctuated Equilibrium
Two thousand years ago, Archimedes' formulation and derivation of Archimedes' Principle demonstrated that the laws of nature can be not merely observed and measured, but grounded and understood through the application of reason - of logic and mathematics - to more fundamental and evident laws and facts. The principle of derivation set what is still the highest standard in scientific understanding of how nature works. Punctuated Equilibrium is the fact that when a change in the environment changes the locations of local optima for reproductive success, the traits and species affected by this change are efficiently and quickly moved by natural selection to new local optima - where they may stay, without further modification, until the location of the local optima changes again. The local optima of evolutionary equilibrium do not correspond to "design," in the sense of some global optimum of fitness or health. They are, rather, the product of a random process, which converges on some local optimum without regard to its optimality in any global sense. And this fact can be mathematically derived, in the best tradition of Archimedes, from the application of mathematical measure theory to genetic programming.
(continued in Part III.)
Sunday, May 23, 2010
Why Evolution Doesn't Do Design: Part I
1. Introduction
This blog post is about how the Phyletic Gradualism model of evolution is disconfirmed and false, and why the Punctuated Equilibrium model is right. Who cares? If you use (or try to use) the conclusions of evolutionary science to improve your health and your life, you ought to care.
Most people (even scientists) who believe in a God or Gods, and also "believe in evolution," even if they deny the explicit interventionist versions of "intelligent design" still think of evolution as design by other means. In other words, they believe that God's creation is perfect; that God set up and used the laws of nature to result in organisms capable of optimal life and optimal health. The reason for this blog post is to make it easier for my fellow Atheists to separate the science from the beliefs of scientists.
2. The Intellectual Origin of Phyletic Gradualism
It is not the custom of scientists to challenge cultural preconceptions without first having confronted and assembled overwhelming evidence. The religious belief in the perfection of God's creation was not seen by Darwin and his contemporaries as challenged by the theory of evolution. In their day, to contradict, from evidence, the scriptural account of creation, in favor the operation of natural laws that might or might not have been created by a God, was challenge enough. As late as 1970, the phyletic gradualism model was generally accepted by evolutionary biologists, in part because it did not contradict the notion of evolution leading to, or at least moving in the direction of, organs and organisms optimally suited to an optimally healthy existence in their natural environment.
Since the course of evolution is set by a random process of mutations followed by natural selection through differences in reproductive success, its Archimedean derivation is necessarily based on measure theory, probability theory, and mathematical and computational statistics. (Reader, do not be intimidated. You do not need to be a mathematician to understand the essence of the derivation; I will give pictorial hints so that you can let your visual imagination do most of the work.) In measure-theoretic representations of evolution, the probability of reproductive success can be visualized as the height of a variable surface, above the multi-dimensional space representing the state of the genome. A peak at which this probability is higher than it is at all points around it, is called a local optimum. The global optimum, corresponding to the highest possible likelihood of reproductive success, is the highest peak.
Up until the identification, in 1953, of DNA as the genetic material of life, biologists thought of "genes" as direct blueprints for all the tissues, organs and structures of the organism. Assuming this correspondence between the genes, and the traits of the organism, led the scientists of the time to think of the probability-of-reproductive-success surface as having a single optimum only: the location at which the genetic "blueprint" corresponds to the optimal, rational design for the given structure, organ or tissue. Then natural selection selects those mutations that move up the upward slope, rather than down the downward slope, from the present spot on this surface. Gradual evolution to the single, global optimum: this is the mathematical expression of the "Phyletic Gradualism" model.
It took two decades, roughly from 1971 to 1992, for this model to be overturned.
(continued in Part II)
This blog post is about how the Phyletic Gradualism model of evolution is disconfirmed and false, and why the Punctuated Equilibrium model is right. Who cares? If you use (or try to use) the conclusions of evolutionary science to improve your health and your life, you ought to care.
Most people (even scientists) who believe in a God or Gods, and also "believe in evolution," even if they deny the explicit interventionist versions of "intelligent design" still think of evolution as design by other means. In other words, they believe that God's creation is perfect; that God set up and used the laws of nature to result in organisms capable of optimal life and optimal health. The reason for this blog post is to make it easier for my fellow Atheists to separate the science from the beliefs of scientists.
2. The Intellectual Origin of Phyletic Gradualism
It is not the custom of scientists to challenge cultural preconceptions without first having confronted and assembled overwhelming evidence. The religious belief in the perfection of God's creation was not seen by Darwin and his contemporaries as challenged by the theory of evolution. In their day, to contradict, from evidence, the scriptural account of creation, in favor the operation of natural laws that might or might not have been created by a God, was challenge enough. As late as 1970, the phyletic gradualism model was generally accepted by evolutionary biologists, in part because it did not contradict the notion of evolution leading to, or at least moving in the direction of, organs and organisms optimally suited to an optimally healthy existence in their natural environment.
Since the course of evolution is set by a random process of mutations followed by natural selection through differences in reproductive success, its Archimedean derivation is necessarily based on measure theory, probability theory, and mathematical and computational statistics. (Reader, do not be intimidated. You do not need to be a mathematician to understand the essence of the derivation; I will give pictorial hints so that you can let your visual imagination do most of the work.) In measure-theoretic representations of evolution, the probability of reproductive success can be visualized as the height of a variable surface, above the multi-dimensional space representing the state of the genome. A peak at which this probability is higher than it is at all points around it, is called a local optimum. The global optimum, corresponding to the highest possible likelihood of reproductive success, is the highest peak.
Up until the identification, in 1953, of DNA as the genetic material of life, biologists thought of "genes" as direct blueprints for all the tissues, organs and structures of the organism. Assuming this correspondence between the genes, and the traits of the organism, led the scientists of the time to think of the probability-of-reproductive-success surface as having a single optimum only: the location at which the genetic "blueprint" corresponds to the optimal, rational design for the given structure, organ or tissue. Then natural selection selects those mutations that move up the upward slope, rather than down the downward slope, from the present spot on this surface. Gradual evolution to the single, global optimum: this is the mathematical expression of the "Phyletic Gradualism" model.
It took two decades, roughly from 1971 to 1992, for this model to be overturned.
(continued in Part II)
Sunday, May 16, 2010
No, I do not "publish in JARS."
I have heard from a friend that someone is circulating, in media to which I don't have access, the rumor that I publish - note the use of the present tense - in Chris Sciabarra's "Journal of Ayn Rand Studies." In reference to peer-reviewed media, "publish" would mean that I'm still submitting original articles for publication in JARS. (I understand, from comments, that some may be tempted to replace this meaning of "publish" by other meanings that this word has in other contexts; and then twist the result into a contradiction - and accuse me of dishonesty or incoherence, on the basis of equivocations thus manufactured. In this note, I am using "publish" in the one specific sense stated above, where "He publishes in Journal X" means "He submits his original articles for publication in Journal X.") I have not submitted an original article to JARS for years, and I have no intention of doing so, ever. The rumor is false.
In my early, pre-tenure years at my university, beginning in the 2000-2001 academic year, I did some research on the origin of the parallels between the schemata of knowledge representation in Ayn Rand's Objectivist epistemology and in object-oriented programming languages. JARS was a new journal that had just published its first volume, and its charter - to document Ayn Rand's influence on the history of ideas and culture - fit my research. I submitted my article on the origin of the parallels, and it was published. I noticed the poor quality of Sciabarra's editorial process, but I ascribed this to the "teething pains" of a new publication. I communicated my concerns about editorial laxness to Sciabarra, and I expected the quality of his editorial policy to improve.
Sciabarra's editorial policy did not improve. By 2006 he had published several articles of such low quality that they were clearly counterproductive to his stated goal, of getting Ayn Rand's intellectual and artistic influence to be taken seriously in academia. I communicated with Sciabarra at length, and I suggested changes that, had they been made, would have turned JARS into what, according to its published charter, it should have been. Sciabarra discussed the changes that I had suggested to him with his editorial board, but no changes were made. JARS continued to publish articles that were, in my judgment, unscholarly, intellectually disreputable rubbish. It was at that point that I decided never again to submit an original article for publication in JARS.
About a year later, JARS published a couple of articles on Objectivism and religion. My notes on those articles evolved into commentary that, in my judgment, needed to be aired. When I publish an article that may invite commentary, I expect that commentary to appear in the same journal, where I will see it and where I can reply. This is standard academic practice, with which I agree. While I would not submit an original article to JARS, it was and remains my judgment that my commentary was productive and useful. Therefore I followed normal practice, and sent my commentary to the journal that had published the articles that I was commenting on.
There was also an article that I submitted to JARS back in May 2005, and which was accepted for publication after being reviewed, by a peer reviewer whose work with me was unusually productive and well-informed (especially for JARS!) and continued well after the article was accepted. Peer review work is unpaid and anonymous; the reviewer's only payment is in the quality of work published in the journal to which the reviewer contributes her otherwise un-renumerated work. I participate in the peer-review process of a broad range of meetings, journals, and granting agencies. If a paper I had worked on were withdrawn after acceptance, for any reason short of its author repudiating the content, I would judge this as a breach of trust, the work I had worked having been wasted and unpaid-for. Therefore, I would not consider withdrawing an already accepted article, whose content I still stand by, as an ethically justifiable option. This last article was recently printed, bringing all association that I've ever had with JARS to a final close. (I have been told that the person who started the rumor - that I still submit articles to JARS - had prior access to the full text of the article, and should have read in the top footnote that it was submitted in May 2005, but only mentioned that the article was printed recently - not that it was originally submitted 5 years ago.)
I agree, after long scrutiny, with everything Ayn Rand and Leonard Peikoff have written about the principle of moral sanction. I find nothing in this principle, or in Ayn Rand's own actions on this principle, that would mandate more than I have already decided and done. I have no proof that the failure to disclose the May 2005 submission date of the article, which I deduce is what started the rumor, was deliberate; and therefore I am not ready to judge its moral import.
The quality of JARS has continued to fall, so I'm not likely to find another of its articles worthy of comment in the future. I've let my subscription expire years ago (although, as is standard for refereed journals, I did receive an author's copy of the recent issue.) In the present, the rumor that I publish in JARS is false.
In my early, pre-tenure years at my university, beginning in the 2000-2001 academic year, I did some research on the origin of the parallels between the schemata of knowledge representation in Ayn Rand's Objectivist epistemology and in object-oriented programming languages. JARS was a new journal that had just published its first volume, and its charter - to document Ayn Rand's influence on the history of ideas and culture - fit my research. I submitted my article on the origin of the parallels, and it was published. I noticed the poor quality of Sciabarra's editorial process, but I ascribed this to the "teething pains" of a new publication. I communicated my concerns about editorial laxness to Sciabarra, and I expected the quality of his editorial policy to improve.
Sciabarra's editorial policy did not improve. By 2006 he had published several articles of such low quality that they were clearly counterproductive to his stated goal, of getting Ayn Rand's intellectual and artistic influence to be taken seriously in academia. I communicated with Sciabarra at length, and I suggested changes that, had they been made, would have turned JARS into what, according to its published charter, it should have been. Sciabarra discussed the changes that I had suggested to him with his editorial board, but no changes were made. JARS continued to publish articles that were, in my judgment, unscholarly, intellectually disreputable rubbish. It was at that point that I decided never again to submit an original article for publication in JARS.
About a year later, JARS published a couple of articles on Objectivism and religion. My notes on those articles evolved into commentary that, in my judgment, needed to be aired. When I publish an article that may invite commentary, I expect that commentary to appear in the same journal, where I will see it and where I can reply. This is standard academic practice, with which I agree. While I would not submit an original article to JARS, it was and remains my judgment that my commentary was productive and useful. Therefore I followed normal practice, and sent my commentary to the journal that had published the articles that I was commenting on.
There was also an article that I submitted to JARS back in May 2005, and which was accepted for publication after being reviewed, by a peer reviewer whose work with me was unusually productive and well-informed (especially for JARS!) and continued well after the article was accepted. Peer review work is unpaid and anonymous; the reviewer's only payment is in the quality of work published in the journal to which the reviewer contributes her otherwise un-renumerated work. I participate in the peer-review process of a broad range of meetings, journals, and granting agencies. If a paper I had worked on were withdrawn after acceptance, for any reason short of its author repudiating the content, I would judge this as a breach of trust, the work I had worked having been wasted and unpaid-for. Therefore, I would not consider withdrawing an already accepted article, whose content I still stand by, as an ethically justifiable option. This last article was recently printed, bringing all association that I've ever had with JARS to a final close. (I have been told that the person who started the rumor - that I still submit articles to JARS - had prior access to the full text of the article, and should have read in the top footnote that it was submitted in May 2005, but only mentioned that the article was printed recently - not that it was originally submitted 5 years ago.)
I agree, after long scrutiny, with everything Ayn Rand and Leonard Peikoff have written about the principle of moral sanction. I find nothing in this principle, or in Ayn Rand's own actions on this principle, that would mandate more than I have already decided and done. I have no proof that the failure to disclose the May 2005 submission date of the article, which I deduce is what started the rumor, was deliberate; and therefore I am not ready to judge its moral import.
The quality of JARS has continued to fall, so I'm not likely to find another of its articles worthy of comment in the future. I've let my subscription expire years ago (although, as is standard for refereed journals, I did receive an author's copy of the recent issue.) In the present, the rumor that I publish in JARS is false.
Kenyan, Nigerian, all the same...
From the morning's e-mail:
Dear Friend,
This letter is not intended to cause any embarrassment in whatever form, rather is compelled to contact your esteemed self, following the knowledge of your high repute and trustworthiness.
I am David Garfield, Chief Campaign Officer of the PRINCIPAL CAMPAIGN COMMITTEE OF DEMOCRATS: OBAMA 2014 INC ID: C00411934.
I write to seek your sincere assistance in transferring the sum of 10M GBP 10 million Pounds sterling.
I discovered my office has some excess funds amounting too 10 million Pounds recovered from donations and grants from democrats around the world during our election campaign and pleas for support for our incumbent president Barack
Hussein Obama, According to plans, The excess funds was to used in clearing debts owed by Mrs Hillary Clinton during her campaign programs,I taught there is a better way of expending this funds.I want this money to be used to alleviate the poverty and sufferings of children in Iraq and Africa and donate to Charity organizations around the world.
My plea to you is that you assist me get this funds out of the United Kingdom where it is presently lodged safe and for your assistance ,you will have a fair percentage of the total money and all investments shall be under your supervision.
This simple transfer process could be arranged in less than 3 working days.
I await your sincere response,
David .A. Garfield.
Chief Campaign Officer,
Barack Obama Campaign Office.
Phone: +447035969385.
E-MAIL:davidgarfieldsr@gmail.com
garfield.david@krovatka.su
".SU" is the country code of the former Soviet Union. "Krovatka" is Belorussian for "where we make cows."
Tuesday, May 11, 2010
My Premise-Checking Habit
What is the difference between a philosopher and a scientist? When the philosopher comes to a contradiction, she checks her premises. The scientist does not wait for a contradiction.
Maybe. My own premise-checking habit predates my career choice. When I first realized that some grownups believed, and told me, things that were false, I decided that I would rather doubt a hundred truths than believe one falsehood. Later, on encountering the derivation of Archimedes' Principle, I was so taken with the realization that the facts of reality not only could be observed, but could be understood by reason, that I decided to make this my future job. It helped me to know that as a scientist, I would never need to pretend that I knew, when I doubted.
Later, when I began to read the work of Ayn Rand, I was struck by the similarity between her approach to knowledge and that of the scientists I had met. The scientists knew that certain assumptions had to be made for scientific investigation of nature to be possible; Ayn Rand pointed out that these "assumptions" were really axioms that could not be contradicted without self-exclusion, which made them certain. I already knew that in science the results of replicated measurements comparing an observed value with an external standard were "practically certain;" from Ayn Rand I learned the principles that make them contextually certain. Laws that exactly describe some set of contextually certain measurements are also contextually certain, in the context of the precision and range of the measurements that such laws describe. Logically necessary deductions from already certain premises are contextually certain in the intersection of the contexts of their premises. As long as one tracks context in one's deductions and derivations, one can be certain about what one knows with contextual certainty; and one can know in what contexts that which one knows is certain. Everything outside those contexts is rightly open to doubt, regardless of how many people think it true or wish it were true.
Later, as a student of cognitive psychology, I learned about confirmation bias: the universal human tendency to notice and think about evidence that confirms one's prior beliefs and hypotheses, and to ignore and evade evidence to the contrary. I trained myself, as rigorously as I could, in the habit of going against my own confirmation bias; of looking for experiments and observations that would produce, if such evidence existed, evidence against the hypotheses that I myself advanced and wanted to be true. And, like many in the human sciences, I worked on methods for guarding the process of science against confirmation bias and other biases common to all men, including scientists such as myself.
One of my PhD mentors was Ray Hyman. Ray studied physical scientists who had become interested in "psychical" (later called "paranormal") phenomena. Physical scientists, like Objectivists, pride themselves on thinking conceptually, yet grounding even their most abstract ideas in observable and measurable fact. Yet physical scientists, ignorant of their own confirmation bias, were always the first to be fooled by "evidence" that invariably disappeared under the lens of bias-proof methods worked out by cognitive psychologists. More recently, those of us who look at the work of physical scientists through the lens of cognitive psychology were treated to "climategate:" the ultimate spectacle of physical scientists intoxicated with confirmation bias, and keeping their data secret lest their hypotheses be debunked, as alleged "paranormal phenomena" have been, if subjected to the methods of bias-proof analysis that have become standard in the human sciences.
In Objectivist circles, it is customary to give to ideas held by fellow Objectivists the benefit of the doubt. Surely, the reasoning goes, one's fellow Objectivists have sound epistemology, and therefore are less likely to be mistaken than non-Objectivists. Unfortunately, there is no evidence that Ayn Rand knew what we now know about confirmation bias. My guess is that she didn't, because she would have advised Objectivists to guard against it, if only she had known. And so some Objectivists, as I recently found out, regard my own attitude - that I would rather doubt a hundred truths than believe one falsehood - as a flaw of character. As one put it in a letter, 'it indicates a juvenile "iconoclastic" mentality rather than a strive (maybe "a striving?") for knowledge.'
The iconoclasts were early Christian fanatics who defaced artwork, lest statues and paintings receive admiration that the iconoclasts reserved for God. How one gets from an observation of habitual premise-checking to a diagnosis of "iconoclasm" I don't know. What I do know, is that no idea should be exempt from doubt because of who holds it. Even if that person is an Objectivist. Even if it is an idea held by many Objectivists. If this be "iconoclasm," make the most of it.
Maybe. My own premise-checking habit predates my career choice. When I first realized that some grownups believed, and told me, things that were false, I decided that I would rather doubt a hundred truths than believe one falsehood. Later, on encountering the derivation of Archimedes' Principle, I was so taken with the realization that the facts of reality not only could be observed, but could be understood by reason, that I decided to make this my future job. It helped me to know that as a scientist, I would never need to pretend that I knew, when I doubted.
Later, when I began to read the work of Ayn Rand, I was struck by the similarity between her approach to knowledge and that of the scientists I had met. The scientists knew that certain assumptions had to be made for scientific investigation of nature to be possible; Ayn Rand pointed out that these "assumptions" were really axioms that could not be contradicted without self-exclusion, which made them certain. I already knew that in science the results of replicated measurements comparing an observed value with an external standard were "practically certain;" from Ayn Rand I learned the principles that make them contextually certain. Laws that exactly describe some set of contextually certain measurements are also contextually certain, in the context of the precision and range of the measurements that such laws describe. Logically necessary deductions from already certain premises are contextually certain in the intersection of the contexts of their premises. As long as one tracks context in one's deductions and derivations, one can be certain about what one knows with contextual certainty; and one can know in what contexts that which one knows is certain. Everything outside those contexts is rightly open to doubt, regardless of how many people think it true or wish it were true.
Later, as a student of cognitive psychology, I learned about confirmation bias: the universal human tendency to notice and think about evidence that confirms one's prior beliefs and hypotheses, and to ignore and evade evidence to the contrary. I trained myself, as rigorously as I could, in the habit of going against my own confirmation bias; of looking for experiments and observations that would produce, if such evidence existed, evidence against the hypotheses that I myself advanced and wanted to be true. And, like many in the human sciences, I worked on methods for guarding the process of science against confirmation bias and other biases common to all men, including scientists such as myself.
One of my PhD mentors was Ray Hyman. Ray studied physical scientists who had become interested in "psychical" (later called "paranormal") phenomena. Physical scientists, like Objectivists, pride themselves on thinking conceptually, yet grounding even their most abstract ideas in observable and measurable fact. Yet physical scientists, ignorant of their own confirmation bias, were always the first to be fooled by "evidence" that invariably disappeared under the lens of bias-proof methods worked out by cognitive psychologists. More recently, those of us who look at the work of physical scientists through the lens of cognitive psychology were treated to "climategate:" the ultimate spectacle of physical scientists intoxicated with confirmation bias, and keeping their data secret lest their hypotheses be debunked, as alleged "paranormal phenomena" have been, if subjected to the methods of bias-proof analysis that have become standard in the human sciences.
In Objectivist circles, it is customary to give to ideas held by fellow Objectivists the benefit of the doubt. Surely, the reasoning goes, one's fellow Objectivists have sound epistemology, and therefore are less likely to be mistaken than non-Objectivists. Unfortunately, there is no evidence that Ayn Rand knew what we now know about confirmation bias. My guess is that she didn't, because she would have advised Objectivists to guard against it, if only she had known. And so some Objectivists, as I recently found out, regard my own attitude - that I would rather doubt a hundred truths than believe one falsehood - as a flaw of character. As one put it in a letter, 'it indicates a juvenile "iconoclastic" mentality rather than a strive (maybe "a striving?") for knowledge.'
The iconoclasts were early Christian fanatics who defaced artwork, lest statues and paintings receive admiration that the iconoclasts reserved for God. How one gets from an observation of habitual premise-checking to a diagnosis of "iconoclasm" I don't know. What I do know, is that no idea should be exempt from doubt because of who holds it. Even if that person is an Objectivist. Even if it is an idea held by many Objectivists. If this be "iconoclasm," make the most of it.
Monday, March 15, 2010
Healthy Weight
If one were to ask someone who is not familiar with the history of "Public Health" about the meaning of "ideal" ("acceptable," "normal," "healthy") human weight, she would probably guess, that it is the weight range at which the risk of death is, other things being equal, at minimum. What, then, is one to make of these statements in the abstract of a recently published study (Orpana HM, Berthelot JM, Kaplan MS, Feeny DH, McFarland B, Ross NA. 2010: BMI and mortality: results from a national longitudinal study of Canadian adults:)
"Public health," like "public education," was imported to America from Prussia. The Prussian state was founded by a military order of armed monks, who imposed on the people they conquered an order of Christian discipline similar to their own. Their ideal subject was a man optimally suited for military service. Their ideal soldier was a dragoon, that is, a mounted infantryman: Dragons could be used either as highly mobile infantry or as light cavalry. This meant that the ideal soldier, and therefore the ideal Prussian subject, had to be light enough to ride all day without exhausting the horse. The acceptable weight for conscripting a Prussian dragoon is still with us as the range of "acceptable weight" used in public health studies. Adapted to America's greater variation of human height by substituting height-adjusted BMI for weight, the old Prussian standard of "acceptable weight" remains in world-wide "public health" use to this day.
An objective science of human health would set ideal weight to the weight at which the likelihoods of disease and death from disease are minimized. The corresponding measurement is the relative risk of death: the ideal weight is the weight at which the long term (say 12 year) risk of death is at its local minimum. In other words, the real, objective ideal weight has nothing to do with the desiderata of the Prussian General Staff. It ought to be set by measuring the facts of reality. And, from the facts measured to date, it is clear that the objectively optimal weight is nothing like the "acceptable weight" found in "public health" directives. It is almost certainly somewhere in the range that "public health" professionals call "overweight:" BMI between 25.1 and 29.9.
From the perspective of objective scientific methodology there is much wrong with BMI as the independent variable in health research. Optimal weight should be measured by plotting long-term (e.g. 12-year) mortality versus actual weight in the context of sex/gender, age and height. Unfortunately, I do not have access to the raw data that I would need to set an objective target range for my own weight. In the absence of such data, I use a target of BMI 27.5, the midpoint of the BMI range with the lowest observed mortality risk in nearly all quantitative studies to date.
The continuing use of the Prussian "acceptable weight" ranges, objectively known to be sub-optimal for human life and health, should be an epistemic scandal. It is a public folly with political uses. It permits "public health" authoritarians to claim that individual choice must be restricted to save us from the supposed epidemic of fat. Because if one accepts the Prussian pseudo-standard, 68% of Americans are overweight or obese. And this Prussian pseudo-standard is seldom challenged, because Americans "educated" in Prussian-standard public schools are so concept-deprived that they will believe anything, as long as it comes with a number and a percent sign somewhere - and will submit to the authority of the hoax.
A significant increased risk of mortality over the 12 years of follow-up was observed for underweight (BMI 18.5-35; RR = 1.36, P l.t. 0.05) and obesity class II+ (BMI to 35; RR = 1.36, P l.t. 0.05). Overweight (BMI 25 to 30) was associated with a significantly decreased risk of death (RR = 0.83, P l.t. 0.05). The RR was close to one for obesity class I (BMI 30-35; RR = 0.95, P l.t. 0.05). Our results are similar to those from other recent studies, confirming that underweight and obesity class II+ are clear risk factors for mortality, and showing that when compared to the acceptable BMI category, overweight appears to be protective against mortality."Overweight appears to be protective against mortality." Then why is it called "overweight?"
"Public health," like "public education," was imported to America from Prussia. The Prussian state was founded by a military order of armed monks, who imposed on the people they conquered an order of Christian discipline similar to their own. Their ideal subject was a man optimally suited for military service. Their ideal soldier was a dragoon, that is, a mounted infantryman: Dragons could be used either as highly mobile infantry or as light cavalry. This meant that the ideal soldier, and therefore the ideal Prussian subject, had to be light enough to ride all day without exhausting the horse. The acceptable weight for conscripting a Prussian dragoon is still with us as the range of "acceptable weight" used in public health studies. Adapted to America's greater variation of human height by substituting height-adjusted BMI for weight, the old Prussian standard of "acceptable weight" remains in world-wide "public health" use to this day.
An objective science of human health would set ideal weight to the weight at which the likelihoods of disease and death from disease are minimized. The corresponding measurement is the relative risk of death: the ideal weight is the weight at which the long term (say 12 year) risk of death is at its local minimum. In other words, the real, objective ideal weight has nothing to do with the desiderata of the Prussian General Staff. It ought to be set by measuring the facts of reality. And, from the facts measured to date, it is clear that the objectively optimal weight is nothing like the "acceptable weight" found in "public health" directives. It is almost certainly somewhere in the range that "public health" professionals call "overweight:" BMI between 25.1 and 29.9.
From the perspective of objective scientific methodology there is much wrong with BMI as the independent variable in health research. Optimal weight should be measured by plotting long-term (e.g. 12-year) mortality versus actual weight in the context of sex/gender, age and height. Unfortunately, I do not have access to the raw data that I would need to set an objective target range for my own weight. In the absence of such data, I use a target of BMI 27.5, the midpoint of the BMI range with the lowest observed mortality risk in nearly all quantitative studies to date.
The continuing use of the Prussian "acceptable weight" ranges, objectively known to be sub-optimal for human life and health, should be an epistemic scandal. It is a public folly with political uses. It permits "public health" authoritarians to claim that individual choice must be restricted to save us from the supposed epidemic of fat. Because if one accepts the Prussian pseudo-standard, 68% of Americans are overweight or obese. And this Prussian pseudo-standard is seldom challenged, because Americans "educated" in Prussian-standard public schools are so concept-deprived that they will believe anything, as long as it comes with a number and a percent sign somewhere - and will submit to the authority of the hoax.
Monday, January 25, 2010
Leonard Peikoff recommends Ira Levin's "This Perfect Day"
(I am posting this from Blogger because it was blocked from being posted directly on Facebook - someone reported the link as "abusive." Blast the coward's veto!)
Leonard Peikoff writes: ... it is rare to find anyone who understands the basic identity of Christianity and Communism. In this context, therefore, I want to plug an old dystopian novel, This Perfect Day, by a good writer, Ira Levin, whose works and ideas are mixed — at different points of his life he was an admirer and then an enemy of Objectivism. Despite its philosophic inconsistencies, however, I found the book compelling for a number of reasons, but the relevant one here is indicated by
this example: the children in the book’s future totalitarian state learned to skip rope while reciting a paean to four heroes on whom their way of life is based. Two of the four are fictional. The other two are Marx and Christ.
(Full text: Impact, newsletter of the Ayn Rand Institute - PDF)
Leonard Peikoff writes: ... it is rare to find anyone who understands the basic identity of Christianity and Communism. In this context, therefore, I want to plug an old dystopian novel, This Perfect Day, by a good writer, Ira Levin, whose works and ideas are mixed — at different points of his life he was an admirer and then an enemy of Objectivism. Despite its philosophic inconsistencies, however, I found the book compelling for a number of reasons, but the relevant one here is indicated by
this example: the children in the book’s future totalitarian state learned to skip rope while reciting a paean to four heroes on whom their way of life is based. Two of the four are fictional. The other two are Marx and Christ.
(Full text: Impact, newsletter of the Ayn Rand Institute - PDF)
Wednesday, January 20, 2010
A practical guideline for Objectivist activism on political issues
This practical guideline is a follow-up on "A Radical Strategy for Objectivists."
The only proper function of government, and therefore the only principled foundation for Objectivist activism on any political issue, is the protection of individual human rights. Rights merit my activism because they are the pre-requisite conditions for living a life appropriate to a human qua human. Because the species-specific evolved means of human survival is the judgment of one's mind, all individual rights are sub-categories of just one fundamental right: the right to live by the judgment of one's own mind. Therefore the essential guideline for Objectivist political activism is to relate the target issue to living by the judgment of one's own mind.
Corrolaries:
1. If I don't understand how the political issue at hand relates to living by the judgment of one's own mind, then I don't understand it well enough to engage in principled activism on the issue.
2. If, in the context of writing for the purpose of Objectivist political activism, I fail to link the political issue that I'm writing about to living by the judgment of one's own mind, then what I have written is not a contribution to principled Objectivist activism.
The only proper function of government, and therefore the only principled foundation for Objectivist activism on any political issue, is the protection of individual human rights. Rights merit my activism because they are the pre-requisite conditions for living a life appropriate to a human qua human. Because the species-specific evolved means of human survival is the judgment of one's mind, all individual rights are sub-categories of just one fundamental right: the right to live by the judgment of one's own mind. Therefore the essential guideline for Objectivist political activism is to relate the target issue to living by the judgment of one's own mind.
Corrolaries:
1. If I don't understand how the political issue at hand relates to living by the judgment of one's own mind, then I don't understand it well enough to engage in principled activism on the issue.
2. If, in the context of writing for the purpose of Objectivist political activism, I fail to link the political issue that I'm writing about to living by the judgment of one's own mind, then what I have written is not a contribution to principled Objectivist activism.
Tuesday, January 19, 2010
Under the Influence of Confirmation Bias
Quantitative and computer-based methods and models can help - or hinder - the transformation of data into knowledge. Whether they help or hinder depends on how well their designers and users know the strengths and weaknesses not only of computers, but of their own minds. The "Climategate" files, the working documents and correspondence of the East Anglia climate modelers, show what happens when the perils of confirmation bias, and of other defects of intuition, are ignored by those whose job it is to build knowledge from data.
Confirmation bias should not be new to scientists. In the 1970s and 1980s, many reputable physical scientists processed experimental data in ways that suggested, and claimed to demonstrate, the reality of telepathy and telekinesis. When the measurements used to support paranormal claims were examined by human and social scientists familiar with the operation of cognitive bias, those claims turned out to be undemonstrable. The East Anglia climate files suggest that some of the physical scientists involved in climate research are still ignorant of the hazards of cognitive bias, and are mired in methodological errors that replicate the errors of historical parapsychologists.
Confirmation bias is likely to be involved when some data points in a dataset are suspected of mismeasurement and are "corrected." The correction procedure, when selected or written under the influence of confirmation bias, will tend to confirm the model favored by those doing the "correction." The only defense against confirmation bias in such cases is to compare "corrected" data with the original raw data, to check that the corrections are neutral with respect to departures from the favored model. Not only was this not done by the East Anglia climate team, but the original record may have been destroyed.
Confirmation bias may enter when numbers not available from records are extrapolated from other data. Such extrapolation is valid - when the extrapolated relationship is uniform across all contexts in which it has been measured. Many long-term climate studies (not only those of the East Anglia team) rely on extrapolations of temperature from tree rings. Those extrapolations depend on a relation between tree rings and temperatures that held between the mid-1700s to around 1960, but that relation has not been observed since about 1961. No one has given any physical reason for supposing that the relation between tree rings and temperature hundreds or thousands of years ago was more like the relation observed in the nineteenth century, than like that measured between 1961 and today. Experience with the extrapolations of parapsychologists in the 1970s and 1980s suggests that an extrapolation may be chosen because its result fits the model that the extrapolator wishes to confirm.
The final playground of confirmation bias is in the causes considered to explain the data. The spectacular rise in global temperatures near the end of the 1990s corresponded to, among other things, highs of the Atlantic multidecadal oscillation and of the Solar irradiance and Solar flux cycle. Models that test the anthropogenic warming hypothesis only against the null hypothesis are biased in their implicit assumption that nothing else could have contributed to the observed warming. When there may be several causes, the analyses should include a multivariate assessment of their relative contribution to the observed results.
The study of climate change has become as complicated, and as fraught with social implications, as any issue in the social and human sciences. Fortunately, the methods developed in the social and human sciences to produce valid knowledge about complicated issues, even in the presence of inevitable human bias, stand ready to be used in climate science as well. It may well be that the current conclusions of climate scientists will be confirmed when re-evaluated by methods that are equal to the task. We cannot know, unless all their data and all their methods are held open to inquiry.
(This was a draft for some OpEd submissions I sent to popular science media. It was not published - they had their hands full with submissions from scientists much more directly involved with the climate sciences.)
Confirmation bias should not be new to scientists. In the 1970s and 1980s, many reputable physical scientists processed experimental data in ways that suggested, and claimed to demonstrate, the reality of telepathy and telekinesis. When the measurements used to support paranormal claims were examined by human and social scientists familiar with the operation of cognitive bias, those claims turned out to be undemonstrable. The East Anglia climate files suggest that some of the physical scientists involved in climate research are still ignorant of the hazards of cognitive bias, and are mired in methodological errors that replicate the errors of historical parapsychologists.
Confirmation bias is likely to be involved when some data points in a dataset are suspected of mismeasurement and are "corrected." The correction procedure, when selected or written under the influence of confirmation bias, will tend to confirm the model favored by those doing the "correction." The only defense against confirmation bias in such cases is to compare "corrected" data with the original raw data, to check that the corrections are neutral with respect to departures from the favored model. Not only was this not done by the East Anglia climate team, but the original record may have been destroyed.
Confirmation bias may enter when numbers not available from records are extrapolated from other data. Such extrapolation is valid - when the extrapolated relationship is uniform across all contexts in which it has been measured. Many long-term climate studies (not only those of the East Anglia team) rely on extrapolations of temperature from tree rings. Those extrapolations depend on a relation between tree rings and temperatures that held between the mid-1700s to around 1960, but that relation has not been observed since about 1961. No one has given any physical reason for supposing that the relation between tree rings and temperature hundreds or thousands of years ago was more like the relation observed in the nineteenth century, than like that measured between 1961 and today. Experience with the extrapolations of parapsychologists in the 1970s and 1980s suggests that an extrapolation may be chosen because its result fits the model that the extrapolator wishes to confirm.
The final playground of confirmation bias is in the causes considered to explain the data. The spectacular rise in global temperatures near the end of the 1990s corresponded to, among other things, highs of the Atlantic multidecadal oscillation and of the Solar irradiance and Solar flux cycle. Models that test the anthropogenic warming hypothesis only against the null hypothesis are biased in their implicit assumption that nothing else could have contributed to the observed warming. When there may be several causes, the analyses should include a multivariate assessment of their relative contribution to the observed results.
The study of climate change has become as complicated, and as fraught with social implications, as any issue in the social and human sciences. Fortunately, the methods developed in the social and human sciences to produce valid knowledge about complicated issues, even in the presence of inevitable human bias, stand ready to be used in climate science as well. It may well be that the current conclusions of climate scientists will be confirmed when re-evaluated by methods that are equal to the task. We cannot know, unless all their data and all their methods are held open to inquiry.
(This was a draft for some OpEd submissions I sent to popular science media. It was not published - they had their hands full with submissions from scientists much more directly involved with the climate sciences.)
Tuesday, January 05, 2010
A Radical Strategy for Objectivists
Most people's New Year resolutions have to do with a new commitment to act in accordance with one's values. Little thought is given to the identification of the course of action that will lead to the realization of those values. Most non-Objectivists, especially those of the Pragmatist kind, consider the relation between values and actions intuitively obvious. Given what we now know, from advances in cognitive science, about the fallibility of intuition, an Objectivist will start the New Year with a principled, conceptual analysis of the actions needed to actualize one's values. The most general principles relating action to values constitute a strategy for the achievement of those values.
Why do Objectivists need a radical new strategy? Since Objectivism is the system of philosophical principles identified by Ayn Rand, is it not enough to follow Ayn Rand's own strategy from, say, 1964, as many Objectivists have been doing for much of the year 2009? Well, no, it isn't. Ayn Rand's 1964 strategy, in the Goldwater campaign, was so counterproductive to the achievement of Objectivist values that she never again collaborated with a Conservative (or "Libertarian") campaign or organization. The main result of 1964 was that by the end of the year, Ayn Rand's name was on the lips of multitudes of Libertarians and Conservatives, millions of whom would have called themselves Objectivists if Ayn Rand had not been alive to stop them. The subsequent drift of Conservatives in the direction of advocating a Christian Theocracy for America, and the drift of Libertarians toward advocating Anarchism - and the political empowerment derived by both from ripping off Ayn Rand's sound bites for use as slogans, in causes fundamentally opposed to her values of individual rights and of individuals using their minds in the service of their own lives and their happiness on Earth - have demonstrated Ayn Rand's wisdom in dissociating her philosophy from such followers. Rand herself followed a new strategy after 1964. By July 1966, she was ready to start publishing, in The Objectivist, her Introduction to Objectivist Epistemology.
In 2009, the same mindless multitudes of Libertarians and Conservatives found themselves again in need of slogans, and once again drafted Ayn Rand (who, being dead, could no longer object) into their service. In the intervening years, American Conservatives had became outright Christianists. As I documented in October (Three Democides by False Morality: Part III, The Ban On Cloning) the Conservative/Christianist movement eventually allied itself with the worst elements of the anti-technology Left to produce a de-facto (and increasingly de-jure) ban on medical research into cloning-based technologies to reverse organ failure. (And NO, this is not about stem cells: read my essay.) Man's natural lifespan is the lifespan that humans would enjoy by the natural use of Man's natural organ of survival: our minds. For every year of delay in the development of cloning-based cures for organ failure, around 3.8 million individual humans will die (more accurately, will have been murdered by the ban on cloning) short of their natural lifespan. We are now in the 12th year of the de-facto ban: a rough estimate of the number of individuals already murdered by our Christianists ("Conservatives") is 45 million and counting. Other things being equal, a human living today is ten million times more likely to die of Christianist democide than of Islamist terrorism. Yet while in 2009 many Objectivists spoke and wrote about the Islamist threat, the ongoing Christianist democide remains largely unmentioned, even among Objectivists, possibly from fear of alienating potential "allies" (of Objectivism?) in the Conservative movement.
Exposure of Ayn Rand and Objectivism in Conservative media has the direct negative consequence of energizing Conservative activism and bringing more Conservative politicians to power. This can only perpetuate the ongoing democidal restraints on medical cloning research, as well as conservative strangling of individual rights in the areas of freedom of speech, abortion, immigration, sexuality, medical relief of pain; and promote the ongoing subjectivization of "criminal justice," and, more generally, government-enforced adherence to Christian "moral standards." The supposed benefit is greater exposure of Objectivism. In the case of the more naive sections of the public, especially those indoctrinated into an anti-conceptual mentality by the Pragmatist comprachicos who run America's schools, this will mean greater use among the public of Ayn Rand quotes, not as principles but as slogans. More will self-identify as "Objectivists" and thus associate Objectivism, in the minds of their friends, contacts and neighbors, with whatever nonsense those self-identified "Objectivists" happen to favor. At the top, eventually we will find self-identified "Objectivists" in positions of political power. Given the enormous harm done to the reputation of Ayn Rand and Objectivism by just one Objectivism-plated Pragmatist, Alan Greenspan, the harm that could be done by future herds of Objectivism-plated Libertarians and Objectivism-plated Conservatives is best left to the imagination.
In "It Is Earlier Than You Think," published in December 1964, Ayn Rand demonstrates a method for formulating a new strategy. But to use her method in 2010, one must first account for what has changed.
In 1964, Marxism was the only significant ideology of academics around the world. Its only secular competitor in America was Pragmatism, an anti-intellectual anti-ideology relegated mainly to Schools of Education. Supernaturalism was on its last legs, leaving even theologians in a desperate quest for religion without God. Rand's strategy, in "It Is Earlier Than You Think," was to prepare Objectivists to do battle with the Marxists - and to fill the vacuum when Marxism collapsed.
Marxism collapsed much earlier than anticipated: it disconfirmed itself with the implosion of Communism in the late 1980s, long before there were enough Objectivist academics to step in its place. The vacuum was filled by a resurgence (more by bloating from gaseous putrefaction than from intellectual revival) of supernaturalism and Pragmatism. Both supernaturalism and Pragmatism interpreted the disconfirmation of Marxism as showing that it was futile for the human mind to attempt a principled, and applicable, understanding of human existence on Earth. With Marxism deflated,and Objectivists still waiting for tenure, supernaturalism and Pragmatism - each complementing the other, with the effect of a Hegelian "synthesis in praxis" -took over the academy and the culture.
One effect of the supernaturalist-Pragmatist takeover of American education and culture is that the typical American of 2010 lives in a state not merely of value-deprivation, as was already the case in 1964, but of concept-deprivation. Americans no longer hold what had been, from the re-discovery of Aristotle and Archimedes in the Renaissance to the collapse of Marxism in the 1980s, the central idea of Western Civilization: that reality can be made sense of by the human mind. The function of Pragmatist schooling is to keep the student's mind from ever reaching what Piaget calls the stage of "abstract operations." This means that exposure to Ayn Rand and Objectivism in the public arena does not function as exposure to Objectivist ideas, which would undermine and displace the results of supernaturalist and Pragmatist indoctrinations in the mind of the listener. All that happens is that statements of Objectivist ideas are added (as slogans, not as ideas) to the existing inchoate slurry of supernatural-Pragmatist notions in the listener's head. Working for mere exposure of Objectivist ideas in the public arena today is futility in action.
Before it again becomes possible for the bulk of Americans to understand Objectivism, one must restore their ability to think in concepts and principles, and give them confidence that reality can be made sense of by the human mind.
How can this be most effectively done?
I expect every Objectivist to defend his or her values against existential threats, including threats from the realm of politics and culture. It is right for health care professionals to fight their prospective enslavement, for businessmen to fight against non-objective laws and arbitrary regulations, for teachers to fight for the rights of their students, and for everyone to fight for the right to speak and act according to the judgment of his or her individual mind. It is possible, and desirable, to use every argument not only to defend the specific values at stake, but to demonstrate, implicitly or explicitly, the power of Objectivist epistemology. To break a culture that associates principles with un-Earthly supernaturalism, and facts with anti-intellectual Pragmatism, the Objectivist's arguments should insist on, and exploit, the Objectivist linkage between ideas and the facts of reality. (I offer this Op-Ed of mine as an example of how the two can be and ought to be linked.) For many of us, including this linkage in our everyday activism can be the easiest way to infiltrate Objectivist epistemology into the minds of our fellows.
For Objectivist academics and teachers, the deliverable is to replace Pragmatist curricula, Pragmatist textbooks and Pragmatist assessments of knowledge with conceptual, principled curricula, books and tests in the fields, disciplines and schools in which we teach. (As a kind of "demonstration project," I am now in the process of writing, together with John Drake, a radically new, conceptual, principled introductory text in Information Systems.) Nearly any field of study, at just about any level, can be used to introduce students to the art of conceptual thought. I plan to write more about this in the near future.
And, for just about every Objectivist in America, there is the option of running for, and serving on, the local school board, where even one Objectivist may be able to replace at least some Pragmatist syllabi with principled texts that teach the application of abstract conceptual thinking to the solution of real-world problems. In some fields such schoolbooks already exist, in English - but only abroad, from Ireland or India or South Africa or Singapore. As a former elected member of a local school board, this is another topic about which I plan to write at some length.
As Ayn Rand often reminds us, the advocacy of Objectivism is primarily - before anything else - the advocacy of reason. We now live in a culture in which hardly anyone knows what reason is. This will make effective advocacy of Objectivism in the coming decades a demanding - and rewarding - project for every Objectivist.
Why do Objectivists need a radical new strategy? Since Objectivism is the system of philosophical principles identified by Ayn Rand, is it not enough to follow Ayn Rand's own strategy from, say, 1964, as many Objectivists have been doing for much of the year 2009? Well, no, it isn't. Ayn Rand's 1964 strategy, in the Goldwater campaign, was so counterproductive to the achievement of Objectivist values that she never again collaborated with a Conservative (or "Libertarian") campaign or organization. The main result of 1964 was that by the end of the year, Ayn Rand's name was on the lips of multitudes of Libertarians and Conservatives, millions of whom would have called themselves Objectivists if Ayn Rand had not been alive to stop them. The subsequent drift of Conservatives in the direction of advocating a Christian Theocracy for America, and the drift of Libertarians toward advocating Anarchism - and the political empowerment derived by both from ripping off Ayn Rand's sound bites for use as slogans, in causes fundamentally opposed to her values of individual rights and of individuals using their minds in the service of their own lives and their happiness on Earth - have demonstrated Ayn Rand's wisdom in dissociating her philosophy from such followers. Rand herself followed a new strategy after 1964. By July 1966, she was ready to start publishing, in The Objectivist, her Introduction to Objectivist Epistemology.
In 2009, the same mindless multitudes of Libertarians and Conservatives found themselves again in need of slogans, and once again drafted Ayn Rand (who, being dead, could no longer object) into their service. In the intervening years, American Conservatives had became outright Christianists. As I documented in October (Three Democides by False Morality: Part III, The Ban On Cloning) the Conservative/Christianist movement eventually allied itself with the worst elements of the anti-technology Left to produce a de-facto (and increasingly de-jure) ban on medical research into cloning-based technologies to reverse organ failure. (And NO, this is not about stem cells: read my essay.) Man's natural lifespan is the lifespan that humans would enjoy by the natural use of Man's natural organ of survival: our minds. For every year of delay in the development of cloning-based cures for organ failure, around 3.8 million individual humans will die (more accurately, will have been murdered by the ban on cloning) short of their natural lifespan. We are now in the 12th year of the de-facto ban: a rough estimate of the number of individuals already murdered by our Christianists ("Conservatives") is 45 million and counting. Other things being equal, a human living today is ten million times more likely to die of Christianist democide than of Islamist terrorism. Yet while in 2009 many Objectivists spoke and wrote about the Islamist threat, the ongoing Christianist democide remains largely unmentioned, even among Objectivists, possibly from fear of alienating potential "allies" (of Objectivism?) in the Conservative movement.
Exposure of Ayn Rand and Objectivism in Conservative media has the direct negative consequence of energizing Conservative activism and bringing more Conservative politicians to power. This can only perpetuate the ongoing democidal restraints on medical cloning research, as well as conservative strangling of individual rights in the areas of freedom of speech, abortion, immigration, sexuality, medical relief of pain; and promote the ongoing subjectivization of "criminal justice," and, more generally, government-enforced adherence to Christian "moral standards." The supposed benefit is greater exposure of Objectivism. In the case of the more naive sections of the public, especially those indoctrinated into an anti-conceptual mentality by the Pragmatist comprachicos who run America's schools, this will mean greater use among the public of Ayn Rand quotes, not as principles but as slogans. More will self-identify as "Objectivists" and thus associate Objectivism, in the minds of their friends, contacts and neighbors, with whatever nonsense those self-identified "Objectivists" happen to favor. At the top, eventually we will find self-identified "Objectivists" in positions of political power. Given the enormous harm done to the reputation of Ayn Rand and Objectivism by just one Objectivism-plated Pragmatist, Alan Greenspan, the harm that could be done by future herds of Objectivism-plated Libertarians and Objectivism-plated Conservatives is best left to the imagination.
In "It Is Earlier Than You Think," published in December 1964, Ayn Rand demonstrates a method for formulating a new strategy. But to use her method in 2010, one must first account for what has changed.
In 1964, Marxism was the only significant ideology of academics around the world. Its only secular competitor in America was Pragmatism, an anti-intellectual anti-ideology relegated mainly to Schools of Education. Supernaturalism was on its last legs, leaving even theologians in a desperate quest for religion without God. Rand's strategy, in "It Is Earlier Than You Think," was to prepare Objectivists to do battle with the Marxists - and to fill the vacuum when Marxism collapsed.
Marxism collapsed much earlier than anticipated: it disconfirmed itself with the implosion of Communism in the late 1980s, long before there were enough Objectivist academics to step in its place. The vacuum was filled by a resurgence (more by bloating from gaseous putrefaction than from intellectual revival) of supernaturalism and Pragmatism. Both supernaturalism and Pragmatism interpreted the disconfirmation of Marxism as showing that it was futile for the human mind to attempt a principled, and applicable, understanding of human existence on Earth. With Marxism deflated,and Objectivists still waiting for tenure, supernaturalism and Pragmatism - each complementing the other, with the effect of a Hegelian "synthesis in praxis" -took over the academy and the culture.
One effect of the supernaturalist-Pragmatist takeover of American education and culture is that the typical American of 2010 lives in a state not merely of value-deprivation, as was already the case in 1964, but of concept-deprivation. Americans no longer hold what had been, from the re-discovery of Aristotle and Archimedes in the Renaissance to the collapse of Marxism in the 1980s, the central idea of Western Civilization: that reality can be made sense of by the human mind. The function of Pragmatist schooling is to keep the student's mind from ever reaching what Piaget calls the stage of "abstract operations." This means that exposure to Ayn Rand and Objectivism in the public arena does not function as exposure to Objectivist ideas, which would undermine and displace the results of supernaturalist and Pragmatist indoctrinations in the mind of the listener. All that happens is that statements of Objectivist ideas are added (as slogans, not as ideas) to the existing inchoate slurry of supernatural-Pragmatist notions in the listener's head. Working for mere exposure of Objectivist ideas in the public arena today is futility in action.
Before it again becomes possible for the bulk of Americans to understand Objectivism, one must restore their ability to think in concepts and principles, and give them confidence that reality can be made sense of by the human mind.
How can this be most effectively done?
I expect every Objectivist to defend his or her values against existential threats, including threats from the realm of politics and culture. It is right for health care professionals to fight their prospective enslavement, for businessmen to fight against non-objective laws and arbitrary regulations, for teachers to fight for the rights of their students, and for everyone to fight for the right to speak and act according to the judgment of his or her individual mind. It is possible, and desirable, to use every argument not only to defend the specific values at stake, but to demonstrate, implicitly or explicitly, the power of Objectivist epistemology. To break a culture that associates principles with un-Earthly supernaturalism, and facts with anti-intellectual Pragmatism, the Objectivist's arguments should insist on, and exploit, the Objectivist linkage between ideas and the facts of reality. (I offer this Op-Ed of mine as an example of how the two can be and ought to be linked.) For many of us, including this linkage in our everyday activism can be the easiest way to infiltrate Objectivist epistemology into the minds of our fellows.
For Objectivist academics and teachers, the deliverable is to replace Pragmatist curricula, Pragmatist textbooks and Pragmatist assessments of knowledge with conceptual, principled curricula, books and tests in the fields, disciplines and schools in which we teach. (As a kind of "demonstration project," I am now in the process of writing, together with John Drake, a radically new, conceptual, principled introductory text in Information Systems.) Nearly any field of study, at just about any level, can be used to introduce students to the art of conceptual thought. I plan to write more about this in the near future.
And, for just about every Objectivist in America, there is the option of running for, and serving on, the local school board, where even one Objectivist may be able to replace at least some Pragmatist syllabi with principled texts that teach the application of abstract conceptual thinking to the solution of real-world problems. In some fields such schoolbooks already exist, in English - but only abroad, from Ireland or India or South Africa or Singapore. As a former elected member of a local school board, this is another topic about which I plan to write at some length.
As Ayn Rand often reminds us, the advocacy of Objectivism is primarily - before anything else - the advocacy of reason. We now live in a culture in which hardly anyone knows what reason is. This will make effective advocacy of Objectivism in the coming decades a demanding - and rewarding - project for every Objectivist.
Tuesday, November 24, 2009
Reading Justice at the Thanksgiving Table
It could be my Jewish heritage, but I think that it is better to start a celebration with a reading than with a mere saying. I plan to read the following:
Let us read justice to the men and women whom we thank this evening. In the words of Ayn Rand: "Thousands of years ago, the first man discovered how to make fire. He was probably burned at the stake he had taught his brothers to light. ... Centuries later, the first man invented the wheel. He was probably torn on the rack he had taught his brothers to build. ... Throughout centuries there were men who took first steps down new roads armed with nothing but their own vision. Their goals differed, but they all had this in common: that the step was first, the road was new, the vision unborrowed. ... The creators - the thinkers, the artists, the scientists, the inventors - stood alone against the men of their time. Every great new thought was opposed. Every great new invention was denounced. The first motor was considered foolish. The airplane was considered impossible. The power loom was considered vicious. Anesthesia was considered sinful. But the men of unborrowed vision went ahead. They fought, they suffered, and they paid. But they won." We celebrate their victories, and of our own.
Let us read justice to the men and women whom we thank this evening. In the words of Ayn Rand: "Thousands of years ago, the first man discovered how to make fire. He was probably burned at the stake he had taught his brothers to light. ... Centuries later, the first man invented the wheel. He was probably torn on the rack he had taught his brothers to build. ... Throughout centuries there were men who took first steps down new roads armed with nothing but their own vision. Their goals differed, but they all had this in common: that the step was first, the road was new, the vision unborrowed. ... The creators - the thinkers, the artists, the scientists, the inventors - stood alone against the men of their time. Every great new thought was opposed. Every great new invention was denounced. The first motor was considered foolish. The airplane was considered impossible. The power loom was considered vicious. Anesthesia was considered sinful. But the men of unborrowed vision went ahead. They fought, they suffered, and they paid. But they won." We celebrate their victories, and of our own.
Wednesday, November 04, 2009
The Prosecutors and the Astrologer
I have not had much time to post, but this is so outlandish that I'll just do with less sleep later.
The Associated Press reports,
And this is the payback of Kantian philosophy: reality is not knowable; the best that Justice can do is trial by combat, and in combat nothing counts except the result. Other countries, such as Switzerland, do have justice systems based on the Enlightenment notion of objective fact. We Americans have trial by combat, as was done in the Dark Ages, guided by supernatural forces, only hacking at each other with lawyers instead of halberds. And when the stars or the Gods have spoken, innocent men who have had the better part of their lives taken from them may have no recourse at all.
The Associated Press reports,
The Supreme Court on Wednesday seemed worried that allowing people to sue prosecutors who fabricate evidence to win convictions might chill other prosecutions... The case in front of the high court involves two former Pottawattamie County, Iowa, prosecutors, Attorney Dave Richter and his assistant Joseph Hrvol. They are being sued by Curtis W. McGhee Jr., and Terry Harrington, who were convicted of first-degree murder and sentenced to life in prison in 1978 for the death of retired police officer John Schweer. The men were released from prison after 25 years.Or, in short: All the evidence pointed to Charles Gates as the murderer. The prosecutors consulted an astrologer, who told them that CG was innocent. The prosecutors believed that what the astrologer told them was supernaturally true, trumping over any actual evidence. So they hid the real evidence, and used fabricated, fake "evidence" to deprive two innocent men of nearly the entire span of those innocent men's adult lives. But the two prosecutors have a fireproof defense from any criminal charge: they acted "in good faith," sincerely believing in Astrology and its truth. And now the victims of those two publicly employed swindlers may be deprived of even the right to sue those malefactors for civil justice - out of fear that holding future prosecutors accountable will hold them back from doing "their job" in judicial combat against future defendants.
Evidence showed the prosecutors had failed to share evidence that pointed to another man, Charles Gates, as a possible suspect in Schweer's slaying.
They later on denied that Gates was even a suspect, even though witnesses placed him near the scene of the crime and his name appeared in several police reports. He also was administered and failed a polygraph test and the prosecutors themselves even consulted an astrologer about their suspicions of Gates.
McGhee and Harrington filed lawsuits against the former prosecutors, saying as prosecutors Richter and Hrvol had them arrested without probable cause, coerced and coached witnesses, fabricated evidence against them and concealed evidence that could have cleared them.
And this is the payback of Kantian philosophy: reality is not knowable; the best that Justice can do is trial by combat, and in combat nothing counts except the result. Other countries, such as Switzerland, do have justice systems based on the Enlightenment notion of objective fact. We Americans have trial by combat, as was done in the Dark Ages, guided by supernatural forces, only hacking at each other with lawyers instead of halberds. And when the stars or the Gods have spoken, innocent men who have had the better part of their lives taken from them may have no recourse at all.
Life on the Edge of Implosion of Democracy
Back when I left Bell Labs, and decided to switch coasts to live with Yoon, I made a risky choice. Tenure-track jobs at universities where I would be able to teach advanced courses were few, and fewer within a comfortable commuting distance from Yoon's home. I took the job at Cal State LA with full knowledge of its moral and existential hazards. But damn it, I didn't expect the implosion of California Democracy to hit just 9 years after I took the job.
I'm posting this because the sudden silence from my end of the wire may have made some readers of this blog uncomfortable, and I don't want anyone to think that I have a problem beyond serious overwork. With a 12-unit per quarter teaching load, overwork was a given from the start. That would have been true even in classics, or in medieval history, where the content of courses in unlikely to change much from decade to decade. Teaching 12 units of advanced technical courses in Information Systems, with a 3-year technology half-life and 20% of everything in the typical course becoming obsolete each year, was always Serious Overwork. With research, and with enough hands-on experimentation with new technologies to keeps ahead of the graduate students (some of them already CIOs) in my evening classes, the better part of my waking hours were accounted for. And then, this year, came the (financial) crisis of California Democracy.
How does a busy urban school deal with a 16-million-dollar hole in its budget? First, it does not renew the contracts of part-time adjunct faculty. Simultaneously, there is a flow of incoming students whose 529s shrank enough that they can no longer afford private universities, or even the UC. The remaining faculty's advanced courses are cut, and we are assigned to teach the Business School's required Intro to IS and the like. Since there are fewer courses and fewer sections and more students, class sizes tripled, from an average of 12 to an average of 33. I spend most of my class time dealing with e-mailed questions from students; just reading and organizing and preparing to answer those questions, without which I can have no assurance that I'm doing a responsible job, takes three times as long as it used to.
Two of my three 4-unit courses this term (4 units because they cover the content of a 3-unit semester course in one academic quarter) are Intros. And there are NO adequate textbooks for Intros out there. So, following John Drake, I'm teaching my Intro sections with books that were never meant to be textbooks. I have nothing that otherwise would have come from the textbook's Instructor Site: no prepared homework assignments, no presentation PowerPoints, no test question pools (I had no idea how much time such conveniences saved.) And this on top of getting the Intro students (two-thirds of them coming from Pragmatist schools where they never had to do this before) to think in concepts instead of shopping lists.
My one remaining combined Senior-Graduate advanced technical course is up to the same numbers, because so many other courses were canceled. From 100% students who were taking a difficult technical course because they were burning with enthusiasm for its content, I'm down to 33%, the rest there because they had to take something; some of them signed up without the lower-division prerequisites. And the old textbook was 4 years old and obsolete; I switched to a brand new one for which I'm receiving the still-rough supporting materials by e-mail, sometimes in the morning before the evening's class.
The budget for graduate assistants and graders also is gone. I'm typing this as an otherwise-I-would-go-insane break from grading 100 midterm exams.
And we were just advised of an even larger hole next year. So I am lucky, in that I still have a job...
I'm posting this because the sudden silence from my end of the wire may have made some readers of this blog uncomfortable, and I don't want anyone to think that I have a problem beyond serious overwork. With a 12-unit per quarter teaching load, overwork was a given from the start. That would have been true even in classics, or in medieval history, where the content of courses in unlikely to change much from decade to decade. Teaching 12 units of advanced technical courses in Information Systems, with a 3-year technology half-life and 20% of everything in the typical course becoming obsolete each year, was always Serious Overwork. With research, and with enough hands-on experimentation with new technologies to keeps ahead of the graduate students (some of them already CIOs) in my evening classes, the better part of my waking hours were accounted for. And then, this year, came the (financial) crisis of California Democracy.
How does a busy urban school deal with a 16-million-dollar hole in its budget? First, it does not renew the contracts of part-time adjunct faculty. Simultaneously, there is a flow of incoming students whose 529s shrank enough that they can no longer afford private universities, or even the UC. The remaining faculty's advanced courses are cut, and we are assigned to teach the Business School's required Intro to IS and the like. Since there are fewer courses and fewer sections and more students, class sizes tripled, from an average of 12 to an average of 33. I spend most of my class time dealing with e-mailed questions from students; just reading and organizing and preparing to answer those questions, without which I can have no assurance that I'm doing a responsible job, takes three times as long as it used to.
Two of my three 4-unit courses this term (4 units because they cover the content of a 3-unit semester course in one academic quarter) are Intros. And there are NO adequate textbooks for Intros out there. So, following John Drake, I'm teaching my Intro sections with books that were never meant to be textbooks. I have nothing that otherwise would have come from the textbook's Instructor Site: no prepared homework assignments, no presentation PowerPoints, no test question pools (I had no idea how much time such conveniences saved.) And this on top of getting the Intro students (two-thirds of them coming from Pragmatist schools where they never had to do this before) to think in concepts instead of shopping lists.
My one remaining combined Senior-Graduate advanced technical course is up to the same numbers, because so many other courses were canceled. From 100% students who were taking a difficult technical course because they were burning with enthusiasm for its content, I'm down to 33%, the rest there because they had to take something; some of them signed up without the lower-division prerequisites. And the old textbook was 4 years old and obsolete; I switched to a brand new one for which I'm receiving the still-rough supporting materials by e-mail, sometimes in the morning before the evening's class.
The budget for graduate assistants and graders also is gone. I'm typing this as an otherwise-I-would-go-insane break from grading 100 midterm exams.
And we were just advised of an even larger hole next year. So I am lucky, in that I still have a job...
Sunday, October 18, 2009
Is Christianity More Benign Than Islam?
We really need a site about Christianity to parallel Little Green Footballs. Just think what LGF would say if Moslems did what the Christian Churches have been documented doing ("Churches involved in torture, murder of thousands of African children denounced as witches") by the LA Times - click the title for details.
Friday, October 16, 2009
In other news
Faf (Fafner?) of fafblog.blogspot.com seems a bit of a Libertarian flako, but this is brilliant:
".... In other news, the Nobel Prize for Literature was awarded to a man who set fire to a library and then promised to write a book about it."
(H.T.: Natailya Petrova on Facebook.)
".... In other news, the Nobel Prize for Literature was awarded to a man who set fire to a library and then promised to write a book about it."
(H.T.: Natailya Petrova on Facebook.)
Sunday, October 04, 2009
Three Democides by False Morality: Part III, The Ban On Cloning
(This is the third part of a three-part article. Part I is here; Part II is here.)
The third democide by false morality differs from the Stalin and Carson democides in that, unlike its precursors, it was not a simple consequence of a false morality held my millions. Stalin headed a totalitarian regime whose main claim to popular legitimacy was enforcement of the traditional, originally Christian altruist false morality of Russia and Europe. Carson spawned a new, equally anti-human ideology and false morality that did not begin its toll of democide until after it had gained millions of adherents.
The third of the modern democides by false morality started out without a constituency and without anything resembling ideological conviction. It was - and is - mass murder of tens of millions of individuals, originating not from enforcement of false principles but from a false embrace of pseudo-principles, driven not by conviction, but by simple (and simple-minded) opportunism in the service of political power-seeking.
Dick Armey's world-wide de-facto prohibition against medical research into cloning-based organ replacement technology is not a case of political power in the service of false morality, but of false morality in the service of one politician's otherwise unprincipled pursuit of political power.
The first successful organ transplant, a kidney transplant between identical twins, was performed in 1954. It was successful because there is no immune rejection between genetically identical twins. Transplants between individuals who are not genetically identical always relied, and still rely, on chemical suppression of the recipient's immune system, leaving the patient at severe risk of premature death from diseases that someone with a healthy immune would have been protected against.
Genetically compatible replacement organs can be grown artificially, in a decorticated fetus created by replacing the nucleus of a newly fertilized human egg with the nucleus of a somatic cell from the patient. Once the first mammal, a mouse, was cloned in 1986, the combination of somatic cell nuclear transfer with subsequent organ transplantation has been the obvious least-effort technology whose development would essentially end the threat of organ failure as a cause of death in the developed world. Fetal transplantation technology has been routinely used, since at least 2004, in replacement of small organs such as the retina of the eye. All that remains for the complete organ replacement technology to become practical, is experience with growing an actual decorticated fetus cloned from a prospective patient. There are no objective ethical or scientific obstacles to the development of this technology - only political ones.
The moral aspects of cloning have been sufficiently addressed in several articles by Alex Epstein of the Ayn Rand Institute, especially his "Cloning is Moral," which specifically addressed the characterization of this technology as "growing human beings for spare body parts." To supplement the moral perspective, here is a rough estimate of the number of avoidable deaths that result from each year of delay in the development of cloning-based organ replacement technology:
In 2002 - the most recent year for which government statistics were available when I first wrote on this topic - 696,400 Americans died of heart failure, 124,770 of chronic lung failure, 73,247 of diabetes, 40,801 of kidney failure and 27,247 of liver failure. The total for these five is 762,465 deaths per year in the United States, out of a population of about 300 million. Half of the world's population, about 3 billion -- ten times the population of the United States -- live in countries advanced enough to use therapeutic cloning and fetal organ transplant technology if it were legal. The proportional estimate of death from failure of one of the above 5 major organs -- in advanced countries only -- is about 7.6 million. If only half of those deaths could be eventually prevented by application of cloning and fetal organ transplant technologies, then every year of delay in the development of those technologies results in 3.8 million preventable deaths.
Given its obvious usefulness for saving millions of lives, the prospect of cloning-based organ replacement technology was something that Americans who understood its potential, including American Christians, generally favored, from the first mouse cloning of 1986 onward. Cloning is an important - and generally benevolent - part of the projected technological context of the future society envisioned by J. Neil Schulman, a recent convert to Christianity, in "The Rainbow Cadenza," his 1986 futuristic novel on the theme of Original Sin. Toward the end of the novel, the protagonist is trying to have her mother, in stasis as a result of organ failure, revived by cloning. It is the protagonist's sister, Judge Vera, the most cruel and generally evil character in the book, who then voices the book's only objection to cloning: "I was supposed to cut out a baby's brain to bring her back?" Tellingly, the perversely anti-technology Judge Vera is a Wiccan. The book's Christians, the author's proxies, have no problem with restoring failed and amputated organs with cloning-based technology. Indeed, until Dick Armey's anti-cloning campaign in the late 1990s, no American would have associated opposition to cloning-based technologies with anyone other than the American Left's marginal anti-technology, anti-Western-Civilization fringe.
Dick Armey, an economics professor at North Texas State University, was elected to Congress in 1984, eventually becoming the leader of so-called "Economic Conservatives" in the Republican Party. In 1994 he collaborated with Newt Gingrich, the leader of the "Social Conservative" faction, in drafting the "Contract with America," which was credited with bringing about the Republican victory in that year's elections. In 1995 Gingrich became Speaker and Armey Majority Leader in the House of Representatives.
After the election, the Social Conservative faction expected the Republican majority in Congress to "deliver" on its key issues: immigration, abortion, and homosexuals. Until that time, Armey, himself a Bible Christian and a congregant of a Bible Church, had counted on, and had received, the support of Social Conservatives in his district. But that district also had many voters with friends and relatives among legal and illegal immigrants, and a university town with predictably libertarian attitudes on abortion and on the rights of Gays and Lesbians. Moreover, as an empirical social scientist of some competence, Armey understood that the three top issues of the Social Conservatives had no traction with the electorate. A genuine effort on those issues would cost him his seat, and could well lead to the loss of a Republican majority in Congress.
This left Armey in search of issues on which he could "deliver" to the Social Conservatives and the Religious right, without alienating his district's voters from his own candidacy, or the national electorate from the Republican party. Back in 1989, Armey thought that he had found one such issue in National Endowment for the Arts grants to Andres Serrano and Robert Mapplethorpe, but the NEA backed down without a fight. In 1995 Armey was at a loss. And then came 1996, the year of Dolly the sheep.
Dolly was the first large mammal - not a mere mouse - cloned by somatic cell nuclear transfer. The path to a cloned human fetus was clear. The anti-technology left, including some among Armey's university town constituents, were on fire. Interestingly, now that medical cloning had come closer to imminent reality, its compatibility with Christian morals started to be debated. A part of that debate was an editorial by one Gino Concetti in L'Osservatore Romano, the Vatican newspaper, calling for a ban on human cloning: "A person has the right to be born in a human way and not in the laboratory." Concetti was a working journalist and an ordinary priest, not a philosopher or a theologian or a member of the Curia, and a newspaper editorial was far from an authoritative statement of Catholic Doctrine. But the Washington Post, noting the "semiofficial" reputation of the paper that Concetti's editorial appeared in, headlined a story in its February 27, 1997 edition "Vatican Calls For Ban on Human Cloning."
Dick Armey had his issue.
Contemporary Religious Right Protestants in America are mostly Pragmatist and anti-intellectual. When they need doctrine, they turn to the Catholic Church; the Religious Right's men on the Supreme Court are, to a man, Catholics. Concetti's editorial gave Armey a cause on which, through collaboration with the anti-technology Left in Congress and in both district and national electorates, he had a pragmatic chance to win ("deliver") on an issue that, he may have thought, the Religious Right would care about.
From his position as House Majority Leader, Armey led the formation of a formidable anti-cloning lobby. It was the first lasting political coalition between the theocratic "Right" of James Dobson and the anti-technology Left of Jeremy Rifkin, whose followers and allies came to dominate Clinton's National Bioethics Advisory Commission. By March 1997, political pressure from this unprecedented coalition led President Clinton to sign an executive order banning research on medical cloning in any institution receiving US federal funds, or any organization or enterprise working under contracts with the US government. Clinton's ban effectively terminated any possibility of cloning research at any formal institution, from colleges that enroll students with government-guaranteed education loans, to medical practices treating Medicaid or Medicare patients, to medical drug and technology companies with Medicare contracts. This effectively outflanked Armey, who was left to legislate a more formal legal ban against something that in practice could not take place in America any more.
Armey went ahead, and in January 1998 submitted to the House a permanent ban on cloning humans in the United States. Armey's bill was announced at a news conference with representatives of the Christian Coalition, Dobson's Family Research Council and the National Conference of Catholic Bishops. Jeremy Rifkin simultaneously announced a symmetrical anti-cloning initiative from the Left, at first informal but eventually, when Armey's initiative stalled, producing a statement signed by "64 of the nation's leading progressive policy leaders, academics and activists" in support of Armey's legislation. Under the Senate version of Armey's bill, introduced by Senator Bill Frist, a scientist convicted of human cloning would face up to 10 years in prison.
And then things began to fall apart for Armey. Congressional Democrats, seeing Armey's legislation as a blatant attempt to wrest credit for a cloning ban away from President Clinton, whose executive order had already produced an effective ban, did not go along. Armey, on the strength of Jeremy Rifkin's support, had counted on the support of Congressional "progressives." He didn't get it. And in the Senate, Senator McCain, who saw in biotechnology, including medical cloning, a hope for reversing the disabilities he had suffered from North Vietnamese torture, organized enough resistance to stop Frist's bill. And thus Armey's hope of "delivering" a legislative result to Dobson and the Theocratic Right, a hope for which he was willing to kill millions - some 3.8 million per year of delay in the development of medical cloning - came to naught.
Armey continued to re-introduce his legislation banning all human cloning after each congressional election. After 2001, when Jeremy Rifkin's Foundation on Economic Trends published a statement from 64 prominent anti-technology Leftists in support of Armey's legislation, anti-technology congressional leftists rallied to Armey's bill, which passed twice in the House of Representatives, only to be blocked by McCain's efforts in the Senate. Given the threat that such legislation could pass at any time, thus wiping out all previous investment in cloning-based technologies, private investment predictably stopped. Some work on cloning was included in State-level stem cell initiatives, but the State-level legislation authorizing these initiatives mandated that any cloned embryos be used only to extract stem cells, and in any case destroyed within 10 days, thus eliminating the possibility of developing organ-replacement procedures.
With the election of a Republican president in 2000, Armey's effort acquired a world-wide dimension. George W. Bush aligned his presidency with the theocratic wing of the Republican Party, and while the Constitution limited how far his theocratic agenda could be taken inside the country, as President he felt entitled to conduct the foreign policy of the United States pretty much as he pleased. The legislatures of countries striving to maintain friendly relations with the United States found themselves under pressure to enact their versions of Armey's bill domestically, and to join Bush's push for an international treaty to ban cloning worldwide, even while in the United States a formal ban of this kind was replaced with comprehensive funding restrictions, regulatory directives, and, to back it up and intimidate potential private investors, the threat of Armey's legislation.
The reader is invited to refer to a lengthy scholarly article by Thomas Banchoff for a detailed study of the great theocratic power grab for a global ban on medical cloning. In brief, there was no consensus among the various Christian and Islamic sects about the morality of cloning; Jewish religious authorities were unanimously, even among the most Orthodox, supportive of cloning, declaring it to be no less than a religious obligation when done to save a fully developed human life (while also mandating early decortication of fetuses cloned for medical applications, "ensuring that the embryos used in this research are not brought to a point which constitutes human-hood.")
Countries with tax-supported, politically influential Catholic and Evangelical churches (such as Costa Rica and Germany) were, as would be expected, among the first to ban all human cloning in their national legislation, and to advocate a global ban through a UN-sponsored international treaty. Such countries, however, represented only a small fraction of the population of the world, and they would not have stepped forward to urge such a global treaty without the initiative of a trio of unusual allies: The United States (actually the administration of President George W. Bush,) the Vatican, and Saudi Arabia.
For the Vatican cloning was always a minor issue, minuscule in comparison with abortion, or with equal marriage for same-sex couples. But was also, as it was for Dick Armey, an issue with which they hoped to score deliverables. If a global treaty to ban cloning were successful, it would also establish a global precedent for an international regime based on religious rules rather than purported concern for the national interests of participating countries. Such a precedent would open the door to global bans on other supposedly "immoral" human action; abortion or equal marriage could then be next.
As for Saudi Arabia, its advocacy for a cloning ban was expected to be particularly effective in the Islamic world, as Saudi Arabia was both the site of Islam's two most holy pilgrimage destinations, and the model of strict enforcement of Islamic religious law. Saudi Arabia was an absolute monarchy, its royal family having close business, political and personal ties with President Bush, and more-or-less completely dependent on the United States commercially, politically and militarily. Kuwait, politically and militarily dependent on the United States for defense against Iraq, joined Saudi Arabia on Bush's side.
The Bush administration deployed every instrument of pressure it could to create an anti-cloning majority at the UN. That majority was largely composed of small countries that depended for their existence on military, political or economic support of the United States. This majority also included those Islamic countries that depended on Saudi Arabia or Kuwait for cheap oil and handouts. It also included Israel. Israelis, whether religious or secular, held (and still hold) an unusually positive view of science, technology, and especially of medical technologies, such as cloning, that promise to be useful in the defense of human life. In medical research and invention Israel was already a world leader, on par with the United States and Switzerland. But Israel's political leaders were (and still are) in the grasp of an expensive national-collectivist ideology that made them abjectly dependent on American appropriations, which could only originate in the US House of Representatives, which was firmly under the control of Dick Armey. And so Israel passed domestic anti-cloning legislation, and joined the US-Vatican-Saudi-led anti-cloning side at the United Nations.
On the opposite side was an equally ad hoc alliance of independent countries with secular majorities or secular constitutions, such as Great Britain, Turkey, and South Korea; the more secular countries of Europe; and countries determined to spite President Bush: China, Russia, and of course Iran. It was the ultimate inversion of sense: United States and Israel on the side of theocratic mass murder; Iran on the side of technology and of the freedom of science.
Ultimately, the world was saved from the prospect of a global ban on cloning by the fact that even the most abject of diplomats is not without some concern for the continuation of his own life. And so the ban was changed into a non-binding resolution that called on member states "to prohibit all forms of human cloning inasmuch as they are incompatible with human dignity and the protection of human life." They agreed to disagree, of course, on the exact meaning of "inasmuch" in that declaration. But the chilling effects of Bush's and Armey's efforts on investment in cloning technologies continues, and so do the regulatory barriers that stand in the way of research on medical cloning in the United States, and legislated barriers abroad.
As of 2009, medical research into cloning-based organ-replacement technologies has been at a standstill since 1998. With 3.8 million avoidable deaths for every year of delay in the development of these technologies, the death toll to date is close to 42 million, rivaling the number of victims of Rachel Carson, and close to the number murdered by Stalin and Hitler together. And what good did this exercise in mass murder do for Dick Armey?
James Dobson, who sponsored the press conference that announced Armey's legislation to the world, wanted a visible triumph of Faith. With both a legislative ban in the United States having no hope of becoming law, and a global anti-cloning treaty demoted to a non-binding declaration, a mere chilling effect was not the triumph of Faith that Dobson wanted, even if it was still killing 3.8 million people a year. And with only peripheral action in Congress on Dobson's big issues - on abortion and on equal marriage for Gays - the public perception of Dobson having enough Washington pull to be worth paying off was vanishing. As Armey was to write later, "As Majority Leader, I remember vividly a meeting with the House leadership where Dobson scolded us for having failed to 'deliver' for Christian conservatives, that we owed our majority to him, and that he had the power to take our jobs back. This offended me, and I told him so."
Offended or not, Armey practically conceded that Dobson had the power to "take Armey's job back" by resigning from Congress in 2002. Having sat on the fence between Republican Theocrats and Republican Pragmatists through his tenure in Congress, in retirement Armey began to identify explicitly with the Pragmatists. The name of Armey's political organization, "FreedomWorks," is an explicit riff on the Pragmatist anti-principle, "whatever works." Armed with the anti-principle of having no principles of his own, Armey has been known to talk about "separation of Church and State" as though he had not been theocracy's standard bearer when he advocated his cloning ban, and murdered some 40 million people by the threat of this ban, only a few years before.
A Personal Postscript
As recently as 1997, I had a reasonable hope of living long enough for cloning-based organ-replacement technology to become available - and then of going on tolive practically forever. After 11 years of delay, and the prospect of more delay to come, that hope is no longer reasonable. Like the millions of Ukrainians who lost their lives because Stalin's false morality prohibited trade in food, and like the millions of Africans who lost their lives because Rachel Carson's false morality prohibited spraying mosquito swamps with DDT, I am one of millions who are losing our lives because Dick Armey's false morality barred the imminent development of cloning-based organ replacement technologies.
Of course Dick Armey, like Joe Stalin and Rachel Carson, didn't do it alone. Dick Armey's unique contribution was to yoke together an unprecedented (and unlikely) coalition of anti-science, anti-reason, and anti-technology activists spanning the spectrum from James Dobson to Jeremy Rifkin. Armey eventually lost the support of some of his former collaborators, but he is still in the coalition business. Armey's new coalition - the Tea Party movement, sponsored and organized by Armey's FreedomWorks - embraces everyone who despises the Obama program. It is of course preposterous to think that Objectivists, who oppose ObamaCare because it would enslave the providers of health care, and Theocrats, who oppose it because insurance companies that provide coverage for abortion would not be excluded from selling policies under the proposed Federal mandate, have something (or anything) in common. Dick Armey is counting on his new coalition to take him to the White House in 2012. The good news is that by 2012 Armey will be older than any first-time presidential candidate in history. And by then, he may well be dead of organ failure. Or, more accurately, of suicide by false morality - and by lack of principle.
The third democide by false morality differs from the Stalin and Carson democides in that, unlike its precursors, it was not a simple consequence of a false morality held my millions. Stalin headed a totalitarian regime whose main claim to popular legitimacy was enforcement of the traditional, originally Christian altruist false morality of Russia and Europe. Carson spawned a new, equally anti-human ideology and false morality that did not begin its toll of democide until after it had gained millions of adherents.
The third of the modern democides by false morality started out without a constituency and without anything resembling ideological conviction. It was - and is - mass murder of tens of millions of individuals, originating not from enforcement of false principles but from a false embrace of pseudo-principles, driven not by conviction, but by simple (and simple-minded) opportunism in the service of political power-seeking.
Dick Armey's world-wide de-facto prohibition against medical research into cloning-based organ replacement technology is not a case of political power in the service of false morality, but of false morality in the service of one politician's otherwise unprincipled pursuit of political power.
The first successful organ transplant, a kidney transplant between identical twins, was performed in 1954. It was successful because there is no immune rejection between genetically identical twins. Transplants between individuals who are not genetically identical always relied, and still rely, on chemical suppression of the recipient's immune system, leaving the patient at severe risk of premature death from diseases that someone with a healthy immune would have been protected against.
Genetically compatible replacement organs can be grown artificially, in a decorticated fetus created by replacing the nucleus of a newly fertilized human egg with the nucleus of a somatic cell from the patient. Once the first mammal, a mouse, was cloned in 1986, the combination of somatic cell nuclear transfer with subsequent organ transplantation has been the obvious least-effort technology whose development would essentially end the threat of organ failure as a cause of death in the developed world. Fetal transplantation technology has been routinely used, since at least 2004, in replacement of small organs such as the retina of the eye. All that remains for the complete organ replacement technology to become practical, is experience with growing an actual decorticated fetus cloned from a prospective patient. There are no objective ethical or scientific obstacles to the development of this technology - only political ones.
The moral aspects of cloning have been sufficiently addressed in several articles by Alex Epstein of the Ayn Rand Institute, especially his "Cloning is Moral," which specifically addressed the characterization of this technology as "growing human beings for spare body parts." To supplement the moral perspective, here is a rough estimate of the number of avoidable deaths that result from each year of delay in the development of cloning-based organ replacement technology:
In 2002 - the most recent year for which government statistics were available when I first wrote on this topic - 696,400 Americans died of heart failure, 124,770 of chronic lung failure, 73,247 of diabetes, 40,801 of kidney failure and 27,247 of liver failure. The total for these five is 762,465 deaths per year in the United States, out of a population of about 300 million. Half of the world's population, about 3 billion -- ten times the population of the United States -- live in countries advanced enough to use therapeutic cloning and fetal organ transplant technology if it were legal. The proportional estimate of death from failure of one of the above 5 major organs -- in advanced countries only -- is about 7.6 million. If only half of those deaths could be eventually prevented by application of cloning and fetal organ transplant technologies, then every year of delay in the development of those technologies results in 3.8 million preventable deaths.
Given its obvious usefulness for saving millions of lives, the prospect of cloning-based organ replacement technology was something that Americans who understood its potential, including American Christians, generally favored, from the first mouse cloning of 1986 onward. Cloning is an important - and generally benevolent - part of the projected technological context of the future society envisioned by J. Neil Schulman, a recent convert to Christianity, in "The Rainbow Cadenza," his 1986 futuristic novel on the theme of Original Sin. Toward the end of the novel, the protagonist is trying to have her mother, in stasis as a result of organ failure, revived by cloning. It is the protagonist's sister, Judge Vera, the most cruel and generally evil character in the book, who then voices the book's only objection to cloning: "I was supposed to cut out a baby's brain to bring her back?" Tellingly, the perversely anti-technology Judge Vera is a Wiccan. The book's Christians, the author's proxies, have no problem with restoring failed and amputated organs with cloning-based technology. Indeed, until Dick Armey's anti-cloning campaign in the late 1990s, no American would have associated opposition to cloning-based technologies with anyone other than the American Left's marginal anti-technology, anti-Western-Civilization fringe.
Dick Armey, an economics professor at North Texas State University, was elected to Congress in 1984, eventually becoming the leader of so-called "Economic Conservatives" in the Republican Party. In 1994 he collaborated with Newt Gingrich, the leader of the "Social Conservative" faction, in drafting the "Contract with America," which was credited with bringing about the Republican victory in that year's elections. In 1995 Gingrich became Speaker and Armey Majority Leader in the House of Representatives.
After the election, the Social Conservative faction expected the Republican majority in Congress to "deliver" on its key issues: immigration, abortion, and homosexuals. Until that time, Armey, himself a Bible Christian and a congregant of a Bible Church, had counted on, and had received, the support of Social Conservatives in his district. But that district also had many voters with friends and relatives among legal and illegal immigrants, and a university town with predictably libertarian attitudes on abortion and on the rights of Gays and Lesbians. Moreover, as an empirical social scientist of some competence, Armey understood that the three top issues of the Social Conservatives had no traction with the electorate. A genuine effort on those issues would cost him his seat, and could well lead to the loss of a Republican majority in Congress.
This left Armey in search of issues on which he could "deliver" to the Social Conservatives and the Religious right, without alienating his district's voters from his own candidacy, or the national electorate from the Republican party. Back in 1989, Armey thought that he had found one such issue in National Endowment for the Arts grants to Andres Serrano and Robert Mapplethorpe, but the NEA backed down without a fight. In 1995 Armey was at a loss. And then came 1996, the year of Dolly the sheep.
Dolly was the first large mammal - not a mere mouse - cloned by somatic cell nuclear transfer. The path to a cloned human fetus was clear. The anti-technology left, including some among Armey's university town constituents, were on fire. Interestingly, now that medical cloning had come closer to imminent reality, its compatibility with Christian morals started to be debated. A part of that debate was an editorial by one Gino Concetti in L'Osservatore Romano, the Vatican newspaper, calling for a ban on human cloning: "A person has the right to be born in a human way and not in the laboratory." Concetti was a working journalist and an ordinary priest, not a philosopher or a theologian or a member of the Curia, and a newspaper editorial was far from an authoritative statement of Catholic Doctrine. But the Washington Post, noting the "semiofficial" reputation of the paper that Concetti's editorial appeared in, headlined a story in its February 27, 1997 edition "Vatican Calls For Ban on Human Cloning."
Dick Armey had his issue.
Contemporary Religious Right Protestants in America are mostly Pragmatist and anti-intellectual. When they need doctrine, they turn to the Catholic Church; the Religious Right's men on the Supreme Court are, to a man, Catholics. Concetti's editorial gave Armey a cause on which, through collaboration with the anti-technology Left in Congress and in both district and national electorates, he had a pragmatic chance to win ("deliver") on an issue that, he may have thought, the Religious Right would care about.
From his position as House Majority Leader, Armey led the formation of a formidable anti-cloning lobby. It was the first lasting political coalition between the theocratic "Right" of James Dobson and the anti-technology Left of Jeremy Rifkin, whose followers and allies came to dominate Clinton's National Bioethics Advisory Commission. By March 1997, political pressure from this unprecedented coalition led President Clinton to sign an executive order banning research on medical cloning in any institution receiving US federal funds, or any organization or enterprise working under contracts with the US government. Clinton's ban effectively terminated any possibility of cloning research at any formal institution, from colleges that enroll students with government-guaranteed education loans, to medical practices treating Medicaid or Medicare patients, to medical drug and technology companies with Medicare contracts. This effectively outflanked Armey, who was left to legislate a more formal legal ban against something that in practice could not take place in America any more.
Armey went ahead, and in January 1998 submitted to the House a permanent ban on cloning humans in the United States. Armey's bill was announced at a news conference with representatives of the Christian Coalition, Dobson's Family Research Council and the National Conference of Catholic Bishops. Jeremy Rifkin simultaneously announced a symmetrical anti-cloning initiative from the Left, at first informal but eventually, when Armey's initiative stalled, producing a statement signed by "64 of the nation's leading progressive policy leaders, academics and activists" in support of Armey's legislation. Under the Senate version of Armey's bill, introduced by Senator Bill Frist, a scientist convicted of human cloning would face up to 10 years in prison.
And then things began to fall apart for Armey. Congressional Democrats, seeing Armey's legislation as a blatant attempt to wrest credit for a cloning ban away from President Clinton, whose executive order had already produced an effective ban, did not go along. Armey, on the strength of Jeremy Rifkin's support, had counted on the support of Congressional "progressives." He didn't get it. And in the Senate, Senator McCain, who saw in biotechnology, including medical cloning, a hope for reversing the disabilities he had suffered from North Vietnamese torture, organized enough resistance to stop Frist's bill. And thus Armey's hope of "delivering" a legislative result to Dobson and the Theocratic Right, a hope for which he was willing to kill millions - some 3.8 million per year of delay in the development of medical cloning - came to naught.
Armey continued to re-introduce his legislation banning all human cloning after each congressional election. After 2001, when Jeremy Rifkin's Foundation on Economic Trends published a statement from 64 prominent anti-technology Leftists in support of Armey's legislation, anti-technology congressional leftists rallied to Armey's bill, which passed twice in the House of Representatives, only to be blocked by McCain's efforts in the Senate. Given the threat that such legislation could pass at any time, thus wiping out all previous investment in cloning-based technologies, private investment predictably stopped. Some work on cloning was included in State-level stem cell initiatives, but the State-level legislation authorizing these initiatives mandated that any cloned embryos be used only to extract stem cells, and in any case destroyed within 10 days, thus eliminating the possibility of developing organ-replacement procedures.
With the election of a Republican president in 2000, Armey's effort acquired a world-wide dimension. George W. Bush aligned his presidency with the theocratic wing of the Republican Party, and while the Constitution limited how far his theocratic agenda could be taken inside the country, as President he felt entitled to conduct the foreign policy of the United States pretty much as he pleased. The legislatures of countries striving to maintain friendly relations with the United States found themselves under pressure to enact their versions of Armey's bill domestically, and to join Bush's push for an international treaty to ban cloning worldwide, even while in the United States a formal ban of this kind was replaced with comprehensive funding restrictions, regulatory directives, and, to back it up and intimidate potential private investors, the threat of Armey's legislation.
The reader is invited to refer to a lengthy scholarly article by Thomas Banchoff for a detailed study of the great theocratic power grab for a global ban on medical cloning. In brief, there was no consensus among the various Christian and Islamic sects about the morality of cloning; Jewish religious authorities were unanimously, even among the most Orthodox, supportive of cloning, declaring it to be no less than a religious obligation when done to save a fully developed human life (while also mandating early decortication of fetuses cloned for medical applications, "ensuring that the embryos used in this research are not brought to a point which constitutes human-hood.")
Countries with tax-supported, politically influential Catholic and Evangelical churches (such as Costa Rica and Germany) were, as would be expected, among the first to ban all human cloning in their national legislation, and to advocate a global ban through a UN-sponsored international treaty. Such countries, however, represented only a small fraction of the population of the world, and they would not have stepped forward to urge such a global treaty without the initiative of a trio of unusual allies: The United States (actually the administration of President George W. Bush,) the Vatican, and Saudi Arabia.
For the Vatican cloning was always a minor issue, minuscule in comparison with abortion, or with equal marriage for same-sex couples. But was also, as it was for Dick Armey, an issue with which they hoped to score deliverables. If a global treaty to ban cloning were successful, it would also establish a global precedent for an international regime based on religious rules rather than purported concern for the national interests of participating countries. Such a precedent would open the door to global bans on other supposedly "immoral" human action; abortion or equal marriage could then be next.
As for Saudi Arabia, its advocacy for a cloning ban was expected to be particularly effective in the Islamic world, as Saudi Arabia was both the site of Islam's two most holy pilgrimage destinations, and the model of strict enforcement of Islamic religious law. Saudi Arabia was an absolute monarchy, its royal family having close business, political and personal ties with President Bush, and more-or-less completely dependent on the United States commercially, politically and militarily. Kuwait, politically and militarily dependent on the United States for defense against Iraq, joined Saudi Arabia on Bush's side.
The Bush administration deployed every instrument of pressure it could to create an anti-cloning majority at the UN. That majority was largely composed of small countries that depended for their existence on military, political or economic support of the United States. This majority also included those Islamic countries that depended on Saudi Arabia or Kuwait for cheap oil and handouts. It also included Israel. Israelis, whether religious or secular, held (and still hold) an unusually positive view of science, technology, and especially of medical technologies, such as cloning, that promise to be useful in the defense of human life. In medical research and invention Israel was already a world leader, on par with the United States and Switzerland. But Israel's political leaders were (and still are) in the grasp of an expensive national-collectivist ideology that made them abjectly dependent on American appropriations, which could only originate in the US House of Representatives, which was firmly under the control of Dick Armey. And so Israel passed domestic anti-cloning legislation, and joined the US-Vatican-Saudi-led anti-cloning side at the United Nations.
On the opposite side was an equally ad hoc alliance of independent countries with secular majorities or secular constitutions, such as Great Britain, Turkey, and South Korea; the more secular countries of Europe; and countries determined to spite President Bush: China, Russia, and of course Iran. It was the ultimate inversion of sense: United States and Israel on the side of theocratic mass murder; Iran on the side of technology and of the freedom of science.
Ultimately, the world was saved from the prospect of a global ban on cloning by the fact that even the most abject of diplomats is not without some concern for the continuation of his own life. And so the ban was changed into a non-binding resolution that called on member states "to prohibit all forms of human cloning inasmuch as they are incompatible with human dignity and the protection of human life." They agreed to disagree, of course, on the exact meaning of "inasmuch" in that declaration. But the chilling effects of Bush's and Armey's efforts on investment in cloning technologies continues, and so do the regulatory barriers that stand in the way of research on medical cloning in the United States, and legislated barriers abroad.
As of 2009, medical research into cloning-based organ-replacement technologies has been at a standstill since 1998. With 3.8 million avoidable deaths for every year of delay in the development of these technologies, the death toll to date is close to 42 million, rivaling the number of victims of Rachel Carson, and close to the number murdered by Stalin and Hitler together. And what good did this exercise in mass murder do for Dick Armey?
James Dobson, who sponsored the press conference that announced Armey's legislation to the world, wanted a visible triumph of Faith. With both a legislative ban in the United States having no hope of becoming law, and a global anti-cloning treaty demoted to a non-binding declaration, a mere chilling effect was not the triumph of Faith that Dobson wanted, even if it was still killing 3.8 million people a year. And with only peripheral action in Congress on Dobson's big issues - on abortion and on equal marriage for Gays - the public perception of Dobson having enough Washington pull to be worth paying off was vanishing. As Armey was to write later, "As Majority Leader, I remember vividly a meeting with the House leadership where Dobson scolded us for having failed to 'deliver' for Christian conservatives, that we owed our majority to him, and that he had the power to take our jobs back. This offended me, and I told him so."
Offended or not, Armey practically conceded that Dobson had the power to "take Armey's job back" by resigning from Congress in 2002. Having sat on the fence between Republican Theocrats and Republican Pragmatists through his tenure in Congress, in retirement Armey began to identify explicitly with the Pragmatists. The name of Armey's political organization, "FreedomWorks," is an explicit riff on the Pragmatist anti-principle, "whatever works." Armed with the anti-principle of having no principles of his own, Armey has been known to talk about "separation of Church and State" as though he had not been theocracy's standard bearer when he advocated his cloning ban, and murdered some 40 million people by the threat of this ban, only a few years before.
A Personal Postscript
As recently as 1997, I had a reasonable hope of living long enough for cloning-based organ-replacement technology to become available - and then of going on tolive practically forever. After 11 years of delay, and the prospect of more delay to come, that hope is no longer reasonable. Like the millions of Ukrainians who lost their lives because Stalin's false morality prohibited trade in food, and like the millions of Africans who lost their lives because Rachel Carson's false morality prohibited spraying mosquito swamps with DDT, I am one of millions who are losing our lives because Dick Armey's false morality barred the imminent development of cloning-based organ replacement technologies.
Of course Dick Armey, like Joe Stalin and Rachel Carson, didn't do it alone. Dick Armey's unique contribution was to yoke together an unprecedented (and unlikely) coalition of anti-science, anti-reason, and anti-technology activists spanning the spectrum from James Dobson to Jeremy Rifkin. Armey eventually lost the support of some of his former collaborators, but he is still in the coalition business. Armey's new coalition - the Tea Party movement, sponsored and organized by Armey's FreedomWorks - embraces everyone who despises the Obama program. It is of course preposterous to think that Objectivists, who oppose ObamaCare because it would enslave the providers of health care, and Theocrats, who oppose it because insurance companies that provide coverage for abortion would not be excluded from selling policies under the proposed Federal mandate, have something (or anything) in common. Dick Armey is counting on his new coalition to take him to the White House in 2012. The good news is that by 2012 Armey will be older than any first-time presidential candidate in history. And by then, he may well be dead of organ failure. Or, more accurately, of suicide by false morality - and by lack of principle.
Subscribe to:
Posts (Atom)