Book: The Agile Gene

Previous: 5. Genes in the Fourth Dimension
Next: 7. Learning Lessons

The childhood shows the man, as morning shows the day.

John Milton, Paradise Regained

Nurture is reversible; nature is not. That is the reason responsible intellectuals have spent a century preferring the cheerful meliorism of environment to the bleak Calvinism of genes. But what if there were a planet where it was the other way around? Suppose some scientist discovered a world in which lived intelligent creatures whose nurture was something they could do nothing about, whereas their genes were exquisitely sensitive to the world in which they lived.

Suppose no more. In this chapter I intend to start convincing you that you live on precisely such a planet. To the extent that people are products of nurture, in the narrowly parental sense of the word, they are largely the products of early and irreversible events. To the extent that they are the product of genes, they are expressing new effects right into adulthood, and often those effects are at the mercy of the way they live. This is one of those contrarian surprises that science delights in delivering, and it is one of the least recognized and most significant discoveries of recent years. Even its discoverers, steeped as they are in the issue of nature versus nurture, are only dimly aware of how revolutionary their discoveries are.

In 1909, in the Danube marshes near Altenberg in eastern Austria, a six-year-old boy named Konrad and his friend Gretl were given two new-hatched ducklings by a neighbor. The ducklings became imprinted on the children and followed them everywhere, mistaking them for parents. “What we didn’t notice,” said Konrad 64 years later, “is that I got imprinted on the ducks in the process…. A lifelong endeavour is fixed by one decisive experience in early youth.” In 1935 Konrad Lorenz, by then married to Gretl, described rather more scientifically how a gosling, soon after hatching, will fixate on, and follow, the first moving thing it encounters. That moving thing is usually its mother, but occasionally it turns out to be a goateed professor. Lorenz realized that there was a narrow window of time during which this imprinting could occur. If the gosling was less than 15 hours or more than three days old, it would not imprint. Once imprinted, it was stuck and could not learn to follow a different foster parent.

Lorenz was not actually the first to describe imprinting. More than 60 years before, the English naturalist Douglas Alexander Spalding spoke of early experience being “stamped in” to a young animal’s mind—virtually the same metaphor. Little is known about Spalding, but that little is refreshingly exotic. John Stuart Mill, having met Spalding in Avignon, got him the job of tutor to the elder brother of Bertrand Russell. Russell’s parents, Viscount and Viscountess Amberley, thought it would be wrong for Spalding, a consumptive, to reproduce. But they thought it equally wrong that a man’s natural sexual urges should be denied, so they decided that the dilemma should be solved in the obvious way: by Lady Amberley personally. Dutifully she did so, but in 1874, she died, followed in 1876 by her husband, who had named Spalding as one of Bertrand Russell’s guardians. The revelation of the affair appalled the aged grandfather, Earl Russell, who promptly took over the guardianship of young Bertrand before himself dying in 1878. Spalding, meanwhile, had died in 1877 of his tuberculosis.

The obscure hero of this Greek tragedy seems in his few writings to have anticipated many of the great themes of twentieth-century psychology, including behaviorism. He also described how a newborn chick “will follow any moving object. And, when guided by sight alone, they seem to have no more disposition to follow a hen than to follow a duck or a human being…. There is the instinct to follow; and the ear, prior to experience, attaches them to the right object.” Spalding even remarked on how a chick kept hooded for the first four days of life immediately fled from him when unhooded, whereas if it had been unhooded the day before, it would have run to him.

But Spalding went unnoticed, and it was Lorenz who put imprinting (in German, Pragung) on the scientific map. It was Lorenz who formed the concept of the critical period—the window during which environment acts irreversibly upon the development of behavior. For Lorenz the importance of imprinting was that it was itself an instinct. The tendency to imprint on a parent is innate in the new-hatched gosling. It cannot possibly be learned, for it is the bird’s first experience. At a time when the study of behavior was dominated by conditioned reflexes and associations, Lorenz saw his role as rehabilitating innateness. In 1937 Niko Tinbergen spent the spring with Lorenz at Altenberg, and between them they invented the science of ethology—the study of animal instincts. Concepts like displacement (doing something else when prevented from doing what is desired), releasers (the environmental triggers of instinct) and fixed action patterns (subprograms of an instinct) were born. Tinbergen and Lorenz were awarded the Nobel Prize in 1973 for the work which had begun in 1937.

But there is another way to view imprinting: as a product of the environment. After all, the gosling will not follow unless there is something to follow. Once it has followed one kind of “mother” it will prefer to follow one which looks like that. But before then, it is open-minded about what “mother” looks like. From a different perspective, Lorenz had discovered how the external environment shapes behavior just as much as the internal drive does. Imprinting could be recruited to the nurture camp as surely as it was recruited to the nature camp: a gosling can be taught to follow anything that moves.

A duckling, however, is different. Despite his boyhood success with ducklings, the adult Lorenz could not easily get mallard ducklings to imprint upon him until he tried making mallard-like noises. Then they followed him with enthusiasm. The ducklings need both to see and to hear their mother. In the early 1960s, Gilbert Gottlieb did a series of experiments to explore how this works. He found that naive newborn ducklings of either mallard or wood ducks had a preference for the calls of their own species. That is, despite never having heard their own species’ call, they knew the right sound when they heard it. But Gottlieb then tried to complicate things and got a surprising result. He muted the ducklings themselves by operating on their vocal cords while they were still in the egg. Now the ducklings, on hatching, had no preference for their own species of mother. Gottlieb concluded that the ducklings knew the right call only because they had heard their own voices before hatching. This he felt undermined the whole notion of instinct, by bringing an environmental trigger in before birth.

THE SCARS OF GESTATION

If the influence of the environment is partly prenatal, then the environment begins to sound a lot less like a malleable force and more like fate. Is this a peculiarity of ducks and geese, or are people also imprinted by the early environment with certain unvarying characteristics? Start with the medical clues. In 1989, a medical scientist named David Barker analyzed the fate of more than 5,600 men born between 1911 and 1930 in six districts of Hertfordshire in southern England. Those who had weighed the least at birth and at one year old went on to have the highest death rates from ischemic heart disease. The risk of death was nearly three times as great in the light babies as in the heavy babies.

Barker’s result attracted much attention. It was no surprise that heavier babies should be more healthy, but it was a great surprise that they should be less vulnerable to a disease of old age, and one, moreover, for which the causes were supposedly well known. Here was evidence that heart disease is influenced less by how much cream you eat as an adult than by how thin you were at one year old. Barker has gone on to confirm the same result in data from other parts of the world for heart disease, stroke, and diabetes. For instance, among 4,600 men born in Helsinki University Hospital between 1934 and 1944, those who were thin or light at birth and at one year old were far more likely to die of coronary heart disease. Barker puts it this way: had none of these people been thin as babies, then there would have been half as much coronary heart disease later—a huge potential improvement in public health.

Barker argues that heart disease cannot be understood as an accumulation of environmental effects during life. “Rather, the consequences of some influences, including a high body mass in childhood, depend on events at early critical stages of development. This embodies the concept of developmental ‘switches’ triggered by the environment.” According to the “thrifty phenotype” hypothesis, which has grown out of this work, Barker has found an adaptation to famine. The body of a poorly nourished baby, imprinted with prenatal experience, is born “expecting” a state of food deprivation throughout life. The baby’s metabolism is geared to being small, hoarding calories, and avoiding excessive exercise. When, instead, the baby finds itself in a time of plenty, it compensates by growing fast but in such a way as to put a strain on its heart.

The famine hypothesis may have even more bizarre implications, as revealed by an “accidental experiment” conducted on a vast scale during the Second World War. It began in September 1944, at a time when Konrad Lorenz and Niko Tinbergen, who had formerly worked together, were both in captivity. Lorenz was in a Russian prisoner-of-war camp, having just been captured; Tinbergen was about to be released after two years in a German internment camp where he was held hostage under threat of death against the activities of the Dutch resistance. On 17 September 1944, British paratroopers occupied the Dutch city of Arnhem to capture a strategic bridge over the Rhine. Eight days later, the Germans forced them to surrender, having fought off the ground forces sent to their relief. The Allies then abandoned attempts to liberate Holland until after the winter.

The Dutch railroad workers had called a strike to try to prevent German reinforcements from reaching Arnhem. In retaliation, Reichskommissar Arthur Seyss-Inquart ordered an embargo on all civilian transport in the country. The result was a devastating famine, which lasted for seven months: it was called the hunger winter. More than 10,000 people starved to death. What later caught the attention of medical researchers was the effect that this abrupt famine had on unborn babies. Some 40,000 people were fetuses during the famine, and their birthweight and later health are on record. In the 1960s a team from Columbia University studied the data. They found all the expected effects of malnourished mothers: malformed babies, high infant mortality, and high rates of stillbirth. But they also found that those babies who were in their last trimester of gestation (only) were of low birth weight. These babies grew up normal, but they later suffered from diabetes, probably brought on by the mismatch between their thrifty phenotype and the abundant rich food of the postwar world.

Babies who were in the first six months of gestation during the famine were of normal birth weight, but when they reached adulthood they themselves gave birth to unusually small babies. This second-generation effect is hard to explain with the thrifty-phenotype hypothesis, though Pat Bateson notes that locusts take several generations to switch from a shy, solitary form with a specialized diet to the swarming, gregarious form with a generalized diet and back again. If it takes several generations for humans to switch between thrifty and affluent phenotypes, this may explain why the death rate from heart disease is nearly four times as high in Finland as in France. The government of France began supplementing the rations of pregnant mothers after the Franco-Prussian war of the 1870s. The people of Finland lived in comparative poverty until 50 years ago. Perhaps it is the first two generations to experience abundance who suffer from heart disease. Perhaps that is why the United States is now seeing rapidly falling death rates from heart disease, but in Britain, well fed for a shorter time, the rates remain high.

THE LONG FINGER OF LIFE

A prenatal event may have far-reaching effects that are all but impossible to counteract in later life. Even subtle differences between healthy individuals can be traced to prenatal imprinting. Finger length is a case in point. In most men the ring finger is longer than the index finger. In women the two fingers are usually the same length. John Manning realized that this was an indication of the level of prenatal testosterone to which people had been exposed while in the womb: the more testosterone, the longer the ring finger. There is a good biological reason for the link. The hox genes that control the growth of the genitalia also control the growth of digits, and a subtle difference in the timing of events in the womb probably leads to subtly different finger lengths.

Manning’s measurements of the ring finger give a crude measure of testosterone exposure before birth. What does that imply? Forget palmistry; this is a real prediction. Men with unusually long ring fingers (indicating high testosterone) are at greater risk of autism, dyslexia, stammering, and immune dysfunction; they also father relatively more sons. Men with unusually short ring fingers are at higher risk of heart disease and infertility. And because in the male muscles are also partly based on testosterone, Manning was prepared to predict rather rashly on television that among a group of athletes about to run a race, the one with the longest ring finger would win—a prediction that promply came true.

The length of the ring finger and indeed the fingerprint on it are imprinted in the womb. They are products of nurture—for surely the womb is the very embodiment of the word “nurture.” But that does not make these traits malleable. The comforting belief that nurture is more malleable than nature relies partly on the mistaken notion that nurture is what happens after birth and nature is what happens before birth.

Perhaps you can now glimpse an explanation of the paradox of : that behavior genetics reveals a role for genes and a role for unshared environmental influences, but hardly any role for shared environmental influences. The prenatal environment is not shared with siblings (except twins); the experience of gestation is unique to each baby; the insults suffered therein, such as malnutrition or influenza or testosterone, depend on what is happening to the mother at that time, not on what is happening within the whole family. The more prenatal nurture matters, the less postnatal nurture can matter.

SEX AND THE WOMB

There is something rather Freudian about this imprinting. Freud believed that the human mind carries the marks of its early experience, and that many of these marks lie buried in the subconscious, but they are still there. Rediscovering them is one of the purposes of psychoanalysis. Freud went on to suggest that by this process of rediscovery, people could cure themselves of various neuroses. A century later there is an unambiguous verdict on this proposal: good diagnosis, terrible therapy. Psychoanalysis is notoriously bad at changing people. That is what makes it so profitable—“See you next week.” But it is right in its premise that there are such things as “formative experiences,” that they come very early, and that they are still powerfully present in the adult subconscious. By the same token, if they are still there, and still influential, then they must be hard to reverse. Formative experiences must be unchangeable, if they persist.

Freud may not have been the first person to consider infantile sexual desires, but he was certainly the most influential. In this he was being contrarian. To the detached observer nothing could be more obvious than that sex starts at adolescence. Until the age of about 12, human beings are indifferent to nudity, bored by romance, and mildly incredulous about the facts of life. By 20, they are fascinated by sex to an obsessive degree. Something has surely changed. But Freud was convinced that there was something sexual occurring in the mind of the child, even the baby, long before that.

Back to goslings. Lorenz noticed that imprinted goslings (and other birds) not only treated him as a parent but later became sexually fixated on him as well. They would ignore members of their own species and court human beings. (My sister and I found the same thing when as children we reared a collared dove from hatchling to adult: it fell fanatically in love with my sisters’ fingers and toes, probably because it had been fed with fingers from the moment it opened its eyes. It treated my fingers and toes like sexual rivals.) This was rather intriguing because it implied that, at least in birds, the object of a sexual attraction could be fixed from soon after birth and yet simultaneously could consist of almost any living thing. A whole series of experiments both in captivity and in the wild has since shown that in many kinds of bird a male chick reared by a foster mother of a different species does indeed sexually imprint on that other species, and that there exists a critical period during which it picks up this sexual preference.

Might the same be disturbingly true of people? The reassuring answer that most people gave themselves in the twentieth century was that people did not have instincts, so this question need not arise. But see what a fine mess this leads you into! If instinct is something so flexible that a goose can become infatuated with a man, then do human beings have a less flexible instinct? Or do they laboriously have to learn what to love? Either way, the human boast that our lack of instinct is what makes us flexible begins to sound a bit hollow.

In any case, it has long been clear from the experiences of homosexual people that human sexual preferences are not only difficult to change but also fixed from a very early age. Nobody in science now believes that sexual orientation is caused by events in adolescence. Adolescence merely develops a negative that was exposed much earlier. To understand why most men are attracted to women while some men are attracted to men you must go much further back into childhood, perhaps even into the womb.

The 1990s saw a series of studies that revived the idea of homosexuality as a “biological” rather than a psychological condition, as a destiny rather than a choice. There were studies showing that future homosexuals had different personalities in childhood, studies showing that homosexual men had differences in brain anatomy from heterosexual men, several twin studies showing that homosexuality was highly heritable in western society, and anecdotal reports from homosexual men to the effect that they had felt “different” early in life. On its own none of these studies was overwhelming. But together, and set against decades of proof that aversion therapy, “treatment,” and prejudice entirely failed to “cure” people of gay instincts, the studies were emphatically clear. Homosexuality is an early, probably prenatal, and irreversible preference. Adolescence simply throws fuel on the fire.

What exactly is homosexuality? It is plainly a whole range of behavioral characteristics. In some ways gay men seem to be more like women: they are attracted to men, they may pay more attention to clothes, they are often more interested in people than, say, football. In other ways, however, they are more like heterosexual men: they buy pornography and seek casual sex, for example. (Playgirl’s nude centerfolds of men turned out to appeal mainly to gays, not the intended women.)

People, like all mammals, are naturally female unless masculinized. Female is the “default sex” (it is the other way around in birds). A single gene, called SRY, on the Y chromosome starts a cascade of events in the developing fetus leading to the development of masculine appearance and behavior. If that gene is absent, a female body results. It is therefore reasonable to hypothesize that homosexuality in men results from the partial failure of this prenatal masculinization process in the brain, though not in the body (see ).

By far the most reliable discovery about the causes of homosexuality in recent years is Ray Blanchard’s theory of the fraternal birth order. In the mid-1990s Blanchard measured the number of elder brothers and sisters of gay men compared with the population average. He found that gay men are more likely to have elder brothers (but not elder sisters) than either gay women or heterosexual men. He has since confirmed this in 14 different samples from many different places. For each extra older brother, a man’s probability of being gay rises by one-third. (This does not mean that men with many elder brothers are bound to be gay: an increase from, say, 3 percent of the population to 4 percent is an increase of one-third.)

Blanchard calculates that at least one gay man in seven, probably more, can attribute his sexual orientation to this effect of fraternal birth order. It is not simply birth order, because having elder sisters has no such effect. Something about elder brothers must actually be causing homosexuality in men. Blanchard believes the mechanism is in the womb rather than the family. One clue lies in the birth weight of baby boys who will later become homosexual. Normally, a second baby is heavier than a first baby of the same sex. Boys especially are heavier if they are born after one or more sisters. But boys born after one brother are only slightly heavier than firstborn boys, and boys born after two or more brothers are usually smaller than first- and second-born boys at birth. By analyzing questionnaires given to gay and straight men and their parents, Blanchard was able to show that younger brothers who went on to become homosexual were 170 grams lighter at birth than younger brothers who went on to become heterosexual. He confirmed the same result—high birth order, low birth weight compared with controls—in a sample of 250 boys (with an average age of seven) who were showing sufficient “cross-gender” wishes to have been referred to psychiatrists; cross-gender behavior in childhood is known to predict later homosexuality.

Like Barker, Blanchard believes that conditions in the womb are marking the baby for life. In this case, he argues, something about occupying a womb that has already held other boys occasionally results in reduced birth weight, a larger placenta (presumably in compensation for the difficulty the baby experiences in growing), and a greater probability of homosexuality. That something, he suspects, is a maternal immune reaction. The immune reaction of the mother, primed by the first male fetuses, grows stronger with each male pregnancy. If it is mild, it causes only a slight reduction in birth weight; if strong, it causes a marked reduction in birth weight and an increased probability of homosexuality.

What could the mother be reacting to? There are several genes expressed only in males, and some are already known to raise an immune reaction in mothers. Some are expressed prenatally in the brain. One intriguing new possibility is a gene called PCDH22, which is on the Y chromosome, is therefore specific to males, and is probably involved in building the brain. It is the recipe for a protocadherin (see ). Could this be the gene that wires the bit of the brain that is peculiar to males? A maternal immune reaction may be sufficient to prevent the wiring of the part of the brain that would eventually encourage a fascination with female bodies.

Clearly not all homosexuality is caused this way. Some of it may be caused directly by genes in the homosexual person without the mediation of the mother’s immune reaction. Blanchard’s theory may explain why it has proved so hard to pin down the “gay gene.” The main method for finding such a gene is to compare markers on the chromosomes of homosexual men with those of their heterosexual brothers. But if many gay men have straight elder brothers, this method would work poorly. Besides, the key genetic difference might be on the mother’s chromosomes, where it causes the immune reaction. This might explain why homosexuality looks as though it is inherited through the female line: genes for a stronger maternal immune reaction could appear to be “gay genes,” even though they may not be expressed in the gay man himself but only in the mother.

But notice what this does to nature versus nurture. If nurture, in this case birth order, causes some homosexuality, it does so by causing an immune reaction, which is a process directly mediated by genes. So is this that environmental or genetic? It hardly matters, because the absurd distinction between reversible nurture and inevitable nature has now been well and truly buried. Nurture in this case looks just as irreversible as nature, perhaps more so.

Politically, the confusion is even greater. Most homosexuals welcomed the news in the mid-1990s that their sexual orientation looked “biological.” They wanted it to be a destiny, not a choice, because that would undermine the argument of homophobes that it was a choice and therefore morally questionable. How could it be wrong if it was innate? Their reaction is understandable but dangerous. A greater tendency to violence is also innate in the human male. That does not make it right. Reasoning that “ought” can be derived from “is” is called “naturalistic fallacy.” To base any moral position on a natural fact, whether that fact is derived from nature or from nurture, is asking for trouble. In my morality, and I hope in yours, some things are bad but natural, like dishonesty and violence; others are good but less natural, like generosity and fidelity.

THROWING SWITCHES IN THE BRAIN

It is easy to infer the existence of critical periods during which the wet cement of character can be set. It is less easy to conceive of how they work. What can possibly occur inside a brain to imprint a gosling on to a professor soon after hatching? Even to ask such a question reveals me to be a reductionist, and reductionists are bad. We are supposed to glory in the holistic experience and not try to take it apart. To which I could reply that there is often more beauty, poetry, and mystery in the circuit design of a microchip or the workings of a well-made vacuum cleaner than there is in a roomful of conceptual art, but I would not want to be called a philistine, so I will merely claim that reductionism takes nothing from the whole; it adds new layers of wonder to the experience. That applies whether the designer of the parts was a human being or the GOD.

How does a gosling’s brain imprint on a professor? Until very recently this was a complete mystery. Within the past few years, though, the veils of mystery have begun to lift, revealing new veils beneath. The first veil concerns which part of the brain is involved. When a chick imprints on its parents, experiments reveal that memories are laid down first and most rapidly in a part of the brain called the left intermediate and medial hyperstriatum ventrale (IMHV). In this part of the brain, and only on the left side, many changes accompany imprinting: neurons change shape, synapses form, and genes are switched on. If the left IMHV is damaged, the chick fails to imprint on its mother.

The second veil to lift reveals which chemical is necessary for “filial” imprinting of this kind. By examining the brains of chicks after they had or had not imprinted on an object, Brian McCabe found that a neurotransmitter called GABA is released from brain cells in the left IMHV during imprinting. He had previously noticed that a gene for a GABA receptor is switched off about 10 hours after the chick has been trained to imprint on an object.

So something happens in one part of the left side of the chick’s brain during imprinting, first to release GABA and then to reduce sensitivity to GABA at the end of the critical period. To take the story further, it is time to leave baby birds for a different kind of critical period, one that is a little easier to study: the development of binocular vision. Babies are occasionally born with cataracts in both eyes that render them blind. Until the 1930s surgeons thought it wise not to operate to remove such cataracts until after the child reached age 10, because of the risks of surgery on small children. But it became apparent that such children never managed to perceive depth or shape properly even after the removal of the cataracts. It was simply too late for the visual system to learn how to see. Likewise, monkeys reared in darkness for the first six months of their lives took months to learn to distinguish circles from squares, something normal monkeys could learn in days. Without visual experience in the first months of life, the brain cannot interpret what the eye sees. A critical period has passed.

There is one layer of primary visual cortex, called layer 4C, that receives inputs from both eyes and separates them into streams from each eye. To begin with, the inputs are randomly distributed, but before birth they become roughly sorted into stripes, each stripe responding mainly to one eye. During the first few months after birth, this segregation becomes increasingly marked, so that all the cells responding to the right eye become clustered into right-eye stripes while all those responding to the left eye become clustered into left-eye stripes. These stripes are called ocular dominance columns. Amazingly, the columns do not segregate in the brains of animals deprived of sight during the early months of life.

David Hubel and Torsten Wiesel discovered how to stain these columns different colors by injecting dyed amino acids into one eye. They were then able to see what happens when one eye is sewn shut. In an adult animal, this has virtually no effect on the stripes. But if one eye is sewn shut for as little as a week during the first six months of a monkey’s life, then the stripes from the deprived eye almost disappear and that eye becomes effectively blind, because it has nowhere in the brain to which to report. The effect is irreversible. It is as if the neurons from the two eyes compete for space in layer 4C and those that are active win the battle.

These experiments in the 1960s were the first demonstrations of “plasticity” in the development of the brain during a critical period after birth. That is to say, the brain is open to calibration by experience in the early weeks of life, after which it sets. Only by experiencing the world through its eyes can an animal sort the input into separate stripes. Experience seems actually to switch on certain genes, which in turn switch on others.

By the late 1990s, a number of people were searching for the molecular key to this critical period of plasticity in vision. Their method of choice was genetic engineering: the creation of mice with extra genes or missing genes. Mice, like cats and monkeys, have a critical period during which the inputs from the two eyes compete for space in the brain, though they do not sort into neat columns. In Boston, in the laboratory of Susumu Tonegawa, Josh Huang thought he had an idea of what they were competing for: brain-derived neurotrophic factor, or BDNF, the product of a gene one version of which also seems to predict neurotic personalities (see ). BDNF is a sort of brain food: it encourages the growth of neurons. Perhaps, Huang reasoned, the cells carrying the most signals from the eye got more BDNF than the silent cells, so the input from the open eye displaced the input from the closed eye. In a world where there was not enough BDNF to go around, it was survival of the hungriest neuron.

Huang did the obvious experiment: he made a mouse that produced extra BDNF from its genes, expecting that this BDNF would now provide ample food for all neurons, enabling the input from both eyes to survive. He was surprised to see a different and dramatic effect. The mice with extra BDNF went through the critical period faster. Their brains set two weeks after eye opening instead of three. This was the first demonstration that a critical period could be adjusted artificially.

A year later, in 2000, came another breakthrough, in the laboratory of a Japanese scientist, Takao Hensch. Hensch discovered that a mouse lacking a gene called GAD65 failed to sort its eye inputs in response to visual stimuli. But these same mice did sort their inputs if injected with the drug diazepam. Indeed, diazepam, like BDNF, seemed to bring on a precocious imprinting. Injecting diazepam after the critical period could not restore plasticity to the brain. In the mice lacking GAD65, the scientists could bring on plasticity with diazepam at any time, even during adulthood. But only once. After the reorganization caused by diazepam, the system entirely lost its sensitivity. It is as if there is a dormant program for rewiring the brain, which can be triggered once—but only once.

Back in Boston Huang had surprised himself again. Together with Lamberto Maffei in Pisa, he had simply reared his transgenic mice—the ones with the extra BDNF—in the dark. Normal mice raised in the dark for three weeks after their eyes open are effectively blind for life; they need the experience of light so that their visual system can mature. To put it bluntly, their brains need nurture as well as nature. But remarkably, the extra-BDNF mice reared in the dark responded normally to visual stimuli, suggesting that they could see well despite having had no exposure to light during the critical period. Huang and Maffei had stumbled on an extraordinary fact: a gene that could substitute for aspects of experience. One of the roles of experience is apparently not to fine-tune the brain but merely to switch on the BDNF gene, which in turn fine-tunes the brain. If you shut the eye of a mouse, BDNF production in its visual cortex drops within half an hour.

Despite this result, Huang does not really believe that experience is dispensable. He notes that the system seems to be designed to delay maturation of the brain until experience is available. What do BDNF, GAD65, and diazepam—the three things that can affect critical periods—have in common? The answer is the neurotransmitter GABA: GAD65 makes it, diazepam mimics it, and BDNF regulates it. Since GABA was implicated in filial imprinting in the chick, it looks plausible that the GABA system will prove to be central to critical periods of all kinds. GABA is a sort of neuronal spoilsport: it inhibits the firing of neighboring neurons. Feeling unloved, the inhibited neurons die. Because the maturation of the GABA system is itself dependent on visual experience and is driven by BDNF, the link between them has the ring of truth.

Though the story is still far from complete, GABA is a beautiful example of how it is now possible as never before to begin to understand the molecular mechanisms behind such things as imprinting. It shows just how unfair is the charge that reductionism takes the poetry out of life. Who would have conceived of a mechanism so exquisitely designed if they had refused to look under the lid of the brain? Only by equipping the brain with BDNF and GAD65 genes can the GOD make a brain capable of absorbing the experience of seeing. These are, if you like, the genes for nurture.

YOUNG TONGUES

Critical-period imprinting is everywhere. There are a thousand ways in which human beings are malleable in their youth, but fixed once adult. Just as a gosling is imprinted with an image of its mother during the hours after birth, so a child is imprinted with everything from the number of sweat glands on its body and a preference for certain foods to an appreciation of the rituals and patterns of its own culture. Neither the gosling’s mother-image nor the child’s culture is in any sense innate. But the ability to absorb each is.

An obvious example is accent. People change their accents easily during youth, generally adopting the accent of people of their own age in the surrounding society. But sometime between about 15 and 25, this flexibility simply vanishes. From then on, even if a person emigrates to a different country and lives there for many years, his or her accent will change very little. People may pick up a few inflections and habits from their new linguistic surroundings, but not many. This is true of regional as well as national accents: adults retain the accent of their youth; youngsters adopt the accent of the surrounding society. Take Henry Kissinger and his younger brother Walter. Henry was born on 27 May 1923; Walter was born just over a year later, on 21 June 1924. They both moved to the United States as refugees from Germany in 1938. Today Walter sounds like an American, whereas Henry has a characteristic European accent. A reporter once asked Walter why Henry had a German accent but he did not. Walter’s facetious reply was, “Because Henry doesn’t listen,” but it seems more likely that when they arrived in America Henry was just old enough to be losing the flexibility of imprinting his accent on his surroundings; he was leaving the critical period.

In 1967 a psychologist at Harvard, Eric Lenneberg, published a book in which he argued that the ability to learn language is itself subject to a critical period that ends abruptly at puberty. Evidence for Lenneberg’s theory now abounds, not least in the phenomenon of creole and pidgin languages. Pidgin languages are used by adults of several different linguistic backgrounds to communicate with each other. These languages lack consistent or sophisticated grammar. But once they have been learned by a generation of children still in the critical period, they change into creoles—new languages with full grammar. In one case in Nicaragua, deaf children sent to new schools for the deaf together for the first time in 1979 simply invented a new sign-language creole of remarkable sophistication.

The most direct test of the critical period in learning language would be to deprive a child of all language until the age of 13 and then try to teach the poor creature to speak. Deliberate experiments of this kind are thankfully rare, though at least three monarchs—King Psamtik of Egypt in the seventh century B.C., the Holy Roman Emperor Frederick II in the thirteenth century, and King James IV of Scotland in the fifteenth century—are said to have tried depriving newborn children of all human contact except a silent foster mother to see whether they grew up speaking Hebrew, Arabic, Latin, or Greek. In Frederick’s case, the children all died. The Moghul emperor Akbar is said to have done the same experiment to find out whether people were innately Hindu, Muslim, or Christian. All he got was deaf-mutes. Genetic determinists were made of stern stuff in those days.

By the nineteenth century, attention had shifted to natural deprivation experiments in the form of “feral children.” Two seem to have been genuine. The first was Victor, the wild boy of Aveyron, who appeared in 1800 in the Languedoc, having apparently lived wild for many of his 12 years. Despite years of effort, his teacher failed to teach him to speak and “abandoned my pupil to incurable dumbness.” The second was Kaspar Hauser, a young man discovered in Nuremberg in 1828 who had apparently been kept in a single room with almost no human contact for all of his 16 years. Even after years of careful coaching, Kaspar’s syntax was still “in a state of miserable confusion.”

These two cases are suggestive but hardly constitute proof. Then suddenly, four years after Lenneberg’s book, there was a third case of a wild child found after puberty: a 13-year-old girl named Genie was discovered in Los Angeles after a childhood of almost inconceivable horror. The daughter of a blind, abused mother and a paranoid and increasingly reclusive father, she had been kept in silence in a single room, mostly either harnessed to a potty chair or confined in a caged crib. She was incontinent, deformed, and almost completely mute: her vocabulary consisted of two words: “stopit” and “nomore.”

The story of Genie’s rehabilitation is almost as tragic as that of her childhood. As she was passed between scientists, foster parents, state officials, and her mother (the father committed suicide after her discovery), the initial optimism of those who set out to care for her was gradually lost in lawsuits and bitterness. Today Genie is in a home for retarded adults. She learned much, her intelligence was high, her nonverbal communication was extraordinary, and her ability to solve spatial puzzles was ahead of her age.

But she never learned to speak. She developed a good vocabulary, but elementary grammar was beyond her, and syntax or word order was a foreign land. She could not grasp how to phrase a question by altering word order or how to change “you” to “I” in an answer. (Kaspar Hauser had the same problem.) Though the psychologists who studied her at first believed she would disprove Lenneberg’s critical-period theory, they eventually admitted that she was a confirmation of it. Untrained by conversation, the brain’s language module had simply not developed, and it was now too late.

Victor, Kaspar, and Genie (and there have been other cases, including a woman not diagnosed as deaf until she was 30) suggest that language does not just develop according to a genetic program. Nor is it just absorbed from the outside world. Instead, it is imprinted. It is a temporary innate ability to learn by experience from the environment, a natural instinct for acquiring nurture. Polarize that into either nature or nurture, if you can.

Though language was the most severe of Genie’s problems in adjusting to the world, it was not the only one. After her release she became an obsessive collector of colored plastic objects. She was also for many years terrified of dogs. Both of these characteristics could be tentatively traced to “formative experiences” in her childhood. Just about the only toys she had were two plastic raincoats. As for dogs, her father would bark and growl outside her door to frighten her if she made a noise. How many of a person’s own preferences, fears, and habits are imprinted during youth? Most of us can recall in astonishing detail the places and people of our early years, whereas we forget much more recent adult experiences. Memory is plainly not all a matter of a critical period—it does not switch off at a certain age. But there is an element of truth in the old notion that the child is father to the man. Freud was right to emphasize the importance of formative years, even if he sometimes generalized too freely about them.

FAMILIARITY BREEDS INDIFFERENCE

One of the more controversial theories of human imprinting concerns incest. The critical period in the development of sexual orientation plainly leaves a young person committed to being attracted to members of the opposite sex (except when it makes them committed to being attracted to members of the same sex). Probably it also determines one’s “type” of partner in some much more specific way. But does it also determine who will be positively averse to wooing?

The law forbids marriage between brothers and sisters, and for good reason. Inbreeding causes horrific genetic diseases by bringing together rare recessive genes. But suppose some country were to repeal its law and proclaim that from now on brother–sister marriages would be considered not only legal but rather a good thing. What would happen? Nothing. Despite being the best of friends and highly compatible, most women are simply not sexually attracted to their brothers. In 1891, a Finnish pioneer of sociology, Edward Westermarck, published a book—History of Human Marriage—in which he suggested that human beings avoid incest by instinct rather than by obedience to the law. They are naturally averse to sex with close kin. Cleverly, he saw that this did not require people to have an innate ability to recognize real brothers and sisters. Instead, there was a rough-and-ready way of knowing: those people whom one has known well as children are probably close kin. He predicted that people who have a shared childhood will be instinctively averse to sleeping with one another as adults.

Within 20 years Westermarck’s idea was all but forgotten. Freud criticized his theory and suggested instead that human beings were attracted to incest and were prevented from practicing only it by cultural prohibitions in the form of taboos. Oedipus without incestuous desire is like Hamlet without madness. But if people are averse to incest, they cannot have incestuous desires. And if they need taboos, they must have desires. Westermarck protested in vain that social learning theories “imply that the home is kept free from incestuous intercourse by law, custom, or education. But even if social prohibitions might prevent unions between the nearest relatives, they could not prevent the desire for such unions. The sexual instinct is hardly changed by proscriptions.”

Westermarck died in 1939 as Freud’s star was still rising and “biological” explanations were falling out of fashion. It took another 40 years before somebody looked again at the facts. That somebody was a sinologist, Arthur Wolf, who analyzed the meticulous demographic records kept by the occupying Japanese in nineteenth-century Taiwan. Wolf noticed that the Taiwanese had practiced two forms of arranged marriage. In one, the bride and groom met on their wedding day, though the match was arranged many years before. In the other, the bride was adopted by the groom’s family as an infant and reared by her future in-laws. Wolf realized that this was a perfect test of Westermarck’s hypothesis, for these “sim-puahs” or “little daughter-in-laws” would experience the illusion that they were expected to marry their brothers. If, as Westermarck argued, shared childhood led to sexual aversion, then these marriages should not work very well.

Wolf collected information on 14,000 Chinese women and compared those who had been sim-puahs with those who met their arranged husbands only on their wedding day. Astonishingly, marriage to a childhood associate was 2.65 times as likely to end in divorce as an arranged marriage to an unfamiliar partner—people who had known each other all their lives were much less likely to stay married than people who had never met. The sim-puah marriages also produced fewer children and involved more adultery. Wolf ruled out other obvious explanations—that the process of adoption led to ill health and infertility, for example. Far from bringing spouses together, the habit of co-rearing them seemed to inhibit the later development of sexual attraction. But this was true only of sim-puahs adopted at the age of three or younger; those adopted at four or older had just as successful marriages as those who met as adults.

Since then many studies have confirmed these findings. Israelis reared communally in a kibbutz rarely marry each other. Moroccans who have slept in the same room as children are averse to accepting an arranged marriage. The aversion seems to be stronger among women than men. Even in fiction, the aversion reverberates: Victor Frankenstein, in Mary Shelley’s novel, finds himself expected to marry a cousin reared with him since childhood—but (symbolically) his monster intervenes to kill his prospective bride before the marriage is consummated.

It is true that incest taboos exist, but on closer inspection they are little concerned with marriage between close kin. They are about regulating marriages between cousins. It is true, also, that people seem to be fascinated by incest, and that it plays a large part in medieval fiction, Victorian scandal, and modern urban legends. But then things—such as snakes—that horrify people also often fascinate them. It also seems to be true that siblings separated at birth who later find each other as adults are often strongly attracted to each other, but this if anything supports the Westermarck effect.

The Westermarck effect is plainly not universal. Exceptions do exist both at the cultural and at the individual level. Many sim-puah brides were able to overcome their sexual aversion and have successful marriages: the system had set their incest-avoidance instinct against an even stronger instinct for procreation. Also there is some evidence that “fooling around” between brothers and sisters who were reared together does occur, whereas those who are separated for more than a year during early childhood are much more likely to indulge in actual intercourse. In other words, childhood association may not produce an aversion so much to attraction as to actual intercourse.

Nonetheless, aversion to incest between those reared in the same family, like language, seems to be a clear case of a habit imprinted on the mind during a critical period of youth. In one sense it is pure nurture—the mind has no preconceptions about whom it will become averse to, so long as they are childhood companions. And yet it is nature in the sense of an inevitable development set in train presumably by some genetic program at a particular age. You need nature to be able to absorb nurture.

Just like Lorenz’s goslings, we are imprinted—but in our cases we are imprinted with an aversion rather than an attachment. However, here’s an odd thing: Konrad Lorenz married his childhood friend Gretl, the girl with whom he imprinted his first duckling at the age of six. She was the daughter of a market gardener in the next village. Why were they not averse to each other? Perhaps a clue lies in the fact that she was three years older than he. This means that she was probably already out of the critical period for the Westermarck effect by the time they came to know each other. Or perhaps Konrad Lorenz was just an exception to his own rule. Biology, somebody once said, is the science of exceptions, not rules.

NAZITOPIA

Lorenz’s notion of imprinting was a great insight that has stood the test of time. It is a crucial part of the jigsaw I call nature via nurture, and an exquisite marriage of the two. The invention of imprinting as a way of ensuring the flexible calibration of instinct was a masterstroke of natural selection. Without it, either we would all be born with a fixed and inflexible language unchanged since the Stone Age, or we would struggle to relearn each grammatical construction. But one of Lorenz’s other ideas will not be judged so kindly by history. Though the story has little to do with imprinting, it is worthwhile to recount how Lorenz, like so many others in the twentieth century, fell into a trap by flirting with a sort of utopia.

In 1937 Lorenz was unemployed. His studies of animal instinct were prohibited in the Catholic-dominated university of Vienna on theological grounds, and he had retired to Altenberg to continue his work with birds at his own expense. He applied for a grant to work in Germany. Commenting on the application, a Nazi official wrote: “All reviews from Austria agree that the political attitude of Dr. Lorenz is impeccable in every respect. He is not politically active, but in Austria he never made a secret of the fact that he approved of National Socialism…. Everything is also in order with his Aryan descent.” In June 1938, shortly after the Anschluss, Lorenz joined the Nazi Party and became a member of its Office of Race Policy. He immediately began speaking and writing about how his work on animal behavior could fit in with Nazi ideology; in 1940 he was appointed a professor at the University of Königsberg. Over the next few years, until his capture on the Russian front in 1944, he argued consistently in favor of the utopian ideals of “a scientifically underpinned race policy,” “the racial improvement of Volk and race,” and the “elimination of the ethically inferior.”

After four years in a Russian prisoner-of-war camp after the end of the war, Lorenz returned to Austria. He managed to gloss over his Nazism as gullible and stupid but said he had not been politically active. It was more that he had tried to bend his science to suit the new political powers than that he genuinely believed in Nazism, he said. While he lived, this was accepted. But after he died it gradually emerged how deeply he had imbibed Nazism. In 1942, while serving as a military psychologist in Poland, Lorenz took part in research led by the psychologist Rudolf Hippius and sponsored by the SS, the aim of which was to develop criteria for distinguishing “German” from “Polish” features of “half-breeds” in order to help the SS decide which to choose for their “re-Germanization” effort. There is no evidence that Lorenz was involved in war crimes himself, but he probably knew that they were being committed.

Central to his argument, during this Nazi period, was the issue of domestication. Lorenz had developed a rather quaint contempt for domesticated animals, which he regarded as greedy, stupid, and oversexed compared with their wild relatives. “Great ugly beast,” he once cried while rejecting the sexual advances of an imprinted muscovy duck. Pejoratives aside, he had a point. Almost by definition, selective breeding for domesticity produces animals that fatten well, breed well, and are docile and dull. Cows and pigs have brains that are one-third smaller than those of their wild relatives. Female dogs are fertile twice as often as wolves. And pigs notoriously can gain far more weight than wild boars.

Lorenz began to apply these notions to humanity. In a notorious paper, “Disorders Caused by the Domestication of Species-Specific Behavior,” (1940) he argued that human beings are self-domesticated and that this has led them into physical, moral, and genetic deterioration. “Our species-specific sensitivity to the beauty and ugliness of members of our species is intimately connected with the symptoms of degeneration caused by domestication, which threatens our race…. The racial idea as the basis of our state has already accomplished much in this respect.” In effect, Lorenz’s argument about domestication opened a new front in eugenics, giving another reason to nationalize reproduction and eliminate both unfit individuals and unfit races. Lorenz seems not to have spotted a large flaw in his own argument, that the muscovy duck is inbred after generations of selection to narrow its gene pool, whereas civilization has the opposite effect on people: it relaxes selection, allowing more mutations to survive in the gene pool.

There is no evidence that this had any influence on Nazism, which already had plenty of reasons, some more “scientific” than others, for its policies of racism and genocide. Lorenz’s argument was ignored, perhaps even distrusted, by the party. What is more remarkable, perhaps, is that Lorenz’s argument survived the war, to be reiterated in less emotive terms in his book Civilized Man’s Eight Deadly Sins, first published in 1973. This book combined Lorenz’s earlier concerns about human degeneration caused by the relaxation of natural selection with newer and more fashionable concerns about the state of the environment. As well as genetic deterioration, the eight deadly sins were overpopulation, destruction of the environment, overcompetition, the seeking of instant gratification, indoctrination by behaviorist techniques, the generation gap, and nuclear annihilation.

Genocide was not on Lorenz’s list.

Previous: 5. Genes in the Fourth Dimension
Next: 7. Learning Lessons