CHAPTER THREE

BLIND CATS AND BABY EINSTEINS: THE BIOLOGY OF NURTURE

READING THIS BOOK WILL LITERALLY CHANGE YOUR brain! That may sound like marketing hype, but as we’ll see in this chapter, it’s a pretty safe bet. The fact is that almost any experience we have can “change” our brains. The connections between our neurons (synapses) are continually changing and adapting as we perceive and respond to the world around us.

In the last chapter, I told you how variations in our genes may shape our temperament, emotional responses, and personalities. But no self-respecting scientist believes that genes alone are destiny. There is no dichotomy between nature and nurture because they are two sides of a single coin. The effect of any gene depends on the environment it’s expressed in. It’s not even a sensible question to ask what part of our behavior is genetic and what part environmental. The ability to speak a language is a universal human ability that is made possible by our genetic endowment. But in the absence of exposure to people who speak, our capacity for language wouldn’t be expressed.

And early in life, certain experiences can be particularly powerful in shaping the development of our minds and behavior. How much of who we are depends on what happened to us in the first few years of life? The two major strands of psychology in the twentieth century—psychoanalytic theory and behaviorism—gave opposite answers to this question. Sigmund Freud, the father of psychoanalysis, claimed that our emotional lives and the way we approach relationships are forever shaped by what happens during our first five years. His contemporary John Watson, the father of behaviorism, claimed that we are blank slates on which experience can write and rewrite learned behavior throughout life.

Over the last twenty years, developmental neuroscience has provided a more nuanced view of how early experience affects us. As we will see in this chapter, it turns out that both Freud and Watson were on to something. Freud was right in claiming that early experience has a formative and enduring influence on our relationships and how we interact with the world around us. But fortunately, as Watson emphasized, we are lifelong learners. In fact, our brains are continually being remodeled as they encounter new information. And this ongoing plasticity is the root of the remarkable resilience of the human spirit. In this chapter, we’ll explore how experience interacts with our genes to shape the trajectory of our lives.

GETTING AHEAD

DESPITE THE MEDIA HYPE SURROUNDING GENE DISCOVERIES, THE notion that early experience can shape brain development is alive and well. In fact, it spawned a multimillion-dollar industry that began in the 1990s with the debut of “Baby Einstein” videos. The marketing of educational videos for infants and toddlers later expanded to evoke the whole pantheon of genius in Western civilization: there’s Baby Mozart, Baby Da Vinci, Baby Van Gogh, Baby Beethoven, Baby Shakespeare, and Baby Wordsworth. (Even those who want their infant to revere a great football team needn’t waste a moment since the advent of “Baby Bama,” a video that “uses officially licensed footage of Crimson Tide sports, mascot, marching band, and campus attractions to expose children to The University of Alabama in an exciting . . . and educational manner.”) As we’ll see, the growth of the market for baby brain enrichment products provides a fascinating example of how neuroscience can be hyped beyond the laboratory and co-opted to fuel commercial interests and even public policy agendas.

One of the seminal events in this story was the 1993 publication of a brief paper entitled “Music and Spatial Task Performance” in the prestigious scientific journal Nature.1 Frances Rauscher and her colleagues at the University of California–Irvine reported a study in which thirty-six college students underwent three procedures in which they listened to ten minutes of Mozart’s sonata for two pianos in D major (K. 448), a relaxation tape, or silence. Immediately after each procedure, the students were given three tests of spatial reasoning. Compared to the other two conditions, listening to Mozart was associated with a cognitive boost that corresponded to an 8- to 9-point increase in spatial reasoning IQ. The results quickly captured the media’s attention. In a front-page Los Angeles Times article entitled “Study Finds That Mozart Music Makes You Smarter,”2 Rauscher expressed unease about the potential for exploitation of the original findings: “You can never control what the marketers will do. It is a very scary thought.”

Indeed, although the cognitive boost lasted only ten to fifteen minutes, this didn’t deter others from seizing upon the results and dubbing them “the Mozart Effect.” While other scientists had difficulty replicating Rauscher and her colleagues’ results, the notion that classical music could enhance brain function seemed too appealing to ignore. Classical recordings for babies and toddlers began to be marketed to parents who were eager to give their kids an edge.

In 1996 Julie Aigner-Clark, the mother of a one-year-old girl, began creating homemade videos to entertain and educate her child. Within a year, she had begun selling the videos under the name “Baby Einstein,” and the product quickly took off. Sales topped $100,000 in the first year and reached $1 million by the second year. Just around this time, music critic and author Don Campbell trademarked the phrase “the Mozart Effect” and published a best-selling book with that title. That was followed by The Mozart Effect for Children, which didn’t claim that music could make your kid a genius, but “certainly it can increase the number of neuronal connections in her brain, thereby stimulating her verbal skills” (p. 4).3

The timing couldn’t have been better for the “Baby Einstein” franchise. As Aigner-Clark told CBS’s The Early Show in 2005,4 “All kinds of research was done that said ‘Listen to Mozart; Mozart is great for you.’ There are wonderful studies showing that listening to Mozart will stimulate your mind. And I had a video called Baby Mozart. So I was really lucky.” By 2004, three years after Disney purchased Baby Einstein, annual sales had reached $170 million, a success story that earned Aigner-Clark a special mention in the president’s 2007 State of the Union address.

Meanwhile, child advocates were energized by what they saw as a scientific consensus that early environmental enrichment was essential to wiring a healthy brain.5 And even politicians jumped on the bandwagon. Florida passed a law mandating that classical music be piped into all state-funded day care centers, and in 1998, Georgia governor Zell Miller allocated funding to ensure that every newborn would leave the hospital with a classical music CD. As he put it, “No one questions that listening to music at a very early age affects the spatial, temporal reasoning that underlies math and engineering and even chess. Having that infant listen to soothing music helps those trillions of brain connections to develop.”6

Unfortunately, the story was getting ahead of the science. Remember, the original study had shown a fleeting effect of a Mozart piano sonata on a limited domain of reasoning in a small group of undergraduates. A spate of studies that followed had decidedly mixed results. A combined analysis of sixteen studies found no significant evidence that listening to Mozart improved IQ, even in the limited realm of spatial reasoning.7 In fact, several studies suggested that any transient effect on cognitive function was probably due to the arousal and positive mood induced by listening to pleasurable material.8, 9 That might explain why similar increases in cognitive functions were reported for adults listening to the Greek composer Yanni,10 and ten- and eleven-year-olds listening to the rock band Blur (dubbed the “Blur effect”).8, 9 Kenneth Steele, one of the scientists who was unable to replicate the results of the original study, concluded that “The Mozart effect is pretty much on the wallet of the parents who are buying the CDs.”11 For her part, the original researcher, Frances Rauscher, claimed that the failures to replicate were due to methodological problems with the later studies. She wrote, “Because some people cannot get bread to rise does not negate the existence of a ‘yeast effect.’ ”12 Regardless of the scientific debate, the Mozart effect became fixed in the public consciousness.*

Fueled in part by this widespread belief that early stimulation is important for cognitive development, the baby video and DVD market exploded. Although the Baby Einstein folks demurred that their products “are not designed to make babies smarter,” surveys suggested this is exactly the hope that motivated parents to buy children’s videos and DVDs (to the tune of nearly $5 billion in 2004).13 An analysis of top-selling DVDs for babies in 2005 found that more than 75 percent made educational claims. For instance, according to the packaging of the Brainy Baby Left Brain video, the video series was the first “that can help stimulate cognitive development.” Parents of children under two years say that the most important reason for having their babies watch TV, videos, and DVDs is that it is educational or good for their child’s brain.14 Although the American Academy of Pediatrics issued recommendations in 1999 and again in 2011 discouraging media use for children under age two,15, 16 children age six months to three years spend nearly two hours per day watching TV and other video media.13

Is there evidence that watching videos early on affects cognitive development? The answer is yes, but not necessarily in ways parents would hope for. In one study, greater media exposure in six-month-olds has been shown to predict lower language and cognitive development when the babies are fourteen months old.17 In another influential study, Frederick Zimmerman and his colleagues surveyed more than one thousand parents of children who were between two months and two years of age.18 For infants eight to sixteen months of age, every hour spent viewing baby videos and DVDs correlated with a substantial decline in scores on a standard measure of language development. The more they watched, the fewer words they had learned. And it didn’t matter whether or not parents watched the videos with their infants.*

In contrast, reading or telling stories to the children was associated with an improvement in vocabulary learning. Though the study doesn’t prove that baby videos hurt infant language development, it’s clear that they don’t seem to help. At best, other studies have found no correlation between TV or video viewing in infancy and later cognitive skills.22, 23 In one study of a best-selling baby DVD designed to boost vocabulary, twelve- to eighteen-month-old children were randomly assigned to one of four conditions. The first group watched the DVD with a parent at least five times per week over four weeks; the second group watched the same amount, but on their own. The third group didn’t watch the DVD but their parents were given a list of the twenty-five words featured on the video and asked to “try to teach your child as many of these words as you can in whatever way seems natural to you.” And finally, the fourth group had no intervention. When the children were tested for how many words of the twenty-five words they learned over the four weeks, only the children who were taught by their parents without watching the video did better than chance. The other groups did no better than the control group who had no intervention.24

In 2009 the Walt Disney Company offered a refund to parents who purchased the Baby Einstein videos, but emphasized that this was an expression of their confidence in the product.25 Meanwhile, cofounders Julie Aigner-Clark and her husband went to court to defend the legacy of their product and to challenge the studies that suggested adverse effects of early video watching.26 As Aigner-Clark noted, baby videos are hardly the worst thing for a child: “Welcome to the twenty-first century. Most people have televisions in their houses, and most babies are exposed to it. And most people would agree that a child is better off listening to Beethoven while watching images of a puppet than seeing any reality show that I can think of.” And so the controversy continues.

NOW OR NEVER?

ANOTHER INTERESTING FINDING EMERGED FROM THE ZIMMERMAN study—baby videos were associated with language decrements in the eight- to sixteenth-month-olds but not the seventeen- to twenty-four-month-olds. Timing mattered. So while this study doesn’t support the benefits of baby videos, it ironically supports the logic of why some parents buy these videos in the first place. The primary rationale people give for plunking their babies in front of a “brainy” video is the belief that there may be an early window for boosting brain power: Expose your infant to the right stimuli and you may affect the wiring of the brain in ways that last a lifetime. The problem is that before age two, children may not be able to learn from media, and time spent watching TV or videos is time away from playing or interacting with parents and siblings—activities that can promote cognitive growth.

No one doubts that development works on a schedule. If you don’t develop a right arm in the womb, you are destined to live a one-armed life. No amount of nurturing, good diet, or physical therapy will get you a second arm later in life. Does the same thing apply to the organs of the mind?

The idea of “windows of opportunity” for brain development has a long and controversial history, but, in some domains, they clearly exist. Scientists refer to these windows as sensitive periods or critical periods—when the brain is especially sensitive to some kind of input from the environment, and may need to get that input in order to develop normally. In the case of a critical period, it’s “now or never.” If the developmental event doesn’t happen in the critical time frame, it may be lost forever. Sensitive periods are less absolute—they represent a time of maximum sensitivity to an environmental stimulus, but the developmental changes may occur later to a lesser degree. The difference between critical and other sensitive periods can be illustrated by two familiar examples.

The first—imprinting—is one you probably remember from ninth-grade biology. The Austrian ethologist Konrad Lorenz is well-known for describing how newborn greylag geese instinctively follow their mothers around within about twenty-four hours of their birth. Lorenz observed that goslings born in an incubator will follow the first conspicuous object they are exposed to—whether it’s a human being, a pair of boots, or even a wooden block. After a brief period of exposure to the object, they treat it as though it were their mom. This process, known as filial imprinting, has been considered a classic example of a critical period because there is a limited period (typically the first two days of life) when the gosling’s brain is prepared to make the association between an object and the concept of mother goose. Filial imprinting seems to release a behavior (in this case, finding mom) that is waiting for its environmental trigger.

The second example is the sensitive period for learning a second language. If you’ve ever tried to learn a second language as an adult, you know it’s harder than it is for a typical child. Immigrants who learn the language of their new country may retain an accent of their native language depending on how old they were when they emigrated. My grandmother and mother were both born in Poland and came to this country as Holocaust refugees—my mother when she was nine and my grandmother when she was thirty-nine. Both became fluent speakers of English, but my grandmother retained a thick accent of the old country.

If Nobel Prizes are any measure of the importance of a scientific question, the study of critical periods is clearly up there. Lorenz received the Nobel Prize in 1973 for his work on imprinting. And then, in 1981, Torsten Wiesel and David Hubel shared the Nobel Prize for groundbreaking insights into how critical periods actually work in the brain.*

It was known that children who are born with congenital cataracts covering the lenses of the eye continue to have problems seeing even after the cataracts are removed. This is, of course, not the case for people who develop cataracts as adults. Adults who develop cataracts have their sight restored when the cataracts are surgically removed.

Hubel and Wiesel wanted to know how this difference in timing can have such a profound effect. To tackle the question, they developed a model to study the visual system in cats.27 They found that if you raise a kitten with one of its eyes sewn shut and then remove the sutures after several months, the animal never develops the ability to see in that eye. Somehow, depriving the eye of visual input early in life had an irrevocable effect.

After a series of studies, they showed that the problem wasn’t with the eye itself or even with the eye’s connections to the visual cortex of the brain. Instead, they discovered, the brain actually changes in response to a lack of visual stimulation. Areas of the cortex that would have been committed to processing information from the occluded eye are taken over by neurons from areas that received input from the nonoccluded eye. In other words, the brain has changed to allow the “good eye” to do the work of both.

Hubel and Wiesel had uncovered a striking example of brain plasticity—changes in the brain’s architecture that result from an animal’s experience. But they also found that this plasticity had a critical window—if the kitten’s eye was kept closed beyond about three months of age, the loss of vision in that eye would be irreversible.

Hubel and Wiesel’s work provided the first detailed description of two key mechanisms by which the environment affects the brain: critical periods and neuroplasticity. These two phenomena, which have motivated a vast amount of brain research, are actually quite closely connected. In essence, critical periods could be thought of as temporally constrained periods of environmentally dependent brain plasticity. Or, put more simply, critical periods are developmental windows during which experience can powerfully shape how the brain is wired.


EARLY EXPERIENCE: HISTORY AND MYSTERIES

THE NOTION THAT EARLY EXPERIENCE CAN HAVE PROFOUND and perhaps irrevocable effects on the mind has a long and storied history. It was, of course, a key question that framed the nature vs. nurture debate. And the answers that have been offered have ranged from superstition and myth to widely influential scientific theories.

FIRST IMPRESSIONS

Until the twentieth century, there was a widespread belief, endorsed by prominent physicians, that children could be permanently harmed by emotional frights, longings, and traumas that befell their mothers during pregnancy. The idea was that the circumstances of the emotional situation would leave an impression or mark on the fetus. Maternal impressions they were called, and they were blamed for all manner of deformity and intellectual weakness in the offspring. In an 1870 article from the medical literature, we find the following typical case:

A lady in the third month of her pregnancy was very much horrified by her husband being brought home one evening with a severe wound of the face, from which the blood was streaming. The shock to her was so great that she fainted, and subsequently had an hysterical attack, during which she was under my care. Soon after her recovery she told me she was afraid her child would be affected in some way, and that even then she could not get rid of the impression the sight of her husband’s bloody face had made upon her. In due time the child, a girl, was born. She had a dark red mark upon the face, corresponding in situation and extent with that which had been upon her father’s face. She proved also to be idiotic. (pp. 251–52)28

 

In other cases, the emotional distress was more modest:

A woman, between four and five months advanced in pregnancy, had an irresistible desire for a fine salmon which she saw in a market; this she purchased, despite her poverty, and as a result, at the end of the full term of normal gestation she was delivered of a child “the head and body of which presented a peculiar and strange conformation, in truth it was salmon-shaped, whilst the fingers and toes were webbed, representing the fins or tail of the salmon.” (pp. 247–48)28

 

Fortunately, salmon envy is no longer invoked as a threat to the well-being of young children, but one of the most dramatic and ancient narratives about the effects of early experience has continued to capture the popular imagination.

BABIES GONE WILD

What would happen if a child were utterly deprived of normal human nurturing? Is there a minimal set of experiences that are crucial for normal development—and if they don’t occur, can the damage be undone? These questions have made tales of so-called feral children eternally fascinating,

Stories of children raised by animals date back at least to the myth of Remus and Romulus, the twin sons of Mars who were left to drown in the river Tiber. Rescued and suckled by a she-wolf, they went on to found Rome. In more modern times, wild children have been a recurrent theme in literature and the arts, from Rudyard Kipling’s Mowgli to Edgar Rice Burroughs’s Tarzan.

The legend of feral children took on new significance after the eighteenth century as revolutionary ideas about human nature permeated Western thought. Jean-Jacques Rousseau’s notion of the “noble savage” romanticized the ideal of mankind’s natural roots, before it was exposed to the polluting effects of civilization. It was against this backdrop that eighteenth-century France was captivated by the discovery of Victor of Averyon (the subject of Francois Truffaut’s 1970 film L’Enfant Sauvage), who was said to have spent most of his life alone in the woods until he emerged at age twelve. Unable to speak and apparently without any capacity for emotional engagement, he was judged to be a hopeless case by the medical establishment.29

Victor lacked all social skills; he defecated in public, ate like a wild animal, and showed no interest in human attachments. He was taken in by a young medical student, Jean Marc Gaspard Itard, who devoted years to rehabilitating the boy. Victor made little progress. He never acquired language nor developed social connections. Through the lens of twentieth-century medicalizing, some wondered whether the boy suffered from autism or another developmental disorder. But at the time, Victor’s sad life seemed to be a tale of the immovable force of early experience.

Tales of feral children continued to grab headlines over the past two hundred years. And yet, for all the romance and sensationalism that these stories have generated, the underlying story is anything but exotic. In most of these cases, including Victor’s, there is a history of abuse and neglect.29, 30 In the end, these stories illustrate the profound and lasting effects of early adversity and trauma—a theme we’ll return to later in this chapter.


TIMING IS EVERYTHING

OVER THE PAST SEVERAL YEARS, NEUROSCIENTISTS HAVE BEGUN to sketch a fascinating picture of how sensitive periods shape the development of our mental and emotional lives. As Hubel and Wiesel discovered in the case of vision, a lot depends on getting the right instructions from the environment at the right time. As we’ll see, the same idea turns out to apply to many of the brain’s other fundamental functions—our sense of taste, our capacity to learn language, our ability to process emotions and to form attachments.

In the 1980s William Greenough and his colleagues at the University of Illinois provided new insights into how the environment shapes brain development. They suggested that there are two different developmental phases of plasticity during which the environment impacts the wiring of the brain.31

The first they called the experience-expectant phase, in which the brain makes use of “environmental information that is ubiquitous and has been so throughout much of the evolutionary history of the species” (p. 540).31 Our brains have been prepared to await signals from the environment that have reliably been there over millennia—like the visual contours of objects in the world or the presence of a mother. Sensitive periods of development are essentially windows of experience-expectant learning. If you deprive an animal of an expectable environment during these periods—as Hubel and Wiesel did when they patched one eye of a newborn kitten and Lorenz did when he substituted himself for mother goose—you interfere with the development of fundamental brain functions. In other words, if you mess with the expectable environment, you mess with the brain in ways that can last a lifetime.

Early in brain development, experience can shape how neurons hook up. Humans, for example, are born with billions of neurons packed into our little heads. In the first twelve to eighteen months of life, these neurons undergo an explosion of connectivity, sprouting branches and forming trillions of junctions known as synapses. Over the next several years, this neuronal thicket is scaled back. During the experience-expectant phases that underlie sensitive periods, these connections are refined by a use-it-or-lose-it strategy. The expectable environment reinforces useful connections and eliminates irrelevant ones. As a result of this synaptic pruning, brain circuits are refined in ways that enable the animal to adapt to key features of the environment. Some brain regions, notably the prefrontal cortex, which is involved in higher cognitive functions and self-control, continue to be wired well into adolescence. So experience is literally remodeling the brain and experiences that occur during times of active pruning might have long-lasting effects.

As Greenough and colleagues put it: “If the normal pattern of experience occurs, a normal pattern of neural organization results. If an abnormal pattern of experience occurs, an abnormal neural organization will occur” (p. 544).31 There is a kind of irreversibility that accompanies this process “because a set of synapses has become committed to a particular pattern of organization, while synapses that could have subserved alternative patterns have been lost” (p. 546).31

The experience-expectant or sensitive period of development allows the brain to develop the fundamental, species-typical skills it needs to navigate the world. But some of the most important circumstances an animal faces as its life unfolds are not expectable. They are unique to a particular geographic location, family, or social system. And this is where the second phase of plasticity—experience-dependent learning—comes in. This is how the brain responds to the fine-grained information that is unique to the world around it: where the nearest food sources are, the politics of the social hierarchy it lives in, and so on.

As the specific circumstances of our lives play out, the brain can adapt. Some of this adaptation involves the formation of new synapses. For example, dendrites, the projections from neurons that form the receiving end of a synapse, can sprout extra branches that create new synaptic connections to other neurons. As these synapses form and interconnect, they create new wiring of brain circuits that respond to the demands of the environment.32 This lifelong process is one of the crucial ways that our brains shape the twists and turns along the unique path that each of us follows in life.

TURN ON, TUNE IN, OR DROP OUT

THE SYNAPTIC PRUNING THAT OCCURS WITH EXPERIENCE-EXPECTANT learning has a remarkable implication. The normal development of many of our key brain functions—including language, emotional responses, and social cognition—involves a loss of abilities. In other words, we actually begin life with capacities that we must lose over time so our brains can develop normally.

Take the case of language. If you really think about the task facing an infant who has to learn to speak and understand language, you begin to doubt that it could ever occur. The cognitive neuroscientist Patricia Kuhl has called it “cracking the speech code.”33 Out of the hundreds of available consonants and vowels, each language has derived a distinct set of about forty phonemes that change the meaning of a word (for example, from take to lake or from cream to creep). With a brain that’s only a few weeks old, infants must begin to recognize these acoustic differences. They also have to learn that sounds are grouped into the distinct units that make up words. That’s no mean feat. Acoustical analyses show that there are actually no silences between spoken words in a sentence.33

Imaginehowhardreadingthisbookwouldbeiftherewerenospacesbetweenthewords. Or think about what you hear when you listen to someone speaking a foreign language: you don’t know the words or even where they begin and end. To your ear, it’s just a stream of sounds. And at least you understand that there are words there and you have experience with how sounds are grouped into words in your own language. The infant has none of this experience. What’s more, the same word almost never sounds the same. Every occurrence of the word play may sound different depending on who is saying it and in what context. An infant may hear the same word spoken by a man, woman, or child, each of whom says it at a different rate, pitch, inflection, and tone. The word may sound quite different in one context (“Do you want to play with that?”) than in another (“Don’t play with your food!”). And yet infants are instinctively able to categorize words and pick them out of a string of sounds. The genetic program that allows them to do this is far better than any program software engineers have been able to write, as anyone who has used speech recognition software can tell you.

What’s even more amazing is that infants are actually better than adults at distinguishing speech sounds. We are born with a universal capacity to distinguish phonemes and sounds that are found in any of the world’s languages.34 An infant can tell the difference between word sounds used in any language and can hear the boundaries between words in any language. You cannot.

Japanese infants, like English infants, can tell the difference between “r” and “l” sounds, but Japanese adults, unlike English adults, don’t hear that distinction.33 That’s because the Japanese language combines those sounds into a single phonetic unit while English treats them separately. Before you were about eight months old, you were prepared to understand any human language. And then you lost it. That’s because the environment encouraged your brain to make a commitment to your native language. How does this happen?

As Kuhl explains,33 the infant brain seems to use a kind of “statistical learning” for language acquisition. As infants are exposed to more and more examples of their native language, they start to register the statistical frequency of certain sounds and lexical patterns. The brain’s language circuits use this statistical analysis to progressively tune into the language around it.* By committing to one language, the brain becomes better and better at learning that language. The faster and more fully the brain makes this transition, the faster and better the child is at learning its native language.35 But like any commitment, this one comes at a cost. As the infant brain tunes into the ambient speech of native language speakers, it tunes out nonnative (foreign) language sounds. After the first year of life, the brain has an increasingly difficult time understanding nonnative speech sounds. This explains in large part why learning a second language is so much harder for adults than for children (whose language circuits are not yet fully committed).

The developmental program involved in language acquisition is marvelously efficient. Within three years, a newborn morphs into a toddler who is fluent in her native language. But this story also reveals key elements that our brains use over and over to acquire the species-typical (human nature) skills needed to make sense of our environment. We begin with an open mind, primed to detect information that evolution has designed our brains to expect. During this phase, the brain casts a wide net. It is sensitive to a broad variety of relevant information because the neonate may be born into any of a range of environments. So at birth, we can distinguish phonetic information in any language because our brains have to be prepared to be born in Peoria, Kabul, or Tokyo. After a time, cues from the environment clue the brain into where it has landed and the salient details of that world. This triggers an experience-dependent phase, in which brain development depends on specific inputs from the environment. A commitment is made and other roads are not taken.

THE FIRST TIME EVER I SAW YOUR FACE

BABIES USE THIS SAME GENERAL MECHANISM TO LEARN THE LANGUAGE of social and emotional communication. Just like words are the units of spoken language, facial expressions are the units of emotional language. Before they can speak, babies use facial expressions to recognize Mommy and Daddy and figure out whether they’re in trouble or safe. If you’ve only recently emigrated from the womb, these abilities are about as important as you can get. Not surprisingly, then, evolution has prepared the baby brain to read faces.

Like so much else, we owe this insight to Darwin himself. In 1872 he devoted an entire volume to cataloguing the expression of emotions in humans and animals36 and postulated that facial expressions of emotion are innate, universal, and evolved tools of communication. As Darwin concluded, “That these and other gestures are inherited, we may infer from their being performed by very young children, by those born blind, and by the most widely distinct races of man” (p. 1468).

In a series of fascinating studies, the developmental neuroscientist Charles Nelson and his colleagues have shown that babies have a facial recognition system that starts out, like the infant’s language system, “broadly tuned.” In the first of these studies, they showed infants pairs of pictures of monkey and human faces to see if they could distinguish individual faces of either species.37 At six months of age, infants were equally good at recognizing both human and monkey faces—that is, they could tell the difference between one person and another, but also one monkey and another. You might say they were bilingual for faces.

How can you tell if a six-month-old recognizes a face? Nelson and his colleagues used a well-established test. Babies will look longer at an object they’ve never seen before than they will at a familiar object. So they showed babies pairs of faces, one they’d seen before along with one new one. By nine months, babies could still tell the new human face from the familiar one, but they’d lost the ability to discriminate monkey faces. What does this mean? It appears that, like learning a native language, face recognition passes through an experience-expectant window—a sensitive period—when it’s powerfully shaped by experience. As their environment provided more exposure to human faces and no exposure to monkeys, the babies’ brains became committed to the human variety.

Once again, the brain, initially open to many possible worlds, makes an irrevocable commitment to the one it’s given. In many ways, brain development, like life itself, is about making choices.

OKAY, NOW YOU’RE SCARING ME

EXPERIENCE DOES MORE THAN SHAPE OUR ABILITY TO RECOGNIZE the people around us—it’s also crucial for tuning into their emotions. Between six and nine months of age, babies enter an experience-expectant phase in which their brains are looking for information about the important features of faces, including expressions of emotions. The first emotion we learn to recognize is fear. More than any other expression, a fearful face is a signal that our very survival may depend on. Seven-month-old infants fixate on fearful expressions (but not happy or neutral faces) even though such faces don’t yet trigger fear responses in the infants themselves.38, 39 It’s as though babies recognize that a frightened expression is telling them something important, but they don’t yet understand what it is. Not coincidentally, the brain’s fear recognition circuitry comes on line when the baby is beginning to explore its environment and thus needs feedback about what’s safe and what’s dangerous. Babies who crawl toward a visual cliff, where the ground appears to fall away, will continue crawling if their mother’s face looks happy but will stop in their tracks if their mom looks scared.40

The network of brain structures that allows babies to decode facial expressions includes two regions of the cortex that appear to be specialized for recognizing faces—the fusiform gyrus and the superior temporal sulcus (STS)—and two regions that are key players in our experience of emotion—the amygdala and the orbitofrontal cortex (OFC), a ridge of brain cells above the eyes (fig. 3.1). When a baby sees a face, the visual cortex relays information to the amygdala, which quickly uses low-level information to get a rapid read on the novelty and emotional tone of the face. The amygdala communicates with the OFC, as more detailed information on the salience of the face is processed. Meanwhile, the STS and fusiform gyrus are recognizing and decoding features of the face and communicating back and forth with the emotion areas to process the emotional expression in finer grain.

23244.jpg

Brain circuits that develop to process emotional signals from faces include emotion-processing regions (amygdala and OFC) and face recognition areas (STS and fusiform gyrus).

As babies gain experience with faces, they begin to notice how expressions link with sounds and events—smiles are paired with cooing sounds and soothing sensations, fearful faces are paired with worried tones and perhaps followed by a painful fall. This tuning of the mind has a biological counterpart in the circuits of the brain. The wiring of the face recognition network is refined and strengthened while neurons and synapses that are superfluous are pruned back. In the end, we emerge from this sensitive period with a neural system that we will rely on for the rest of our lives to judge whether other people are angry or happy with us, whether they are threatening or welcoming. Over the course of our years, the personal adversities and fortunes we encounter will continue to calibrate this system through a process of experience-dependent learning. But as we’ll see in the next section, adversity and trauma can distort the developing brain in ways that cast a long, dark shadow.

FACIAL PROFILING

IN THE LAST CHAPTER, WE SAW THAT EACH OF US HAS A SET OF genetic variations that make us more or less prone to perceive and feel certain emotions. These genes help create subtle biases in how we sense and respond to stress, unfamiliarity, and reward. But experience itself plays on those biases in crucial ways.

The most striking demonstration of this comes from research on the effects of childhood adversity and maltreatment. Child abuse and neglect are known to have long-lasting effects on behavior and emotional reactivity, and each year in the United States, nearly 2 percent of children under age three are victims of maltreatment. Maltreated children are more likely to have insecure attachments to other people; they have difficulty understanding the feelings and thoughts of others and form fewer close friendships; and they are prone to anxiety, depression, aggression, academic failure, and antisocial behavior later in life.41

How does early adversity cast such a long shadow over its victims? The answer emerging from biological studies is that the environment actually biases the brain’s emotional circuitry in ways that refocus the lens through which we experience the world. When bad things happen, they can literally change the way we see other people.

Developmental psychologist Seth Pollak and his colleagues at the University of Wisconsin have shown that abused children have a perceptual sensitivity for anger.42 In one study, they showed children pictures of emotional faces (fearful, angry, sad, and happy faces) that morphed from fuzzy to focused over the course of the experiment. The clarity of each picture was gradually increased at three-second intervals and after each interval, the children were asked to say what emotion, if any, they saw. Children who had been abused were able to pick out an angry face much sooner than children who hadn’t been maltreated.

In other studies, Pollak’s group has shown that the effect was specific to angry faces.43, 44 They showed abused and nonabused children (the controls) faces that gradually morphed between two expressions: happy → fearful, happy → sad, angry → fearful or angry → sad. The researchers wanted to see whether abused children and controls perceive the transition from one category of emotion to the other at the same point in the morphing continuum. In fact, the two groups were identical in their ability to categorize the transition from happy to sad or fearful faces. However, there was a dramatic difference in how the two groups perceived angry faces. The controls stopped seeing the angry face when only about 30 percent of the expression was sad or fearful (and 70 percent was angry), but the results were opposite for abused children. They kept seeing anger until about 70 percent of the facial expression was sad or fearful (and only 30 percent angry). Sadly, their childhoods had made them experts in anger. They needed much less information to see an angry face than other kids. Their brains had adapted to their environments to give them an edge—a little extra time to see danger coming. That brief head start might mean the difference between getting hit and avoiding abuse by appeasing a parent, shielding their bodies, or running away.

Researchers who study the effect of adversity on brain development face a problem. If children raised in emotionally or socially impoverished environments see the world differently from other children, it is still possible that the difference depends more on their genes than on their experience. After all, parents are not only the most powerful environmental influence on their infants, they also pass on their genes. Maybe the same genes that predispose a parent to be less nurturing also predispose their children to have emotional and behavioral problems. If you really wanted to test the effect of the environment, you’d need to do some kind of controlled experiment where you randomize infants to grow up in either an environment that is nurturing or one that is not. But surely that would be impossible.

FOSTER CARE FOR THE MIND

AS IT TURNED OUT, A TRAGIC SERIES OF EVENTS COMBINED WITH the passion of a few scientists to set the stage for just such an experiment. In 1966 Nicolae Ceausescu, head of Romania’s Communist Party, issued a decree banning abortion for women under forty-five. His motivation was not religious but political. Ceausescu was determined to expand Romanian communism by expanding the number of Romanian Communist workers. Government agents, nicknamed “the Menstrual Police,” rounded up women under forty-five years of age for compulsory pregnancy examinations.45 Restrictions were also placed on contraception and divorce.46 As Ceausescu told his countrymen, “the fetus is the socialist property of the whole society.”47 He issued a law requiring women under forty to have five children. Financial incentives were provided for those who complied and severe penalties for those who did not, including tax penalties of up to 20 percent, a so-called celibacy tax.46, 48

Ceausescu’s economic policies also decimated the Romanian economy, and the consequences were tragic. The prohibition on abortion spurred a huge increase in illegal abortions that were often primitive and lethal. By the time Ceausescu was overthrown in 1989, Romania had achieved the distinction of having Europe’s highest maternal and infant mortality rates.49 Because of widespread economic hardship, those who did not obtain abortions were often unable to support additional children and had to abandon them to state-run institutions. At the end of Ceausescu’s reign, more than 150,000 children were housed in orphanages that were notorious for their appalling conditions.50, 51

In the late 1990s the Romanian minister for child protection was seeking alternatives to institutionalization for the thousands of children who remained abandoned in Romanian orphanages. At the time, few alternatives existed; there was almost no government-sponsored foster care. Around the same time, the developmental neuroscientist Charles Nelson and his colleagues Charles Zeanah and Nathan Fox were trying to study how experiences in infancy affect the development of the brain and the consequences for cognitive, emotional, and social development. The Romanian government invited them to visit Bucharest to discuss possibilities for collaborative research. Soon thereafter, the Bucharest Early Intervention Project (BEIP) was launched.

The research conducted through the BEIP stands as one of the boldest and most important studies of the impact of early environment on the development of the mind. While previous studies had shown that children reared in institutions had developmental delays and altered activity in specific brain regions when compared to children who were adopted into families, they all suffered from a possible selection bias. It’s likely that children who are adopted away from institutions are psychologically and physically healthier than those who remain behind. As a result, any differences between these two groups could be due to differences in the children, not the effect of their caregiving environment.

The BEIP avoided this problem by constructing a unique experiment. After extensive ethical review and collaboration with governmental and nongovernmental organizations, the BEIP researchers randomized 136 babies (age six to thirty-one months) who had been abandoned to an orphanage at birth to one of two groups. Half of the children were randomized to remain in institutional care while the other half were assigned to foster care at an average age of twenty-one months.* For comparison, they also included a third group of eighty children who were born in the same hospitals as the institutionalized children but who were living with their biological parents and had never been institutionalized.

In 2007 the BEIP group published the first major results of their study in Science magazine.52 They compared cognitive development among the three groups before and after the intervention and tracked them up to age four and a half. The findings were unequivocal. The institutionalized group was significantly worse off than both the foster care group and the never-institutionalized control group on a whole range of developmental tests: IQ, sensorimotor abilities, and language development. Indeed, the institutionalized group’s test scores placed them in the range of borderline mental retardation while the foster care group had caught up to the controls by the time they were three and a half years old.

The researchers also found that the earlier a child was placed in foster care, the greater the gain in cognitive abilities. Those placed before they were two years old performed as well as the never-institutionalized group. When the researchers reexamined the children at age eight, the IQ-boosting effect of foster care placement was still detectable and those who entered foster care before age two seemed to do best.53 The results supported the idea that there is a sensitive period (before the age of two) when a change in nurture can have dramatic effects on the brain.

In other analyses, the foster care group was found to express more positive emotions and had better attention than the institutionalized group.54 These differences emerged quickly after children in the foster care group were removed from the orphanages. It was as though the children were stuck in an experience-expectant phase with their brains on hold, waiting for social interaction. Once they were provided with social stimulation, they rapidly responded, unleashing their capacity for joy.

The effects of the foster care intervention could even be seen in how the children’s brains functioned. Before and after the intervention, the children were shown emotional faces while their brain activity was measured with EEG electrodes. Compared to the children reared in their own families, children who remained institutionalized had diminished brain responses to seeing the faces and these persisted through the last measurement when they were four and a half years old. On the other hand, the children who were placed in foster care had normalization of their brain responses, although by four and a half years, they had still not caught up to the never-institutionalized group.

The deprivation that came with being raised in an institution had significant effects on the children’s risk for psychiatric disorders. By age four and a half, they were more likely to have both “internalizing disorders” (anxiety and depression) and “externalizing disorders” (behavioral and impulsive disorders like ADHD and conduct disorder). Those who were placed in foster care by age two had lower levels of internalizing symptoms, but their risk for ADHD and other behavioral disorders was unchanged. Sadly, the window for reversing these symptoms had apparently closed before they left the orphanages.55

The results of the BEIP project were so compelling that they achieved something that few experiments do: they changed national policy. Several years after the study began, the Romanian government passed a law prohibiting the institutionalization of children under two years of age unless they were severely handicapped.

STUMBLING ON SADNESS

THE BEIP PROVIDES DRAMATIC AND CONVINCING PROOF THAT EXTREME deprivation in early childhood can have lasting effects on the development of intelligence and mental health. By the time they were four and a half years old, the children who remained institutionalized group were nearly three times more likely to have depression or anxiety disorders compared to the foster care group.56 Adversity changes the brain and in the process bends the trajectory of human development. The enduring impact of childhood adversity is well known to every psychiatrist.

Deidre Ward came for help in a moment of crisis. Two months earlier, her boyfriend of three years had ended their relationship, saying she was too “needy and stressed-out all the time.” At thirty-five, she had found herself alone again and sank into a bout of depression, much like those she had struggled with since she was a teenager. She was spending most of her days lying awake in her bed and had been unable to go to her job as a paralegal for three weeks. Crying jags, panic attacks, and thoughts of death had come to dominate her life. She was increasingly convinced that her worst fear was coming true: she would always be alone and would never have the chance to have her own family. Deidre said she’d always suffered from low self-esteem and an abiding sense of insecurity about her appearance, her intelligence, and her ability to sustain an intimate relationship. Her boyfriend was right about her, she said with a sadness that was heartbreaking. Whenever she allowed herself to become close to anyone, her anxiety would overtake her. Her desperation to hold on to a relationship would ultimately sabotage it and drive her boyfriend away. The reality was that Deidre was an attractive and accomplished young woman who had endured substantial adversity in her life.

Deidre was born in a lower-middle-class neighborhood just outside Baltimore. When she was three, her father abandoned the family, leaving her mother to care for Deidre, her six-year-old sister, and eleven-year-old brother. Her mother was overwhelmed, both emotionally and financially. She tried to find work, but with three children to raise and limited job experience, it proved too difficult. Over the next several years, Deidre’s mother seemed to be increasingly stressed and erratic. It was never clear what would set her off, but when her mother got upset, she would blame and berate the kids for her troubles. When Deidre was a little older, her mother would often go out, leaving Deidre’s brother in charge of the girls, or sometimes leaving them with a neighbor, but they were never sure where she was or how long she’d be gone.

When Deidre was about ten, her mother remarried a man who was gruff, irritable, and seemingly resentful of the kids. When Deidre was distressed, her stepfather would tell her mother not to “baby her” and to “let her cry—she needs to learn to be tough.” Her stepfather had a nasty temper and though he never hit the kids, Deidre often lived in fear that he would. In high school, she had few friends and was teased for being socially awkward, but she yearned for some kind of connection. She turned to books for escape and ended up doing well in high school, but she always carried with her a vague sense of dread that periodically bloomed into a full-blown depression. Minor setbacks often seemed catastrophic, and she found herself constantly worried about her school performance, her weight, and her social life. She began dating in college, but her relationships were brief and she felt tense in romantic situations. She seemed to be always on guard for any sign that her boyfriend was angry with her or wanted out. This latest relationship was the longest she’d had and she was beginning to feel hopeful that it would last. Now, left alone again, she felt ashamed for believing they had a connection—like the daydreaming child in a store who tugs at her mother’s dress only to find out that it isn’t her mother.

Deidre couldn’t recall a time when she’d been carefree or really happy. Looking back at her childhood, she felt as though she’d never quite gotten a solid footing and had been stumbling ever since. “I feel haunted,” she said.

Deirdre’s childhood seemed to have set her on a collision course with suffering. As she passed through sensitive periods of emotional development, her brain tuned into a world that was threatening, chaotic, and unreliable. The results seemed to reverberate in her troubled relationships and bruised self-image ever after. Not surprisingly, researchers have found that childhood adversity and trauma are among the strongest risk factors for depression.

EXPRESS YOURSELF

WHEN WE SAY THE ENVIRONMENT AND EXPERIENCE AFFECT BRAIN development what does this mean exactly? How does the environment get into the brain?

The answer emerging from recent research is that experience actually changes how our genes behave.

To explain the nuts and bolts of all of this, we have to go deep down to the molecular level and into the world of gene expression—the process of turning a gene’s instructions into a usable product (RNA and proteins). Our genes carry the set of instructions for making the proteins that run the cells of our bodies. But how those instructions play out depends on the details and timing of gene expression. In case you’ve ever wondered, that explains why your brain doesn’t have teeth and your kidneys don’t salivate. Every cell contains the same genome* and yet some cells become neurons while others end up making enzymes in the pancreas or make the heart contract. This process of specialization occurs because only certain genes are actively expressed at specific times and in specific places. And that’s how experience plays its hand: it can shape how we develop, including the wiring and rewiring of the brain, by affecting where and when specific genes are expressed.

Several factors control how and when genes are expressed. For example, DNA sequences called promoters, typically located on the front end of a gene, are docking stations for proteins called transcription factors that are made by other genes. When transcription factors bind to the gene promoter, they can turn on the gene. You and I may have different promoter sequences on a given gene that make it easier or harder for transcription factors to turn it on. So, your gene might be more active—more likely to be expressed—than mine. We saw an example of this in Chapter 2: some people have the “long” version of the serotonin transporter gene promoter while others have the “short” version. The “short” version makes the serotonin transporter gene less likely to be expressed, so people carrying that version make less of the serotonin transporter protein. And, as I discussed in Chapter 2, that difference may contribute to anxiety by making the amygdala more sensitive to threats in the environment.

But that difference is fixed—you either have the short version or you don’t, and the environment isn’t going to change that. So, if we’re talking about experience changing gene activity, we need a mechanism that allows the circumstances of our lives to activate or deactivate genes. One solution nature has arrived at has to do with what scientists call epigenetics. Epigenetics refers to the study of gene expression changes that are not due to variation in the DNA sequence itself.57

Some epigenetic effects involve chemical modifications of the chromosomes—you can think of them as chemical “dimmer” switches that get attached or removed from our chromosomes. These modifications make it more or less difficult for transcription factors to either turn on or silence genes. And this is one key way, at a molecular level, by which the environment directly regulates how our genes function. In essence, the environment can “mark up” the genome, annotating the basic text with instructions on where and when it can be read.

Two of the best-understood epigenetic mechanisms are DNA methylation and chromatin remodeling. DNA methylation involves the addition of a methyl group—a simple molecule that consists of one carbon atom bound to three hydrogen atoms—to a gene. When methyl groups attach to specific DNA sequences, they act like a lock or an off switch, preventing transcription factors from binding to the gene and turning on expression. With chromatin remodeling, chemical groups (including methyl or acetyl groups) are added to or removed from proteins that our DNA is wrapped around. These changes affect how DNA interacts with the cell’s gene expression machinery so that specific genes are turned on or off. In order to understand how chromatin remodeling affects gene expression, we need to understand how chromosomes are packaged.

Our DNA doesn’t just lie naked in the nucleus of our cells. Rather, the long strands of DNA that make up our chromosomes are tightly spooled around proteins called histones. That packaging is essential because if the chromosomes weren’t tightly wrapped, they simply wouldn’t fit. The nucleus of a cell is about six millionths of a meter across—about four thousand times smaller than the size of a single uncoiled chromosome. And of course each nucleus has to accommodate the twenty-three pairs of chromosomes that make up our genome.

23244.jpg

The “epigenome” plays a major role in when and where our genes are expressed.

So how do you fit twenty-three pairs of chromosomes into a space 1/4000 their size? You pack them really, really tight. The chromosomal DNA wrapped around histones and related proteins form what scientists call chromatin. The activity of genes depends in part on how tightly wound, or condensed, the chromatin is. Transcription factors and other components of the cell’s transcription machinery have a harder time getting through regions of condensed chromatin to turn on the genes that lie underneath. Conversely, genes in regions where the chromatin is relaxed are open to transcription factors and more likely to be expressed. These shifts in chromatin can occur when the histone proteins are chemically marked up (by methyl, acetyl, or other chemical groups), with the result that gene expression gets dialed up or down.

So the environment and our life experiences can fine-tune the expression of genes in our brains by at least two routes: by triggering chemical changes that mark up the DNA itself or by modifying proteins around the DNA. While the genetic code was cracked decades ago, this “epigenetic code” is only now being deciphered. And the results of epigenetic research are already offering some surprising clues about the biology of behavior. For example, epigenetic differences help explain why identical twins often turn out to be quite different. Over time—due to chance and experience—the DNA and chromatin of identical twins acquire different epigenetic switches, which can alter the expression of their genes, and drive them apart.58 That, for example, may explain part of the reason one monozygotic twin develops a psychiatric disorder like schizophrenia while his “identical” co-twin doesn’t.59

A variety of environmental factors are known to alter the epigenetic state (methylation or acetylation) of DNA and chromatin, including diet, low-dose radiation, and various drugs and chemicals like cigarette smoke and alcohol. Thus, the epigenome serves as a gateway by which the world around us can change how our genes express themselves. Recent research is showing that early life experiences can also affect the brain’s epigenome. And as we’ll see in the next section, that may hold a key to how nurture shapes the workings of the mind.

“I’M BECOMING MY MOTHER”

MICHAEL MEANEY AND HIS COLLEAGUES AT MCGILL UNIVERSITY pioneered this research by studying the effect of maternal care on rat mothers and their pups. Some rat moms are very nurturing: they lick and groom their newborn pups a lot and they arch their backs and crouch over the pups when they nurse, making it easier for their babies to feed.60 Other rat moms are more cold and distant: they don’t lick and groom much or make it easy for the pups to nurse. Meaney and colleagues found that this difference in maternal care had a profound and lasting effect. What’s more, the dye is cast within the first week of life—a critical period that corresponds roughly to human infancy. Nurturing mothers set the development of their pups’ brains and stress hormone systems on a lifelong path that helps them cope well with stress. However, offspring of mothers who are absent or less nurturing grow up with hyperreactive stress systems and a lifelong tendency to be fearful.

All of this is due to how the offspring are raised and not the genes they inherit: when rat pups of low-licking-grooming (low-LG) mothers are taken at birth and raised by high-LG moms, their behavior and biology matches that of the biological offspring of high-LG moms (and vice versa).

Meaney’s group discovered that maternal care programs the infant’s stress systems by changing the chemistry of its chromosomes. Recall that methylated DNA is closed off to transcription factors that regulate gene expression. On the first day of life, part of a key stress response gene in the newborn’s brain, the glucocorticoid receptor (Nr3c1) gene, is locked up by methyl groups. This gene, which makes the receptor for the stress hormone cortisol, will go on to play a crucial role in the development and regulation of the stress response system by determining how quickly and effectively the brain copes with adversity.

By the end of that first week, the Nr3c1 gene will either stay locked or it will be unlocked, and the key is a mother’s touch (or, more specifically, how much the mother licks and grooms her pups and how she nurses them). Meaney’s group found that animals born to nurturing mothers produce higher levels of an enzyme that unlocks (demethylates) the gene, setting the course for the development of healthy stress responses. But those born to distant mothers end the week with the lock intact, beginning their lives with a brain less equipped to manage stress and set on a lifelong course of fearfulness and hormonal dysregulation.

This mothering effect reaches across the generations: by dampening the expression of stress response genes in their daughters, less-nurturing mothers produce offspring who have emotional and behavioral problems. As a result, the daughters themselves become less-nurturing mothers and go on to raise fearful offspring who go on to be less-nurturing mothers and the cycle continues. These behavior patterns are transmitted without any change in DNA sequence. Going one step beyond Freud, the implication is that a mother’s behavior can influence the emotional development not only of her child but her grandchildren.

There is emerging evidence that early maternal care can shape the developing brain of human infants as well. In one study,61 a team from the University of British Columbia found evidence at a molecular level that the same kind of epigenetic effects found in rat studies of maternal care can be seen in human infants. They compared infants born to depressed mothers to infants whose mothers weren’t depressed. Babies whose mothers were depressed during the third trimester of pregnancy had increased DNA methylation of the human NR3C1 gene at birth, the same gene that was methylated in the offspring of low-LG rat mothers. What’s more, these babies had exaggerated stress hormone responses when they were tested at three months old.

The long arm of epigenetics was strikingly demonstrated in another study from Meaney’s group.62 Child maltreatment is a potent risk factor for depression and suicide. It’s also known that people with severe depression, like the offspring of low-licking/grooming rats, tend to have high levels of the stress hormone cortisol and lower than normal brain expression of the glucocorticoid receptor (NR3C1) gene. Could the same epigenetic switch that derails the rat stress hormone system be the link between child abuse and suicide?

To answer this question, Meaney and his colleagues62 looked at the human version of the glucocorticoid receptor gene (NR3C1). They obtained brain tissue from adult suicide victims who either had or had not suffered child abuse as well as from controls who had not died by suicide. When they looked at the NR3C1 gene, they found no differences between the groups in terms of DNA sequence. But at the level of epigenetics, the results closely matched what they had found in rats: the promoter of the gene was much more highly methylated in those with a history of abuse. And, as in rats, this methylation blocked expression of the gene by making it less responsive to transcription factors that normally activate it. Basically, the gene was switched off.

The story of how early adversity influences the epigenome is more complicated than the NR3C1 gene story implies. For one thing, it’s becoming clear that early stress and deprivation can cause epigenetic changes across a much broader range of brain genes and the effects on behavior and stress responses likely depend on much more than the NR3C1 gene.6365 But the important point is that researchers are beginning to unravel, at a molecular level, how it is that life experiences shape the trajectories of behavior and stress responses. During a sensitive period early in life, subtle and not-so-subtle differences in how parents treat their infants can change the chemistry of the chromosomes in ways that alter how stress response genes are expressed. This sets off a cascade of cellular events that may govern how a child’s brain and stress hormone system responds to challenges and threats for the rest of her life. We saw that brain development involves a set of neural commitments—the selection of one path or another—that shape how an animal (or person) approaches life. And here, at a fundamental biochemical level, is one way that the brain makes a “commitment” to a particular life trajectory. Early experience programs the stress response system, predisposing the child to a temperamental or personality style that may last a lifetime.

This is perhaps the clearest demonstration of why the age-old dichotomy between nature and nurture is a false one. It’s hard to imagine anything more fundamental to nurture than how parents treat their children. And it’s hard to get closer to nature than the molecular biology of gene expression. But now we see that these two pieces of the puzzle are inextricable. Parental care (nurture) can affect child development by regulating gene expression (nature) and altering how the brain and stress response system function. Somehow I suspect even Freud would find this satisfying.

There’s an important point to be made about normal here. It’s tempting to see this research in simple terms: a nurturing environment promotes normal development whereas adversity and deprivation produce children with abnormal or broken stress response systems. But we have to remember that development is a process of adaptation to the world. What’s normal depends on context. The developing brain makes an educated guess about what life will be like based on what it’s been like so far: Is the world likely to be nurturing and predictable or threatening and chaotic or somewhere in between? During sensitive periods of development, epigenetic changes and patterns of gene expression start calibrating the brain and the mind to the expectable world. If you’re born into a world where your caregivers are stressed, absent, or unpredictable, being vigilant and having hair-trigger stress responses may be the best way to go. In this sense, the fearfulness of Meaney’s rats and the anger-sensitivity of Seth Pollak’s maltreated children may be “normal” adaptations. But as Deidre Ward found, these adaptations can come at a cost: a predisposition to distress, anxiety, and even depression later in life.

THE PARENT TRAP

THE FINDINGS ON THE BIOLOGICAL IMPACT OF ADVERSITY WOULD seem to encourage the obsessive worry that many parents already have about making sure that every experience in their baby’s life is optimal. After her son got into trouble at school, a colleague of mine joked that his behavior problems now made sense: “I didn’t lick and groom him enough when he was a baby!”

But her comment was only partly facetious. Over the past decade, everyone from child advocates to marketing executives have used the results of developmental neuroscience to warn parents that their children may be permanently damaged if they don’t provide the perfect environment for brain development.

On a spring weekend, my wife and I were surfing the web, looking at strollers for the baby we were expecting in several months. We came upon the website for “Orbit Baby,” “the world’s first rotating stroller.”66 Most strollers have the baby facing either outward or toward Mom (or Dad). The Orbit Baby allows you to rotate the baby’s seat while she is still in the stroller. The product’s website claimed that “parent-facing strollers are better for child development” and cited research to argue for the importance of their product. The research in question was a report entitled “What Is Life in a Baby Buggy Like?” and its author, Dr. M. Suzanne Zeedyk of the University of Dundee, noted that no published studies have examined the impact of stroller design on parent-child interactions and infant stress. She explained the motivation for this research by alluding to the science of early emotional development:

Infants are born with brains that are already tuned into, and dependent upon, social responses from other people. Thus, on every occasion that a baby has a need for a communicative response from his or her parent, but is unable to obtain it, this creates a low-level stress response in the infant. When such instances of stress occur repeatedly and frequently, they become damaging to infants’ neural, physiological, and psychological development. The present research project arises out of recent suggestions that baby buggies may inadvertently be generating such stressful circumstances for infants. (p. 4)

That sounds alarming—could such “low-level stress” really damage neural development? To support the claim, Zeedyk reports a study in which she had volunteers observe the interaction between adults and their children as they pushed them in strollers of various designs on the streets of fifty-four UK cities. She found that parents and children spoke less when the children were facing away. Children facing their parents were also more likely to be sleeping, which Zeedyk took to be an indication that they were more relaxed and less stressed. The study also found that

there were a small minority of children who were attempting to get their parents’ attention, but failing to do so. These children will have even higher stress levels, as they seek out parents either through crying or through turning around, yet fail to obtain a response. For these children, a buggy ride may go from being stressful to being traumatic. This is not too strong a statement, because young children’s coping systems are immature. To be left on their own, coping unassisted with discomfort for too long, constitutes trauma for a young child. If parents cannot easily see their infants’ faces, they may not realize in just how much distress their children are.

To examine the question in more depth, Zeedyk’s team conducted a second small study in which she assigned twenty mothers to walk with their infants (age nine to twenty-four months) in either toward-facing or away-facing strollers.

Zeedyk reported that the toward-facing strollers won out: infant heart rates were lower (perhaps because they were less stressed), mothers and infants laughed and talked more, and mothers preferred the experience. While noting that definitive claims would be premature, she concluded that “infant development may be negatively affected by buggy design” and that life in a baby buggy “may be more emotionally impoverished than is good for children’s development . . . If there is even the possibility that baby buggy design is aggravating children’s stress levels, then this is a cause for concern” (pp. 26, 27).

We may never know how many children have been damaged by buggy-induced stress, but the scourge of strollers seems more like a bugaboo. There’s a larger point here, and one that should reassure parents who fear that they must cleanse their children’s lives of any distress. Research has certainly established that major adversity early in life—trauma, abuse, neglect, and the major deprivations that come with poverty—can have enduring and problematic effects on brain development. But those insults are a far cry from the minor vicissitudes that are inevitable in any child’s life.

The notion of a distress-free development is not only an unattainable ideal, there’s reason to believe it’s not a worthy one. While extreme or prolonged adversity is clearly not good for development, a considerable body of research suggests that moderate stress can promote resilience and that being sheltered from all adversity may not be such a good thing. Psychologist Mark Seery and his colleagues suggest that “without adversity, individuals are not challenged to manage stress, so that the toughness and mastery they might otherwise generate remains undeveloped” (p. 1096).67 Friedrich Nietzsche may have been right when he said, “That which does not kill me makes me stronger.”

“NEVER SAY THIS IS YOUR LAST JOURNEY”

THERE ARE STILL MANY MYSTERIES THAT NEUROSCIENCE AND PSYCHOLOGY have not fully explained. Why are some people sensitized by adversity while others are immunized?

Many studies of abused and neglected children have documented how adversity causes their brains to become sensitized to fear and anger. Where others see a challenge with adversity, they see a threat. But there are others for whom hardship doesn’t just make them sensitized to fear and anger. Surviving adversity has given them perspective and fostered a kind of resilience that allows them to not sweat the small stuff.

In 1943 Mike Bornstein was three years old, living with his parents, older brother, and grandmother in the town of Zarki, Poland. One day, his family was rounded up and sent to a labor camp, where his older family members were forced to work in a munitions factory. After several months, they were shipped in cattle cars across the Polish countryside to a cold and frightening place whose name has become a symbol of brutal inhumanity and mass murder: Auschwitz. His father and brother were murdered shortly after they arrived. Mike, his mother, and his grandmother passed day after day in a state of hopelessness, surrounded by filth and starvation. They lived in barracks that were converted stables, with nothing in them but rows of cramped wooden bunks, stacked like shelves—each three-meter bunk packed with four people. Yards away from their cells, continuous plumes of smoke rose from the crematoria where the bodies of thousands of men, women, and children were incinerated.

During Mike’s stay at Auschwitz/Birkenau, more than ten thousand were killed each and every day. The moans of the sick and dying were unrelenting. The latrine, an open bunker with a row of holes in a bench, was thick with the stench of urine, feces, and the diarrhea that came with epidemic typhus. The inmates were lucky to receive 100 grams of bread per day, and Mike became emaciated from starvation. Whenever she could, his mother would find him and give him some of her bread, but many times she was discovered by the guards and beaten for doing it. One day she disappeared. He later learned she had been shipped to a work camp in Austria.

As she watched men, women, and children being slaughtered around her, Mike’s grandmother feared he would be next. One day, she sneaked Mike into her bunk and hid him in a mattress, where he stayed hidden and survived on the rations she shared with him.

When the camp was finally liberated in 1945, Mike and his grandmother walked out, barely alive, and made their way to Czestochowa, a town near Zarki. Of the 230,000 children deported to Auschwitz, Mike was one of only 700 who were liberated. His grandmother, with little education, struggled to find work so they would have food to eat. Having nowhere for him to stay during the daytime, she would leave Mike in a chicken coop by himself. After several months Mike’s mother, Sophie, who had been liberated from the Austrian work camp, made her way back to Poland and searched for him. She finally arrived in Czestochowa, and they were reunited. But the horror was far from over.

“I was very sick, and my mother took me to Germany, where there was better medical help.” Mike told me. “She didn’t have any vocation, so we lived in Munich in one room that my mother rented.” They had no kitchen privileges. They lived for six years in that one room, wary of the landlady, who wore a swastika charm around her neck. His mother made a little bit of money teaching Hebrew, but it wasn’t enough, so she smuggled food and sold it on the black market in Munich. “She would buy flour and chocolate and nylons from American soldiers and sell it on the black market. It was a very scary time for me.” Sometimes after school Mike helped his mother smuggle food. He lived in constant fear that they would be arrested.

Life in Munich was lonely and frightening for Mike, as it was for many of the survivors who had relocated there. Several of his friends at school committed suicide, jumping to their deaths. He felt like an eternal outsider. Perhaps frightened by the emaciated state she had found him in, his mother overfed him, and he was now severely obese. “I looked different from other kids; I acted different. It was easy for them to make fun of me.” During the day, he would hitch a ride between home and school. One day, a man picked him up in truck and tried to sexually molest him, but he escaped.

When Mike was eleven, he and his mother applied to emigrate to the United States. They arrived penniless and spoke no English. Through charitable services, they were put up in temporary housing in the Bowery, in Lower Manhattan. They eventually found a one-room apartment, and Mike took odd jobs to help support himself and his mother, who made $30 a week as a seamstress. At some point, Mike took a job in a pharmacy on the Upper East Side, delivering medicines, sweeping the floor, and doing whatever was asked of him. The pharmacist, a severe taskmaster, berated him for any mistakes he made but also trained him to be meticulous about his work.

The pharmacy was an exciting place for Mike, and his experience there would prove pivotal to his future in the United States. He learned English and managed to do well in high school, thanks in large part to his mother’s dedication. “My mother would sit up with me late into the night—we only had one room—but she’d make sure I did my homework. She didn’t know how to check it, but at least she’d make sure I’d stay up till ten o’clock after getting home from the pharmacy. She basically did everything in my interest. She didn’t have much, but whatever she had, she gave to me.”

When he was eighteen, Mike was accepted to Fordham University. He studied pharmacy there and then received a PhD from the University of Iowa in pharmaceutics and analytical chemistry. One day in Iowa, while at the local synagogue, he ran into a girl named Judy, whom he knew through a mutual friend. Within two years, they married. Mike had a number of jobs working as a scientist for large chemical and pharmaceutical companies in the Midwest before becoming an executive at Johnson and Johnson in New Jersey. Mike and Judy had four children—all of whom became successful professionals—and nine grandchildren. If you were to look at them, you would see the prototypical Midwestern family: close-knit, happy, and successful.

That’s not to say that Mike is in denial. He is able to talk about his experiences, has shared them with his children and even lectured about them in the Indianapolis schools. And there is one constant reminder of them that he sees every day: a number engraved on his forearm.

One day in 1981 Mike and Judy went to the movies to see The Chosen, a film adapted from the book by Chaim Potok about two boys, one Hasidic, the other a Reform Jew, who forge a friendship in 1940s Brooklyn. In one scene, a newsreel of the liberation of Auschwitz is shown—there are heaps of dead bodies and piles of shoes and eyeglasses. Finally, the camera shows a group of children crowding together with blank faces. One little boy rolls up his sleeve to bear the number on his forearm. Sitting in the theater, Mike was startled—the number was his. The child was Mike.

23244.jpg

Mike and Judy Bornstein with one of their daughters. The background photo, taken in 1945 shows Mike (right), age five, at the moment of his liberation from Auschwitz.

I asked Mike whether he thought of himself as resilient. He thought for a moment, as though he’d never considered the question, and said that he supposed he was. He wasn’t sure how he had managed not only to survive but to thrive after such a traumatic childhood. “There are two things that I keep in mind when things don’t go the way I want them to go. One of them is a watch that my mother gave me when I was eighteen. She brought a couple of things from Germany and one was an 18-karat gold Schaffhausen watch that she gave me when I turned eighteen. On the back of the watch, she had inscribed the Hebrew letters gimmel and zayin which stands for gam zeh ya’avor meaning ‘this too shall pass,’ and I try to remember that if things go real bad. And the other thing—if things go really bad—I like to sing a song. In Yiddish it’s called ‘Zog nit keyn mol az du geyst dem letzten veg,’ which means ‘never say this is your last journey.’ And I like to sing that and it helps me overcome things that look pretty bleak. I just let things fly off me and start anew.”

RISING TO THE CHALLENGE

HOW IS IT THAT A CHILD WHO SPENT HIS EARLY YEARS IN A NAZI death camp and then endured poverty and social isolation ends up happy and well-adjusted?

The question of why some people are particularly resilient in the face of stress is just as important as understanding why some people are particularly vulnerable. We need to know what genetic and experiential factors are protective rather than simply identifying risk factors. And yet we know much less about resilience than about vulnerability.

One clear resilience factor is the support and nurturing we get from other people. In the rodent studies of Michael Meaney and others and similar studies in monkeys,68 the buffering effects of maternal care are clear. Maternal nurturing seems to help program stress response systems that are flexible and efficient—turning on when they’re needed, and, importantly, turning off when they’re not. And in humans, close relationships with parents and social support can buffer the effects of adversity even among children who have genetic risk factors for depression.69, 70 Secure attachments to our caregivers can sustain us when the going gets tough, and have lasting effects on our development (about which I’ll say more in Chapter 5). One of the ingredients in Mike Bornstein’s story of resilience was the bond he had with his mother that had been developing even before he arrived at Auschwitz as a three-year-old—a mother who endured beatings to bring her son a crust of bread.

With the tools of molecular biology, genetics, and neuroimaging, researchers are just beginning to dissect the biological origins of resilience.71 And, again, the evidence points to remarkably subtle effects on how the brain responds to experience.

At one level, resilience may be related to the brain’s ability to renew itself. Around the time you’re born, the process of generating neurons (called neurogenesis) that build the brain is largely over. While synapses between neurons continue to be remodeled throughout life, the neurons themselves never regenerate. Or so scientists thought until recently. It’s now known that neurogenesis continues throughout adult life from neural stem cells in just two locations. The first is in the walls of the brain’s lateral ventricles—part of the system through which cerebrospinal fluid flows between the brain and spinal cord. New neurons born here migrate to the olfactory bulb, where our sense of smell is processed. The second is located in a part of the hippocampus called the dentate gyrus. The hippocampus is well known to be crucial for learning, memory, and regulating stress responses, and neurogenesis here is part of the brain’s response to new experiences.

Shortly after these new neurons are born, they are especially plastic—that is, responsive to stimulation and able to form new synapses with other neurons.72 As they integrate into brain circuits in the hippocampus, that extra plasticity may help them build connections that allow us to adapt to new and stressful situations. Animal studies have found that neurogenesis is crucial for the normal ability of the hippocampus to buffer the effects of stress by keeping stress hormone levels from going out of control. When neurogenesis is blocked, levels of the stress hormone cortisol stay abnormally high and animals exhibit behavioral signs of depression.73 In other words, resilience may depend in part on the brain’s ability to generate new neurons in the hippocampus.

At the same time, animal studies have shown that stress and early life adversity themselves inhibit neurogenesis. That means that stress itself can overwhelm the brain’s own resilience and coping mechanisms, taking a bad situation and making it worse. But when these normal coping mechanisms fail, there may be ways to restore them. For example, SSRI antidepressants like fluoxetine (Prozac) work in part by stimulating neurogenesis in the hippocampus.74 And even physical exercise, which also has antidepressant effects, has been shown to promote neurogenesis in animal studies.75

Neuroscientist Eric Nestler and his colleagues have discovered other key elements of resilience pathways using mice to create a model of chronic stress. In their “social defeat” model, mice are exposed repeatedly to the stress of an aggressive intruder mouse. Most mice end up falling into the mouse equivalent of despair—avoiding social contact, losing weight, losing interest in reinforcing stimuli, and showing more anxiety-related behavior. But some mice seem to be immune to the stress. Through a series of detailed experiments, Nestler’s team was able to identify molecular signatures of resilience.7678

Nestler and his team found that, in the face of stress, resilient mice were able to turn on genes in a key reward circuit, blocking a cascade of chemical events that produced anxiety- and depression-like behaviors in vulnerable mice. One of the genes encodes a transcription factor known as ΔFosB that, in turn, sets in motion synaptic changes that appear to protect brain circuits from encoding adversity. It turns out that people with depression have lower levels of ΔFosB in these reward centers, further suggesting that these chemical cascades play a key role in shifting the brain between vulnerability and resilience. Intriguingly, treatment with the antidepressant fluoxetine (Prozac) was able to turn vulnerable mice into resilient mice by switching on ΔFosB.

These and other studies have helped us develop an understanding of how genetic and epigenetic variations can have either “buffeting” or “buffering” effects on how we cope with adversity. The accumulation and interaction of these effects calibrate set points for brain circuits that influence how we appraise the challenges we face.71 For those fortunate enough to have more buffering than buffeting, the world becomes less threatening and more manageable.

Each of us lives out a unique configuration of human possibility. And it’s true that some important influences on our individual trajectories can be subtle, especially early in life. They can have a “butterfly effect” in which small perturbations can have cascading effects that amplify over time. How much any given perturbation or experience will matter for an individual can be hard to predict. Clearly, beginning life in a Romanian orphanage is not the same thing as riding around in a suboptimal stroller. Regardless, Mike Bornstein’s story underscores the broad sweep of normal, encompassing the vast scope of human vulnerability and resilience. One of the useful implications of exploring the biology of normal is that it broadens our perspective beyond a focus on pathology, even if, as it stands, science has yet to fully account for the remarkable resilience of the human spirit.

AN EMBARRASSMENT OF RICHES

LET’S RETURN TO WHERE WE STARTED. WE’VE SEEN THAT EARLY deprivation and adversity can have lasting negative effects on brain development. But what about children who aren’t deprived or maltreated? Could we boost their cognitive and emotional development and even give them an edge by providing their brains with the right kind of stimulation and experiences? This is the logic that fueled the appeal of Baby Einstein and Governor Miller’s push to send newborns home with classical music CDs. But there’s a problem with this logic. The research on sensitive periods suggests that more of a good thing won’t necessarily get you a better outcome. Let me explain with an analogy.

In the 1990s Americans were introduced to a new kind of bar scene that had its roots in Japan. Instead of serving alcohol, these bars offered something for the health-conscious set. They served oxygen. Patrons who bellied up to the bar were given oxygen through a nasal cannula (prongs that fit in the nose) for which they paid about $1 per minute. Proponents claimed that inhaling extra oxygen has a wide range of benefits, from detoxifying the body and reducing stress to boosting the immune system and improving mental abilities. In 1997 actor Woody Harrelson and a business partner opened “O2” on Sunset Boulevard, charging patrons $13 for twenty minutes of oxygenated air. For an extra couple of bucks, you could get your oxygen spiked with aromas like the rose-scented “Joy” or the eucalyptus inspired “Clarity.”79 By the decade’s end, the oxygen bar trend was in full swing, with outlets spanning the country.

As the New York Times reported, oxygen bars traded on a simple idea: “if oxygen is good for life, more oxygen must be better.”78 After all, we know that when people are deprived of oxygen, their bodies and minds suffer and, if the deprivation is severe enough, they die. If you are deprived of oxygen or have a lung disease or heart disease that interferes with your body’s ability to get oxygen, extra oxygen could make a big difference.

But any physician could tell you that as long as you don’t have respiratory problems, you get plenty of oxygen from simply breathing the air around you. Oxygen is carried throughout the body by hemoglobin, and hemoglobin has a certain capacity or limit to the amount of oxygen it can carry. Under normal circumstances, that capacity is nearly saturated. If I measured the oxygen-saturation level of your blood right now, it would likely be somewhere in the range of 97 to 99 percent. And that’s all you need to deliver adequate oxygen to your brain and other tissues. Inhaling extra oxygen might push you from 97 percent to 100 percent, but that difference wouldn’t matter. The human body has evolved to be efficient at extracting oxygen from ambient air and giving it supernormal shots of oxygen won’t make your body supernormal. In fact, too much oxygen can be harmful. The fallacious “more is better” premise of the oxygen bar resembles the claims made by those promoting supersized stimulation for young children. Some advocates of early enrichment claim that providing extra cognitive, emotional, or social stimulation during sensitive periods can give children the edge they need to outpace their peers later in life.

But remember, the sensitive periods for cognitive and social development occur because children are passing through an experience-expectant phase of development. Evolution has prepared their brains to be open to an expectable environment. If the basic elements of that expectable environment are present, the brain gets what it needs. There is little evidence that going beyond the expectable environment (and exactly what that would entail is not clear) will make the brain excel. The Romanian orphanage studies have shown that social deprivation, like oxygen deprivation, can be harmful and that providing a normal social environment can result in dramatic benefits. But that doesn’t necessarily mean that starting with a normal environment and trying to make it supernormal would have any detectable effect. The impact of enrichment depends on where you’re starting from.

Advocates of cognitive enrichment for everyone—including marketers trying to sell baby products designed to make your baby smarter or happier—often point to animal studies that have suggested that increasing the complexity of the environment can promote brain connections and enhance cognitive performance. I’m not disputing that, but let’s dig a little deeper than the headlines. Part of the anxiety parents have about optimizing their children’s first few years of life comes from a misunderstanding about brain development. As John T. Bruer describes in his book The Myth of the First Three Years,81 journalists, policymakers, and activists have abetted these problems by reading too much into the science.

First, there is the misconception that children have a fixed window of opportunity (the first three to five years of life) during which they must develop key cognitive and social skills. In a 1997 address to the National Governor’s Association, Rob Reiner, the actor/director who became a child development activist, claimed that “By age ten, your brain is cooked.”82 But that isn’t true. We know that brain development is not restricted to an early critical period. Through the mechanisms of experience-dependent learning, we continue to learn and adapt to our environment throughout our lives.

It’s true that many studies of rats have reported that environmental enrichment boosts their cognitive skills and that this is accompanied by synaptic plasticity and changes in the wiring of their brains.83 It may be true that these rats’ experiences have been enriched, but the question is, enriched compared to what?

Typically, enrichment in these studies refers to adding complexity to a rat’s environment beyond what they get from standard caged housing. Even William Greenough and his colleagues, who reported some of the influential studies of this phenomenon, acknowledged that “these conditions represent an incomplete attempt to mimic some aspects of the wild environment and should be considered ‘enriched’ only in comparison to the humdrum life of the typical laboratory animal” (p. 546).32 In this sense, going from a standard cage to an enriched one might be more analogous to taking an institutionalized infant and placing her in foster care. That is, it’s really more like going from a deprived environment to a normal, expectable environment. We already know that alleviating deprivation is good for the brain. But that’s a far cry from saying we can make brain development better than normal by manipulating the environment. So the lesson isn’t that intervening during sensitive periods can’t enhance brain function. But, again, it matters where you’re starting from. There is clear evidence that early educational interventions can have lasting cognitive and behavioral benefits for economically disadvantaged children.84 Enrichment can certainly be powerful when the environment is not good enough; but trying to go beyond that may have diminishing returns.

On the other hand, for children who are disadvantaged or raised in highly stressful environments, the biology of sensitive periods offers an important policy insight for optimizing intervention programs. Educational and social programs that aim to give these children a head start should be informed by what we are learning about the timing of brain development. Rather than simply enriching the environment as early as possible, these programs could tailor interventions to specific sensitive periods when they are likely to have the most potent effects. Key mental functions—language, attachment, social cognition, executive functions, and so on—have their own developmental periods during which the brain is exquisitely sensitive to the environment. It stands to reason that efforts to foster and protect these functions will have the biggest bang for the buck if we work with the brain’s own timetable of plasticity.

A GPS IN THE BRAIN

AND, FINALLY, IT’S IMPORTANT TO REMEMBER THAT PLASTICITY doesn’t end when sensitive periods pass—the brain is not “cooked” by the time we bid childhood good-bye. In fact, any episode of learning from experience involves changes in the brain. That’s what the idea of experience-dependent plasticity is all about. Even if our past experiences deprived us of a “good enough” environment, we can change and learn new ways of coping. In essence, that’s the premise and promise of psychotherapy.

Some of the most dramatic demonstrations of how learning can change the brain involve the development of specific skills and expertise. Take the case of spatial memory—that is, our ability to remember where things are. Since the mid-nineteenth century, a select group of people in London has acquired what may be the greatest talent for spatial memory ever achieved. These people are not supergeeks with outsized IQs. They are cabbies. And the body of knowledge they have mastered is known, appropriately, as “The Knowledge.”

If you want to get a license to operate a black cab in London, you have to memorize the layout of twenty-five thousand city streets and thousands of destinations.85 In other words, aspiring cabbies have to develop a GPS system in their brains for every route in the London area. The daunting process of acquiring “The Knowledge” typically takes two to four years of training. Those who get through it and pass written and oral exams are granted a license by the Public Carriage Office. By that point, the cabbies have an encyclopedic knowledge of London’s streets. But they also have something even more remarkable: remodeled brains. Brain-imaging studies have shown that cabbies with the Knowledge have thicker gray matter in the posterior hippocampus, a brain region involved in processing spatial memory, compared to controls. The longer they had been cabbies, the thicker their gray matter was.

They have accommodated the demands of storing the details of London’s roadways by devoting more brain territory to it. That visible change in brain structure seems to be a result of the vast body of spatial information they have to acquire. In contrast, bus drivers—who have to memorize a few specific routes—don’t show such changes, nor do people who memorize nonspatial information like doctors or even contestants in the World Memory Championships.85

Similar findings have been reported for a wide range of skills and talents. Structural and functional changes are visible in brain scans of ballet dancers, golfers, basketball players, and people who learn new languages or musical instruments.8690

WRITTEN IN PLASTIC NOT STONE

IN THIS CHAPTER, WE’VE SEEN THE POWER OF EXPERIENCE TO shape the function of the brain and the architecture of the mind. Ultimately, our sensitivity to experience is an evolutionary solution to two daunting problems: a finite genome and a changing world. Our genetic endowment provides a general plan for wiring brain circuits. But the genome can’t prepare us for the full range of contingencies we might encounter. If the wiring of the nervous system were fixed by a genetic program, we’d be up a creek—totally unprepared for the challenges the environment can throw at us. And so natural selection has solved this dilemma in an ingenious way. Instead of trying to cover all the possible “if-then” instructions we might need, the genome encodes the ingredients for plasticity.

Plasticity has a sequence of its own. For certain key capacities—learning to see, learning a language, forming emotional attachments, understanding social cues—the brain uses the environment to wire itself. Because these functions are so important, little can be left to chance. And so, early in life, we have sensitive periods when the brain is exquisitely attuned to the environmental inputs it needs to set up these capacities. Like the most sophisticated computer imaginable, the brain mines the data that it receives to build representations of the world around it. It’s a remarkably efficient plan for getting a lot done quickly while adjusting to the facts on the ground. Of course, it also creates windows of vulnerability.

If the inputs are corrupted or the wiring goes awry, trajectories can be distorted in ways that have cascading effects. Caught early on, while the brain is still capable of large-scale adaptation, these distortions may be remediable—as the Romanian orphan studies have shown.

But ongoing experience-dependent plasticity allows us to fine-tune the wiring of circuits as we adapt to the circumstances of our lives. And so, as I suggested at the beginning of this chapter, both Freud and Watson had a point. As Freud would have it, infancy and childhood are privileged periods. The existence of sensitive periods means that fundamental aspects of how our lives turn out depends on what the world was like as we passed through these windows. But as Watson emphasized, we can learn and change in important ways well beyond those early years.

In this chapter and the last, I’ve introduced the major players that drive the biology of normal: natural selection, genetic variation, and experience. In the next several chapters, we’ll explore what happens as we bring these influences to bear on some of the universal challenges we face, how they shape the trajectory of our lives, and what happens when that trajectory goes awry.


* While the evidence for an early effect of music listening on brain development and IQ has been mixed, there is a more consistent set of findings suggesting that early musical training can enhance musical aptitude, appreciation, and performance.

* Other analyses suggested that the content of TV shows that infants and toddlers watch is relevant. For example, heavy TV viewing before age three has been associated with attention problems in later childhood,19 but the effect seems to hold only for noneducational TV.20 Research on toddlers has shown that watching certain TV shows—Sesame Street, Blue’s Clues, Dora the Explorer—is associated with improved literacy and language skills. These shows tend to elicit the child’s participation, offer a clear story line, and avoid overstimulation.21

* The work for which they received the Nobel Prize also involved the first detailed description of how and where visual information is processed in the visual cortex of the brain.

* In addition to statistical learning, social interaction plays a key role in language acquisition. Mothers in cultures around the world use a particular speech pattern when talking to their infants. This “motherese” involves exaggerating phonemes of the native language, making it easier for the baby to pick them out. In other words, motherese facilitates the baby’s neural commitment to its native language.35

* Because there was very little foster care available in Romania at the time, the investigators actually created their own foster care program. They established a child care network, including fifty-six foster families, and provided financial support, training, and close supervision in collaboration with trained Romanian social workers.

* Actually, there are exceptions: mature red blood cells and platelets have no nucleus and thus lack a genome.