DOGS, POKER, AND AUTISM: THE BIOLOGY OF MIND READING
ON AUGUST 11, 2006, A NEW YORK TIMES ARTICLE about the finals of the World Series of Poker began:
When Jamie Gold bluffed, his opponents folded. When he had the best hand, they threw in all their chips. With a run of cards, a huge chip stack, and an uncanny knack for reading other players, Gold, a talkative former Hollywood talent agent, cajoled his way to victory Friday at the World Series of Poker for the $12 million grand prize.
Between 1903 and 1910, Cassius Marcellus Coolidge painted a series of pictures that are still among the most widely reproduced and copied American oil paintings. The image of dogs playing poker may be a symbol of kitsch for most, but it could also serve as an emblem for the fascinating science of social cognition. It turns out that dogs and champion poker players have something in common: they’re both skilled at reading people’s minds. Far from being a dubious power of those who claim to have ESP, mind reading—that is, deciphering the thoughts and feelings in other people’s minds—is in fact a universal, even essential, skill of the human brain. Think about it—how could you function if you were unaware that your spouse had her own thoughts and feelings? (When I posed that question to my wife, she said, “You seem to function just fine.”)
At some point in your childhood, you came to understand that others have their own thoughts, intentions, and beliefs. That understanding, which psychologists refer to as a theory of mind, is essential to almost every social encounter we have from early childhood onward—from appeasing a bully on the playground, to getting a date in high school, to negotiating with the boss at work.
As we’ll see, the human brain has been shaped by natural selection to be able to monitor and respond to the activity of other human brains. Evolution has given us a social sense, specialized for navigating the world of human interaction. Our theory of mind and capacity for empathy are mental tools that allow us to compete and cooperate in a social world. They are so essential to how we operate that they have to be effortless; they come standard as part of the brain’s basic package. This chapter is about how this social sense develops normally and what happens when it doesn’t.
“FACE” FACTS
OUR JOURNEY INTO OTHER PEOPLE’S HEADS BEGINS WITH THEIR faces. Facial expressions are the outer windows into other people’s minds. We are able to form emotional first impressions based on viewing a face for only thirty-nine thousandths of a second.1 But faces also give us vital information about the social environment at any moment. The ability to recognize faces quickly allows us to instantly judge whether someone is kin, friend, or stranger. We look at their eyes to figure out what they are attending to: Is she watching me? Is he looking at something that I need to know about—an approaching threat? A source of food? Not surprisingly, we are expert at recognizing and decoding faces. We have to be. But how do we get there?
One view is that we’re particularly adept at recognizing and reading faces because we have to do it all the time. In other words, the brain has the capacity to process all kinds of things, but it acquires face expertise because it’s called on to process faces every day. If we lived in a world where we had to recognize luggage every day, we’d become equally expert at that (if you think back to the last time you tried retrieving your black suitcase at an airport baggage claim, you’ll realize we lack this skill). The other view is that our expertise for faces is an innate skill that develops early in life.
Either way, a large body of research now shows that our brains have a biological system for processing faces. For one thing, brain abnormalities can selectively knock out face-recognition skills. People who have this condition, known as prosopagnosia, have problems recognizing faces even though they can recognize other objects.2 The problem can be acquired—for example as a result of brain injury or stroke—but the most common form, developmental prosopagnosia, is present from birth. People with developmental prosopagnosia have no apparent brain damage—they grow up with this face-blindness and may not even realize they have it until their deficit collides with social norms. Bradley Duchaine, an expert in developmental prosopagnosia, has heard plenty of these stories—many of them offered by visitors to his website. As one woman put it, “This week I went to the wrong baby at my son’s daycare and only realized he was not my son when the daycare staff looked at me in horrified disbelief”(p. 166).3
After Duchaine and his colleagues’ work received media coverage, a strange thing happened. They began to hear from people who said they had the opposite of prosopagnosia. Instead of not being able to recognize faces, these people claimed to have supernormal powers of face recognition. Intrigued, Duchaine, along with Harvard colleagues Richard Russell and Ken Nakayama, decided to test the supernatural claims. They brought four of these people into the lab to test their face-recognition skills. The subjects told stories of how their superskills were a decidedly mixed blessing. As one said: “I’ve learned to stop surprising people with bizarre comments like, ‘Hey, weren’t you at that so-and-so concert last fall . . . I recognize you.’ Before that, I’d occasionally make people uncomfortable with my recognition.” Another said, “I do have to pretend that I don’t remember [people], however, because it seems like I stalk them, or that they mean more to me than they do when I recall that we saw each other once walking on campus four years ago in front of the quad!”4
To see if these people were really better than normal at recognizing faces, the researchers had to develop special tests. One of the tests was straight out of the pages of People magazine. They showed subjects pictures of famous people “before they were famous” and asked them to identify each celebrity. Some of the pictures were photos from childhood and were cropped to make them extra hard to identify (the figure below shows four examples from the test set—see if you can identify them).
Examples from the “Before They Were Famous” test. See end of chapter for answers.*
The tests showed that the subjects were extraordinarily good at face recognition. The researchers dubbed them “super-recognizers.” They far outperformed normal control subjects. And, in fact, they seemed to be as far from normal on the superior side as prosopagnosics were on the impaired side.
The existence of prosopagnosics and super-recognizers may be more than just a biological curio. Perhaps these individuals define the extremes of a basic mental function that we all use to establish social connections. In fact, there does seem to be a spectrum of face-recognition ability—some of us are better at it than others, and a study of twins found that where people lie on this spectrum is almost entirely due to genetic differences.5
In the late 1990s MIT neuroscientist Nancy Kanwisher began using fMRI to search for the brain’s face recognition center. She found that an area known as the fusiform gyrus in the temporal lobe responded specifically to pictures of faces.6 This fusiform face area (FFA) is the hub of a cortical brain network that activates when we look at faces.7 These areas communicate with subcortical regions, including the amygdala, that provide a fast read of the structure and emotional salience of faces.
As it turns out, learning how to process faces seems to involve both innate, face-specific brain mechanisms and learned expertise—both nature and nurture. In children, the development of what has been called “the social brain” follows a path from simple attention to faces and emotion perception to sophisticated mind reading and empathy within a span of just a few years.
Within minutes of being born, neonates are drawn to face patterns. We’re born with the basic circuitry to process social information, but experience tunes it to the social world around us. In fact, the face processing network continues to develop after early childhood and doesn’t become fully specialized for reading faces until we’re about ten years old.8
As we encounter the social world, our neural networks sharpen their responses and become highly efficient and specialized. In essence, the social environment trains the brain using an innate network that is loosely wired from birth. As we saw in Chapter 3, for key functions of the mind and brain like reading faces or learning a language, we begin with a slate that is not blank but broadly tuned—a brain that is biased, as a result of natural selection, to attend to certain expectable cues (like faces or speech) from the environment. During experience-expectant phases, these environmental cues help strengthen some synaptic connections and let others get pruned away. Experience guides brain circuits to make commitments by focusing on some information at the expense of others.
Around six months of age, with some face time under their belts, babies typically begin to show several milestones of social cognition. They are able to recognize the face they’ve had the most experience with—Mom’s—and they begin to distinguish positive and negative emotional expressions on other people’s faces.9 They also develop an ability to read eye gaze—that is, to follow the gaze of an adult who has made eye contact with them.10
Soon after gaining these rudimentary abilities, infants begin to acquire a set of cognitive skills that are characteristically (and, perhaps, uniquely) human. One of these is the capacity for joint attention—a mental breakthrough that transports the child beyond the world of just “you” and “me” to the realm of “you,” “me,” and “that.” “That” is something we are both looking at or paying attention to. By twelve to fifteen months, typically developing infants understand that adults are not only looking or pointing somewhere, but directing their attention at something interesting.11 Not coincidentally, it’s at this age that infants around the world acquire one of their own tools for joint attention—they begin to point.12 Joint, or shared, attention is a deceptively simple concept that represents some pretty sophisticated mental abilities. It implies that you and I are distinct beings and that there is a world outside us. Joint attention also requires tracking your attention relative to mine, recognizing the significance of your eye movements or pointing, and coordinating our attention to focus on something else.13 It also precedes and perhaps enables a whole suite of activities that only humans do. Without an ability to share attention and information, human societies as we know them would never have happened.
There are a few features of human life that are qualitatively different from the rest of the animal kingdom, and the creation and persistence of human cultures is perhaps the most dramatic and far-reaching of these. Other groups of mammals develop sporadic local traditions and wild chimpanzees even exhibit complex customs that spread throughout a community—particular styles of tool use, foraging strategies, and even social customs like grooming behaviors. Some even call these traditions “cultures.”14, 15
But the richness and variety of human activities, the breadth of their reach throughout human populations, and their propagation across the centuries are unparalleled. Human societies have a vast array of social and behavioral customs and values—eating habits, bathroom habits, food preferences, religious beliefs, aesthetic ideals, dating and mating customs, child-rearing practices, and moral proscriptions. Some of these cut across geography (the use of utensils for eating or the use of symbols for communication) and others may be unique to a particular society (how to hold a fork “properly” or how to address an elder). Our success or failure as members of a culture and even our survival may depend on mastering these things.
But life is short, and it would be simply impossible to assimilate everything we need to by trial and error or even simple observation and imitation. How do we do it? How does a child climb this impossibly steep learning curve and still have time to sleep? The answer, in large part, is that our brains are structured to make use of a uniquely human short cut: pedagogy.16
SEE ONE, DO ONE, TEACH ONE
ALL MAMMALS LEARN, BUT ONLY HUMANS TEACH. ONE CHIMP CAN learn to use a stick to pull ants out of an anthill by imitating another chimp. But chimps don’t have a way of communicating, “Here’s what you need to know . . .” or “Show me that again” or even “Watch this!” And it’s not just because they lack the words.
Only humans seem to have the conscious and deliberate motivation to share information.11, 12, 17 And a fundamental building block for this uniquely human brain adaptation is joint attention—it allows you and me to exchange information about the rest of the world. Coupled with another uniquely human capacity—spoken language, which, like joint attention, begins to develop at twelve to fifteen months—we can share our knowledge, transmit it across generations, and build the complex structures of human culture. Babies expect this kind of information because their brains are tuned to recognize teachable moments.
Sharing attention is also the germ for sharing intention, the basis of cooperation and collaboration. Humans share goals and make plans in ways that even our closest primate cousins—chimpanzees and bonobos—don’t seem to. Despite the fact that their genomes are more than 99 percent the same as ours, chimps lack the key mental capacities that we use effortlessly to collaborate: they don’t speak, point, or even smile—behaviors that human infants universally exhibit by about fourteen months of age.
So, from very early on, our brains are wired to process faces, to engage with others, and to communicate with one another about the world. But developing a theory of mind—an understanding of the mental states of others—requires something more than that. We need to think about thinking.
TALK TO THE BANANA
IMAGINE THE FOLLOWING SCENE: YOU’RE SITTING IN A RESTAURANT, about to have brunch with your mother and one of her friends. Things are going well until her friend picks up a banana and presses it against the side of her head. Next thing you know, she’s staring at you with an exaggerated grin and talking into one end of the banana. Instead of being alarmed, your mother picks up another banana, puts it to her ear, and seems to be getting messages from the banana. Then she presses it to your ear and says, “It’s for you.”
What the hell is going on? Are these people psychotic? As a psychiatrist, if I were passing by the table and witnessed this, the thought might cross my mind. Except for one thing: Did I mention that you are two years old? That little detail transforms the scene from being rather bizarre to utterly unremarkable. Chances are a scene like this happened thousands of times in your own childhood. It’s called pretending and it’s a universal part of childhood. There’s nothing strange or puzzling about it. Or is there?
If you think for a moment, pretending could be a disaster for children. As a two-year-old, you’re immersed in a crash course on reality. Your job is to learn how the world works, understand what things mean and what they do, and learn how to predict the behavior of those around you. If half the time your parents, siblings, and peers are talking into bananas, staging faux tea parties, and mooing like cows, how in God’s name are you supposed to make sense of the world?
The truth is, of course, that children don’t get cognitively derailed by pretense. In fact, pretending may be an essential step along the way to developing a social brain. Babies around the world begin pretending by eighteen to twenty-four months old, whether or not their parents encourage it.18 Alan Leslie, a psychologist and expert on the development of pretend play, has pointed out that pretending is an inherently social activity that virtually every child begins to engage in around the same age.19
Leslie suggests that when a child is engaged in pretending, she mentally puts quotes around some behavior or thing, a capacity that involves creating a “metarepresentation.”20 Our minds must conceive of a mental world that can be different from the physical world. When we perceive things or people, we form a mental representation of them (“that’s a banana”). This kind of direct representation allows us to learn about the world. But when we think about the contents of other people’s minds, we need another layer of representation—we need to put quotes around something: “she is pretending that the ‘banana is a phone.’ ”
This capacity to form metarepresentations is what links pretending to the more general human capacity for theory of mind. There is only a small step between “she is pretending that” and “she believes that.” That’s why they call it make believe. In Alan Leslie’s account, pretending is the playground where we learn to think about thinking.
THAT’S WHAT YOU THINK!
HOW DO YOU KNOW IF A CHILD UNDERSTANDS THAT OTHER PEOPLE have their own minds? To many researchers, the strongest test is whether the child understands that someone else can have a false belief. For a child to have this capacity, he must recognize that someone else can see the world differently, and he must also imagine what’s in someone else’s mind (a metarepresentation). The classic demonstration of this developmental milestone involves a kind of pretend game called the “Sally-Anne task.” In this game, an adult shows a child two dolls—Sally and Anne—and has the Sally doll place a marble in a basket. The adult then takes Sally away and has Anne move the marble from the basket to a box. Then the adult asks the child, “When Sally returns to the room, where will she look for the marble?” A typical three-year-old will say “In the box,” which is where the marble really is. But by age four or five, most children understand that Sally will have the false belief that the marble is where she left it—in the basket. The child understands that Sally’s mind contains beliefs that can be different from reality (and different from his own).
More recent research indicates that children can attribute false beliefs to others quite a bit earlier than age four or five. By designing experiments that don’t require the child to manage lots of information at once, studies have now shown that false belief detection can be present as early as thirteen months of age, although theory of mind skills clearly become more sophisticated as the child develops.21
That simple transition to understanding that other people have their own thoughts and beliefs opens the door to the whole world of social relationships. The operation of our theory of mind has been called mentalizing or mind reading to emphasize that it’s about getting inside someone else’s head, reading or inferring their mental states. A theory of mind allows us to cooperate and compete, to recognize other people’s motivations and beliefs, to predict how they will behave, to empathize and trust, to deceive and avoid being exploited.
Without the capacity to infer motivations, beliefs, feelings, and other mental states, we wouldn’t be able to create or appreciate literature, theater, or art. In fact, a theory of mind is so fundamental to how we function that it was only identified as a subject for research about thirty years ago. Like the purloined letter of Poe’s story that I mentioned in the prologue, it was so self-evident that it was almost invisible. We do it effortlessly. We can’t help but mentalize—that is, infer mental states—when we see behavior. In 1944 psychologists Fritz Heider and Marianne Simmel provided the classic demonstration of this phenomenon when they showed people a two-minute film of two triangles and a circle moving around a rectangle and asked them to describe what they saw.22 Almost all of them described the action in terms of animate beings that had feelings and intentions. If you want to see how automatic the impulse to ascribe mental states is, put the words Heider and Simmel into YouTube and watch the film. Even if you try not to, I’ll bet you can’t resist seeing the large triangle as menacing.
THE EVOLUTION OF MIND READING
WHERE DOES OUR THEORY OF MIND COME FROM? AT LEAST PART of the answer seems to be that natural selection created the genetic blueprint for brains that can peer into other brains.
The phrase “theory of mind” first appeared in a 1978 paper entitled “Does the Chimpanzee Have a Theory of Mind?” by David Premack and Guy Woodruff at the University of Pennsylvania.23 They called the capacity to impute mental states a theory of mind “first, because such states are not directly observable, and second, because such systems can be used to make predictions, specifically about the behavior of other organisms” (p. 515).23
In their original study Premack and Woodruff tested a laboratory chimp named Sarah by showing her videos of a man in a problematic situation—for example, standing under a bunch of bananas that were suspended from the ceiling, out of his reach. Then Sarah was shown two sets of pictures, one which showed the man solving the problem (for example, by standing on a box), and the other which did not. Her task was to choose the “correct” picture. Admittedly, Sarah was no ordinary chimp—for one thing, we’re told, “she had had extensive prior experience with commercial television.” Sarah chose the correct solution almost every time, implying that she understood that the human wanted the food and was trying to solve the problem. Their study certainly didn’t settle the issue of whether chimps have a theory of mind, but it launched a whole field of research.
Summarizing the thirty years of studies that followed,24 psychologists Josep Call and Michael Tomasello concluded that the answer to the question of whether chimpanzees have a theory of mind is “yes and no.” Chimps seem to be able to tell when a human is choosing to do something as opposed to doing it unintentionally and they seem to be able to distinguish positive and negative emotional expressions.25, 26 Like human infants, they are also capable of appreciating someone else’s perspective. For example, given a choice between reaching for food that a rival can see and food that is hidden from a rival’s view, chimps will reach for the hidden food. In other words, they can track what someone else sees, hears, or knows and use that information to avoid a fight.24 Some chimps will even use deception by going out of their way to hide their efforts to get at the food if a rival is watching.27
But can chimps pass the classic test for a full-blown theory of mind? Can they understand the concept of false beliefs? Here, at last, may be the Rubicon separating our minds from our hairy cousins’. When chimpanzees are given tests analogous to the Sally-Anne task, they don’t get it—they can’t conceive that someone else believes something that isn’t true.24 So the evidence supports David Premack’s rule of thumb: “Concepts acquired by children after three years of age are never acquired by chimpanzees.”28
Why do we care if apes are able to think about thinking? Well, for one thing, it tells us something about the evolutionary history of our own minds. The chimp research suggests that theory of mind is a relatively recent evolutionary development. Tomasello and his colleagues speculate that about 150,000 years ago, when humans lived in small groups, the fitness advantages of cooperation created a selection pressure to collaborate.11 Groups that hunted and gathered together beat out those that lived by the creed of “every man for himself.” This shift toward cooperation required not simply predicting what another member of the group will do, but understanding their goals and intentions and aligning yours with theirs. And with that, the modern human mind was born—a mind that could read other minds and that was motivated to share information (the building blocks of human culture). But there is also fascinating evidence that some elements of mentalizing developed independently—by a process biologists call convergent evolution—in animals that are far more distant from us than the apes. For example, the Western scrub-jay, a bird in the crow family, is able to hide and guard its food by keeping track of what rival birds have seen and know about where the food’s been stashed.29 But there’s another species whose social cognition skills may surpass even those of chimpanzees and chances are, at some point, you’ve had one of these creatures in your home.
“TIMMY’S IN THE WELL?”
JON PROVOST, WHO PLAYED TIMMY ON TV’s LASSIE, TITLED HIS autobiography Timmy’s in the Well as a nod to the iconic scene in which Lassie saves Timmy by getting help:
LASSIE: Bark! Bark-Bark! Bark!
ADULT: What, girl? Timmy’s in the well? Go get a rope!
LASSIE (returning with rope in mouth): Bark-bark!
The irony is that, of the many scrapes Timmy got into, falling in a well was not one of them. But Lassie’s uncanny ability to read people’s thoughts, empathize, and engage in interspecies communication was the essence of the show. In a sense, Lassie was a TV show about doggie social cognition. Lassie had it all: joint attention, shared intentionality, and a sophisticated theory of mind. And recent scientific evidence suggests that there was a kernel of truth to Lassie’s mental sophistication.
It turns out that the domestic dog has some humanlike social skills that even apes can’t match. The clearest demonstration of this involves an experiment called “the object choice task.” The experimenter places two opaque containers on the floor and puts a piece of food under one of them. The test subject, say a chimpanzee, is brought into the room and the experimenter cues him about the food’s location by looking at the correct container or pointing to it. Despite their mental talents, chimpanzees simply don’t get it—they can’t make use of human communication signals. But most dogs have no problem picking the right container.30, 31
“But wait a minute,” you might say. “Of course dogs do better than chimps—most dogs spend lots of time around humans, so they learn how to read human signals.” Makes sense, but that doesn’t seem to explain their social skills. In 2002 Brian Hare, along with Michael Tomasello and others, reported a series of experiments in the leading journal Science that tested whether dogs’ ability to read human social cues is unique and innate.30 First, they showed that on the object choice task, dogs outperformed both chimps and even wolves, their closest evolutionary ancestors. But are dogs just better than wolves or chimps at reading human social signals because they have more exposure to humans?
To answer that, the researchers went one step further. If reading human minds were all a matter of experience and training, dogs with more experience should do better than dogs with little or none. To test this, they ran puppies through the object choice task. Like adult dogs, the puppies were able to understand the human signals, and how much human contact they’d had made no difference. So there is something special about the social skills of dogs. They are better than chimps and even wolves at reading human behavior and, as the puppy experiments showed, their human-reading skills seem to be innate.
What accounts for this ability? Call it a kind of “unnatural selection.” It’s no accident that the domestic dog is “man’s best friend”: we made him that way. Modern genetic analyses suggest that dogs originated about fifteen thousand years ago when humans began domesticating wolves, their evolutionary ancestors.32, 33
At least two ingredients combined to provide the raw material for transforming wild wolves into domesticated dogs. First, dogs were exposed to human social groups, which collect and often discard food, providing a ripe opportunity for animals that were inclined to seek out and scavenge human leftovers. Migrating humans, in turn, could have used help in carrying, hunting, and guarding their resources. And so a niche was born.
The second ingredient—genetic variation—allowed some enterprising wolves to enter that niche. Presumably, some wolves carried genetic variants that allowed them to approach rather than avoid or attack the humans they encountered. They were rewarded with a replenishing supply of scavengeable food. The advantages bestowed on these “protodogs” helped them to flourish and favored the selection of those who could accommodate to the human environment. Once the process of domestication got under way, the theory goes, humans selected those dogs that were least aggressive and most cooperative. Somewhere between one thousand and five thousand years ago, the human-dog partnership took a leap forward when people began selectively breeding dogs based on their appearance, behavior, and ability to do work.* As the human-dog partnership strengthened, dogs developed brains with the specialized social cognition skills they needed to herd, work, and just know when we need a friend. In a sense, dog breeding became a kind of tool making in which the tool was another animal’s mind.
Apart from providing a fascinating story, the mental skills of dogs are important because they support the conclusion that genetic selection is a key to understanding the biology behind mind reading. Domesticated dogs are better at reading social cues than their feral counterparts, suggesting that the genetic selection that occurred during domestication shaped the social brain. But, as compelling as it is, this conclusion is still largely based on circumstantial evidence, and recent work suggests that both dogs and wolves vary in their ability to read human cues depending on their own experience with people.35 To really study whether genetic selection can shape social cognition, you’d want to measure social skills before and after a species undergoes domestication.
TWENTIETH-CENTURY FOX
ONE OF THE MOST DRAMATIC EXPERIMENTS IN THE EVOLUTION OF social behavior came from a most unlikely place—the fox farms of Estonia. In the 1950s a Russian geneticist named Dmitry K. Belyaev was rebuilding his career in the aftermath of a dark chapter for Soviet biology. Josef Stalin had placed Soviet science under the direction of Trofim Lysenko, an authoritarian anti-intellectual who rejected classical Mendelian genetics in favor of pseudoscientific theories about the inheritance of acquired characteristics. Challenging Lysenko’s brand of genetics became a criminal offense and dissenters were imprisoned . . . or worse. Belyaev’s research on classical genetics led to his removal as director of a fur-breeding laboratory in Moscow. He moved to Siberia, where he studied how to enhance fur breeding.36, 37 Fortunately, his interests in genetics had a practical application. The silver fox was prized for its fur but the animals were aggressive and difficult to manage. Taming the silver fox would be a boon for breeders and farmers.
In 1959 Belyaev launched an experiment that continues to this day. He was intrigued by the fact that domesticated animals look quite different from their wild cousins. Belyaev hypothesized that the process of selection for tame behavior acts on genes that affect the development of both emotional and physical traits.
To test this theory, Belyaev obtained silver foxes from a fox farm and began breeding them for their behavior. At the start, most of the foxes were pretty nasty creatures—aggressive and fearful of humans.38 From each generation, Belyaev culled and bred the most tame (least aggressive) foxes. The goal, he wrote, “was, by means of selection for tame behavior, to obtain animals similar in their behavior to the domestic dog” (p. 302).38 After forty generations of selection, a remarkable thing happened: the silver foxes had become . . . dogs. They became playful, they cuddled and licked their human handlers, they wagged their tails to express pleasure. But even more startlingly, they took on the physical characteristics of dogs: their pointy, upright ears became floppy, their long bushy tails shortened and curled up like a dog’s, they developed patches of light fur, wider faces, and shorter, doglike snouts.37
What’s more, the domesticated foxes seemed to acquire the dog’s ability to read human signals. In a head-to-head comparison on the object choice task, domesticated fox kits performed better than undomesticated fox kits and just as well as puppies at understanding human pointing gestures.39 So the process of breeding for tameness—a form of temperament—seemed to have some dramatic side effects, including the emergence of social cognition related to theory of mind.
Putting the evidence together, Brian Hare and Michael Tomasello proposed that social cognition in dogs initially evolved as a by-product of selection pressure on temperament and its underlying emotional brain circuits (which I described in Chapter 2).31 The main goal of domestication is to reduce emotional reactivity (aggression and fear behavior). But the side effect of this recalibration of emotional brain circuits and stress hormone systems was the development of a kind of social intelligence—the capacity to recognize and respond to the intentions and desires of other animals. If, as Hare and Tomasello claim, something like this also happened in humans, then the foundations of our theory of mind may have been a side effect of natural selection for anger management.*
As our primate and hominin ancestors faced the challenges of social group living, the ability to impute mental states to others would have provided a compelling, even transformative, fitness advantage. Animals able to mind read would be able to predict behavior, to cooperate, to deceive, and to teach. Once the rudiments of these skills took hold, there would be powerful selective pressure to enhance them into a full-blown theory of mind.
BRAIN ON BRAIN ACTION
WHERE IN OUR BRAINS DO WE THINK ABOUT WHAT’S GOING ON IN other people’s brains? Rebecca Saxe, a neuroscientist at MIT, has been studying the neural basis of social cognition for most of her career. As a graduate student, Saxe began searching for a region of the brain that is uniquely active when people think about the mental states of others. Because typical theory-of-mind tasks involve a whole host of features that stimulate a wide range of brain circuits—people behaving, responding to visual and social cues, conducting causal reasoning, and forming mental representations—her challenge was to separate out brain activity that reflects thinking about mental states per se. In an elegant study, Saxe and Nancy Kanwisher42 presented subjects with a range of stories that systematically isolated each of these features and measured the subjects’ brain function using fMRI. They discovered that a region of the brain at the intersection of the parietal and temporal cortex called the temporoparietal junction (TPJ) is specifically engaged when we think about another person’s mind.
The key elements of the brain’s social cognition network. The bolded areas are crucial for theory of mind.
Though the TPJ is essential to mentalizing, it is only one hub in a larger theory of mind network that includes the medial prefrontal cortex and the posterior cingulate cortex.43, 44 And the more active these regions are in preschool children, the better they perform on theory-of-mind tests; by age four, this network has matured enough for most children to understand the possibility of false beliefs.45
NATURE, NURTURE, AND MENTALIZING
SO MIND READING APPEARS TO BE A MENTAL CAPACITY THAT IS SO important that our brains have circuitry dedicated to the task. It is a part of our universal human nature. But not all of us have the same level of skill when it comes to thinking about other people’s minds.
Twin research has shown that as much as 67 percent of the variation in theory-of-mind abilities among three-to-four-year-olds is due to genetic differences, although after age five, life experiences play a larger role.46, 47
One of the key factors in our ability to read minds may be who was around when we were kids. One day, my wife came home to find me enjoying the last bites of an ice cream bar and said, “Ooh—can I have one?” I had to confess that I had only bought one. Annoyed, she huffed, “You’re such an only child!” She may have been on to something. Studies show that only children don’t perform as well on theory-of-mind tasks as children with age-matched siblings because they’re not as good at taking someone else’s perspective into account. And, I hasten to add, that’s not because they’re just not as bright. In fact, only children tend to do better than kids with siblings on measures of verbal abilities and achievement.48
But having siblings does provide lots of opportunities to practice (or rail against) accommodating someone else’s thoughts and desires. For one thing, brothers and sisters engage in pretend play, which involves creating a shared mental representation that differs from reality.49, 50 Sibling rivalry is filled with episodes of persuading, cajoling, and arguing—each of which requires an effort to work with someone else’s thoughts and beliefs. Also, siblings have to learn to protect their “stuff” from an envious rival. When siblings fight, mom may try to settle the dispute by trying to get one child to understand what the other was trying to do or say. In doing so, she’s likely to refer to their mental states—desires, goals, and feelings. Being exposed to another mind early on seems to help a child develop his mentalizing skills.
And, it seems, the more dissimilar that mind is, the better (up to a point). That was the conclusion of an intriguing study that compared theory-of-mind skills in three groups of four-year-olds: only children, twins, and children with siblings of different ages.51 The groups were tested on a series of false belief tasks analogous to the Sally-Anne story. The sibling group did significantly better than the twin pairs who did about the same as only children. In other words, the results showed that it’s not enough to have a sibling—you want to have a sibling whose mental perspective is substantially different. If you think about it, growing up with a twin is like growing up with someone whose mind is pretty similar to yours. Your brains are at the same developmental stage, you experience things at about the same time, and, if you’re identical twins, you are genetically the same. This study and others51 suggest that the best combination for developing a child’s theory-of-mind skills is to have older or younger siblings of the opposite sex.
But as we’ll see in the next section, for some people these subtle variations in mentalizing skills are painfully beside the point. They suffer from a form of mindblindness that can overwhelm the trajectory of their lives.
MIND BLIND?
THE SCIENTIFIC STUDY OF MIND READING PROVIDES ONE OF THE best examples of how understanding normal mental function has helped us make sense of dysfunction. Imagine what life would be like for someone whose theory of mind never fully developed. Without the ability to imagine that other people have their own thoughts and beliefs, the simplest social interactions would be bewildering. You walk into a store and on the way in, bump up against another customer. She frowns at you and stares expectantly and finally says, “Thanks a lot!” Without a theory of mind, you’d miss the sarcasm in her response. What would you say? “You’re welcome”? You’d be liable to make all kinds of inadvertent faux pas. Your sister smiles and asks, “Do you think these pants make me look fat?” Well, they do . . .
Without a sense that people have their own agendas, you would be vulnerable to all kinds of exploitation and deceit. You’d also have a hard time sharing a laugh since most humor depends on things like irony, which in turn require recognizing the difference between what is said and what is meant.52 And, without access to other people’s thoughts or intentions, you’d have a hard time predicting their behavior.
In fact, there is a condition that involves an impairment in theory of mind: it’s called autism.
Although autism was given its name by the child psychiatrist Leo Kanner in 1943,*53 it remained a mysterious and misunderstood condition for decades. Kanner considered the condition to be quite peculiar and rare—his original paper described eleven cases and noted that after completion of the paper, he had seen only two more. But from the start, he identified core features of the condition that remain central to its modern definition. The fundamental problem of autism, he wrote, “is the children’s inability to relate themselves in the ordinary way to people and situations from the beginning of life. . . . There is from the start an extreme aloneness” (p. 242).
The current diagnostic criteria for autistic disorder involve three clusters of symptoms: (1) a substantial impairment in social interaction (e.g., eye contact, pointing, emotional connectedness, and failure to develop peer relationships); (2) impairments in communication (e.g., speech, language, and pretend play); and (3) repetitive and stereotyped patterns of behavior. These problems have to begin before a child’s third birthday.
In the past decade, autism has burst into public awareness, in large part because of highly publicized reports that the disorder has become dramatically more common in recent years. In the 1980s, autism was estimated to affect one child in 2,500. But estimates have been climbing ever since, and by 2006, estimates from the U.S. Centers for Disease Control (CDC) had reached 1 in 110 children.54 And those high rates don’t seem to be just the product of some American propensity to pathologize. A large 2011 study of Korean children55 reported that more than 1 in 40 were affected with an autism spectrum disorder—nearly three times more than the latest U.S. estimates! While some have claimed that we are in the throes of an autism epidemic, the cause of the apparent increase remains controversial. A number of factors seem to be contributing to the increase, including increased awareness of the disorder, which in turn leads to increased surveillance and diagnosis.
One reason for the expansion of autism diagnoses has been the expansion of the syndrome itself. In fact, within the broader class of autism spectrum disorders, classic autism represents only about 20 to 30 percent of cases.56, 57 Several conditions are now grouped under the rubric “autism spectrum disorders.” In addition to classic autistic disorder, these include Asperger syndrome and pervasive developmental disorder, not otherwise specified (PDD-NOS). Asperger syndrome (sometimes called “high-functioning autism”) is a milder form of autism in which there is no major delay in language or general intellectual development. PDD-NOS is diagnosed when there is an autismlike disorder that doesn’t exactly fit the criteria for autistic disorder or Asperger syndrome.*
In 1985, Simon Baron-Cohen, along with coauthors Alan Leslie and Uta Frith, published an article whose title—“Does the Autistic Child Have a ‘Theory of Mind’?”58—was a nod to Premack and Woodruff’s original paper on chimps. The authors used the Sally-Anne task to test theory-of-mind skills in a group of children with autism, a group with Down syndrome, and a group of typically developing children. They found that 80 percent of the autistic children failed the false belief test, while 85 percent of both the Down syndrome and healthy control groups passed. The fact that the Down syndrome group performed as well as healthy children helped rule out the possibility that the autistic children’s poor performance could simply be due to overall cognitive impairment. Baron-Cohen and his colleagues suggested that a theory-of-mind deficit is a core dysfunction in autism and could explain much of the social impairment that is a defining feature of the disorder.
Baron-Cohen coined the term mindblindness to describe the theory of mind deficit at the core of autism.59 Children with autism also show impairments at each stage on the typical road to developing a theory of mind. As infants, they are less attentive to faces and facial cues and less likely to smile or follow another’s eye gaze. As toddlers, they show less evidence of pointing and joint attention, and by the age of two, when other children are engaging in pretend play, children with autism spectrum disorders are much less likely to.
FROM REFRIGERATORS TO GENES AND SYNAPSES
WHAT CAUSES THE ABNORMALITIES IN SOCIAL COGNITION THAT WE see in autism? In his 1943 paper, Kanner conjectured about the causes of “infantile autism” based on what he had observed in eleven cases:
One other fact stands out prominently. In the whole group, there are very few really warmhearted fathers and mothers. For the most part, the parents, grandparents, and collaterals are persons strongly preoccupied with abstractions of a scientific, literary, or artistic nature, and limited in genuine interest in people. Even some of the happiest marriages are rather cold and formal affairs. Three of the marriages were dismal failures. The question arises whether or to what extent this fact has contributed to the condition of the children. The children’s aloneness from the beginning of life makes it difficult to attribute the whole picture exclusively to the type of the early parental relations with our patients. We must, then assume that these children have come into the world with innate inability to form the usual, biologically provided affective contact with people, just as other children come into the world with innate physical or intellectual handicaps. (p. 250)53
Although Kanner recognized the likelihood that the disorder had “innate” biological roots, the suggestion that parents were at fault gave rise to the influential “refrigerator mother” account of autism. In the 1950s and 1960s, this notion that cold and distant parents were to blame found influential proponents. The child psychologist Bruno Bettelheim wrote in The Empty Fortress, “[T]he precipitating factor in infantile autism is the parent’s wish that his child should not exist.”
It’s almost painful to read a statement like that, and from a twenty-first-century standpoint, it seems bizarre. It would also, by the way, require us to believe that parents were particularly averse to male children since 80 percent of children with autism are boys. Unfortunately, this kind of thinking was not a fringe view and caused untold anguish and guilt for the parents of autistic children.
And these ideas persisted until the work of geneticists and neuroscientists began to discredit them, using methods that only became available in the last twenty years. We now know that autism is a disorder of brain development and that genes play a major role. Studies of twins have shown that the heritability of autism (the contribution of genetic variations to risk in the population) may be as high as 80 to 90 percent, making it one of the most highly heritable medical conditions. It was known for some time that a number of rare classic genetic disorders—like Fragile X syndrome and tuberous sclerosis—can cause autism in children who inherit the mutations that cause these diseases. But the genes contributing to the vast majority of autism cases remained unknown. In the past several years, new tools of genome biology have begun to change that. Now researchers can look at common and rare variations across the whole genome and ask: What variations are more likely to be found among people with autism?
Finding the genes responsible for autism has become one of the hottest areas of research in the field of genetics. Most of the gene variations that have been found are relatively rare mutations and involve genes that make proteins crucial to holding synapses together. But some of the most dramatic findings have pinpointed a kind of genetic difference between people that was relatively obscure before 2004. Known as copy number variations or CNVs, these are segments of DNA sequence spanning a few hundred to a few million bases that are either missing or duplicated. The deleted or duplicated stretches may contain several or even dozens of genes. In 2006 an international team of geneticists reported the first map of CNVs across the whole genome.60 The results startled many in the genetics community because they suggested that more than 10 percent of the human genome contains these deletions and duplications. If our genomes are the book of life, that means that we differ not just in a letter here or there but in how many paragraphs we have. Many of these genetic differences are common and seem to be benign. But there are rare CNVs that can have profound effects on the brain.
It is now clear these rare duplications and deletions of DNA can be a cause of autism.* 61–64 As the evidence has accumulated, the emerging picture is one in which genes involved in the development of the brain are deleted, disrupted, or duplicated. These include genes involved in how neurons find their place in the brain, the formation of synapses, and the balance of excitatory and inhibitory connections64—all fundamental players in how brain circuits get wired up.
Given that, you might expect that disrupting these genes would have widespread and diverse effects on the brain. And that seems to be the case. One of the most striking findings to emerge from this genetic research is that many of the same CNVs that can cause autism can also cause other conditions where brain development has gone awry—including schizophrenia.65, 66 That may also explain the finding that the risk of both autism and schizophrenia is higher in children born to older parents.67, 68 It turns out that these CNVs are largely due to copying errors that occur when our cells replicate their genomes before dividing into new cells.69 The older our germ cells are (the cells that give rise to sperm and eggs), the more likely they are to make copying mistakes that lead to these mutations. In fact, part of the recent increase in autism spectrum disorders may be due to people having children later in life.70
One consequence of disrupting genes needed for wiring the brain may be dysfunction in circuits needed for the development of theory of mind. Brain-imaging studies of individuals with autism have found altered function in theory-of-mind areas.71 Michael Lombardo, Simon Baron-Cohen, and other scientists at the University of Cambridge compared brain activity in adults with Asperger syndrome to healthy controls while they had them make judgments about people’s mental states or physical attributes and found a specific difference in the right TPJ—the same region that Rebecca Saxe and others have pinpointed as the crucial area for thinking about other people’s minds. Whereas the controls had greater activation of the TPJ when thinking about others’ mental states as opposed to physical traits, those with Asperger syndrome did not. In other words, the special mentalizing function of the TPJ was reduced in the Asperger subjects. And the less TPJ activity they had, the more social impairment they had.72
On the other hand, it’s clear that alterations in theory-of-mind and social cognition networks don’t capture all of the brain basis of autism. There is growing evidence of widespread disruption of connections in several brain systems. The emerging consensus is that autism involves a basic problem with the wiring of the brain and the formation of synapses in early development.
MIND READING AND THE SPECTRUM OF NORMAL
AUTISM MAY BE AN EXTREME VERSION OF AN IMPAIRMENT OF MIND reading, but the more we learn about theory of mind, the more we appreciate that there is a spectrum of individual differences with no bright lines between normal and abnormal. As we saw, the autism spectrum is now considered to be pretty broad, encompassing Asperger syndrome and even traits of social awkwardness that are seen in the “normal” population.73 For example, relatives of individuals with autism spectrum disorders tend to score higher than other people on traits that make up what’s called “the broad autism phenotype,” which includes rigid and aloof personality traits, a lack of tact, and socially awkward speech.74–77 And, when Simon Baron-Cohen and his colleagues gave a test of autism spectrum traits to students at Cambridge University, they found that science majors scored significantly higher than students majoring in the humanities or social sciences.78 Math majors and winners of the UK Mathematics Olympiad scored highest.
Popular culture has recently embraced the stereotype of the somewhat odd person who has a narrow focus on numbers and technology; “nerds,” “geeks,” and “trekkies” used to be pejorative labels, but now they are worn with pride. The American Heritage Dictionary defines a “geek” as “a person who is single-minded or accomplished in scientific or technical pursuits but is felt to be socially inept.” You can now find “geek chic” fashions, online geek-to-geek dating sites, and Geek Pride Day, which is held annually on May 25, to coincide with the anniversary of the first Star Wars movie premiere.
Of course, being a computer whiz is not evidence of a disorder. On the contrary, my point is that variation in the same brain systems that underlie normal social intelligence and information processing give us insights into how a syndrome like autism works. Whatever genes and environmental exposures contribute to the development of autism spectrum disorders likely act on the same brain systems that govern cognitive and social skills for all of us.79 Some of the traits that are more common among people with autism spectrum conditions, especially high-functioning autism, can clearly be beneficial: an ability to think systematically, a facility with mathematical and technical concepts, an ability to recognize patterns, and an exquisite attention to detail.80 These are the kind of skills that are essential to many of the most complex occupations in our modern world—computer technology, finance, and engineering, to name a few.
Temple Grandin, a professor at Colorado State University who was diagnosed with autism in early childhood, has become internationally known for her work in animal science and behavior. Named one of Time magazine’s “100 Most Influential People” in 2010, she has attributed much of her success to being able to think in pictures and attend to details. And she has emphasized that “the world needs all kinds of minds,”81 highlighting the importance of valuing and cultivating the skills that may be part of the broader autism spectrum.
It’s clear that many people with autism are profoundly disabled. But there is a broad spectrum of severity, and at the milder ends of the spectrum, there are no sharp borders between “normal” and “abnormal” social and cognitive functioning.*
Steve Parris seemed ill at ease and anxious during our first appointment. As I welcomed him into the office, he seemed unsure which chair to use, and when he chose one, he sat silently, glancing around the room. After a few moments, I suggested, “Maybe you could tell me what’s brought you in today and how I might be of help.” He spoke in a formal style, frequently interrupting his sentences to address me as “Doctor.” He said he thought he might be depressed and that someone at work had suggested he “get it checked out.”
He described a pattern of painful interactions with his coworkers at a publishing job he’d started eighteen months earlier. He felt excluded and suspected that other people were taking advantage of him. “I was supposed to go on a trip for work to New York, Doctor, but the day before the trip, my boss tells me that he’s decided that another guy in my department is going. And he wouldn’t tell me why.” It was an experience he was familiar with. As a boy, he’d had few friends and was teased by his peers for being “weird.”
But as we talked further, it became clear that the source of much of his pain was a relationship that ended four years earlier, when he was thirty-one. He had dated rarely in his twenties, but he always felt that he should be in a relationship. While living in California, he had a met a woman through work and they went out a few times over a period of months. “I finally had a real girlfriend, Doctor.” But then, after about six months, she told him she was interested in somebody else and broke off the relationship. He was wounded and bewildered and for some months after that he continued to call her, visit her at work, and arrange things for them to do together, but she rebuffed him. She became increasingly blunt in her rejection and this only made him more persistent. He began to have anxious ruminations and difficulty sleeping and concentrating at work and now he didn’t know what to do. After meeting with him several times, I referred him for a neuropsychological evaluation and the results were consistent with a diagnosis of Asperger syndrome. But that still left us with the challenge of how to help him tune in to the social cues that he was missing.
People with Asperger syndrome can learn to read social cues, but it requires effort. They need to learn the signals by studying people’s outward behavior and decoding it. Reading social cues for someone with high-functioning autism is a bit like reading a book in a language that you’re not fluent in. I studied French in high school, and I can understand it, but unlike a native speaker, I can’t think in French. It’s not intuitive. If I’m reading something in French, I need to translate the words into English in my mind. Learning the language of other people’s minds is one of the elements of “social skills training” in which individuals with autism spectrum disorders are taught to decode social signals and respond effectively in social situations—how to start a conversation, make small talk, make eye contact, interpret facial expressions, imagine how other people might feel in a given situation, and so on. Studies of these approaches have had mixed results,84–86 but it does appear that mind reading is a language that can be learned, even if it’s not your native tongue.
BLUFFERS AND LIE DETECTORS: THE OTHER END OF THE SPECTRUM?
IF MIND READING IS REALLY A SPECIALIZED SKILL OF THE HUMAN mind and autism illustrates one end of that spectrum, what about the other end? Is there anything analogous to the super-recognizers whose face recognition skills are beyond normal? Are there people whose mind-reading skills are better than good? These would be people who have a superability to detect and reason about the mental states of others.
Trying to answer that question points up a fascinating asymmetry in what scientists have chosen to study about the human mind. We know a good deal about the mental and brain functioning of “healthy” volunteers (or “typically developing individuals,” as they’re sometimes called) and people suffering from disorders or impairments of those functions. But we know much less about whether the mind can do its job better than average.
In a sense, that’s not surprising, because that asymmetry is in our DNA. One of the points I’ve made several times is that our brains are prepared by natural selection to acquire capacities that were tuned by the environment of our evolutionary past. But for most things that have really mattered to human evolution, our neural equipment is designed to make us good enough, plus or minus a little. On the other hand, not everyone is the same. There may not be many baby Einsteins, but there has been at least one. Science has given the study of talent short shrift.
So is there any evidence that some people are super–mind readers? Perhaps the closest researchers have come to asking this question can be found in studies of lie detection. Obviously, at some level, recognizing that someone is lying requires theory-of-mind skills. You have to understand that another person holds beliefs that are different from what they are communicating. And liars are essentially trying to create false beliefs in the minds of other people.
Typically developing children begin to lie at age three, around the time that they pass tests of false beliefs, but it’s not until age seven or eight that they are able to sustain a lie with a consistent story.87 In school-age children, the development of proficient lying is a sign that a child’s theory-of-mind skills are developing normally. Not surprisingly, children with autism are not adept at telling or detecting lies, or even conceiving of the possibility that others are lying to them.
Actually, most of us are not very good at recognizing when someone is telling a lie. When people are shown videotapes of someone who is either lying or telling the truth and they’re asked to catch the liars, they do no better than chance.88 Paul Ekman, a psychologist who has made a life’s work of studying the communication of emotion, wanted to see if some people are unusually good lie detectors. Would people who are trained to detect deceit—law enforcement agents, for example—do better than the rest of us? In one study, Ekman and his colleague Maureen O’Sullivan showed groups of people videotapes of ten young women and told them that about half of them would be telling the truth and the rest would be lying. Ekman and O’Sullivan told their subjects that all of the women would be describing their positive feelings about a nature film they were watching.89 But in reality, only some of them were watching nature films. The rest were watching horribly gruesome films and lying about what they were feeling.
Some of the subjects in this lie detection test were average college students, but others were professionals whose jobs involved lie detection: members of the Forensic Services Division of the Secret Service, federal agents from the CIA and FBI, police investigators, a group of judges, and a group of psychiatrists. Only one group performed better than chance: the Secret Service agents. The best lie detectors were more likely to use nonverbal cues—like subtle features of facial expression—when they judged the truthfulness of the videotaped performances. In a subsequent study, Ekman used films of people lying or being truthful about their beliefs rather than their feelings and again found that expertise matters in being a good lie detector. The only groups that showed special skills were trained interrogators and forensic psychologists.90
While many studies have shown that ordinary people (mainly college students) don’t do better at spotting liars than they would if they flipped a coin,88 O’Sullivan and Ekman claim that there are rare individuals—mainly forensic professionals—who are “truth wizards.”91 The idea that some people are human polygraphs has an undeniable “cool” factor that hasn’t escaped the entertainment industry. There was Jack Byrnes, the intimidating dad played by Robert De Niro in Meet the Parents who spends much of the film suspiciously sizing up his daughter’s hapless suitor Greg Focker (Ben Stiller). Turns out Jack is a former CIA profiler, and, as his daughter warns Greg, “a human lie detector.” And more recently, there was the crime series Lie to Me, whose main character, Dr. Cal Lightman, is “the world’s leading deception expert,” and was actually based on Paul Ekman himself.
But there’s another forum for mind-reading expertise that has lately become a national obsession: poker. In fact, poker is in many ways the apotheosis of theory of mind in action. Beyond knowing the hierarchy of winning hands and having a familiarity with probability and odds, the talent that separates great poker players from the rest of us is an ability to detect and manipulate mental states. And who better to teach you these techniques than a former FBI counterintelligence agent?
In his book Read ’Em and Reap, retired FBI special agent Joea Navarro tells us “that 70 percent of poker success comes from reading the people and 30 percent from reading the cards. . . .”92 And in The Theory of Poker,93 David Sklansky captures the complex theory of mind skills that a great poker player needs: “getting into your opponents’ heads, analyzing how they think, figuring out what they think you think, and even determining what they think you think they think” (p. 236). To play at this level, you need to be attuned to the thoughts, emotions, and intentions of the other players despite their efforts to keep them hidden. You need to hide your emotions and suppress any tells about your own mental states. Mastering the poker face means overcoming a biological system for broadcasting our emotions that’s been shaped by millions of years of primate evolution. And great poker players are also expert at creating and exploiting false beliefs in other people—the art of the bluff.*
Poker legend Jack Straus executed one of the most famous bluffs in poker history. Nicknamed “Treetop” because of his 6’6” frame, Straus was playing high-stakes Texas Hold ’Em poker and he was on a winning streak. In Texas Hold ’Em, players make their first bet after being dealt two cards facedown (the “hole cards”). The next bet comes after three more cards are laid out face up (“the flop”), and then another bet after the fourth card (“the turn card”) and a final bet after a fifth face-up card (“the river”) is dealt. The winner is the one with the best five-card hand or the last one standing.
An aggressive gambler, Straus decided he’d bet big no matter what two hole cards he was dealt. When he picked up his hand, he saw a 7 card and a 2 card of different suits—known to poker players as the worst possible hand. But he followed through with a big bet. Only one player at the table called his bet. The “flop” cards were a 7, a 3, and another 3. Straus bet again, and his opponent made a huge raise suggesting that he had at least a high pair. Things were not looking good for Straus. The turn card was dealt: it was a 2. So Straus had two low pairs (7–7 and 3–3), but his opponent almost certainly had better. He made a large bet anyway. In theory-of-mind terms, Straus had decided his only hope was to create the false belief that he had three of a kind. As his opponent deliberated, Straus grew uneasy. He knew he would lose if his opponent called the bet, so, in a move that made poker history, he made a generous offer: for $25, his opponent could choose one of Straus’s down cards and turn it face up. His opponent took the offer, threw him a $25 chip, and turned up the 2. After another moment of deliberation, the opponent folded and Straus took the massive pot. What happened? As Straus must have surmised, his opponent thought that to have made such an offer, Straus must have had a pair of 2’s down and thus, a full house. Straus was operating on the level of “what they think you think they think.”
NOTHING MORE THAN FEELINGS?
I’VE SAID A LOT ABOUT HOW OUR MINDS READ OTHER PEOPLE’S thoughts. But what about reading their feelings? There’s an important difference between thoughts and feelings. Thoughts and intentions are invisible. We can infer someone else’s thoughts by observing their behavior and by listening to what they say. But emotions hang on our faces and lurk in the sound of our voices. Usually, our inner feelings and outward expressions are consistent: you’re feeling elated at winning an Academy Award and you’re all smiles as you thank God and your agent. But they can also be out of sync: you feel angry as you realize you’ve lost another Oscar to Meryl Streep, but you’re all smiles as the camera zooms in on your face. We humans have exquisitely sophisticated systems for expressing and detecting emotions in other people. And once again, the face is where the action is.
As I noted in the last chapter, Darwin was the first to claim that facial expressions of emotion are innate, universal, and evolved tools of communication. Nearly a century later, his conjecture was taken up by Silvan Tomkins, a psychologist who claimed that there are nine primary categories of emotion, or affects, and that they are universal, innate, and biologically based. They included the ones you’d probably guess—enjoyment/joy, surprise/startle, fear/terror, shame/humiliation, anger/rage, interest/excitement, distress/anguish, disgust—and one he made up—“dismell.” Disgust literally means “bad taste,” so you can guess what dismell is.
Tomkins’s students Caroll Izard and Paul Ekman later provided crucial evidence for the universality of emotional expressions, showing that they are the same across cultures. People around the world use the same facial expressions when they experience basic emotions like fear, anger, disgust, sadness, surprise, and happiness. Congenitally blind individuals from diverse cultures produce the same emotional expressions as those who can see,95, 96 supporting the idea that these expressions are innate. Emotional expressions make up a kind of universal vocabulary. People in industrialized and preliterate cultures see the same emotion when they are shown what we would call an angry face or a sad face or any of the other basic emotions.97 When it comes to signaling our emotional states, facial expressions are the Esperanto of human cultures.
WHY THE LONG FACE?
SO WE USE EMOTIONAL EXPRESSIONS LIKE FEAR, ANGER, AND DISGUST to communicate our internal states to other people. But why did natural selection favor the particular shapes our faces take on when we’re feeling fear or other emotions? For example, why do we open our eyes and flare our noses when we are afraid? If facial expressions are like the words of a social language, are they just arbitrary forms like the words of a spoken language or do they have some inherent meaning?
Darwin speculated that particular emotional expressions evolved for a reason. The curled lip of an angry face, for example, has the intimidating effect of baring one’s teeth. But in 2008, researchers at the University of Toronto applied twenty-first-century technology to provide a more scientific test of Darwin’s speculation.98 They used sophisticated computer modeling to map the detailed facial structure involved in a facial expression of fear. Then they ran the computer mapping in reverse—creating a face that had the opposite muscle movements of a fear face. And the result was instantly recognizable—it was the expression of disgust. That seems surprising—why would these two different emotions use the same muscles patterns in reverse?
The answer seems to be that they are not just arbitrary forms—they serve a purpose. Fear spreads the face in ways that enhance our ability to take in sensory information. In contrast, disgust compresses the eyes, nose, and mouth to keep sensory information out. The fear face is about vigilance, taking the environment in; the disgust face is about shutting out or even expelling the environment. This gives us another clue to why natural selection paid so much attention to recognizing emotions in other people. Emotional expressions serve not only as a language of social communication—allowing us to predict behavior and see the effect of our own behavior on other people—they can also save our lives. They can help us take in or keep out danger, and we can use other people’s emotional signals to warn us of danger that lurks nearby. By empathizing with someone who is showing signs of fear or disgust, for example, we can prepare ourselves for the worst.
EMPATHY: THE SINCEREST FORM OF FLATTERY?
ADAM SMITH, THE SCOTTISH PHILOSOPHER BEST KNOWN FOR HIS book The Wealth of Nations, identified sympathy as the basis of human moral behavior and proposed that we feel another person’s feelings by a kind of imitation of his or her mind: “ . . . we enter as it were into his body, and become in some measure the same person with him, and thence form some idea of his sensations, and even feel something which, though weaker in degree, is not altogether unlike them” (p. 4).99
In a sense, empathy may literally be a form of “inner imitation.” When we interact with other people, we tend to unconsciously mimic their behavior, a phenomenon known as the chameleon effect.100 Empathy and its cousins, sympathy and compassion, all involve recognizing the emotional states of another person, but only when we empathize do we actually experience the same emotion as that other person.101
The notion that empathy and imitation are biologically two sides of a single coin was bolstered by the discovery of a system of neurons specialized for imitating others. In the early 1990s, a group of Italian neurophysiologists discovered a set of nerve cells in the brains of macaque monkeys that had a very special property. It had already been shown that neurons in a part of the premotor cortex called F5 fire when a monkey engages in goal-directed activity, like using their hands to grasp for food. The Italian scientists were trying to study the properties of these neurons in detail, but serendipity revealed something startling. The F5 neurons also fired when the monkeys were observing an experimenter pick up the food to put it in front of a monkey.102 The scientists discovered that the same set of neurons that activate when a monkey makes an intentional action also activate when they see someone else make the same action. In fact, these neurons are most active when the monkey imitates the action that they see. These “mirror neurons” seemed to follow the rule “monkey see, monkey do.” It wasn’t long before scientists claimed to have found a human mirror neuron system (MNS), analogous to the monkey system, distributed in regions of the brain’s frontal and parietal cortex.103, 104
Here, it seemed, was a neural basis for the chameleon effect: when you watch someone perform an action, brain regions activate as if you yourself were performing the action. Some have claimed that this mirroring may extend beyond the actions of others to their emotions. The MNS is connected to the limbic system (the neural circuits of emotion) through the small strip of cortex called the insula, which you may recall from Chapter 1. So, the theory goes, one biological system for empathy may play out like this: observing the emotional responses of others activates the MNS, which relays information through the insula to the limbic system, triggering our own emotional experience that mirrors the one we observed.
In an fMRI study, a team of American and Italian scientists found that the same network of brain areas, centered on the insula, activates when people are either observing or imitating emotional facial expressions105 and when they experience pain directly or watch a loved one experience pain.101 And, as you may recall from Chapter 1, the insula engages when people are exposed to disgusting smells and when they simply watch others express disgust.106 Rare individuals who have lesions of the insula are unable to recognize facial expressions of disgust and are also unable to experience disgust themselves. Adam Smith wasn’t far off: empathy seems to involve a process of enacting the emotional experiences of others in our own brains. We connect with someone else’s feelings by simulating them.
But even if the MNS provides a biological mechanism for empathy (and there’s plenty of controversy about that), it is likely to be only one part of the puzzle. Neuroscientists have drawn a distinction between “emotional empathy” and “cognitive empathy.” Emotional empathy involves the kind of immediate emotional resonance and imitation that mirror neurons might provide—“I feel what you feel”—whereas cognitive empathy involves mind reading or theory-of-mind skills—“I recognize and understand what you feel.”107
Emotional empathy seems to be the more primitive system. Even newborns display rudimentary types of emotional imitation—for example, smiling in response to their mother’s smile. The capacity for this kind of imitation makes infants susceptible to emotional contagion. If you’ve ever walked into a day care for infants, you’ve probably witnessed babies doing the emotional equivalent of the wave: one baby’s cries will trigger crying in her neighbors, creating a spreading front of wails across the room.* But understanding the mental states of others (cognitive empathy) seems to come later.
Emotional empathy and cognitive empathy seem to rely on different brain circuits. In a study of patients with brain lesions,107 those with damage to the inferior frontal gyrus, a key node of the mirror neuron network, had severe deficits in emotional empathy but normal cognitive empathy (theory-of-mind abilities). In contrast, those with lesions in the ventromedial prefrontal cortex, a key node in the theory of mind network, had no difficulty empathizing but performed poorly on false belief (Sally-Anne-type) tasks. These studies suggest that there are two independent brain systems involved in what people refer to as empathy. One—the MNS—allows us to simulate and mirror other people’s emotional states, while the other—the theory-of-mind network—enables us to appreciate and anticipate what will make someone feel good or bad.
Both systems may be necessary for us to accurately read other people’s feelings.108 The very fact that our brains have multiple systems for detecting and responding to other people’s feelings says something about how important empathy is to human nature. Mirroring and emotional contagion are crucial for forming our earliest attachments. Later, as the cognitive empathy system comes on line, we are able to take another’s perspective—to see and feel the world through their eyes and to sympathize. Together, these systems help shape our moral behavior. We avoid hurting other people and want to help those in need in large part because we can feel their pain.
LIFE WITHOUT EMPATHY
IF EMPATHY IS A BASIC FUNCTION OF THE NORMAL BRAIN, CAN there be disorders of empathy itself? What would someone look like if he had an intact cognitive theory of mind (an ability to read other people’s thoughts, beliefs, and intentions) but an impairment of emotional empathy? This would be a person who would understand what people are thinking and feeling but wouldn’t care. That might not be enough to cause problems. But when a lack of empathy is combined with callous and aggressive personality traits, the results can be destructive indeed. There’s a name for people like that: they’re called psychopaths.
The serial killer Ted Bundy was an exemplar of the psychopathic mind—charming, confident, manipulative, and utterly without empathy or remorse. “I don’t feel guilty for anything . . . I feel sorry for people who feel guilt,” he said while awaiting execution for more than thirty murders.109 Recent research suggests that psychopathic individuals have a neurobiologic impairment in the ability to recognize and process fear and sadness in the facial expressions or voices of other people. It’s as though they’re blind and deaf to the pain of those around them.
That combination of being able to read people but not connect with their fear and pain creates a dangerous mix, especially when someone is motivated—as we all are to some extent—by self-interest. The terms psychopath and sociopath are essentially synonymous, but contrary to popular belief, neither is a psychiatric diagnosis. There is no category of psychopathy per se in the DSM, psychiatry’s official diagnostic manual. The term was coined in the nineteenth century but its modern usage derives from an influential book, The Mask of Sanity (1941), by the American psychiatrist Hervey Cleckley,110 who is also known as the coauthor of The Three Faces of Eve, which put multiple-personality disorder on the map. In The Mask of Sanity, Cleckley identifies a group of patients whose veneer of sanity, rationality, and even charm covered up a deep disturbance of emotional and social functioning. The psychopath, Cleckley observed, is often outwardly friendly and agreeable, but below the surface, he is insincere, callous, emotionally vacant, pathologically egocentric, given to exploiting others without remorse, “and almost incapable of anxiety.” In the realm of psychiatry, psychopathy is most closely tied to the diagnoses of conduct disorder and antisocial personality disorder.
Conduct disorder is a diagnosis made in childhood or adolescence when someone has a persistent pattern of violating rules and victimizing other people. To warrant the diagnosis, a person has to exhibit three or more of the following kinds of behavior: (1) aggression to people or animals, such as physical cruelty, assaults, forced sex, mugging; (2) destruction of property or fire-setting; (3) deceitfulness or theft; and (4) serious violations of rules, such as repeated truancy or running away from home. Antisocial personality disorder is essentially the adult form of conduct disorder.
But you’ll notice that these diagnoses are all about behavior and they are not quite the same as psychopathy. It’s been estimated that 80 to 90 percent of inmates in maximum-security prisons meet criteria for a diagnosis of antisocial personality disorder, but only 15 to 20 percent qualify as psychopaths. Not all people with psychopathic tendencies have antisocial personality disorder; some are quite high-functioning and successful.
Because their cognitive mind-reading skills are normal, psychopaths are often quite able to keep their true nature well hidden. They can learn to “talk the talk” of normal social relationships. When they do cross the line into criminal behavior, those who know them may be surprised. “He seemed like a regular guy” is the familiar refrain when a reporter interviews a neighbor of the latest serial killer to make the headlines.
The problem with psychopaths, according to James Blair at the National Institute of Mental Health, seems to be a specific impairment in emotional empathy.111 Most of us are emotionally aroused when we see someone gripped by fear. You can measure this arousal by hooking someone up to a machine that measures skin conductance—how easily an electric current is conducted through electrodes on the skin. Sweating is a sign your emotions have been aroused, and when you sweat, the moisture makes your skin a better electrical conductor. But Blair and his colleagues have shown that when psychopaths are shown faces expressing distress—fear or sadness—they are unmoved. Their skin conductance responses show little or no sign of arousal. In fact, they have trouble even recognizing fear in the faces of others. This is a deficit that’s also seen in rare individuals who have damage to the amygdala, that key region of the limbic system that is essential for processing emotional stimuli.
In fact, neuroimaging studies suggest that psychopathy involves a distortion of the brain’s fear-processing machinery. Psychopathic individuals have been found to have small amygdalae that have a blunted response when they look at fearful faces or listen to fearful voices.112–115 And the deficit in appreciating fear seems to develop early: Blair and his team found that children and adolescents who score highly on measures of callous-unemotional traits (a core feature of psychopathy) showed reduced activity of the amygdala in response to seeing fearful faces when compared to healthy children. The more callous they were, the smaller the amygdala response. They also found that callousness was associated with a weaker connection between the amygdala and the ventromedial prefrontal cortex (vmPFC), a brain region involved in moral decision making.116
These findings fit with experimental results showing that psychopathic traits are related to problems with recognizing fear and responding empathetically to fear and distress cues from other people. Putting this evidence together, Blair suggests that psychopathy results from a neurobiological disconnect that short-circuits empathy-based learning.117
Most of us develop a moral compass in part by learning that exploiting or harming others causes them to feel fear and pain. Because of our capacity for empathy, we find other people’s suffering aversive and we learn to avoid doing things that cause it. This emotional learning depends on a circuit involving the amygdala—which detects the other person’s distress—and regions of the prefrontal cortex—including the orbitofrontal cortex (OFC)—which registers the connection between their distress and your behavior along the lines of “you just did something that was not good—don’t do it again.” In tests of empathy and accuracy at judging other people’s emotional states, criminal psychopaths show deficits that are similar to those of people with brain damage in the OFC.118
Blair’s work suggests that psychopathic tendencies result from a dysfunction of this brain circuit: without a capacity to feel other people’s fear and distress, the brakes on callousness and antisocial behavior fail to develop normally. When this callousness combines with a predisposition to be impulsive and aggressive—which other research suggests is related to hypersensitive reward circuitry119—the seeds of a criminal mind are sown.
The psychopath’s disconnection between the amygdala and the OFC may explain why they are so resistant to change or treatment. We rely on this circuit to learn that exploiting other people, causing them pain, is something to avoid. As Blair told me: “It would be very difficult to give them newer value judgments and to really make them care about other people. In order to really feel that hitting another person is wrong, you need to have this basic neural architecture—the amygdala and OFC circuit—intact. It allows you to be able to learn the badness of harming an individual.”
In a series of interviews he gave while on death row, Ted Bundy spoke at length about the mental world of someone who could, as he did, brutalize and kill young women. The transcripts are chilling, in part because he tells his story in a detached third-person narrative, which the interviewers allowed because Bundy refused to take responsibility for his actions. Here’s Bundy on “Bundy”:
I think we’d expect a person not to feel much remorse or regret for the actual crime—or guilt in the conventional sense for the harm done to another individual. Because the propriety or impropriety of that kind of act could not be questioned. If it was, then, of course, there would be all sorts of internal turmoil.
The guilt or remorse were most prevalent, if they were prevalent at any time, during the period when the individual was uncertain about the results of the police investigation. Once [it] became clear that there was going to be no link made or that he would not become under suspicion, the only thing which appeared relevant was not exposing himself to that kind of risk of harm again.
Not thinking about the nature of the act, of the death of the individual herself. The approach is, say, “Don’t ever do that again.” But as time passes, the emphasis is on “Don’t get caught.” (p. 96)109
Bundy was capable of feeling the fear of being caught but incapable of feeling his victims’ terror as he tortured them. After describing a series of assaults and murders, he said, with evident pride:
. . . [I]t’s not that I’ve forgotten anything, or else closed down part of my mind, or compartmentalized. . . . I guess I’m in the enviable position of not having to deal with guilt. (pp. 280–81)109
What causes the brain circuitry differences that nudge some people toward the dark side? The answer, familiar by now, is that it seems to be a combination of genes and life experience. Twin studies suggest that variations in genes account for up to two-thirds of individual differences in psychopathy.120, 121
Like other personality traits, psychopathy seems to be a dimension rather than a category. For the most part, people aren’t entirely psychopathic or not at all psychopathic. Certainly, criminals score high on measures of psychopathy, and Bundy’s case is about as extreme as they get, but studies have shown varying degrees of callousness and psychopathic traits in polite society as well. Up to 30 percent of people in a general population study in Britain exhibited some psychopathic traits (men more than women), though less than 2 percent reached clinical thresholds for psychopathy.122 At a brain level, many of the regions implicated in psychopathic tendencies overlap with those implicated in studies of empathy, leading some scientists to conclude that callousness and empathic concern are two ends of a spectrum of normal brain function.
COMPASSION FATIGUE
WHAT ABOUT THE OTHER END OF THE EMPATHY SPECTRUM? IS there such a thing as too much sensitivity to other people’s feelings? We all know people who are annoyingly touchy-feely, but I mean something else. Imagine how painful or frightening it would be to be constantly tuned into other people’s emotional states. In an old episode of Star Trek, the crew land on the planet Menarian 2, and are captured by a subterranean race of Vions, who lack all emotion. Captain Kirk is saved by an “empath,” a woman who has a special ability to take on and process the painful feelings of others—but at the cost of having to bear the pain herself.
People with Williams syndrome, the genetic disorder we first encountered in Chapter 2, may experience something close to the pain of an empath. You’ll recall that Williams syndrome results from a missing sequence of DNA on chromosome 7 and that this specific change produces an extreme interest in other people. Individuals with Williams syndrome are often described as highly empathic and emotionally sensitive.123, 124 They are happy when you are happy, but feel terrible when you are sad or angry and want to soothe your distress. But their empathy seems to be primarily the automatic, mirroring kind. Like people with autism, they do poorly on theory-of-mind tests.125 So their attunement to other people is not the result of some keen ability to mentalize, but rather an intuitive sensitivity to distress.
That sensitivity can come at a devastating cost. Despite their lack of social anxiety, people with Williams syndrome often suffer from debilitating generalized anxiety and worry. Compared to the general population, they are about five times more likely to suffer from generalized anxiety disorder.126 And part of that may be the price of tuning into other people’s distress.
The case of Williams syndrome is certainly an extreme. But all of us can experience empathic overarousal at times.127 You’re curled up on the couch, comfortably watching the evening news when they cut to a commercial. You see the sad face of a young girl and a voice says,
This is ten-year-old Maria. She lost her parents when she was only two. She lives in Mozambique with her blind grandmother and her severely disabled sister. As soon as the sun rises, Maria is hard at work—gathering firewood, scavenging for food, caring for her sister, and working in the fields. Every day is the same. She’s tired. Hungry. And sick. There are millions of children just like Maria who are hurting, barely surviving . . . One person—just like you—can help make a difference for one child . . . And all it takes is one phone call. . . .*
How could you resist such a plea? In his book The Life You Can Save,128 the philosopher Peter Singer lays out a simple but compelling argument that our failure to donate more to aid agencies is not only sad but ethically indefensible. There are many reasons for this moral failure, including our habits of self-interest, the numbing effect of abstract statistics about human suffering, and an insufficient capacity for empathy.129 But in some cases, it might be our capacity for empathy itself that makes us turn away.128 When sympathy crosses into empathy, the effect can be paradoxical. Empathizing means taking on the feelings of someone else. And sometimes that can be overwhelming, even for the best of us. In these small moments, we are all susceptible to a kind of psychopathic apathy.
What if empathy were your job? Doctors, therapists, aid workers, and child protective services workers must face the pain and suffering of strangers day after day. If they were to internalize that pain every time they were called on to help someone, they would be incapacitated. And some are. They suffer from compassion fatigue, a kind of emotional exhaustion that can lead to burnout and even posttraumatic stress disorder. For example, studies have found that 30 to 60 percent of oncologists experience emotional exhaustion and burnout.130 In a study of New York City social workers who were involved in counseling victims of the 9/11 World Trade Center attack, more than a third of those who had been extensively involved in counseling had symptoms of compassion fatigue and posttraumatic stress.131
But even when it’s not so severe, the costs of caring for the sick, suffering, and dying can have a numbing effect. And there’s the irony. We need professional helpers to be empathic but also to maintain emotional distance. I remember well the twin anxieties that gripped me and my fellow medical students as we began our medical training: Will I be able to comfort the anguished without crumpling into a heap of quivering despair or will I lose my humanity and turn into one of those heartless doctors who chirps “How are we feeling today?” when my patients need me the most? Beginning with the emotional trial by fire of dissecting a cadaver on the first day, the process of medical education is carefully calibrated to produce doctors who can both feel and heal. To be effective, those of us whose job it is to care must live in the middle ground between the empath and the psychopath.
* Most of the several hundred breeds recognized today were created by selective breeding that occurred within the last five hundred years.34
* Not all experts agree with this “emotional reactivity” account, and there is vigorous debate about whether animals really have anything resembling a theory of mind.40, 41 No one is claiming that canines, scrub-jays, or nonhuman primates have a theory of mind to match our own. But the evidence from comparative studies of animals does illustrate how the rudiments of mentalizing could have arisen from evolution’s influence on brain function
* In the same year, Hans Asperger, another Austrian-born physician, independently described the same syndrome, which he called “autistic psychopathy.” Although Kanner’s is often recognized as the pioneering description, Asperger is probably more widely known for the syndrome of high functioning autism that now bears his name.
* In 2013 the new edition of the DSM (DSM-5) will appear. As of this writing, the proposed DSM-5 criteria include some major changes to the criteria. Most important, classic autism, Asperger syndrome, and PDD-NOS would all be collapsed into a single category called “autism spectrum disorder.”
* So far, CNVs that have been associated with autism account for only a small proportion of cases—less than 10 percent. Other genetic and epigenetic variations have also been associated with autism, but in most cases, the causes and risk factors for autism are still unknown.
* Advocates for acceptance of “neurodiversity” have argued that the concept of “normal” or “neurotypical” is inherently stigmatizing and ignores the ubiquitous and natural diversity of mental functioning.82, 83
* Here’s a tip: the best poker face is not what you’d expect. In one study, subjects were brought into the lab to play Texas Hold ’Em poker.94 The only information they had was their own cards and the face of their opponent. The subjects made more betting mistakes when their opponents looked trustworthy and approachable than when the opponents had a neutral poker face. So next time you’re in a poker game and you want to annihilate your opponent, smile and try a little tenderness.
* Of course, emotional contagion works in adults, too—that’s why God made laugh tracks.
* From a video entitled “One” on the Save the Children website http://www.savethechildren.org/, accessed December, 23, 2009.
* Answers: Malcolm X, Bill Clinton, Scarlett Johansson, and John Wayne