9

Storytelling and the Theory of Mind

If you don’t know the trees you may be lost in the forest, but if you don’t know the stories you may be lost in life.

—Siberian Elder

Storytelling involves an array of mental activity—the aural construction and reception of sound symbols, rich with connotation and association, as metaphors and schemas rise, blend, fall, while momentary elements of narration are held in working memory and then reinterpreted by the new information that follows—since stories are necessarily linked to time. While all this is going on, our brains are continually reassessing the entire narrative past, while anticipating the future course of narrative events. And on top of all this, the story, as we listen, is making us feel something—fearful, happy, sad, shocked, sexually awakened, jealous, worried, or embarrassed—as the memories of our own lived experiences come flooding through. For each of us visualizes the details and interprets the meaning of stories through his or her own subjective past. If the protagonist is scared we’re a little scared ourselves. If the main character is sitting on top of the world, we are also a little bit inflated. We feel stories because of our lived past and our emotional memories.

In the first breath of the story we hear that the wolf is stealing from the forest. In the second utterance, we learn he is hungry. In the third, we learn the wolf is hungry for us. Our connective mind snaps the pieces together and works them into a present whole, as we listen logically, putting two and two together, but listening emotionally as well, as our hearts race and our palms sweat, as our attention is focused on the teller from whom we want to find out what will happen next. Emotion and feeling are at the heart of stories and are what give stories value. We care what happens next. Story is an activity we never tire of, for it is the way the human mind works. As P. C. Hogan says, stories are universal:

Literature—or, more properly, verbal art—is not produced by nations, periods, and so on. It is produced by people. And these people are incomparably more alike than not. They share ideas, perceptions, desires, aspirations, and—what is most important for our purposes—emotions. Verbal art certainly has national, historical, and other inflections. The study of such particularity is tremendously important. However, literature is by all people at all times. As Paul Kiparsky put it, “literature is neither recent nor a historical invention. In fact no human community lacks a literature”; no group is “so wretched that it does not express its memories and desires in stories and poems.”1

We put the pieces of experience together every day—trying to figure out the world and our place in it—through both the large stories and the small, both subconsciously (95 percent or more) and with consciousness. Language allows both the creation of new worlds and the ability for us to move into that territory through imagination, taking us out of numbing drudgery and giving us freedom. Gossip was one such freedom that language allowed—further connecting our web of human relationships and infusing those connections with theater (drama and comedy). There were many more freedoms language allowed us—mythic worlds, time, calendars, rituals, religion, even hope itself. Language carved up the world into not only more explicit divisions than pre-language categories and thought schemas could: me and you, you and him, them and us. Language also divided the world into taxonomies: the tusked, the hooved, the winged. It was no accident that the first task God gave man in the Garden of Eden was to name the animals, for with naming comes differentiation and power through defined networks and an explication of relationship: kinship, lineages, family trees (not only between humans and other humans, but between humans and animals, as well as animals and animals, plants and plants). Patricia S. Churchland has called the brain’s ability to parse the world, “cognitive compression,” which “helps to categorize the world and reduce the complexity of conceptual structures to a manageable scale.”2 Language allows for the natural categorizing process of brain activity (which creates a mapped version of the world) to be cast it into an exterior communicative web. A portion of an individual’s mapped mind can be shared through the oral transference of story.

Unconsciousness and Consciousness

The mapping mind is directly related to the greatest mystery of all: that of consciousness. As John Horgan says in “Can Science Explain Consciousness?”

Investigators are probing its [the brain’s] deepest recesses with increasingly powerful tools, ranging from microelectrodes, which can discern the squeaks of individual neurons, to magnetic resonance imaging and positron emission tomography, which can amplify the cortical symphony . . . [which has allowed] “a growing number of scientists” to look into questions regarding “the most elusive and inescapable of all phenomena: consciousness, our immediate, subjective awareness of the world and ourselves.”3

The problem of consciousness is being examined from many angles, and there are numerous findings, many of which are converging. David J. Chalmers says in “The Puzzle of Conscious Experience,” “consciousness is subjective, so there is no way to monitor it in others. But this difficulty is an obstacle, not a dead end. For a start, each one of us has access to our own experiences, a rich trove that can be used to formulate theories.”4 Those experiences are turned into stories, as our minds naturally create narratives, the most important perhaps being the narrative each of us has about ourselves. We naturally create a mythic “me,” which is nearly always imagined as a traveler on a journey, as we live in the arrow of time. As we progress on our path, we both collect anecdotes and attempt to make sense of them. While the right side of the brain primarily deals with collecting information, the left side handles making sense of the information that has been collected, creating the stories that form our lives.5 Kotre describes these two aspects of the brain, which Sedonie A. Smith in turn calls the “librarian element” and the “mythmaking element.”6 The mythmaking element is the creation of the “self,” that autobiographical sense born from bits and pieces of memory and turned into the perceptions of “this is who I am,” projecting the feeling of a single being who, though constantly changing through time, remains somehow the same.

But memory is not what most people believe—a video tape in the brain that plays back the past. Instead, memory is something that is reconstituted continually in the present. The brain forms connections with each moment that is memorized, which involves emotion, images, sounds, smells, and context. At the moment of remembering it reassembles these elements, igniting the same sensory elements throughout the brain as in the moment the memory first occurred. Antonio Damasio’s latest hypothesis is that memory is retained through a combination of the disposition and the mapping systems of the brain. Neurons control movement, something that plants did not evolve. Neurons can also create maps of the body and of the outside world that the body lives in. Map-making is the primary function of an advanced brain. A “slice” of neurons operates the way a pixilated screen does: certain molecules in the brain are turned off or on to replicate the world the eye takes in, just as certain pixels are activated or are inactive on the pixilated screen. A computer screen can be made to look like anything—an old black and white movie, a luscious tropical plant, an African lion, even the world itself spinning blue and white in outer space. The pixels that constitute the screen can activate instantly, revealing an unlimited number of patterns. Neurons also mimic on a grid formed from what the senses see, hear, feel, taste, touch, all of which Damasio calls images. By mapping, the brain literally replicates what it takes in from the world, later translating these images into meaning. An apple, for instance would be portrayed as oval shaped, red, shiny on a visual grid. But the brain/mind translates that different ways, depending upon the larger context. This brings us to another essential part of mapping: the creating of categories. The brain allows for a system of value—a hierarchy of levels. Food is elementary, but there are more complex meanings that fall under an element of an apple. It exists in relation to all the other apple pieces that are part of the category it “lives” in, such as an apple falling from a tree hitting Newton on the head, or an apple under a tree in which two naked humans stand. Immediately, we grasp a higher order of meaning, for the apple will remind us of sin and sexuality—the Garden of Eden—and categories of religion, ethics, morality, and many more, will be opened up. The wiring we are born with, and the wiring we acquire from experience both work to interpret “apple” and all of its various meanings and manifestations.

The brain takes in data from the senses and arranges it into categories to create order out of chaos. It does this for reasons of survival. Imagine a Homo erectus looking out on an African landscape. She would have to make sense of colors and shapes: a herd of elephants, a volcano, three lionesses, two thousand zebra, a river of crocodiles, and a fierce wind. The brain evolved as a way of assessing values in order to make predictions about the future and how a particular story might end up depending upon the actions one might take. Any of those things in the landscape just mentioned could be a threat. The lions are large dangerous animals, but they also have respectable brains and are probably in fear of the volcano that is spouting ash and not much in the mood for a meal. The crocs, less brainy, probably have little awareness of the volcano and would gulp you right up, so you have to worry about them. The elephants are smart but very aware and more dangerous even than lions (more people are killed by elephants in Africa than lions) as they could stampede, like the zebras could, who are less intelligent, but there are more of them. One might be able get out of the way of these large creatures by observing them closely and climbing into the trees at the right time. But then again the volcano spew could be very dangerous, full of chemicals that could hurt lung tissue or even cause death if inhaled, which the wind will carry, also fanning fires. So the best path would probably be to not go the way the wind is blowing, but go at right angles from the wind, or to get behind it on the other side of the volcano. The brain would have instantly put things in categories to make such an assessment of these threats: animals and non-animals; predators and stampeding animals; wind and volcano smoke.

But mapping, and creating categories are not the only things the neurons allow. The older job they had was in creating dispositions (the ability to learn to go this way instead of that way when a particular stimulus occured). The dispositional system is ancient, the one primitive creatures use to make decisions that initiate action. A stimulus occurs and a muscle reacts in a particular way. Dispositions are simple and effective. The mapping system is much more sophisticated and recent, allowing for greater precision and greater predictive power. But when the mapping system evolved, the disposition system did not go away. According to Damasio, the two united as a way of saving space. Maps are costly in terms of space as they take up a lot of “memory.” Damasio believes that the disposition system encodes mapmaking, making more mental space available. When a memory is being recalled, the dispositional system (like a computer program that encrypts a large file and reduces it for storage or transmission) contains a formula to reconstitute the map. The map recharges the molecules in the sensory cortices—as they were excited originally—recreating the senses, context, mood, and ambience of the original moment and merging these with what is going on at the moment a memory is being recalled. According to Damasio:

the disposition was commanding the process of reactivating and putting together aspects of past perception, wherever they had been processed and then locally recorded. Specifically, the dispositions would act on a host of early sensory cortices originally engaged by perception. The dispositions would do so by dint of connections diverging from the disposition site back to early sensory cortices. In the end, the locus where memory records would actually be played back would not be that different from the locus of original perception.7

You sit in your chair and think of a candy store you went to one day as a kid. The dispositional part of the brain connects with the mapmaking part, and the old map comes back, igniting the original sensory areas of the brain as they were in their original time zone. It’s like you can taste your favorite chocolate bar all over again, and be dazzled by all the candy packages and wrappers everywhere; you see the woman working the counter with her bright red lipstick and smell her perfume, and you are reminded of your mom telling you to quit going into the candy store every day or else you’ll get fat. You remember all that and much more tied to that era, while at the same time your memory is tied to the present, to the fact that you are sitting in a chair now, twenty years later, twenty pounds fatter than you were in middle school. So you feel guilty about that and tell yourself that you’ll go on a diet, but first you need one more chocolate bar, so you get up and go out the door and drive to the nearest place that sells candy bars, while all the time you can still hear your mom’s voice in your head telling you not to do it.

Of course, memories are imperfect, not exact representations, and our interpretations of events from the past is subject to contemporary biases. And all of this activity is being processed in the subconscious. What “the brain computes,” Michael Gazzaniga states, in The Mind’s Past, the mind is the “last to know.” The conscious mind is at the very end of the line in terms of events. The “interpreter” part of the brain “reconstructs the brain events and in doing so makes telling errors of perception, memory, and judgment.” For this reason, Gazzaniga states flatly that “biography is fiction,” and that “autobiography is hopelessly inventive.”8 The literature regarding false memory is voluminous, from episodes of induced “memory” of psychotherapy “revealing” mythical childhood sexual molestations to the well-known unreliability of first-hand witnesses in court. Studies of those who have had their corpus callosum severed (the connecting nerves that hold together the left and right hemispheres of the brain) show repeatedly that in such patients the left brain invents stories to explain what the right brain is doing.

From an evolutionary point of view, it makes sense that higher-brained animals would have developed a fairly reliable memory, for it is an essential tool for survival: the ability to recall environments that offer food, shelter, water; the ability to remember one’s friends and enemies within a social environment in which alliances play a crucial role; the ability to remember the lessons of behavior that might engender survival, such as the right food choices, the right way to avoid predators, and the most successful mating strategies. Evidence has shown that memories with greater image detail are more reliable—so a watering hole by two palms next to a mountain that looks like a pyramid—is probably a more factual memory than the piece of gossip someone told you at the bar six weeks ago about their sister. Memory about people and events can be very unreliable, which is why first-person testimony in criminal cases is notoriously bad (and why many innocent people are locked up in jail). Memories are creations of the “self.”

That “self,” created by the brain, the protagonist “me,” “I,” is neither learned nor taught but is universal to human experience. Its centrality to our being must also have served some evolutionary purpose, for unlike other higher animals—who hold an immediate sense of core consciousness with memory of past events, locations, and relationships—there is no evidence that even our closest cousins, chimps and gorillas, contain within their minds an autobiographical self, at least not to the degree we do (chimps, dolphins, elephants, magpies, and some other animals can recognize themselves, through experiments involving self identification in a mirror, but that is not the same).

Damasio makes clear that the “self” is constituted from all the cells and organs of the body, the organs and primitive brain areas that the brain maps. The self is not something derived just from higher consciousness. But what advantages might the constructed self have had for our ancestors, when there’s plenty of evidence that it is not always reliable? Is it just an accidental by-product of having a large brain and greater intelligence, just an auxiliary to consciousness? Or is it an accident of language development—that once we began to talk we automatically began to practice self-talk—that incessant chattering of the mind that we all experience daily—which bolstered an already dormant mythical story-telling talent (tied to basic cause and effect processing of schema manufacturing) creating the story of self? Bickle points out that the wiring of self-talk (from studies of brain activity through scanning) has very little to do with cognitive ability:

Given the limited access enjoyed by the language regions to neural networks that subserve specific cognitive and behavioral tasks, these narratives are actually outright fabrications, as is the self-in-control they create and express. The linguistic contents of these inner narratives about the self’s causal control are false; the causal effects they attribute to deliberations and exhortations are pieces of fiction that don’t square with the known anatomical and biological facts. Cognitive processing is occurring throughout the cortex (and subcortically). But activity in the brain’s language regions, and hence the neurally realized narrative self, neither accurately reflects nor causally affects very much of what is going on [emphasis his].9

Yet, regardless of the unreliability of self-talk and the lack of empirical validity of self-narrative, we find the stories we make up essential to the concept of “me.” While every living thing (from the gene on) has an innate and tenacious desire to live, for large-brained Homo sapiens, the story of oneself (and one’s tribe), is probably—outside of the individual drive for hierarchical dominance and reproduction—the greatest human obsession. I think the two are intricately linked. Having a self gives purpose and drive to the individual that could very well have an adaptive quality for survival, spurring an individual to want to keep going in even the most desperate circumstances. In order to make sense from an evolutionary perspective it would have to have had some survival ability for an individual, even though sense of self is also tied to the tribe’s mythic system of belief and most elements of culture. Possibly, a sense of self could be caused by mirror cells and social awareness. However it came into being, self becomes a driving force,. As the genes desire immortality through reproduction, so do our embodied minds want to reach immortality through stories that live on after our bodies rot. The inscribed gravestone in the local cemetery is such a testament, as is the Pyramid of Giza, and the terracotta warriors of China. Having dominion over the earth essentially means imprinting one’s particular story upon the environment. As a species, we have gone to war over and over again to make sure that our stories and not someone else’s are the ones that live on. In a sense, story is the cultural equivalent of the gene, and to have a story (for an individual or a group) has often meant fighting tooth and nail to insure it’s not forgotten. Story is the encapsulation of the entire imagined identity—from past, to present, and future—and the drive to preserve the “self” of the individual or the tribe is a powerful force.

W. J. Freeman calls the human ability to create and interpret stories of self “the examined life” and he makes the point that the “truth” of events can often only be ascertained beyond the moment in which they occur—just as we finish reading a story and look back upon the earlier episodes with a new clarity of what the preceding events mean.10 Freeman says that this ability is valuable to the human condition, giving us greater insight into the meaning of our lives. But it still would have to be the consequence of natural or sexual selection.

Theory of Mind

Throughout the animal kingdom, it is social animals who test highest on intelligence tests (of course, we humans are the ones defining “intelligence”). And a general consensus is emerging that high intelligence stems from individuals within a species interacting with their peers, where knowledge of others within the group is required for survival. For example, in primates one’s relationship to others must constantly be calculated to establish and secure one’s place in society. A primate must know its rank in relationship to others—including a history of interactions—and be able to be cognizant of numerous individuals. For humans, Theory of Mind, knowing the minds of others goes to the heart of our humanity, and we have this ability in spades. We must be able to inhabit the world of another through imagination. We see people around us and believe we know how they think and feel. We can even imagine what another person knows and feels about someone else’s knowing and feeling of yet another person far away. We can conjure up, and store, multiple scenarios in our minds, mapping all these varied perspectives at once. And in our species, children are able to understand what other people think at about the age of four. They begin to anticipate another person’s consciousness and perspective and include it in their own.

A 1978 article by psychologists David Premack and Guy Woodruff, “Does the Chimpanzee Have a ‘Theory of Mind,’” began the scientific look at how we think about others. While scientists still argue to what degree non-human animals possesses a Theory of Mind, there is no doubt that human beings are the masters of the skill. As Jesse Bering says in The Belief Instinct, having a Theory of Mind “was so useful for our ancestors in explaining and predicting other people’s behaviors that it has completely flooded our evolved social brains.”11 In other words, we continually construe the intent of others (while we also imagine intent in forces of nature, inanimate objects, animals (even attributing natural acts as emanating from the gods). Even if we don’t get it right all the time, the fact that we can accurately predict the behavior of other humans most of the time gave enough advantage to our ancestors so that the Theory of Mind trait is now part of our genetic endowment.

Storytelling is really Theory of Mind fictionalized, for each and every story deals with the psychology of intent: what drives a character, and how will those forces influence a character to behave. In story, we try to coax out the elements of motivation, and by doing so we test our own Theory of Mind. Whereas real living people are extremely complicated and contradictory, our storied selves are necessarily less so (as Art is inevitably a reduction of real life with its vast and infinite interactions), enabling us to more easily grasp and essentially fine-tune our observations and calculations of others. Stories, and their interpretation, allow us to flex our Theory of Mind, and to improve our assessments and predictions.

As we will see, elements within the trickster stories reveal much of what has just been discussed: the struggle between the dispositional system (which causes behavior to occur in more automatic and primitive ways) and the mapmaking system (which is more “deliberative” and necessary for higher levels of “conscious” thought). Trickster often acts in a dispositional manner (on instinct) with little deliberation and consciousness. Yet his actions are always played out against other members of society who are highly conscious of social codes and taboos. Trickster’s Theory of Mind is usually primitive, though he has to have some Theory of Mind to outwit his enemies, which happens frequently. Trickster can be a kind of transitional fossil showing us from whence we’ve come while not quite having arrived at where we are.