1

Finding a Voice

Memory as Word

When did hominids become human?

When we used our memories to do more than remember.

When did that occur? That’s not any easy question to answer—certainly not as easy as it might seem from those charts of the “Ascent of Man” we first encountered as children and have seen parodied ever since.

There, the answer was obvious: in the long parade from the knuckle-dragging Ardipithecus ramidus to the upright Australopithecus to Homo habilis carrying his stone scraper, to Homo erectus with his flint knife. Next, depending upon the complexity of the chart came Neanderthal Man, carrying a spear on his sturdy shoulders.

Finally, and you can tell this was the culmination of human evolution because the figure was standing upright, his hair combed, looking European, and in some posters even clothed to protect our same-species modesty, stood Homo sapiens in all of his mid-twentieth-century glory. He always looked about to shake your hand and introduce himself as the Southeastern regional sales manager.

Implicit in the chart was the notion that man became Man somewhere in midstride between heavy-browed Neanderthal and shiny new Brussels bureaucrat. Older charts made things a little more complicated, because, unexpectedly, there was a new guy (also covered in skins because he looked a little too much like one of your neighbors) slipped in between Mr. Forehead and ourselves. This was Cro-Magnon Man, who—until he was judged to be just modern man in mufti—presented the depressing paradox of apparently being a more impressive specimen than modern man.

Still, missing links (real or imagined) or not, the Ascent of Man seemed a neat and straight parade across the course of a couple million years. And the threshold between early man (simple tools, hunting parties, small family groups) and modern man (computers, cities, nation-states) lay somewhere in that blank space just in front of Mr. Neanderthal’s mighty forehead.

Part of the appeal of limited knowledge is that it often makes organizational schemes and taxonomies really easy. But eventually you dig up enough bones and fossils that you have to put a new head—and a new name (Apatosaurus)—on everyone’s beloved Brontosaurus. And so, in the four decades since the discovery of “Lucy,” the little female Australopithecus afarensis who lived more than three million years ago, the Ascent of Man has changed from a single-file march through history into something closer to the crowd at the end of a hockey game milling around in the plaza and slowly making its way to a single exit turnstile.

Every year, archaeologists, armed with ever more powerful investigative tools, find new bits of bone and other artifacts that alter—sometimes radically—our understanding of hominid history. And it is only going to get worse: Thanks to phenomena such as “genetic drift” (in which rare genes can come to dominate isolated populations) the closer we look, the more alternative evolutionary pathways, breakthroughs, and dead ends we are likely to find.

A case in point was the controversial discovery in 2004 on the Indonesian island of Flores of the tiny bones of “Hobbit people” (Homo floresiensis), most of them less than three feet tall.1 Remarkably, some of these skeletons are only about 13,000 years old, making them contemporaneous with modern man. Whether the Hobbit people were a distinct species or merely the bones of Homo sapiens with genetic diseases (such as lack of a thyroid gland) is still being debated.

But perhaps the most interesting recent discovery comes from the spot just behind Neanderthal Man’s huge head on our old chart. His name is Homo heidelbergensis (named after the university) and though first identified at the beginning of the twentieth century, his importance wasn’t really understood until the 1990s—which is why he is all but unknown to the general public. H. heidelbergensis both answers and complicates an important question in human evolution: What happened during that apparent transformation from Neanderthal to modern man 50,000 years ago in Asia (30,000 years ago in Europe)?

The answer, scientists now believe, begins with H. heidelbergensis, who appeared about 600,000 years ago. H. heidelbergensis was an impressive figure: heavily muscled, six feet tall (there may even have been some seven-footers), and with a brain about the size of modern man’s. He knew how to use simple tools. And, most remarkably, he also appears to be the ancestor of both Neanderthal and modern man.

That helps to explain why, at least toward the end of the former’s existence, Neanderthal and modern man appear to have existed side by side. Neanderthal Man got most of Heidelberg Man’s looks—the heavy bones, beadle brow, and the comparatively large brain. But modern man got the height and added a uniquely flat face and a lanky frame. And if modern man didn’t inherit quite the cranial size of his Neanderthal cousin, he instead got something even more important: language, and a brain to process and store it.

As anyone who watches science documentaries knows, chimpanzees and apes, for all of their brainpower (our appreciation of which also seems to grow by the day), have limited ability to speak because of a weak larynx due to a narrow cervical vertebra. What most people don’t know is that, as with deaf-mutes, this inability to verbalize is further constrained by an inability to hear—in the ape’s case, a lack of capacity to differentiate between certain vocal sounds.

Of all of the points of divergence between man and apes, this is a big one—a fact made clear when researchers first taught apes to use sign language and were stunned by their facility. Historically, this divergence appears to have taken place with Heidelberg Man’s immediate predecessor, Homo ergaster, a southern African hominid who should also stand on the chart between H. habilis and Neanderthal Man.

Like Heidelberg Man, Homo ergaster is a very interesting character. Huge—he may have averaged a couple inches taller than six feet, and the females were nearly as tall—Homo ergaster appears to have used his comparatively larger brain not only to gain mastery over fire but to take hominids out of Africa for the first time. As earthshaking as those achievements were, H. ergaster’s greatest achievement was to evolve both a wider cervical vertebra—which gave him the first “human voice”—but also a new middle- and outer-ear configuration that enabled him to hear the voices of others.

This was hardly a coincidence. The competitive advantage of a more facile voice was amplified by the improved hearing of that voice—and vice versa. H. ergaster probably never had a true speaking voice, much less developed a spoken language; his brain was still too small for that. But as with fire and tools, H. ergaster was, if not the ultimate owner of spoken language, certainly its pioneer.

The task of turning this capability into a defining human characteristic fell to H. heidelbergensis, with his larger and nimbler brain. Certainly he had the physical tools to do so. And there is a wealth of circumstantial evidence—for example, he was the first to honor and bury his dead, he developed relatively precise tools, and may have collected red ochre for painting and body adornment—that suggests a level of cultural sophistication that seems impossible without some kind of complex form of communication.

But was it a spoken language? We may never know the answer to that. There are no known Heidelberg Man drawings or carvings to suggest pictogram-based communication. And as anyone who has ever been in a hunting party or a reconnaissance team knows, it is possible to convey a considerable amount of information—even without a formal grammar—via a very small repertoire of hand signals. Indeed, being prodigious hunters operating in small family units, Heidelberg Man may well have created a kind of sign language like that found in later hunter-gatherer societies (such as American Indians) to create a kind of lingua franca for those rare intertribal encounters.

But that larynx and inner ear weren’t evolving without a competitive advantage. So, we can assume that Heidelberg men and women were communicating with one another with an increasingly sophisticated vocabulary of sounds, if not yet a true language. Further, we can also assume that these sounds were taught to thousands of generations of progeny, who slowly but surely added to the common repertoire. And, given the flexibility with which these sounds could be made by the evolving voice box—and the fact that verbal communication didn’t have to be line-of-sight—it seems pretty likely that, had you been walking in the Bavarian woods a half-million years ago, you would have heard proto-humans calling out to one another across the valleys and through the forests.

Spoken language might not have conferred much of an additional advantage during the hunt, but it certainly did before with strategy and after with the distribution of the spoils. And in a world of terrors and dangers, shouted warnings would be especially useful—especially to warn people looking the wrong way, or to assemble the tribe quickly in an emergency.

But this is as far as Heidelberg Man got, even in the most optimistic analysis. He could convey information (look out!) in the present—and perhaps reference physical objects in the past (the wooly mammoth with the crooked tusk we killed)—but not much more; and even the latter was probably better expressed with sign language.

More sophisticated spoken language would have to wait for Heidelberg’s descendant, Neanderthal Man. This may come as a surprise to many readers, who were taught in school, movies, and television that Mr. Neanderthal, the stereotypical “caveman” only managed to communicate with an array of grunts and gutturals. But anthropologists have challenged that notion ever since 1983, when an Israeli dig of Neanderthal skeletons uncovered a hyoid bone.2 The hyoid bone is a c-shaped structure that acts like a roof truss, tying together the tongue and the larynx and enabling them to brace off each other to produce a wider spectrum of sounds.

Hyoid bones have been around a very long time—they evolved from the second gill of early fishes—but the particular shape of these Neanderthal hyoids was thought to be unique to humans, creating a “descended” larynx that enables Homo sapiens to not only wrap their voices around an endless array of sounds but also to sing notes across multiple octaves. Now, it was found, Neanderthal had the same hyoid bone. Genetic research also found that Neanderthal shared with modern man the FOXP2 (Forkhead Box Protein P2) gene form equated with language capability.3 These discoveries have led some researchers to speculate that Neanderthal men and women, like their modern counterparts, may have verbally communicated in two forms: speech and music.

A SIMIAN SONG

A singing Neanderthal is a long ways from the classic image of the slope-headed caveman in a bearskin dragging a club. But for all of his heavy features and brute strength, Neanderthal man was a very sophisticated, even artistic, creature, with a brain larger than (if not as advanced as) modern man’s. He built shelters and fabricated fine bone and antler tools, buried his dead, lived in extended communities, and organized some of the bravest and most sophisticated hunts in hominid history. Given all of that, it would not be surprising that Neanderthal Man, in some limited way, actually talked.

One archaeologist, Steven Mithen of the University of Reading, has argued that Neanderthal Man used a “protolinguistic” mode that reflected the fact that spoken language and music had not yet split and taken their separate paths.4 This simple, singing language he calls hmmmm—meaning that it was holistic, manipulative, multimodal, musical, and mimetic. But at the moment, Mithen’s a lonely voice: The current scientific consensus is that a language of even this little sophistication was still probably too much for Neanderthal Man.

Temporarily leaving the disturbing image of heavy-browed Neanderthal hunters serenading one another with song as they maneuver through the snow around a trapped wooly mammoth, let’s pause to consider what was going on inside the brain of one of those hunters.

First of all, put aside any old stereotype about our Neanderthal Man being stupid. He had a very powerful brain; in some ways—pattern recognition, multisensory processing, and comprehensive visual field analysis—it was likely more powerful than our own. But a modern human being would find inhabiting such a mind an enthralling—and terrifying—experience indeed.

What would be exciting about sharing Neanderthal Man’s mind is that it would exist in the present with an intensity few of us have ever known. The world around us would explode with so much information—sounds, colors, movements, shapes, textures, and smells—that it would almost be overwhelming. We modern humans consume drugs, watch movies, seek out adventures, and take dangerous risks just to feel for a few moments an eternal here-and-now that Neanderthal Man likely felt almost every second of the day.

That’s the good news. The bad news is that the cost of so completely owning the present is to lose all of the future and much of the past. Finding yourself inside a Neanderthal brain would be, in those intervals when you weren’t absorbed in the moment, a desperately lonely place. For one thing, that voice—of consciousness and conscience—that we hear in our heads would be gone. So would all of the stories and anecdotes that we’ve ever heard, and every memorable conversation we’ve ever had. What things we did learn—almost always by observing someone else—would be learned by rote and, because we lacked any ability to analogize, would be difficult to adapt to changed circumstances. Instead we would just ritually do things over and over, not understanding why it no longer worked.

Without a real language, our ability to interact with others would be severely limited. Certainly a lot can be accomplished nonverbally: hunting, child-rearing, sex, food preparation, and so on. But the inability to share fading memories of the past, or dreams of the future, of one’s hopes and fears, of new ideas and useful experiences … that would be devastating to anyone who had ever known such things.

But ultimately, what might be the most frustrating thing about finding ourselves in a Neanderthal brain would be that despite our new hyperacute sensory experience of the world around us, something profound and vital would be missing from our ability to enjoy those experiences. Not only would it be difficult to share those experiences with others but, just as important, we would have almost no capacity to contextualize them to ourselves. Without language, we lack a capacity for analogy and metaphor—perhaps the most important traits distinguishing human beings from all other living things on Earth. Without those two, our capacity to learn and grow intellectually would be profoundly impeded.

Neanderthal memory was likely a powerful engine, capable of remembering an almost infinite number of warnings of impending weather change, animal patterns and behaviors, precise geological and biological features along long migratory paths, and the tracks, spoor, and calls of hundreds of species and varieties. But without the ability to extrapolate from limited data to larger explanations, or to tie together disparate subjects in order to increase understanding—and, most of all, to tie the trends of the present into scenarios for the future—Neanderthal Man was doomed to be forever trapped in the “now,” in his short and brutal lifetime having to learn almost everything from experience.

Neanderthal Man had a human brain, but without language—or even with a protolinguistic like hmmmm—he couldn’t operate it efficiently. Most of all, without language, he couldn’t properly fill, organize, expand, or access his memory beyond a direct correlation between past experience and current stimulation. And without such a memory, Neanderthal Man couldn’t be fully human.

But 200,000 years ago, a second descendant of H. ergaster emerged in Africa. This hominid would take another 150,000 years to reach its ultimate form and on at least one occasion (the Toba Catastrophe 70,000 years ago) would come so close to extinction that the entire population might have fit in a couple Boeing 747s.5 But this creature, Homo sapiens—modern man—survived. We typically, and admiringly, credit this to mankind’s vaunted tenacity and will to live. True enough, but in the end what may have saved us (and enabled us to quickly recover) was our ability to work together in common purpose, to preserve our acquired knowledge and to build one innovation on another to create a kind of cultural momentum.

Archaic Homo sapiens, as this earliest form of modern man is called (the Cro-Magnon Man of the old charts) seems, with his skinny frame, light muscles, and delicate skull, awfully frail compared to his more robust predecessors. Indeed, it’s hard to see how he could have survived in the late Ice Age world.

But against all odds, he did. And the reason goes beyond anatomy. In fact, maybe for the first time in the 2 billion years of life on Earth, Homo sapiens men and women succeeded because they had found a way to make their physical attributes only secondary in importance. And they did so precisely by evolving those specific features that had appeared in their ancestors—a wide cervical vertebra, an advanced hyoid bone, an improved inner ear, and the FOXP2 gene—and then added one more new physical feature: a prominent and pointed chin that made possible a number of new verbal sounds, including clicking.

But the real advances were made inside the skull of these first modern men. All of this improved communications equipment meant that early modern man not only could both speak and hear increasingly complex utterances but could attach a vast array of different sounds—first phonemes, then syllables, then words, then sentences—to things and events in the natural world.

In other words, early modern man could talk. And in the construction of those words—that is, in the attachment of multiple sounds to increasingly complex phenomena—human language slowly evolved from the concrete to the abstract, from direct correspondence between object and name into the messy, inexact, and infinitely powerful world of analogy and metaphor.

Needless to say, accomplishing this required a whole different kind of memory—one dedicated as much to organization and filing as mere storage—that was as much language-based as stimulus-based, and was as much optimized for synthesizing new concepts from disparate pieces as it was for speed of access.

By turning memory from mere storage into a language-based scheme of organization, early modern man solved a puzzle that had challenged its ancestors for more than a million years: How do you use that powerful brain to transfer complex information to others? And in finding that solution, early modern man had discovered a remarkable side effect: That same facility with language also created a much richer internal life of the mind, filling it with stunning relationships, stories, and astonishing thoughts about life and death; about the Earth and the cosmos; and even about God and man’s relationship to the Divine—thoughts that no living thing had ever thought before.

And with such thoughts, these early Homo sapiens (which translates as “knowing men”) became human.

TOTAL RECALL

Memory is almost as old as life itself.

All animals have some form of memory, even if it is only a simple biochemical encoding of aversion to negative stimuli. Even schoolchildren know that a paramecium, once it has been adversely stimulated as a result of a certain type of behavior—for example, being electrically shocked for moving in a certain direction—will quickly “learn” not to act that way again.

What this means is that while you may need some kind of brain to think, you don’t necessarily need one to “remember.” All that’s really needed are some organic chemicals, which work similarly to an electronic resistor—linked to some physical action or behavior that can register environmental change. Thus, the “memory” of the unpleasant event is encoded in the chemical and results in a change in future response.6

Of course, it becomes a whole lot easier when you have many cells, and some of them—neurons—are dedicated to the job of sensing these changes, converting them into fast-reacting chemicals or electric charges, and transferring that information to other nerve cells in the body to create both a rapid response and an enduring record of what just happened. That’s essentially how the first animals functioned a billion years ago—and how the simplest ones do today.

The next step is to bundle those neurons into cords—nerves—that are organized like the roads and highways of a modern state. That is, you use small roads, streets, and byways to reach almost every point on the landscape. Then you merge them into higher-traffic arterials that eventually become on-ramps of highways and then a great interstate superhighway. This process took another half-billion years and resulted in increasingly sophisticated animals called chordates, a group that includes both human beings and tapeworms, and almost every crawling, swimming, walking, or flying creature in between. Chordates share a “peripheral” nervous system with their more primitive cousins, but also feature a distinguishing “central” nervous system that, in the simplest chordates such as flatworms and flukes, is little more than a proto–spinal cord.

As animals continued to evolve, the more advanced chordates not only began to protect their central nervous systems with bone and cartilage—spinal columns or backbones—but also to feature two distinct types of neurons: sensory, or those that send signals to the spinal cord; and motor, which send the response commands back to muscles and organs. This is basically the nervous-system setup you see in insects, lobsters, and worms—proving that you don’t need a particularly advanced nervous system to be fast, efficient, and in the case of a species like the cockroach, so perfectly evolved as to be almost immortal.

Once again, bugs and other creepy-crawlies have rudimentary memories. It’s enough to bring them into the world with a complement of instinctual behavior that makes them formidable hunters, consumers, and adversaries … but not enough to keep them from returning to that same hot lightbulb until they cook themselves. But the components are now there; all that’s needed is to perfect their configuration. One of the best solutions turned out to be putting a few knots in that central nervous system. With these bundles—“brains”—animals could begin to divide up tasks and store ever greater amounts of information. Tie a few of these knots along the course of a spinal cord—as in, say, a spider—and suddenly you’ve conferred some real competitive advantages in terms of the stalking skills and lightning response needed to be a successful predator.

It turns out, however, that evolutionarily speaking, the most efficient layout of the animal nervous system—especially if you want to tackle more complex challenges like migration, nest building, and baby rearing—is to combine all of those small bundles into one big brain. The milliseconds that you may lose in response time as the messages now have to travel the length of the spinal cord are more than made up for by the network effects of having all of those neurons in close proximity.

This need becomes especially acute when you start developing really efficient vision and a sense of smell, both of which consume a whole lot of brain processing power. In amphibians, and especially reptiles, these senses take up most of the brain. And indeed, they are so important that these animals began the process of producing a new kind of brain structure, the cerebrum—what we colloquially call “gray matter”—that contains neurons specifically organized to manage voluntary movements of the body, the senses, language, learning, and memory.

You can see where this is going. As animal life evolved—adding greater intelligence to existing orders and phyla, as well as producing whole new kinds of creatures such as birds and mammals with even greater intelligence—the new brains featured ever more gray matter. These growing cerebra likely were the result of the competitive advantage of having better visual perception and an improved sense of smell. But they carried in train an equally greater aptitude for facility with language, an ability to use that language to gain knowledge, and, not least, to store that knowledge using a language- and image-based filing system to improve access and recovery.

Exactly how all this worked has been a puzzle. Scientists long ago recognized that there are two distinct memory operations: short-term, which (as every cramming college student knows) seems to hold memories for about thirty seconds before they fade; and long-term, which seems to be a function of either a strong experience or repetition, and can last an entire lifetime. But how they work and what makes them different has long been a mystery. Only in the twenty-first century have researchers begun, using genetic manipulation, to get a glimmering of how memories are formed, erased, or stored.

As long suspected, memory creation is the result of a biochemical reaction that takes place in nerve cells, especially those related to the senses. Recent research suggests that short-term, or “working,” memory operates at a number of different locations around the brain, with special tasks tending to be handled in the right hemisphere of the brain and verbal and object-oriented tasks in the left. Beyond that, the nature of this distribution, retrieval, and management is the subject of considerable speculation.

One popular theory holds that short-term memory consists of four “slave” systems. The first is phonological, for sound and language that (when its contents begin to fade) buys extra time through a second slave system. This second operation is a continuous rehearsal system—like when you repeat a phone number you’ve just heard as you run to the other room for your phone. The third system is a visuo-spatial sketch pad that, as the name suggests, stores visual information and mental maps. Finally, the fourth (and most recently discovered) slave is an episodic buffer that gathers all of the diverse information in from the other slaves, and maybe other information from elsewhere, and integrates them together into what might be described as a multimedia memory.7

Other theories hold that short-term memory is, in fundamental ways, just a variant of long-term memory. But almost all brain scientists agree that the defining characteristic of short-term memory is its limited functionality—both in duration and capacity. Simply put, short-term memory fills up fast—scientists speak of four to seven “chunks” of information such as words or numbers, which short-term memory can hold at any one time—after which its contents either fade or they are purged.8 Rehearsal can temporarily keep important short-term memories alive, but ultimately the information must either be transferred to long-term memory or lost.

Long-term memory, though it uses the same neurons as short-term memory, is, as one might imagine, quite different in the use of those neurons. Whereas the short-term memory is limited in scope and capacity, long-term memory takes up much of the landscape of the upper brain and is designed to maintain a permanent record. Only in the last few years have researchers determined that memories are often stored in the same neurons that first received the stimulus. That they discovered this by tracking storage of memories in mice created by fear suggests that evolution found this emotion to be a very valuable attribute in a scary world.

Chemically, we have a pretty good idea how memories are encoded and retained in brain neurons. As with short-term memory, the storage of information is made possible by the synthesis of certain proteins in the cell. What differentiates long-term memory in neurons is that frequent repetition of signals causes magnesium to be released—which opens the door for the attachment of calcium, which in turn makes the record stable and permanent. But as we all know from experience, memory can still fade over time. For that, the brain has a chemical process called long-term potentiation that regularly enhances the strength of the connections (synapses) between the neurons and creates an enzyme protein that also strengthens the signal—in other words, the memory—inside the neuron.9

Architecturally, the organization of memory in the brain is a lot more slippery to get one’s hands around (so to speak); different perspectives all seem to deliver useful insights. For example, one popular way to look at brain memory is to see it as taking two forms: explicit and implicit. Explicit, or “declarative,” memory is all the information in our brains that we can consciously bring to the surface. Curiously, despite its huge importance in making us human, we don’t really know where this memory is located. Scientists have, however, divided explicit memory into two forms: episodic, or memories that occurred at a specific point in time; and semantic, or understandings (via science, technology, experience, and so on) of how the world works.10

Implicit, or “procedural” memory, on the other hand, stores skills and memories of how to physically function in the natural world. Holding a fork, driving a car, getting dressed—and, most famously, riding a bicycle—are all nuanced activities that modern humans do without really giving them much thought; and they are skills, in all their complexity, that we can call up and perform decades after last using them.

But that’s only one way of looking at long-term memory. There is also emotional memory, which seems to catalog memories based upon the power of the emotions they evoke. Is this a special memory search function of the brain? Is it a characteristic of both explicit and implicit memory? Or, rather, does it encompass both? And what of prospective memory—that ability human beings have to “remember to remember” some future act? Just a few years ago, researchers further discovered that some brain neurons can act like a clock in the brain, serving as a metronome that orchestrates the pace of operations for the billions of nerve cells there. Why? These and other features are but a few of the conundrums in the long list of questions about the human brain and memory. What we do know is that—a quarter-million years after mankind inherited this remarkable organ called the brain—even with all of the tools available to modern science, human memory remains a stunning enigma.

A THIN LAYER OF THOUGHT

With the rise of the hominids three million years ago nature found two valuable tricks to aid brain development. The first was to divide the brain’s activities into different regions, with the autonomic nervous system (that is, the parts of the body that run on their own, such as the heart, lungs and the many glands) managed by the lower, older parts of the brain, and the somatic (or voluntary) nervous system handled by the gray matter of the cerebrum. The higher functions of the latter, such as speaking and fine-detail motor skills, found their home in the ever-growing frontal lobes of the hominid brain.

But Nature saved her biggest trick for last: the cerebral cortex. This was a final layer of neurons—what might be called “very gray” matter because when it is preserved it is darker than the whiter neurons beneath—covering the entire surface of the upper brain. Untucked from the many brain folds that are designed to maximize its size in the confined space of the skull, the cerebral cortex is basically a four-millimeter-thick sheet of 10 billion neurons covering about 1.3 square feet.

Wadding a sheet of neurons that size into the comparatively small space of the Homo sapiens skull proved to be a huge challenge to evolution—one fraught with numerous vulnerabilities, but one with enough adaptive advantages to make those risks worthwhile.

And those risks were considerable. To carry that giant-sized brain, modern man lived with a thinner skull and less cranial padding than his Neanderthal counterpart, increasing the likelihood of concussion and fatal head trauma. It also led to newborn human babies with skulls sized at the very limit of the adult human female pelvis … making childbirth, until only the last hundred years in the developed world, the leading cause of death for young women.

But in terms of adaptation and the survival of the species, the cerebral cortex was worth the cost. That’s because within those billions of neurons was a capacity for integrating sensory input, abstract thought, language, and a prodigious memory that had no equal—indeed, almost no precedent—in the history of life on Earth. And incredibly, that was the least of it, because somehow, in a process that remains inexplicable, from this sheet of nerve cells emerged that most extraordinary and singular of traits: consciousness.

Someday, the odds (and the Drake equation) suggest, we will find life elsewhere in the universe.11 And if that happens, almost infinitesimally smaller odds predict that we will also eventually meet another conscious life form. But for now, and maybe for thousands of years to come, we Homo sapiens alone are both conscious and self-reflexive. For now, it seems that only we can appreciate that loneliness, and with our imaginations cast our minds to the edge of the cosmos in search of answers. Only we know that we are we.

Whether you believe in a divine spark, a network effect emerging from those billions of neurons, or some kind of quantum phenomenon taking place in carbon nanotubes inside those neurons, the fact that consciousness arose at the same time and resides in the same realm as language suggests something more than a casual relationship. In fact, the best explanation for the rise of human consciousness may come from the opening line of the Bible: In the beginning was the Word.

A COMPUTATIONAL CONNECTION

Every era has its dominant metaphors, lenses through which we look at the world. The Enlightenment used Newtonian physics, the late Victorians evolution. In the early twentieth century Einstein’s Theory of Relativity and Freud’s theory of the unconscious colored everything from art and literature to everyday morality. In the late twentieth century it was the Heisenberg Uncertainty principle.

In our time, the dominant metaphors come from the world of computers and networks. If, as we will see, men and women of the Enlightenment saw human thought as a ghost inside of an automaton, we imagine it as a very powerful and sophisticated computer motherboard, or as being located on a silicon chip inside a powerful microprocessor.

The brain’s three main functions are logic, memory, and input/output—and when we look at the human brain through early twenty-first-century eyes that’s what we see. To us, the human mind seems less a supercharged analytic engine and more a multiprocessing computer that balances these three functions of logic (the processing of data to produce new ideas and understandings), memory (with which we define ourselves and organize vast realms of experience), and input/output (the former combining sensory input with abstract sources such as the printed word; the latter emerging as language and art, ideas, and action).

As metaphors go, the image of the human brain as a modern multiprocessing computer isn’t a bad one, and is better than any that came before. That helps explain why many people are convinced that this path will eventually lead to real “thinking machines.” But for now, at least, the brain-as-computer remains an analogy, not a final explanation. A lot of features in real human brains are still missing from computers, including the ability to “heal” after a crippling physical injury, the enhancement of regularly used pathways to speed access, and an inherent sense of purpose and survival that can be found in even the simplest animals.

But most of all, there is the lack of consciousness in computers—indeed, fully expressed, in anything but us. Without consciousness, modern man becomes nothing more than a super-capable, and shockingly vulnerable, Homo ergaster. In other words, he likely wouldn’t have been born; and if he had, he wouldn’t have lasted for long, much less rule the natural world. It is consciousness that empowers those higher brain functions to reach the fulfillment of their potential.

To be conscious and self-reflexive—to know that you exist as an independent thinking being—is to also recognize that you are just a tiny, and very fragile, speck in a very large and dangerous world. It is to also know that you are mortal, and that one day you will die and all that is in your head will die with you. And in that collision between a knowledge of death and a sense of a purposeful, valuable self, there also emerges a sense that the universe has a larger purpose in which you may be a tiny part, but to which you can still make a contribution even after your death. You want to convey what you’ve learned, the wisdom you’ve hard-acquired, to those who will follow and honor your memory.

With his larger cerebral cortex, archaic modern man had the capacity to develop complex languages. But it was the increasingly complex demands of daily life, combined with consciousness and this irresistible call to meaning and purpose, that drove language forward toward realization. We learned to talk because we now had things to learn … and stories to tell.

Still, at the beginning, those stories were by necessity pretty simple. Research into animal language over the last few decades has shown that the verbal communications of many animals—from birds to dolphins and whales to monkeys—can be remarkably sophisticated. So what used to look like a singular achievement by mankind now appears more like just the latest advance in a long continuum stretching back to the chirping of crickets. And there is some truth to that: Neanderthal Man, and certainly H. ergaster, probably spoke in a manner that wasn’t much more sophisticated in design than, say, a humpback whale, and in content was little more than modern gorillas trained in sign language can produce.

But at the same time, it would be a mistake not to recognize that a profound discontinuity with the past takes place the moment modern man opens his mouth. That crumpled neural sheet of the cerebral cortex, with its vast memory capabilities, powerful facility with language, and, most of all, consciousness with its will to power and immortality, could do things that no other brain ever could. It could organize thought using logic learned from causal relationships observed in the natural world; it could synthesize new ideas through the metaphorical linking of two diverse notions, and it could imagine, perfect, and test scenarios about the world in order to more accurately understand reality.

No other creature had ever accomplished this; no creature, including earlier hominids (save perhaps Neanderthal Man), had ever known the concept of “I” or speculated into the future or used logic (deductive to understand the present, inductive to predict the future) as a path to truth. Indeed, no other animal had ever formulated the very notion of Truth. And as the allegory of the Garden of Eden suggests, to know truth is to also know falsehood. For good (the ability to create fictional realities) and evil (the destructiveness of an alienation from one’s self and others), modern man now knew how to lie.

With a real language (also for the first time in the history of life on Earth) complex memories both old and new (“new” memories being ideas) could be transmitted from one member of a species to another—a process made ever more facile as language itself evolved, and still evolves today, to perfect this function. Shared memories not only had a longer life span, but they could actually grow in size and utility as each sharer added to their content and then shared them again. This “common memory”—shared wisdom, as opposed to rote, shared skills—was almost always an improvement over what any individual had in his or her head. It meant that there was nearly always an intrinsic and obvious advantage to talking with one another, to sharing experiences.

Sharing, in turn, rewarded ever-larger groups of people, not only because language was easy to scale but because it also exhibited what we know today as Metcalfe’s Law: The value of a network grows much faster with the number of participants in that network. Add another member to the tribe and you don’t just gain that person’s memories but also the value of their interaction with every other member of the tribe. Obviously, the infrastructure, technology, and the social harmony of a society sets an upper limit to the size of the population—eventually adding new members will create unrest or a drain on resources that will give their addition a negative value—but that only creates an incentive for the tribe, village, city, or eventually, nation to put its collective head together and come up with solutions such as better food production, law enforcement, housing, and education.

So, let’s follow the string: The rise of a cerebral cortex, which enabled the genetic disposition in hominids toward language and a large memory to be fully expressed, led to the creation (somehow) of a phenomenon unique to modern man, (consciousness). Conscious men and women, anxious to serve their selfhood in the face of their new knowledge of the natural world and of death, found a competitive advantage in expanding language to become more encompassing, adaptive, and abstract. Then, in the act of using language to share memories (skills, experiences, ideas, stories) with others, modern man discovered the value of increasingly large social structures—and the advantage and power of social stratification and specialization. Finally, this specialization freed certain individuals in that society—shamans, priests, and eventually academics—to focus upon expanding both those common stores of memory and the language needed to manage them.

Whew. What is miraculous about this process is not just that it happened, but how quickly it occurred. As noted, Homo sapiens first appeared 200,000 years ago. Just 130,000 years later—a blink in evolutionary time—archaic modern man had already begun his migration out of Africa … and despite facing near extinction, still managed to explore and inhabit Eurasia and Oceania in just the next 30,000 years. Just 25,000 years after that, early modern men and women had found their way across the Bering Strait and had inhabited the Americas. Just 4,000 years after that, H. sapiens had begun to settle down from a hunter-gatherer existence into cities and was embarking on the first Agricultural Revolution. And 10,000 years after that, he left the Earth and walked on the moon. In the same historic interval, most mammals, birds, reptiles, amphibians, fish, and insects changed nary a bit.

MODERN MUSINGS

Fifty thousand years ago, about the time ancient modern man left Africa, the intellectual traits that had distinguished him from his predecessors had reached their full development. These human beings of the Upper Paleolithic were, by this point, truly “modern” humans, as their new name, Homo sapiens sapiens (essentially, “wisest wise man”), suggests. They were now almost indistinguishable from the people you meet today. These new men and women were capable of abstract thought—not least of which was an ability to look inward to their own consciences and to look outward to place themselves both temporally and physically in the larger natural world. They had a language, and could laugh and tell jokes, or weep over a long-lost love or a sad story. They were increasingly capable of separating truth from falsehood, or rationality from passion. Indeed, they were (and we are) the only rational animal to have ever lived. And over the course of a lifetime, they lived up to the “sapience”—the wisdom—in their name. Like us, they were tirelessly in pursuit of explanations of how things worked, and if their imaginations couldn’t find an explanation of the infinite and ineffable, their imaginations could at least encompass those concepts.

And as these modern men or women sat by the fire and looked up at the vault of the night sky with its moon and stars, they would have remembered the story told by a grandmother or the village shaman about how those distant lights found their way into the heavens. And they would wonder if they could tell that same story as well to their grandchildren … and they would fear that the story itself might one day be lost, because memories were such fragile and unreliable things, and words were just breaths lost on the breeze, like the smoke drifting up from the fire.

And what of their own life and all that they had experienced? Who would remember their story after they were buried in the ground with their sewing needles and spears?

If only there were some way to make those stories, and their own memories, as strong and enduring as the rocks themselves.…