Chapter 10

THE INTELLECTUAL CHESS GAME

Were I (who to my cost already am

One of those strange prodigious Creatures Man.)

A Spirit free, to choose for my own share,

What Case of Flesh, and Blood, I pleased to weare,

I’d be a Dog, a Monkey, or a Bear.

Or anything but that vain Animal,

Who is so proud of being rational.

The senses are too gross, and he’ll contrive

A Sixth, to contradict the other Five;

And before certain instinct, will preferr

Reason, which Fifty times for one does err.

—John Wilmot, Earl of Rochester

The time: 300,000 years ago. The place: the middle of the Pacific Ocean. The occasion: a conference of bottle-nosed dolphins to discuss the evolution of their own intelligence. The conference was being held over an area of about twelve square miles of ocean so that the participants could fish in between meetings; it was during the squid season. The sessions consisted of long soliloquies by invited speakers followed by a series of commentaries in Squeak, the language of Pacific bottle-noses. Squawk speakers from the Atlantic were able to hear memorized translations at night. The matter at issue was simple: Why did bottle-nosed dolphins have brains that were so much bigger than those of other animals? The bottle-nose brain was twice as large as that of many other dolphins. The first speaker argued that it was all a matter of language. Dolphins needed big brains to enable them to hold in their heads the concepts and the grammar with which to express themselves. The ensuing commentaries were merciless. The language theory solved nothing, said the commentators. Whales had complex language, and every dolphin knew how stupid whales were. Only the year before a group of bottle-noses had fooled an old humpback whale into attacking his best friend by sending out soliloquies about infidelity in humpback language. The second squeaker, a male, was more favorably received, for he argued that this was indeed the purpose of dolphin intelligence: to deceive. Are we not, he squeaked, the global masters of deception and manipulation? Do we not spend all our time scheming to outwit one another in the pursuit of female dolphins? Are we not the only species in which “triadic” interactions among alliances of individuals are known? The third speaker replied that this was all very laudable, but why us? Why bottle-nosed dolphins? Why not sharks or porpoises? There was a dolphin in the River Ganges whose brain weighed only five hundred grams. A bottle-nose brain weighed fifteen hundred grams. No, he replied, the answer lay plainly in the fact that of all the creatures on earth, bottle-nosed dolphins were the ones that had the most varied and flexible diet. They could eat squid or fish or…well, all sorts of different kinds of fish. That variety demanded flexibility, and flexibility demanded a big brain that could learn.

The final speaker of the day was scornful of all his predecessors. If social complexity was what required intelligence, why were none of the social animals on land intelligent? The speaker had heard stories of an ape species that was almost as big-brained as dolphins; indeed, for its body size it was even bigger. It lived in bands on the African savanna and used tools and hunted meat as well as gathered plants for food. It even had language of a sort, though with none of the richness of Squeak. It did not, he squeaked drolly, eat fish.1

THE APE THAT MADE IT

Around 18 million years ago there were tens of species of ape living in Africa and many others in Asia. Over the next 15 million years most of them became extinct. A Martian zoologist who arrived in Africa about 3 million years ago would probably have concluded that the apes were bound for the trash heap of history, an outdated model of animal made obsolescent by competition with the monkeys. Even if he noticed that there was one ape, a close relative of the chimpanzee, that walked on two feet, entirely upright, he would not have predicted much of a future for it.

For its size, midway between a chimpanzee and an orangutan, the upright ape, known to science now as Australopithecus afarensis and to the world as “Lucy,”2 had a “normal” brain size: about four hundred cubic centimeters—bigger than the modern chimp, smaller than the modern orangutan. Its posture was peculiarly humanlike, undoubtedly, but its head was not. Apart from its uncannily human legs and feet, we would not have had any trouble thinking of it as an ape. Yet over the next 3 million years the heads of its descendants exploded in size. Brain capacity doubled in the first 2 million years and almost doubled again in the next million, to reach the fourteen hundred cubic centimeters of modern people. The heads of chimps, gorillas, and orangutans stayed roughly the same. So did the other descendant of Lucy’s species, the so-called robust australopithecines, or nutcracker people, who became specialist plant eaters.

What caused the sudden and spectacular expansion of that one ape’s head, from which so much else flowed? Why did it happen to one ape and not another? What can account for the astonishing speed, and the accelerating speed of the change? These questions may seem to have nothing to do with the subject of this book, but the answer may lie with sex. If new theories are right, the evolution of man’s big head was the result of a Red Queen sexual contest between individuals of the same gender.

On one level the evolution of big-headedness in man’s ancestors is easily explained. Those that had big heads had more young than those that did not. The young, inheriting the big heads, therefore had bigger heads than their parents’ generation. This process, moving in fits and starts, faster in some places than in others, eventually caused the trebling of the brain capacity of man. It could have happened no other way. But the intriguing thing is what made the big-brained people likely to have more children than the small-brained ones. After all, as a diverse array of observers from Charles Darwin to Lee Kwan Yew, the former prime minister of Singapore, have noted with regret, clever people are not noticeably more prolific breeders than stupid people.

A time-traveling Martian could go back and examine the three consecutive descendants of Australopithecus, Homo habilis, Homo erectus, and so-called archaic Homo sapiens. He would find a steady progression in brain size—that much we know from the fossils—and he would be able to tell us what the clever ones were using their bigger brains for. We can do something similar today simply by looking at what modern human beings use their brains for. The trouble is that every aspect of human intelligence you consider as uniquely human applies to the other apes as well. A vast chunk of our brains is used for visual perception; but it is hardly plausible that Lucy suddenly needed better visual perception than her distant cousins. Memory, hearing, smell, face recognition, self-awareness, manual dexterity—they all have more space in the human than in the chimp brain, yet it is hard to understand why any of them was more likely to cause Lucy to have more children than it was to cause a chimp to have more. We need some qualitative leap from ape to man, some difference of kind rather than degree that transformed the human mind in ways that for the first time made the biggest possible brain the best possible brain.

There was a time when it was easy to define what made humans different from other animals. Humans had learning; animals had instincts. Humans used tools; animals did not. Humans had language; animals did not. Humans had consciousness; animals did not. Humans had culture; animals did not. Humans had self-awareness; animals did not. Gradually these differences have been blurred or shown to be differences in degree rather than in kind. Snails learn. Finches use tools. Dolphins use language. Dogs are conscious. Orangutans recognize themselves in mirrors. Japanese macaques pass on cultural tricks. Elephants mourn their dead.

This is not to say that all animals are as good as humans at each of these tasks, but remember that humans were once no better than them and yet they came under sudden pressure to get better and better, while animals did not. A well-trained humanist is already scoffing at such sophistry. Only people can make tools as well as use them. Only people can use grammar as well as vocabulary. Only people can empathize as well as feel emotion. But this sounds uncannily like special pleading. I find the instinctive arrogance of the human sciences thoroughly unconvincing because so many of its bastions have already fallen to the champions of animals. Beaten back from position after position, the humanists simply pretend they never intended to hold them in the first place and redefine the retreat as tactical. Almost all discussions of consciousness assume a priori that it is a uniquely human feature when it is patently obvious to anybody who has ever kept a dog that the average dog can dream, feel sad or glad, and recognize individual people; to call it an unconscious automaton is perverse.

THE MYTH OF LEARNING

At this point the humanist usually retreats to his strongest bastion: learning. The human, he says, is uniquely flexible in his behavior, adapting to skyscrapers, deserts, coal mines, and tundra with equal ease. That is because he learns far more than animals and relies on instincts far less. Learning how the world is rather than simply arriving in it with a fully formed program for survival is a superior strategy, but it demands a bigger brain. Therefore, the bigger brain of the human reflects a shift away from instinct and toward learning.

Like just about everybody else who has ever thought about these things, I found such logic impeccable until I read a chapter in a book called The Adapted Mind by Leda Cosmides and John Tooby of the University of California at Santa Barbara.3 They set out to challenge the conventional wisdom, which has dominated psychology and most other social sciences for many decades, that instinct and learning are opposite ends of a spectrum, that an animal that relies on instincts does not rely on learning and vice versa. This simply is not so. Learning implies plasticity, whereas instinct implies preparedness. So, for example, in learning the vocabulary of her native language, a child is almost infinitely plastic. She can learn that the word for a cow is vache or cow or any other word. And likewise in knowing that she must blink or duck when a ball approaches her face at speed, a child would not need to have plasticity at all. To have to learn such a reflex would be painful. So the blink reflex is prepared, and the vocabulary store in her brain is plastic.

But she did not learn that she needed a vocabulary store. She was born with it and with an acute curiosity to learn the names of things. More than that, when she learned the word cup, she knew without being told that it was a general name for any whole cup, not its contents or its handle and not the specific cup she saw first, but the whole class of objects called cups. Without these two innate instincts, the “whole object assumption” and the “taxonomic assumption,” language would be a lot harder to learn. Children would often find themselves in the position of the apocryphal explorer who points at a never-before-seen animal and says to his local guide, “What’s that?” The guide replies, “Kangaroo,” which means in his language “I don’t know.”

In other words, it is hard to conceive how people can learn (be plastic) without sharing assumptions (being prepared). The old idea that plasticity and preparedness were opposites is plainly wrong. The psychologist William James argued a century ago that man had both more learning capacity and more instincts, rather than more learning and fewer instincts. He was ridiculed for this, but he was right.

Return to the example of language. The more scientists study language, the more they realize that hugely important aspects of it, such as grammar and the desire to speak in the first place, are not learned by imitation at all. Children simply develop language. Now this might seem crazy because a child reared in isolation would not, as James I of England hoped he might, simply grow up to speak Hebrew. How could he? Children must learn the vocabulary and the particular rules of inflection and syntax specific to their language. True, but almost all linguists now agree with Noam Chomsky that there is a “deep structure” that is universal to all languages and that is programmed into the brain rather than learned. Thus, the reason all grammars conform to a similar deep structure (for example, they use either word order or inflection to signify whether a noun is object or subject) is that all brains have the same “language organ.”

Children plainly have a language organ in their brains ready and waiting to apply the rules. They infer the basic rules of grammar without instruction, a task that has been shown to be beyond the power of a computer unless the computer has been endowed with some prior knowledge.

From about the age of one and a half until soon after puberty children have a fascination with learning a language and are capable of learning several languages far more easily than adults can. They learn to talk irrespective of how much encouragement they are given. Children do not have to be taught grammar, at least not of living languages that they hear spoken; they divine it. They are constantly generalizing the rules they have learned in defiance of the examples they hear (such as “persons gived” rather than “people gave”). They are learning to talk in the same way that they are learning to see, by adding the plasticity of vocabulary to the preparedness of a brain that insists on applying rules. The brain has to be taught that large animals with udders are called cows. But to see a cow standing in a field, the visual part of the brain employs a series of sophisticated mathematical filters to the image that it receives from the eye—all unconscious, innate, and unteachable. In the same way, the language part of the brain knows without being taught that the word for a large animal with an udder is likely to behave grammatically like other nouns and not like verbs.4

The point is that nothing could be more “instinctive” than the predisposition to learn a language. It is virtually unteachable. It is hard-wired. It is not learned. It is—horrid thought—genetically determined. And yet nothing could be more plastic than the vocabulary and syntax to which that predisposition applies itself. The ability to learn a language, like almost all the other human brain functions, is an instinct for learning.

If I am right and people are just animals with more than usually trainable instincts, then it might seem that I am excusing instinctive behavior. When a man kills another man or tries to seduce a woman, he is just being true to his nature. What a bleak, amoral message. Surely there is a more natural basis for morality in the human psyche than that? The centuries-old debate between the followers of Rousseau and Hobbes—whether we are corrupted noble savages or civilized brutes—has missed the point. We are instinctive brutes, and some of our instincts are unsavory. Of course some instincts are very much more moral, and the vast human capacity for altruism and generosity—the glue that has always held society together—is just as natural as any selfishness. Yet selfish instincts are there, too. Men are much more instinctively capable of murder and of sexual promiscuity than women, for example. But Hobbes’s vindication means nothing because instincts combine with learning. None of our instincts is inevitable; none is insuperable. Morality is never based upon nature. It never assumes that people are angels or that the things it asks human beings to do come naturally. “Thou shalt not kill” is not a gentle reminder but a fierce injunction to men to overcome any instincts they may have or face punishment.

NURTURE IS NOT NECESSARILY THE OPPOSITE OF NATURE

The Jamesian notion that man has instincts to learn things at a stroke demolishes the whole dichotomy of learning versus instinct, nature versus nurture, genes versus environment, human nature versus human culture, innate versus acquired, and all the dualisms that have plagued the study of the mind ever since René Descartes. For if the brain consists of evolved mechanisms highly specific and intricately designed but flexible in content, then it is impossible to use the fact that a behavior is flexible as an indication that it is “cultural.” The ability to use language is “genetic” in the sense that it is inherent in the genes’ instructions for putting together a human body to include a detailed language-acquisition device. It is also “cultural” in the sense that the vocabulary and syntax of the language are arbitrary and learned. It is also developmental in the sense that the language acquisition device grows after birth and feeds off the examples it sees around it. Just because language is acquired after birth does not means that it is cultural. Teeth are also acquired after birth.

“There is no more a gene for aggression than there is for wisdom teeth,” wrote Stephen Jay Gould, implying that behavior must be cultural and not “biological.”5 His facts are right, of course, but that is exactly why his implication is wrong. Wisdom teeth are not cultural artifacts; they are genetically determined even though they develop in late adolescence and even though there is not a single gene that says “grow wisdom teeth.” By the term “a gene for aggression,” Gould means that the difference between the aggressiveness of person A and person B would be due to a difference in gene X. But just as all sorts of environmental differences (such as nutrition and dentists) can cause A to have bigger wisdom teeth than B, so all sorts of genetic differences (affecting how the face grows, how the body absorbs calcium, how the sequence of teeth are ordered) can cause person A to have bigger wisdom teeth than person B. Exactly the same applies to aggression.

Somewhere in our education we unthinkingly absorb the idea that nature (genes) and nurture (environment) are opposites and that we must make a choice between them. If we choose environmentalism, then we are espousing a universal human nature that is as blank as a sheet of paper awaiting culture’s pen, that humans are therefore perfectible and born equal. If we choose genes, then we espouse irreversible genetic differences between races and between individuals. We are fatalists and elitists. Who would not hope with all her heart that the geneticists were wrong?

Robin Fox, an anthropologist who has called this dilemma a quarrel between original sin and the perfectibility of man, portrayed the dogma of environmentalism thus:

This Rousseauist tradition has a remarkably strong grip on the post-Renaissance occidental imagination. It is feared that without it we shall be prey to reactionary persuasion by assorted villains, from social Darwinists to eugenicists, fascists and new-right conservatives. To fend off this villainy, the argument goes, we must assert that man is either innately neutral (tabula rasa) or innately good and that bad circumstances are what make him behave wickedly.6

Although the notion of a tabula rasa goes back to John Locke, it was in this century that it reached the zenith of its intellectual hegemony. Reacting to the idiocies of social Darwinists and eugenicists, a series of thinkers first in sociology, then anthropology, and finally psychology shifted the burden of proof firmly away from nurture and onto nature. Until proved otherwise, man must be considered a creature of his culture, rather than culture a product of man’s nature.

Emil Durkheim, the founder of sociology, set out in 1895 his assertion that social science must assume people are blank slates on which culture writes. Since then, if anything, this idea has hardened into three cast-iron assumptions: First, anything that varies between cultures must be culturally rather than biologically acquired; second, anything that develops rather than appears fully formed at birth must also be learned; third, anything genetically determined must be inflexible. No wonder social science is irredeemably wedded to the notion that nothing in human behavior is “innate,” for things do vary greatly between cultures, do develop after birth, and are plainly flexible. Therefore, the mechanisms of the human mind cannot be innate. Everything must be cultural. The reason men find young women more sexually attractive than old women must be that their culture teaches them subtly to favor youth, not because their ancestors left more descendants if they had an innate preference for youth.7

Anthropology’s turn was next. With the publication of Margaret Mead’s Coming of Age in Samoa in 1928, the discipline was transformed. Mead asserted that sexual and cultural variety was effectively infinite and was therefore the product of nurture. She did little to prove nurture’s predominance—indeed, what empirical evidence Mead did adduce was largely, it now seems, wishful thinking8—but she shifted the burden of proof. Mainstream anthropology remains to this day committed to the view that there is only a blank human nature.9

Psychology’s conversion was more gradual. Freud believed in universal human mental attributes—such as the Oedipal complex. But his followers became obsessed with trying to explain everything according to individual early childhood influences, and Freudianism came to mean blaming one’s early nurture for one’s nature. Soon psychologists came to believe that even the mind of an adult was a general-purpose learning device. This approach reached its apogee in the behaviorism of B. F. Skinner. He argued that brains are simply devices for associating any cause with any effect.

By the 1950s, looking back at what Nazism had done in the name of nature, few biologists felt inclined to challenge what their human-science colleagues asserted. Yet uncomfortable facts were already appearing. Anthropologists had failed to find the diversity Mead had promised. Freudians had explained very little and altered even less by their appeals to early influences. Behaviorism could not account for the innate preferences of different species of animal to learn different things: Rats are better at running mazes than pigeons. Sociology’s inability to explain or rectify the causes of delinquency was an embarrassment. In the 1970s a few brave “sociobiologists” began to ask why, if other animals had evolved natures, humans would be exempt. They were vilified by the social science establishment and told to go back to ant-watching. Yet the question they had asked has not gone away.10

The principal reason for the hostility to sociobiology was that it seemed to justify prejudice. Yet this was simply a confusion. Genetic theories of racism, or classism or any kind of ism, have nothing in common with the notion that there is a universal, instinctive human nature. Indeed, they are fundamentally opposed because one believes in universals and the other in racial or class particulars. Genetic differences have been assumed just because genes are involved. Why should that be the case? Is it not possible that the genes of two individuals are identical? The logos painted on the tails of two Boeing 747s depend on the airlines that own them, but the tails beneath are essentially the same: They were made in the same factory of the same metal. You do not assume because they are owned by different airlines that they were made in different factories. Why, then, must we assume because there are differences between the speech of the French and the English that they must have brains that are not influenced by genes at all? Their brains are the products of genes—not different genes, the same genes. There is a universal human language-acquisition device, just as there is a universal human kidney and a universal 747 tail structure.

Think, too, of the totalitarian implications of pure environmentalism. Stephen Jay Gould once caricatured the views of genetic determinists in this way: “If we are programmed to be what we are, then these traits are ineluctable. We may, at best, channel them, but we cannot change them.”11 He meant genetically programmed, but the same logic applies with even more force to environmental programming. Some years later Gould wrote: “Cultural determinism can be just as cruel in attributing severe congenital diseases—autism, for example—to psychobabble about too much parental love, or too little.”12

If, indeed, we are the product of our nurture (and who can deny that many childhood influences are ineluctable—witness accent?), then we have been programmed by our various upbringings to be what we are and we cannot change it—rich man, poor man, beggar-man, thief. Environmental determinism of the sort most sociologists espouse is as cruel and horrific a creed as the biological determinism they attack. The truth is, fortunately, that we are an inextricable and flexible mixture of the two. To the extent that we are the product of the genes, they are all and always will be genes that develop and are calibrated by experience, as the eye learns to find edges or the mind learns its vocabulary. To the extent that we are products of the environment, it is an environment that our designed brains choose to learn from. We do not respond to the “royal jelly” that worker bees feed to certain grubs to turn them into queens. Nor does a bee learn that a mother’s smile is a cause for happiness.

THE MENTAL PROGRAM

When, in the 1980s, artificial intelligence researchers joined the ranks of those searching for the mechanism of mind, they, too, began with behaviorist assumptions: that the human brain, like a computer, was an association device. They quickly discovered that a computer was only as good as its programs. You would not dream of trying to use a computer as a word processor unless you had a word-processing program. In the same way, to make a computer capable of object recognition or motion perception or medical diagnosis or chess, you had to program it with “knowledge.” Even the “neural network” enthusiasts of the late 1980s quickly admitted that their claim to have found a general learning-by-association device was false: Neural networks depend crucially on being told what answer to reach or what pattern to find, or on being designed for a particular task, or on being given straightforward examples to learn from. The “connectionists,” who placed such high hopes in neural networks, had stumbled straight into the traps that had caught the behaviorists a generation earlier. Untrained connectionists networks proved incapable even of learning the past tense in English.13

The alternative to connectionism, and to behaviorism before it, was the “cognitive” approach, which set out to discover the mind’s internal mechanisms. This first flowered with Noam Chomsky’s assertions in Syntactic Structures, a book published in 1957, that general-purpose association-learning devices simply could not solve the problem of inferring the rules of grammar from speech.14 It needed a mechanism equipped with knowledge about what to look for. Linguists gradually came to accept Chomsky’s argument. Those studying human vision, meanwhile, found it fruitful to pursue the “computational” approach advocated by David Marr, a young British scientist at MIT. Marr and Tomaso Poggio systematically laid bare the mathematical tricks that the brain was using to recognize solid objects in the image formed in the eye. For example, the retina of the eye is wired in such a way as to be especially sensitive to edges between contrasting dark and light parts of an image; optical illusions prove that people use such edges to delineate the boundaries of objects. These and other mechanisms in the brain are “innate” and highly specific to their task, but they are probably perfected by exposure to examples. No general-purpose induction here.15

Almost every scientist who studies language or perception now admits that the brain is equipped with mechanisms, which it did not “learn” from the culture but developed with exposure to the world; these mechanisms specialize in interpreting the signals that are perceived. Tooby and Cosmides argue that “higher” mental mechanisms are the same. There are specialized mechanisms in the mind that are “designed” by evolution to recognize faces, read emotions, be generous to one’s children, fear snakes, be attracted to certain members of the opposite sex, infer mood, infer semantic meaning, acquire grammar, interpret social situations, perceive a suitable design of tool for a certain job, calculate social obligations, and so on. Each of these “modules” is equipped with some knowledge of the world necessary for doing such tasks, just as the human kidney is designed to filter the blood.

We have modules for learning to interpret facial expressions—parts of our brain learn that and nothing else. At ten weeks we assume that objects are solid, and therefore two objects cannot occupy the same space at the same time—an assumption that no amount of exposure to cartoon films will later undo. Babies express surprise when shown tricks that imply two objects can occupy the same place. At eighteen months babies assume there is no such thing as action at a distance—that object A cannot be moved by object B unless they touch. At the same age we show more interest in sorting tools according to their function than according to their color. And experiments show that, like cats, we assume any object capable of self-generated motion is an animal, which is something we only partially unlearn in our machine-infested world.16

That last is an example of how many of the instincts in our heads develop on the assumption that the world is that of the Pleistocene period, before cars. Infant New Yorkers find it far easier to acquire a fear of snakes than of cars, despite the far greater danger posed by the latter. Their brains are simply predisposed to fear snakes.

Fearing snakes and assuming that self-propelled motion is a sign of an animal are instincts that are probably as well developed in monkeys as in people. Nor is the unwillingness of adults to have sex with people with whom they have lived as children—the incest-avoidance instinct—peculiarly human. Lucy did not need a bigger brain for these things any more than a dog did.

The one thing Lucy did not need was to have to start from scratch and learn the world afresh every generation. Culture could not teach her to detect edges in the visual field; it did not teach her the rules of grammar. It could have taught her to fear snakes, but why bother? Why not let her be born with a fear of snakes? It is not obvious to somebody with an evolutionary perspective quite why we must consider learning so valuable. If learning really did replace instincts rather than enhance and train them, then we would spend half our lives relearning things that monkeys know automatically, such as the fact that unfaithful mates can cuckold you. Why bother to learn them? Why not allow the Baldwin effect to turn them into instincts and spend slightly less time going through the laborious business of adolescence? If a bat had to learn to use its sonar navigation from its parents, rather than simply developing the ability as it grew, or a cuckoo had to learn the way to Africa in winter, rather than “knowing” before setting off, then there would be a lot more dead bats and lost cuckoos every generation. Nature chooses to equip bats with echo-location instincts and cuckoos with migration instincts because it is more efficient than making them learn. True, we learn a lot more than bats and cuckoos do. We learn mathematics and a vocabulary of tens of thousands of words and what people’s characters are like. But this is because we have instincts to learn these things (with the possible exception of mathematics), not because we have fewer instincts than bats or cuckoos.

THE TOOLMAKER MYTH

Until the mid 1970s the question of why people needed big brains when other animals did not had only really been posed by the anthropologists and archaeologists who study the bones and tools of ancient human beings. Their answer, persuasively summarized by Kenneth Oakley in 1949 in a book called Man the Toolmaker, was that man was a tool user and toolmaker par excellence and that he developed a big brain for that purpose. Given the increasing sophistication of man’s tools throughout his history, and the sudden leaps of technical skill that seemed to accompany each change in skull size—from habilis to erectus, from erectus to sapiens, from Neanderthal to modern—this made some sense. But there were two problems with it. First, during the 1960s the ability of animals, especially chimps, to make and use tools was discovered, which rather took the shine off Homo habilis’s somewhat basic tool kit. Second, there was a suspicious bias about the argument. Archaeologists study stone tools because that is what they find preserved. An archaeologist of a million years in the future would call ours the concrete age, with some justice, but he might never even know about books, newspapers, television broadcasts, the clothes industry, the oil business, even the car industry—all traces of which would have rusted away. He might assume that our civilization was characterized by hand-to-hand combat by naked people over concrete citadels. Perhaps, in like fashion, the Neolithic age was distinguished from the Paleolithic not by its tool kit but by the invention of language or marriage or nepotism or some such unfossilizable signature. Wood probably loomed larger than stone in people’s lives, yet no wooden tools survive.17

Besides, the evidence from the tools, far from suggesting continuous human ingenuity, speaks of monumental and tedious conservatism. The first stone tools, the Oldowan technology of Homo habilis, which appeared about 2.5 million years ago in Ethiopia, were very simple indeed: roughly chipped rocks. They barely improved at all over the next million years, and far from experimenting, they became gradually more standardized. They were then replaced by the Acheulian technology of Homo erectus, which consisted of hand axes and teardrop-shaped stone devices. Again, nothing happened for a million years and more, until about 200,000 years ago when there was a sudden and dramatic expansion in the variety and virtuosity of tools at about the time that Homo sapiens appeared. From then on there was no looking back: Tools grew ever more varied and accomplished until the invention of metal. But it comes too late to explain big heads; heads had been swelling ever since 3 million years ago.18

Making the tools that erectus used is not especially hard. Everybody could do it, presumably, which is why it was done all over Africa. There was no inventiveness or creativity going on. For a million years these people made the same dull hand axes, yet their brains were already grossly large by ape standards. Plainly, the instincts of manual dexterity, perception of shape, and reverse engineering from function to form were useful to these people, but it is highly implausible to account for the enlargement of the brain as driven entirely by an enlargement of these instincts.

The first rival to the toolmaking theory was “man the hunter.” In the 1960s, starting with the work of Raymond Dart, there was much interest in the notion that man was the only ape to have taken up a meat diet and hunting as a way of life. Hunting, went the logic, required forethought, cunning, coordination, and the ability to learn skills such as where to find game and how to get close to it. All true, all utterly banal. Anybody who has ever seen a film of lions hunting zebra on the Serengeti will know how skillful lions are at each of the tasks mentioned above. They stalk, ambush, cooperate, and deceive their prey as carefully as any group of humans ever could. Lions do not need vast brains, so why should we? The fashion for man the hunter gave way to woman the gatherer, but similar arguments applied. It is simply unnecessary to be capable of philosophy and language to be able to dig tubers from the ground. Baboons do it just as well as women.19

Nonetheless, one of the most startling things to come out of the great studies of the !Kung San people of the Namib desert in the 1960s was the enormous accumulation of local lore that hunter-gatherer people possess—when and where to hunt for each kind of animal, how to read a spoor, where to find each kind of plant food, which kind of food is available after rain, which things are poisonous and which medicinal. Of the !Kung, Melvin Konner wrote, “Their knowledge of wild plants and animals is deep and thorough enough to astonish and inform professional botanists and zoologists.”20 Without this accumulated knowledge it would not have been possible for mankind to develop so rich and varied a diet, for the results of trial-and-error experiments would not have been cumulative but would have had to be relearned every generation. We would have been limited to fruit and antelope meat, not daring to try tubers, mushrooms, and the like. The astonishing symbiotic relationship between the African honey guide bird and people, in which the bird leads a man to a bees’ nest and then eats what remains of the honey when he leaves, depends on the fact that people know because they have been told that honey guides lead them to honey. To accumulate and pass on this store of knowledge required a large memory and a large capacity for language. Hence the need for a large brain.

The argument is sound enough, but once more it applies with equal force to every omnivore on the African plains. Baboons must know where to forage at what time and whether to eat centipedes and snakes. Chimpanzees actually seek out a special plant whose leaves can cure them of worm infections, and they have cultural traditions about how to crack nuts. Any animal whose generations overlap and which lives in groups can accumulate a store of knowledge of natural history that is passed on merely by imitation. The explanation fails the test of applying only to humans.21

THE BABY APE

The humanist might be feeling a little frustrated by this line of argument. After all, we have big brains and we use them. The fact that lions and baboons have small ones and get by does not mean that we are not helped by our brains. We get by rather better than lions and baboons. We have built cities, and they have not. We invented agriculture, and they did not. We colonized ice-age Europe, and they did not. We can live in the desert and the rain forest; they are stuck on the savanna. Yet the argument still has considerable force because big brains do not come free. In human beings, 18 percent of the energy that we consume every day is spent in running the brain. That is a mighty costly ornament to stick on top of the body just in case it helps you invent agriculture, just as sex was a mighty costly habit to indulge in merely in case it led to innovation (chapter 2). The human brain is almost as costly an invention as sex, which implies that its advantage must be as immediate and as large as sex’s was.

For this reason it is easy to reject the so-called neutral theory of the evolution of intelligence, which has been popularized in recent years mainly by Stephen Jay Gould.22 The key to his argument is the concept of “neoteny”—the retention of juvenile features into adult life. It is a commonplace of human evolution that the transition from Australopithecus to Homo and from Homo habilis to Homo erectus and thence to Homo sapiens all involved prolonging and slowing the development of the body so that it still looked like a baby when it was already mature. The relatively large brain case and small jaw, the slender limbs, the hairless skin, the unrotated big toe, the thin bones, even the external female genitalia—we look like baby apes.23

The skull of a baby chimpanzee looks much more like the skull of an adult human being than either the skull of an adult chimpanzee or the skull of a baby human being. Turning an ape-man into a man was a simple matter of changing the genes that affect the rate of development of adult characters, so that by the time we stop growing and start breeding, we still look rather like a baby. “Man is born and remains more immature and for a longer period than any other animal,” wrote Ashley Montagu in 1961.24

The evidence for neoteny is extensive. Human teeth erupt through the jaw in a set order: the first molar at the age of six, compared with three for a chimp. This pattern is a good indication of all sorts of other things because the teeth must come at just the right moment relative to the growth of the jaw. Holly Smith, an anthropologist at the University of Michigan, found in twenty-one species of primate a close correlation between the age at which the first molar erupted and body weight, length of gestation, age at weaning, birth interval, sexual maturity, life span, and especially brain size. Because she knew the brain size of fossil hominids, she was able to predict that Lucy would have erupted her first molar at three and lived to forty, much like chimpanzees, whereas the average Homo erectus would have erupted his at nearly five and lived to fifty-two.25

Neoteny is not confined to man. It is also a characteristic of several kinds of domestic animals, especially dogs. Some dogs are sexually mature when they are still stuck in an early phase of wolf development: They have short snouts, floppy ears, and the sort of behavior that wolf pups show—retrieving for example. Other, such as sheepdogs, are stuck at a different phase: longer snouts, half-cocked ears, and chasing. Still others, such as German shepherds, have the full range of wolf hunting and attacking behaviors plus long snouts and cocked ears.26

But whereas dogs are truly neotenic, breeding at a young age and looking like wolf puppies, humans are peculiar. They look like infant apes, true, but they breed at an advanced age. The combination of a slow change in the shape of their head and a long period of youthfulness means that as adults they have astonishingly large brains for an ape. Indeed, the mechanism by which ape-men turned into men was clearly a genetic switch that simply slowed the developmental clock. Stephen Jay Gould argues that rather than seek an adaptive explanation of features like language, perhaps we should simply regard them as “accidental,” though useful, by-products of neoteny’s achievement of large brain size. If something as spectacular as language can be the product of simply a large brain plus culture, then there need be no specific explanation of why larger brains are required because their advantages are obvious.27

The argument is based on a false premise. As Chomsky and others have amply demonstrated, language is one of the most highly designed capabilities imaginable, and far from being a by-product of a big brain, it is a mechanism with a very specific pattern that develops in children without instruction. It also has obvious evolutionary advantages, as a moment’s reflection will reveal. Without, for example, the trick of recursion (subordinate phrases) it becomes impossible to tell even the simplest story. In the words of Steve Pinker and Paul Bloom, “It makes a big difference whether a far-off region is reached by taking the trail that is in front of the large tree or the trail that the large tree is in front of. It makes a difference whether that region has animals that you can eat or animals that can eat you.” Recursion could easily have helped a Pleistocene man survive or breed. Language, conclude Pinker and Bloom, “is a design imposed on neural circuitry as a response to evolutionary pressure.”28 It is not the whirring by-product of the mental machine.

The neoteny argument does have one advantage: It shows a possible reason why apes and baboons did not follow man down the path to ever bigger brains. It is possible that the neoteny mutation simply never arose in our primate cousins. Or, more intriguingly, as I shall explain later, the mutation may have arisen but never had a reason to spread.

GOSSIP’S GRIP

Those outside anthropology had never paid much obeisance to man the toolmaker or any other explanation for intelligence. For most people, the advantages of intelligence were obvious. It led to more learning and less instinct, which meant that behavior could be more flexible, which was rewarded by evolutìon. We have already seen how shot full of holes this argument is. Learning is a burden on the individual, in place of flexible instincts, and the two are not opposites in any case. Mankind is not the learning ape, he is the clever ape with more instincts and more open to experience. Not having seen this flaw in the logic, the disciplines that considered such matters, especially philosophy, always showed a strange lack of curiosity in the whole question of intelligence. Philosophers assume that intelligence and consciousness have obvious advantages and get on with the serious debate about what consciousness is. Before the 1970s there was very little evidence that any of them had even posed the obvious evolutionary question: Why is intelligence a good thing?

So the force with which the question was suddenly put in 1975 by two zoologists working independently had an enormous impact. Richard Alexander of the University of Michigan was one. In the tradition of the Red Queen, he expressed skepticism about whether what Charles Darwin had called “the hostile forces of nature” were a sufficiently challenging adversary for an intelligent mind. The point is that the challenges presented by stone tools or tubers are mostly predictable ones. Generation after generation of chipping a tool off a block of stone or knowing where to look for tubers calls for the same level of skill each time. With experience each gets easier. It is rather like learning to ride a bicycle; once you know how to do it, it comes naturally. Indeed, it becomes “unconscious,” as if conscious effort were simply not needed every time. Likewise, Homo erectus did not need consciousness to know that you should stalk zebras upwind every time lest they scent you or that tubers grow beneath certain trees. It came as naturally to him as riding a bike does to us. Imagine playing chess against a computer that has only one opening gambit. It might be a good opening gambit, but once you know how to beat it, you can play the same response yourself, game after game. Of course, the whole point of chess is that your opponent can select one of many different ways to respond to each move you make.

It was logic like this that led Alexander to propose that the key feature of the human environment that rewarded intelligence was the presence of other human beings. Generation after generation, if your lineage is getting more intelligent, so is theirs. However fast you run, you stay in the same place relative to them. Humans became ecologically dominant by virtue of their technical skills, and that made humans the only enemy of humans (apart from parasites). “Only humans themselves could provide the necessary challenge to explain their own evolution,” wrote Alexander.29

True enough, but Scottish midges and African elephants are “ecologically dominant” in the sense that they outnumber or outrank all potential enemies, yet neither has seen the need to develop the ability to understand the theory of relativity. In any case, where is the evidence that Lucy was ecologically dominant? By all accounts her species was an insignificant part of the fauna of the dry, wooded savanna where she lived.30

Independently, Nicholas Humphrey, a young Cambridge zoologist, came to a conclusion similar to Alexander’s. Humphrey began an essay on the topic with the story of how Henry Ford once asked his representatives to find out which parts of the Model-T never went wrong. They came back with the answer that the kingpin had never gone wrong; so Ford ordered it made to an inferior specification to save money. “Nature,” wrote Humphrey, “is surely at least as careful an economist as Henry Ford.”31

Intelligence must therefore have a purpose; it cannot be an expensive luxury. Defining intelligence as the ability to “modify behavior on the basis of valid inference from evidence,” Humphrey argued that the use of intelligence for practical invention was an easily demolished straw man. “Paradoxically, subsistence technology, rather than requiring intelligence, may actually become a substitute for it.” The gorilla, Humphrey noted, is intelligent as animals go, yet it leads the most technically undemanding life imaginable. It eats the leaves that grow abundantly all around it. But the gorilla’s life is dominated by social problems. The vast majority of its intellectual effort is expended on dominating, submitting to, reading the mood of, and affecting the lives of other gorillas.

Likewise, Robinson Crusoe’s life on the desert island was technically fairly straightforward, says Humphrey. “It was the arrival of Man Friday on the scene that really made things difficult for Crusoe.” Humphrey suggested that mankind uses his intellect mainly in social situations. “The game of social plot and counterplot cannot be played merely on the basis of accumulated knowledge, any more than a game of chess can.” A person must calculate the consequences of his own behavior and calculate the likely behavior of others. For that he needs at least a glimpse of his own motives in order to guess the things that are going through others’ minds in similar situations, and it was this need for self-knowledge that drove the increase in conscious awareness.32

As Horace Barlow of Cambridge University has pointed out, the things of which we are conscious are mostly the mental events that concern social actions: We remain unconscious of how we see, walk, hit a tennis ball, or write a word. Like a military hierarchy, consciousness operates on a “need to know” policy. “I can think of no exception to the rule that one is conscious of what it is possible to report to others and not conscious of what it is not possible to report.”33 John Crook, a psychologist with a special interest in Eastern philosophy, has made much the same point: “Attention therefore moves cognition into awareness, where it becomes subject to verbal formulation and reporting to others.”34

What Humphrey and Alexander described was essentially a Red Queen chess game. The faster mankind ran—the more intelligent he became—the more he stayed in the same place because the people over whom he sought psychological dominion were his own relatives, the descendants of the more intelligent people from previous generations. As Pinker and Bloom put it, “Interacting with an organism of approximately equal mental abilities whose motives are at times outright [sic] malevolent makes formidable and ever-escalating demands on cognition.”35 If Tooby and Cosmides are right about mental modules, among the modules that were selected to increase in size by this intellectual chess tournament was the “theory of mind” module, the one that enables us to form an opinion about one another’s thoughts, together with the means to express our own thoughts through the language modules.36 There is plenty of good evidence for this idea when you look about you. Gossip is one of the most universal of human habits. No conversation between people who know each other well—fellow employees, fellow family members, old friends—ever lingers for long on any topic other than the behavior, ambitions, motives, frailties, and affairs of other absent—or present—members of the group. That is the reason the soap opera is the quintessentially effective way to entertain people.37 Nor is this a Western habit. Konner wrote of his experience with !Kung San tribesmen:

After two years with the San, I came to think of the Pleistocene epoch of human history (the 3 million years during which we evolved) as one interminable marathon encounter group. When we slept in a grass hut in one of their villages, there were many nights when its flimsy walls leaked charged exchanges from the circle around the fire, frank expressions of feeling and contention beginning when the dusk fires were lit and running on until dawn.38

Virtually all novels and plays are about the same subject, even when disguised as history or adventure. If you want to understand human motives, read Proust or Trollope or Tom Wolfe, not Freud or Piaget or Skinner. We are obsessed with one another’s minds. “Our intuitive commonsense psychology far surpasses any scientific psychology in scope and accuracy,” wrote Don Symons.39 Horace Barlow points out that great literary minds are, almost by definition, great mind-reading minds. Shakespeare was a far better psychologist than Freud, and Jane Austen a far better sociologist than Durkheim. We are clever because we are—and to the extent that we are—natural psychologists.40

Indeed, novelists themselves saw this first. In George Eliot’s Felix Holt, the Radical, she gives a concise summary of the Alexander-Humphrey theory:

Fancy what a game of chess would be if all the chessmen had passions and intellects, more or less small and cunning; if you were not only uncertain about your adversary’s men, but a little uncertain also about your own…. You would be especially likely to be beaten, if you depended arrogantly on your mathematical imagination, and regarded your passionate pieces with contempt. Yet this imaginary chess is easy compared with a game a man has to play against his fellowmen with other fellowmen for instruments.

The Alexander-Humphrey theory, which is widely known as the Machiavellian hypothesis,41 sounds rather obvious, but it could never have been proposed in the 1960s before the “selfish” revolution in the study of behavior or by anybody steeped in the ways of social science, for it requires a cynical view of animal communication. Until the mid 1970s zoologists thought of communication in terms of information transfer: It was in the interests of both the communicator and the recipient that the message be clear, honest, and informative. But as Lord Macaulay put it,42 “The object of oratory alone is not truth but persuasion.” In 1978, Richard Dawkins and John Krebs pointed out that animals use communication principally to manipulate one another rather than to transfer information. A bird sings long and eloquently to persuade a female to mate with him or a rival to keep clear of his territory. If he were merely passing on information, he need not make the song so elaborate. Animal communication, said Dawkins and Krebs, is more like human advertising than like airline timetables. Even the most mutually beneficial communication, like that between a mother and a baby, is pure manipulation, as every mother who has been woken in the night by a desperate-sounding infant who merely wants company knows. Once scientists had begun thinking in this way, they looked at animal social life in an entirely new light.43

One of the most striking pieces of evidence for deception’s role in communication comes from experiments that Leda Cosmides did when at Stanford University and that Gerd Gigerenzer and his colleagues did at Salzburg University. There is a simple logical puzzle called the Wason test, which people are bafflingly bad at. It consists of four cards placed on the table. Each card has a letter on one side and a number on the other. At present the cards read as follows: D, F, 3, 7. Your task is to turn over only those cards that you need to in order to prove the following rule to be true or false: If a card has a D on one side, then it has a 3 on the other.

When presented with this test, less than one-quarter of Stanford students got it right, an average performance. (The right answer, by the way, is D and 7.) But it has been known for years that people are much better at the Wason test if it is presented differently. For example, the problem can be set as follows: “You are a bouncer in a Boston bar, and you will lose your job unless you enforce the following law: If a person is drinking beer, then he must be over twenty years old.” The cards now read: “drinking beer, drinking Coke, twenty-five years old, sixteen years old.” Now three-quarters of the students get the right answer: Turn over the cards marked “drinking beer” and “sixteen years old.” But the problem is logically identical to the first one. Perhaps the more familiar context of the Boston bar is what helps people do better, but other equally familiar examples elicit poor performance. The secret of why some Wason tests are easier than others has proved to be one of psychology’s enduring enigmas.

Cosmides and Gigerenzer have solved the enigma. If the law to be enforced is not a social contract, the problem is difficult—however simple its logic; but if it is a social contract, like the beer-drinking example, then it is easy. In one of Gigerenzer’s experiments, people were good at enforcing the rule “If you take a pension, then you must have worked here ten years” by wanting to know what was on the back of the cards “worked here eight years” and “got a pension”—so long as they were told they were the employer. But if told they were an employee and still set the same rule, they turned over the cards “worked here for twelve years” and “did not get a pension,” as if looking for cheating employers—even though the logic clearly implies that cheating employers are not infringing the rule.

Through a long series of experiments Cosmides and Gigerenzer proved that people are simply not treating the puzzles as pieces of logic at all. They are treating them as social contracts and looking for cheats. The human mind may not be much suited to logic at all, they conclude, but is well suited to judging the fairness of social bargains and the sincerity of social offers. It is a mistrustful Machiavellian world.44

Richard Byrne and Andrew Whiten of the University of St. Andrews studied baboons in East Africa and witnessed an incident in which Paul, a young baboon, saw an adult female, Mel, find a large root. He looked around and then gave a sharp cry. The call summoned the baboon’s mother, who “assumed” that Mel had just stolen the food from her young or threatened him in some way, and chased Mel away. Paul ate the root. This piece of social manipulation by the young baboon required some intelligence: a knowledge that its call would bring its mother, a guess at what the mother would “assume” had happened, and a prediction that it would lead to Paul’s getting the food. It was also using intelligence to deceive. Byrne and Whiten went on to suggest that the habit of calculated deception is common in humans, occasional in chimpanzees, rare in baboons, and virtually unknown in other animals. Deceiving and detecting deception would then be the primary reason for intelligence. They suggest that the great apes acquired a unique ability to imagine alternative possible worlds as a means to deception.45

Robert Trivers has argued that to deceive others well, an animal must deceive itself, and that self-deception’s hallmark is a biased system of transfer from the conscious to the unconscious mind. Deception is therefore the reason for the invention of the subconscious.46

Yet Byrne’s and Whiten’s account of the baboon incident goes right to the heart of what is wrong with the Machiavellian theory. It applies to every social species. For example, if you read any stories of life in a chimpanzee troop, the “plot” has a painful predictability about it to human ears. In Jane Goodall’s account of the career of the successful male Goblin, we watch Goblin’s precocious and confident rise in the hierarchy as he challenges and defeats first each of the females in the troop and then, one by one, the males: Humphrey, Jomeo, Sherry, Satan, and Evered:

Only Figan [the alpha male] was exempt. Indeed, it was his relationship with Figan that enabled him to challenge these older and more experienced males: He almost never did so unless Figan was nearby.

[To the human reader what comes next is startlingly obvious.]

For some time we had been expecting Goblin to turn on Figan. Indeed, I am still puzzled as to why Figan, so socially adroit in all other ways, had not been able to predict the inevitable outcome of his sponsorship of Goblin.47

The plot has a few twists, but we are not surprised; Figan is soon toppled. Machiavelli at least warned his Prince to watch his back. Brutus and Cassius took great care to conceal their plot from Julius Caesar; they could never have pulled off the assassination if their open ambition had been so obvious. Not even the most power-blinded human dictator is taken by surprise as Figan was. Of course that only proves that people are cleverer than chimpanzees, which is no great surprise, but it starkly poses the question why? If Figan had had a bigger brain, he might have seen what was coming. So the evolutionary pressure that Nick Humphrey identified—to get better and better at solving social puzzles, reading minds, and predicting reactions—is all there in the chimps and baboons, too. As Geoffrey Miller, a psychologist at the University of Stanford, has put it, “All apes and monkeys show complex behavior replete with communication, manipulation, deception, and long-term relationships; selection for Machiavellian intelligence based on such social complexities should again predict much larger brains in other apes and monkeys than we observe.”48

There have been several answers to this puzzle, none of which is entirely convincing. The first is Humphrey’s own answer, which is that human society is more complex than ape society because it needs a “polytechnic school” in which young people can learn the practical skills of their species. This seems to me merely a retreat to the toolmaker theory. The second is the suggestion that alliance building among unrelated individuals is a key to success in human beings and that this complication vastly increases the rewards of intellect. To which comes the response: What about dolphins? There is growing evidence that dolphin society is based on shifting alliances of males and of females so that, for example, Richard Connor observed a pair of males that came across a small group of other males that had kidnapped a fertile female from her group. Instead of fighting them for the female, the pair went away and found some allies, came back, and with superior numbers stole the female from the first group.49 Even in chimps the rise of a male to the alpha position and his tenure there is determined by his ability to command the loyalty of allies.50 So the alliance theory once more seems too general to explain the sudden increase in human intelligence. Moreover, like most of these theories, it explains language, tactical thinking, social exchange, and the like, but it does not explain some of the things to which human beings devote much of their mental energy: music and humor, for example.

WITTINESS AND SEXINESS

At least the Machiavelli theory proposes an adversary for the human brain that is its equal, however clever it gets. Few of my readers will need reminding of the ruthlessness that human beings can show when in pursuit of self-interest. There is no such thing as being clever enough just as there is no such thing as being good enough at chess. Either you win or you do not. If winning pits you against a better opponent, as it does in the evolutionary tournament generation after generation, then the pressure to get better and better never lets up. The way the brains of human beings have gotten bigger at an accelerating pace implies that some such within-species arms race is at work.

So argues Geoffrey Miller. After laying bare the inadequacies of the conventional theories about intelligence, he takes a surprising turn.

I suggest that the neocortex is not primarily or exclusively a device for toolmaking, bipedal walking, fire-using, warfare, hunting, gathering, or avoiding savanna predators. None of these postulated functions alone can explain its explosive development in our lineage and not in other closely related species…. The neocortex is largely a courtship device to attract and retain sexual mates: Its specific evolutionary function is to stimulate and entertain other people, and to assess the stimulation attempts of others.51

The only way, he suggests, that sufficient evolutionary pressure could suddenly and capriciously be sustained in one species to enlarge an organ far beyond its normal size is sexual selection. “Just as the peahen is satisfied with nothing less than a visually brilliant display of peacock plumage, I postulate that hominid males and females became satisfied with nothing less than psychologically brilliant, fascinating, articulate, entertaining companions.” Miller’s use of the peacock is deliberate. Wherever else in the animal kingdom we find greatly exaggerated and enlarged ornaments, we have been able to explain them by the runaway, sexy-son, Fisher effect of intense sexual selection (or the equally powerful Good-genes effect, as described in chapter 5). Sexual selection, as we have seen, is very different from natural selection in its effects, for it does not solve survival problems, it makes them worse. Female choice causes peacocks’ tails to grow longer until they become a burden—then demands that they grow longer still. Miller used the wrong word: Peahens are never satisfied. And so, having found a force that produces exponential change in ornaments, it seems perverse not to consider it when trying to explain the exponential expansion of the brain.

Miller adduces some circumstantial evidence for his view. Surveys consistently place intelligence, sense of humor, creativity, and interesting personality above even such things as wealth and beauty in lists of desirable characteristics in both sexes.52 Yet these characteristics fail entirely to predict youth, status, fertility, or parental ability, so evolutionists tend to ignore them—but there they are, right at the top of the list. Just as a peacock’s tail is no guide to his ability as a father but despotic fashion punishes those who cease to respect it, so Miller suggests that men and women dare not step off the treadmill of selecting the wittiest, most creative and articulate person available with whom to mate. (Note that conventional “intelligence” as measured by examinations is not what he is talking about.)

Likewise, the manner in which sexual selection capriciously seizes upon preexisting perceptual biases fits with the fact that apes are by nature naturally “curious, playful, easily bored, and appreciative of simulation.” Miller suggests that to keep a husband around long enough to help in raising children, women would have needed to be as varied and creative in their behavior as possible, which he calls the Scheherazade effect after the Arabian storyteller who entranced the Sultan with 1,001 tales so that he did not abandon her (and execute her) for another courtesan. The same would have applied to males who wanted to attract females, which Miller calls the Dionysus effect after the Greek god of dance, music, intoxication, and seduction. He might also have called it the Mick Jagger effect; he admitted to me one day that he could not understand what made strutting, middle-aged rock stars so attractive to women. In this respect Don Symons noted that tribal chiefs are both gifted orators and highly polygamous men.53

Miller notes that the bigger the brain became, the more necessary long-term pair bonds were. A human infant is born helpless and premature. If it were as advanced at birth as an ape, it would be twenty-one months in the womb.54 But the human pelvis is simply incapable of bearing a child with a head that big, so it is born at nine months and treated like a helpless, external fetus for the next year, not even beginning to walk until it is at the age when it would expect to enter the world. This helplessness further enhances the pressure on women to keep men around to help feed them when encumbered with a child—the Scheherazade effect.

Miller finds that the most commonly voiced objection to the Scheherazade effect is that most people are not witty and creative but are dull and predictable. True enough, but compared to what? Our standards for what is considered entertaining have, if Miller is right, evolved as fast as our wit. “I think male readers may find it hard to imagine some four-foot-tall, half-hairy, flat-chested, hominid females being sexier than similar hominids,” wrote Miller in a letter to me (referring to “Lucy”). “We’re spoiled because sexual selection has already driven us so far that it’s hard to appreciate how any point we’ve passed could have been considered an improvement. We are positively turned off by traits that half a million years ago would have been considered irresistibly sexy.”55

Miller’s theory draws attention to several facts that have remained unexplained in other theories, namely that dance, music, humor, and sexual foreplay are all features unique to human beings. Following the Tooby-Cosmides logic, we cannot argue that these are mere cultural habits foisted on us by “society.” Plainly a desire to hear rhythmic tunes or to be made to laugh by wit develops innately. Following Miller we note that they are characterized by obsessions with novelty and virtuosity and much practiced by the young. From Beatlemania to Madonna (and back again to Orpheus), the sexual fascination of youth with musical creativity has been obvious. It is a human universal.

It is crucial for Miller’s theory that human beings are especially selective about their mates. Indeed, among apes, people are unique in that both sexes are extremely choosy. A gorilla female is happy to be mated with whoever “owns” her harem. A gorilla male will mate with any estral female he can find. A chimp female is keen to mate with many different males in the troop. A chimp male will mate with any female in season. But women are highly selective about the men with whom they mate. So indeed are men. True, they are easily persuaded to go to bed with beautiful young women—but that is exactly the point. Most women are neither young nor beautiful, nor are they trying to seduce strange men. It is hard to overemphasize how unusual humans are in this respect. Males in some monogamous bird species such as pigeons and doves56 do take care to select a female carefully, but in many other birds, the males are happy to have a fling with any passing female, as the evidence of sperm competition theory has demonstrated (chapter 7). Although he may prefer variety more than females do, man is a highly sexually selective male as males go.

Selectivity by one or the other sex is the prerequisite of sexual selection. And as I have argued in previous chapters, it is more than that. It is the almost invariant predictor of sexual selection. Fisher’s runaway process for sexy sons and Zahavi-Hamilton’s Good-genes effect simply cannot be avoided once one or the other sex is being selective. So we should actually expect some exaggeration of some feature or other in man as a simple consequence of sexual selection.57

Incidentally, Miller’s argument draws attention to a little-appreciated aspect of sexual selection: It can affect both the selected sex and the selector. For example, among American blackbirds those species in which the female is large are also the species in which the male is much larger. The same is true of many mammals and birds. Among grouse, pheasants, seals, and deer, a greater ratio between male and female size occurs in the larger species. A recent analysis of this effect concludes that it is caused by sexual selection: The more polygamous the species, the more premium there is on large size in males; the more males are selected for large size, the more they inevitably leave large-size genes to their daughters as well as their sons. Genes can be “sex-linked” but usually only imperfectly or when there is a strong disadvantage to a daughter’s inheriting the effect—as in the case of female birds and gaudy colors. Thus, sexual selection by males of females for large brains would result in larger brains for both sexes.58

OBSESSED WITH YOUTH

I believe that Miller’s tale deserves a special twist from the neoteny theory (although he is not convinced). The neoteny theory is well established among anthropologists. And the notion of human monogamous child rearing is well established among sociobiologists. Nobody has yet put the two together. If men began selecting mates that appeared youthful, then any gene that slowed the rate of development of adult characteristics in a woman would make her more attractive at a given age than a rival. Consequently, she would leave more descendants, who would inherit the same gene. Any neoteny gene would give the appearance of youthfulness. Neoteny, in other words, could be a consequence of sexual selection, and since neoteny is credited with increasing our intelligence (by enlarging the brain size at adulthood), it is to sexual selection that we should attribute our great intelligence.

The idea is hard to grasp at first, so a thought experiment may help. Imagine two primeval women: One develops at the normal rate, and the other has an extra neoteny gene so that she is hairless of body, large-brained, small-jawed, late maturing, and long-lived. At the age of twenty-five, both are widowed; each has had one child by her first husband. The men in the tribe have a preference for young women and twenty-five is not young, so neither stands much chance of getting a second husband. But there is one man who cannot find a wife. Given the alternatives, he chooses the younger-looking woman. She goes on to have three more children while her rival barely manages to rear the one she already had.

The details of the story do not matter. The point is that once males prefer youth, a gene for delaying the signs of aging would generally prosper at the expense of a normal gene, and a neoteny gene does exactly that. The gene would probably make the woman’s sons appear neotenized as well as her daughters, for there is no reason that it should be specific to the female sex in its effects. The whole species would be driven into neoteny.

Christopher Badcock, a sociologist at the London School of Economics who unusually combines an interest in evolution and an interest in Freud, has proposed a similar idea. He suggested that neotenic (or, as he calls it, “paedomorphic”) traits were favored by female choice rather than male choice. Younger males, he suggests, made more cooperative hunters, and therefore females who wanted meat picked younger-looking men. The principle is the same: Neotenic development is a consequence of a preference for it in one sex.59

This is not to deny that bigger brains themselves brought advantages in Machiavellian intelligence or language or seductiveness. Indeed, once these advantages became clear, men who were especially fussy about picking youthful-looking women would be most successful because they sometimes picked neotenic, big-brained women and therefore had more intelligent children. But it does suggest an escape from the question Why did it not happen to baboons?

However, Miller’s sexual selection idea suffers from a near fatal flaw. Remember that it presupposes sexual choosiness by one or other sex. But what caused that choosiness? Presumably the cause was the fact that men took part in parental care, which gave women an incentive to confine probable paternity to one man and gave men an incentive to enter into a long-term relationship as long as he could be certain of paternity. Why then did men take part in parental care? Because by doing so they could increase the chances of rearing a child more than by trying to seek new partners. The reason for this was that children, unusual for ape infants, took a long time to mature, and men could help their wives during child rearing by hunting meat for them. Why did they take a long time to mature? Because they had big heads! The argument is circular.

That may not be fatal to it. Some of the best arguments, such as Fisher’s theory of runaway sexual selection, are circular. The relationship between chickens and eggs is circular. Miller is actually rather proud of the theory’s circularity because he believes we have learned from computer simulation that evolution is a process which pulls itself up by its bootstraps. There is no single cause and effect because effects can reinforce causes. If a bird finds itself to be good at cracking seeds, then it specializes in cracking seeds, which puts further pressure on its seed-cracking ability to evolve. Evolution is circular.

STALE MATE

It is a disquieting thought that our heads contain a neurological version of a peacock’s tail—an ornament designed for sexual display whose virtuosity at everything from calculus to sculpture is perhaps just a side effect of the ability to charm. Disquieting and yet not altogether convincing. The sexual selection of the human mind is the most speculative and fragile of the many evolutionary theories discussed in this book, but it is also very much in the same vein as the others. I began this book by asking why all human beings were so similar and yet so different, suggesting that the answer lay in the unique alchemy of sex. An individual is unique because of the genetic variety that sexual reproduction generates in its perpetual chess tournament with disease. An individual is a member of a homogeneous species because of the incessant mixing of that variety in the pool of fellow human beings’ genes. And I end with one of the strangest of the consequences of sex: that the choosiness of human beings in picking their mates has driven the human mind into a history of frenzied expansion for no reason except that wit, virtuosity, inventiveness, and individuality turn other people on. It is a somewhat less uplifting perspective on the purpose of humanity than the religious one, but it is also rather liberating. Be different.