CHAPTER 13

DID MAN EAT MAN? WORLDWIDE, 800,000 B.C.E.

The pursuit of the prion disease mystery has been in many ways a study in frustrations, in long experiments that went nowhere and high hopes that were not borne out. Nature is sometimes too original to be understood through paradigms, and presents puzzles too complicated to be solved on a human time scale. But for all their frustrations, prion researchers also solved an old anthropological puzzle.

Anthropologists have long debated whether humans are cannibalistic. The question is not whether humans ever eat human flesh—clearly we sometimes do when the alternative is starvation—but whether cannibalism was ever central to a human culture. Over the centuries, that assertion was made many times and in regards to many cultures. The Fore, the Anga, and the Mundugumor of New Guinea; the Fijians of the South Seas; the Aztecs, the Arawaks, and the Caribs of the New World (“cannibal” derives from the word “Carib”) are but a few of the peoples whom neighbors, travelers, and, later, Western anthropologists claimed regularly ate human flesh, either to honor their dead, to avenge their enemies, or simply for the nutrition. Yet it turned out that on closer examination there were never firsthand accounts to confirm these stories. Most travelers’ accounts were derived from previous travelers’ accounts, themselves secondhand and probably distorted. Anthropologists, in turn, got a lot of their information from local hearsay. As one skeptical anthropologist wrote in 1979, “The human failing which emerges most clearly from the morass of data is…European plagiarism.”

Even the kuru epidemic can be explained without cannibalism. Gajdusek has long maintained that the handling of infected bodies in preparation for burial would have been sufficient to spread the disease. And though the early Highland patrol reports are full of comments that the Fore considered themselves cannibals, eyewitness accounts are not to be found. “If opportunity presents itself I propose to avail myself of an invitation extended by a Kagu leader to be present at some future ceremonial eating of human flesh to observe the various rites practiced,” promises the first patrol officer to enter Fore territory, R. I. Skinner, in the late 1940s. But the opportunity never does present itself.

So a generation of anticannibalism anthropologists have been able to gain traction, arguing that to call a tribe “cannibalistic” is not descriptive, but pejorative.

That is no longer true. Without deliberately setting out to settle the point, an effort in England to understand why some Britons got sick from mad cow disease while others didn’t has proven that cannibalism was part of early human history. In fact, not only were we all cannibals at one point, but it cost us dearly: our anthropophagy led to an outbreak of a prion disease with a high death toll (and probably caused us to evolve a tendency to feel revulsion at the act that made cannibalism, in more recent history, taboo.)

image

This important discovery by prion investigators began in the 1980s soon after the prion gene was found. They were interested in why some mammals seemed particularly susceptible to prion disease—say, certain breeds of sheep—while others were resistant. They wondered whether there was something in the sheep’s genetic makeup that affected its chances of coming down with scrapie if it was exposed to the infectious agent.

Over time it became clear that susceptibility to prion diseases often depends on small alterations in an animal’s prion gene. Genes contain the blueprints, or recipes, for all the proteins in the body. Most mammals have the same genes for the same proteins as other members of their species, but fairly often there are two or more genetic codes that result in essentially the same protein. These variations are called polymorphisms. The human genome is full of them, and they often have little or no effect on their possessors.

In the early 1990s, the English prion researcher John Collinge undertook a prion gene analysis of Britons to see if a similar relationship existed between a polymorphism and susceptibility to prion disease in the human population. This was before there were any human mad cow victims, but there existed a group of young CJD sufferers whom I have not mentioned yet: teenagers who had gotten CJD from growth hormone derived from human pituitary glands. From the 1950s to the 1980s, British doctors gave the hormones to eighteen hundred children, a few of whom developed CJD, because the donor cadavers had it and the preparation of the injection did not remove the infectious prions. The first such case was reported in the mid-1980s; by the time mad cow was first suspected among beef eaters, there were six human growth hormone CJD sufferers in the United Kingdom.

Collinge found an interesting pattern in the group infected by pituitary growth hormone: four had the same polymorphism on their prion genes. He then looked at sufferers of sporadic CJD, and found an even higher ratio: out of forty-five sporadic CJD sufferers, forty had the combination. These forty, like the four human growth hormone recipients, each had the genetic code for an amino acid called valine in a key spot on each of their two prion genes (every human has two copies of every gene, one from each parent). They were what is called homozygous for valine.

Collinge needed to know the overall British genetic makeup to see whether the percentage of homozygous growth hormone CJD victims and sporadic CJD victims was unusual. It was, after all, possible that there were just a lot of British people walking around with that particular genotype. But surveying British genetic data, Collinge found the reverse—Britons with a valine on each copy of their prion gene were scarcer in the population than chance would dictate. Collinge’s lab thus concluded that homozygosity somehow left the carrier more susceptible to prion disease.

It was important for Collinge to make sure that this pattern also held for susceptibility to infection under natural conditions, because CJD born of an injection often behaves differently from CJD that originates outside the laboratory. There was a place to see a human natural infectious prion disease in action: Papua New Guinea. Skeptics could argue over whether Papua New Guinea had been the site of cannibalism, but no one could argue about its being the site of a prion disease epidemic. In the years after mad cow disease was discovered, the territory had become a de facto laboratory for seeing how this rare epidemiological event played out. Elderly Papua New Guinean women were still dying from prion infections contracted at burial feasts they had participated in forty years before.

Collinge’s prion unit was in touch with Michael Alpers, Gajdusek’s old researcher, who still ran the kuru research program in Okapa. The British researchers asked Alpers to take blood samples from some survivors of the feasts. Of the thirty women Alpers tested, only seven were homozygotes; the twenty-three others had one valine and one of another amino acid that can substitute for valine called methionine at the site of their polymorphism. They were heterozygous. Their heterozygosity likely had helped them survive: the number was too high to be coincidental. In fact, the blood tests of living Fore showed that they had the highest percentage of heterozygotes of anyone in the world. The inescapable conclusion was that they had had some sort of protection from kuru.

Because of Collinge’s work, many prion researchers suspected that if mad cow crossed over to humans it too would attack homozygotes disproportionately. And when it did, in the mid-1990s, they were proven right. By 2003 the plague was almost a decade old and the number of people dead from variant CJD (the human form of mad cow) was around 150. If, as has been estimated, the English were exposed to 640 billion doses of mad cow, they had done a remarkable job of surviving the onslaught. This was a tragedy, certainly, but why weren’t more people dying?

The main answer was that the prion crossed species only with difficulty. Human prions infect humans quite well—viz. kuru—and cows do the same for cows—viz. mad cow disease. That’s because prion contamination is a physical event. It is necessary for the two proteins to fit together well for the disease to spread. The more different the proteins of the infector and the host, the less likely transmission is; the more similar, the more likely. That’s why the most alarming prion outbreaks have begun with same-species infection.

But that didn’t answer the question of why BSE seemed to cross to some humans more easily than others. Collinge found that the striking pattern he saw at the beginning of the epidemic was even more remarkable now: all but one of the victims of mad cow in Britain to date was a homozygote (they all had two methionines on their prion gene). This pattern suggested, happily, that a good many of the Britons had some degree of resistance to mad cow disease, because the majority of them were heterozygotes.

In 2004, Collinge’s group received laboratory data to back up their theory: they created mice with homozygous human prion genes, challenged them with mad cow prions, and found they were more likely to become infected with BSE than mice with heterozygous human prion genes. There was something in heterozygosity that altered the shape of the prion to make it less effective at spreading within the body. (Elio Lugaresi and Pierluigi Gambetti in the late 1990s found a parallel pattern with the fast and slow forms of FFI; Assunta and Silvano were homozygotes and died quickly. Luigia and Teresa were heterozygotes; their disease course was much longer.) In addition, heterozygosity may play a role in the recently discovered phenomenon of healthy prion mutation carriers: For years, researchers believed that everyone who had a prion gene mutation ultimately got the disease, but more recent research shows that there are people who have a prion mutation but never get sick, or at least die of old age–related diseases before they get sick. Possibly heterozygosity is giving them some degree of protection.

         

Collinge and his lab had now identified an important epidemiological pattern: homozygotes were at higher risk of prion disease than heterozygotes and there were more heterozygotes in the British population than chance would dictate. The researchers speculated that the two facts might be connected—some evolutionary force might have favored heterozygotes over homozygotes.

That made what the lab learned next particularly intriguing: heterozygotes were overrepresented worldwide, in every race and every ethnicity. According to the theory of population genetics, then, their common ancestors must have faced a situation that winnowed out the homozygotes. To have altered so many genomes, it would have had to be a severe challenge to human survivability and it would either have had to happen all across the earth or to have arisen very early in human history when humans lived only in Africa.

Collinge and his lab set out to put an estimated date on this mysterious event. With the help of a population geneticist, they were able to establish the history of this portion of the prion gene; they found that methionine was the original amino acid coded for on the prion gene and that valines started to appear in the same position around 500,000 years ago. New amino acids can appear and hang around for hundreds of millennia without purpose, so long as they do no harm, so Collinge’s dating can be taken only as the earliest possible occurrence of whatever event had thinned out the homozygotes in favor of the heterozygotes.

Most paleoanthropologists believe modern humanity comes from a small group of Homo sapiens who lived in Africa around seventy thousand years ago. These humans may have numbered no more than two thousand. If so, a quite small number of humans would have needed to engage in some behavior that changed the balance of methionine and valine in their prion protein. But what was the behavior? To figure it out, Collinge and his team had to think about the lives our distant ancestors lived.

Humans of this era were quite healthy (average life span actually declined when humans first began to form agricultural settlements around ten thousand years ago). They died mostly from accidents and, during periods of famine, from malnutrition. They had little experience with infectious diseases, and as a result, very little resistance to them. So if you wanted to kill a lot of early humans, you couldn’t choose a better way than a novel infectious disease.

But what kind of disease? The most successful contagions among humans at this time were slow-acting ones. That’s because when land is thickly populated, virulent infections prosper. There are sufficient hosts: burn one out, move on to the next (the measles virus, for instance, needs 300,000 people and 3,000 infections a year to keep going; it needs cities to prosper). But in preagricultural times, with humans so spread out, an infection had to have a very mild degree of aggressiveness to hitch a ride from one host to the next. Otherwise it killed its victims faster than it could find new ones.

What would be the optimal vector for this infection? Perhaps meat. Food, in general, is an ideal way to get a pathogen into people’s bodies, because they seek it out and put it in their own mouths. And we know people ate meat in prehistory; they had grown too large to subsist entirely on fruit, berries, and wild grains, and needed concentrated protein. And meat is an excellent source of pathogens. Given that ancient peoples had no drugs to treat infection, it would not be surprising to find a huge mortality rate from a food-borne illness at the time.

Arguing against meat as the source of this ancient plague are two factors. One, stomach acids do a good job of removing infectivity; and, two, early on humans learned to cook what they ate. Meat is far less dangerous when it is cooked, because heating kills bacteria. The first confirmed evidence for fire only dates back to around 150,000 B.C.E., but there are several hints that hominids mastered it before they switched from the vegetarianism of their ancestors. For one thing, the climate of northern Europe required fire for survival, given the extreme cold of periodic glaciations. Embers don’t fossilize well, so the absence of hearths in earlier sites is not necessarily unexpected. The argument is a bit circular—we know there was fire because hominids had grown so large they needed meat—but it has some validity.

So, in this prehuman population posited by Collinge and his colleagues—spread around the world, able to use fire and to hunt, beset by no known conventional disease—a successful contagion would be one that infected its victims very slowly. If it was in meat it would have to survive aggressive stomach acids and extensive cooking. And it would have to come from a source of meat that was readily available to humans everywhere. The meat that best meets these criteria would have been human flesh, and the act that would best spread any disease-causing agent in flesh would be cannibalism.

How much cannibalism would have had to occur to spread a prehistoric prion plague across the world? If the experience of the Fore is an indication, not much. Kuru likely started with a single sporadic CJD case, probably early in the twentieth century. That person’s relatives and friends ate him or her at a mortuary feast and, after they fell ill and died, they were in turn eaten by their relatives and friends. Within fifty years, an epidemic severe enough to kill half the residents of some villages was underway.

But what about the initial doubt I mentioned, that cannibalism caused kuru at all? What about the theory held by Gajdusek that handling the tissues of the dead was enough for a prion disease to spread? This objection wouldn’t apply to an ancient prion plague. The first human burial ceremonies date from around fifty thousand years ago, hundreds of thousands of years after the protective polymorphism likely began spreading. There is no evidence that early hominids buried their dead, but there is a lot of evidence that they ate them.

         

Atapuerca is an important archaeological site in northern Spain. A series of hillside caves overlooking a river valley, it has attracted animals and humans looking for shelter since ancient times. Many of these animals and humans died there too, so Atapuerca presents a kind of smorgasbord of remains. Its five or six dig sites each correspond to a different epoch of prehistory. One is Gran Dolina, a site uncovered at the end of the nineteenth century that contains prehuman remains. Archaeologists have been able to date the site back to roughly 800,000 B.C.E., when the earth’s magnetic field last switched. This makes Gran Dolina the oldest repository of hominid remains in Europe.

Two of the hominid remains discovered at Gran Dolina were the bodies of a fourteen-year-old and a ten-year-old, found at the mouth of the cave. That’s a noteworthy location, because the mouth of a cave is where animals normally eat their prey. It occurred to the archaeologists involved that some prehistoric animal seized these children and consumed them in the cave’s shady entrance—carnivores, among them bear and hyena, were abundant in the area. But further research showed that the children’s bones, as well as some animal bones found nearby, were dissected with a precision that exceeds the skills of nonhuman carnivores: for instance, two segments of finger or toe bone and a cranium had the meat scraped out of them. Other bones had been snapped so the marrow could be sucked out. Stone flints were found lying near the remains. Two scenarios might explain what went on at Gran Dolina. Either hominids celebrated some sort of ritual that involved pulling the meat off their dead, or, more probably (because hominids of 800,000 years ago were unlikely to be capable of such symbolic behavior) they were eating other hominids for food.

The official Atapuerca website speculates that “for these primitive humans the difference between the body of a deer and a human cadaver didn’t exist yet.” That’s an evasive statement—even chimpanzees recognize their own kind, so let’s assume at least a similar level of cultural sophistication among the Atapuerca hominids. The bones of the ten-year-old show evidence of malnutrition. Maybe there was a series of bad hunts in the area, or a drought, and the lack of vegetation drove the large herbivores away. It is easy to imagine how urgently the hominids needed some meat. Maybe they attacked and killed their own children, but again, chimpanzees don’t do that, so we can assume hominids didn’t. It’s more likely that the children died of starvation and the parents thought: Why waste this? Especially with other mouths to feed? In the words of the Fore man recorded by anthropologists more than three quarters of a million years later: “What is the matter with us, are we mad? Here is good food and we have neglected to eat it.”

So, using a blade made by sharpening one stone against another, the parents sliced off one of their children’s legs. They cracked the long part of the femur and gave the nutritious marrow to their surviving offspring. They worked their way through the large muscles of the body, then struck open the skull and removed what was inside and ate that. Done, they threw the human bones at their feet among the deer bones and the pony bones left from more flush times. They were not capable of reflecting on whether what they had done was wrong or right, but emotional pain precedes morality; they may have cried; they may have agonized, but at the same time would have known that what they were doing was the obligatory thing in order for their family to survive.

This is all possible. But of course it also may be wrong. Maybe one clan ambushed another. Maybe they ate the enemy’s dead to intimidate the survivors. A ritual ingestion of one’s defeated rivals is well known in anthropology. Chimpanzees even do it. And the hominids at Gran Dolina are our ancestors, as well as those of the Neanderthals. We know that of those two groups one, our own, with which these hominids shared nearly all their genes, shows little compunction against war and murder. We don’t know whether intentional killing led to the feast in the mouth of the cave at Gran Dolina. Probably we never will. The one thing that never survives in the fossil record is motive. But whatever the cause, amazingly, 800,000 years later, this behavior that cost them so dearly and began to turn the human race against cannibalism forever saved our lives.

         

Skeptical anthropologists will have to update their ideas as Collinge and his colleagues’ proof sinks in. But they will have some consolation. Their main objection to the charge of cannibalism was that Europeans used it to justify their racism. For instance, the Spanish in destroying Aztec civilization made clear that the savages they wiped out were eaters of human flesh and thus not worthy of compassion. They were not alone in this assumption. The Aztecs, too, believed in cannibals. In fact, the figure of the cannibal played a central role in their mythic life. They thought the Indians to the north and the south were cannibals (with cannibalism it is always the other guy who does it). And when Cortés and his soldiers arrived, they thought a new cannibal tribe had come to attack them. That was why they fought so hard when they saw the Spanish. It turns out they were right after all: at Gran Dolina are the remains of the first Europeans, and already they were eating each other.