In this and later chapters I’ll be looking for predictable kinds of anti-free-rider social control found very widely today so that I can confidently project them into the more recent hunter-gatherer evolutionary past, and thereby assess their impact on evolving human gene pools. This treatment actually began in the previous chapters, but here I’ll try to justify making such projections. I’ll be particularly interested in dire forms of punishment that could have affected individual reproductive success drastically and therefore could have strongly shaped gene frequencies and, ultimately, human nature.
To reliably make the case for the punitive type of social selection’s having acted on our genetic makeup, it will be useful to project the group behaviors of today’s foragers into the more recent Pleistocene past as conservatively and accurately as possible. This means we must limit our reconstruction to human predecessors who had brains equal to our own and whose cultures had become flexible and advanced like our own. Archaeologists call these people culturally modern humans, and it’s widely agreed that in Africa they had arrived by 45,000 BP.1
In the African archaeological record, cultural modernity is assessed in terms of a rather abrupt appearance of more complex and regionally changeable stone tool technologies, along with objects of self-adornment and “art,” often in the form of engravings. However, interesting as these developments may be, they tell us precious little about what was happening with these people socially. For this reason, it will be necessary to use today’s foragers to reconstruct the group life of their predecessors.
Past attempts to do so have met with major objections from scientists who deal in human prehistory, so we’ll have to get technical here. For doubters like the influential late political anthropologist Elman Service or more recently archaeologist and hunter-gatherer expert Robert Kelly,2 a main problem is that most of today’s foragers have been “marginalized” by aggressive tribal agriculturalists and, eventually, by civilizations and then empires that took over our planet’s more desirable areas. In contrast, Pleistocene foragers had their pick of world environments, and, so the theory goes, they didn’t have to cope with the not-so-productive semideserts, arctic wastes, and other marginal habitats that often limit subsistence possibilities today. Thus, there’s no telling what they were up to.
Service made his persuasive marginalization argument more than three decades ago, and it made sense at the time. Unfortunately, it has become almost a truism in archaeological and evolutionary circles that Pleistocene foragers must have been living in a fat city situation because unmarginalized small populations could pick their rich environments at will. However, the available prehistoric information has changed since then, and changed dramatically. What’s new is our understanding of Late Pleistocene climates and their all but unbelievable instability.3 Frequently, and cyclically, rapidly changeable weather patterns could have led to two kinds of prehistoric “marginalization” that, roughly speaking, would have been comparable to what we see today.
One was purely ecological marginalization. This was likely when areas with adequate patterns of rainfall became drier and only smaller populations could be supported in widely scattered bands. Such climatic downturns could have created localized-drought challenges directly comparable to those arising in the capricious Kalahari Desert area that people like today’s !Kung and !Ko Bushmen have to cope with, or challenges faced by desert Aborigines in Australia or Great Basin foragers in North America.4 The second type of marginalization would have been political. As cyclically better conditions allowed Pleistocene groups to multiply, competition could have intensified as more aggressive hunter-gatherer groups began to monopolize the better resources, marginalizing other foragers just as today’s foragers have been marginalized by territorially aggressive farmers.
Not only that, but when there were shifts toward ecological good times that permitted gradual but eventually substantial population growth, and then a sudden downturn arrived, it’s likely that foragers of one language or ethnic group would have been prone to aggressively push aside foragers of another. This would have been especially the case if the resources they were competing over were rich enough, and concentrated enough, to be readily defensible.5 Such marginalization could have resulted in outright warfare, and even though direct prehistoric evidence before 15,000 BP is lacking, to judge from certain foragers today6 such conflict could have become quite intensive under the right conditions.7
Some of the Holocene microclimates that today’s foraging nomads deal with are not just thin in resources but are quite unpredictable in the shorter term, and at least a few cases of famine have been recorded by ethnographers.8 In the recent Holocene these perilous junctures have occurred so rarely that only occasionally does an anthropologist even visit a field site at which a true famine is well remembered. But a striking exception is certain Inuit speakers like the Netsilik in central Canada or Inuit groups in Greenland,9 while Bushmen and many other foragers living on semideserts at least are able to recall episodes of serious privation.10 The social, emotional, and genetic effects of such dire scarcity will be weighed in Chapter 10.
Using today’s nomadic hunter-gatherers as models for their nomadic predecessors must be further justified here because Service’s “marginalization taboo” still enjoys such wide adherence. Of course, when archaeologists show their unreadiness to reconstruct the social life of “prehistoric foragers,”11 often this involves a legitimate fear of projecting modern human behaviors on to much earlier types of humans who had smaller brains and, in all likelihood, had a significantly different behavioral potential. With respect to smaller-brained humans who had not yet developed culturally modern tool kits, such conservatism has been and still is quite appropriate. Here, however, I am considering only the more recent prehistoric humans who matched us in brains and cultural capacity.
My theory is that the main outlines of their social and ecological life can be reconstructed quite straightforwardly simply by identifying behavior patterns that similarly nomadic foragers share very strongly today. However, such reconstructions must be carefully strategized, and I will be reconstructing only what might be called core behavior patterns,12 that is, behaviors involved with gaining a living, along with the social behaviors that are basic to such an enterprise. In addition, in recreating Late Pleistocene socioecology, I will be focusing only on those carefully selected contemporary foragers whose ecological lifestyles would have been likely 45,000 years ago.
This analysis has involved ten years of research effort.13 My first task was to evaluate the great majority of the world’s ethnographically described hunter-gatherer societies, 339 of them,14 to weed out those that obviously would have been atypical in the Late Pleistocene. I eliminated, for instance, the many North American mounted hunters like the Apache or the Comanche because horses were domesticated only recently.15 I also eliminated a few bands that lived dependently at missions, like the well-known South American Aché, and ones that symbiotically traded food with horticulturalists like the Pygmies or the Agta of the Philippines or foragers who had begun to cultivate a few plants themselves. And then I had to set aside dozens of societies that had been heavily involved for centuries with the European fur trade, such as the North American Ojibwa and the Cree, and of course I had to eliminate several dozen sedentary foraging societies that began to intensively store food and eventually lost their egalitarian ways to become markedly hierarchical, like Japan’s aboriginal Ainu or the Kwakiutl of British Columbia—who actually had slaves. After this triage was finished, only about half of the world’s foraging societies were left. They were uniformly independent, nomadic, and egalitarian, and they were suitable—if used in quantity with some statistical sophistication—as models for humans in the latter part of the Late Pleistocene Epoch, which overall lasted from about 125,000 BP until our present Holocene Epoch began to kick in.
The contemporary models I’ll be using, then, are taken from the perhaps 150 groups that I’ll be referring to as “Late Pleistocene appropriate”16 foraging societies, or, in a more streamlined fashion, as “LPA foragers.” My assumption is that they are very similar to the culturally modern people who were evolving in Africa around 45,000 BP and were spreading to most parts of the world.17 (Keep in mind that the people who painted those beautiful cave paintings in Spain and France first evolved their artistic potential in Africa, where cultural modernity had its beginnings.)18
With a third of these worldwide LPA societies now coded in fine detail with respect to their social life, this is what I’ve found so far. To start with, these fifty societies are definitely all mobile, and as nomads, instead of trying to store their large-game meat as individual families, they share it widely. It doesn’t matter whether these people live on Arctic tundras or in tropical forests—they never dwell in permanent, year-round villages, and they always combine hunting and gathering to make a living according to what is environmentally available, with an emphasis on eating the relatively fatty meat of large mammals. Normally, their camps or “bands” average around twenty to thirty persons, and each family cooks at its own hearth.19
In the case of camp size and the butchering of large game, today’s ethnography coincides with what we know of yesterday’s archaeology.20 Just from the ethnography, we know that invariably these people believe in sharing their large game with everyone in the band, and that they all face problems of social deviance like bullying and theft and employ similar basic means of social control to combat them. These foragers very predictably share a core of moral beliefs with an egalitarian emphasis on every hunter’s being a political equal, while the political positions of women as nonhunters are much more subject to diversity. We also know that their bands involve highly flexible camping arrangements, with families moving in and out as needed, and that at any given time a band will be composed of a mixture of some related and many more unrelated families.21
If these bands were just big extended families, the cooperation and altruism they engage in would be much easier to explain, for kin selection theory would do the trick. But they aren’t, and we may readily assume that the same was true 45,000 years ago. That’s why, as our evolutionary story unfolds, we’ll be so interested in seeing by proxy how prehistoric forager lifestyles could have generated distinctive types of social selection, as agencies that could have supported generosity outside of the family at the level of genes.
Today, the social patterns I’ve discussed hold all but uniformly across an almost incredibly wide variety of environmental niches that these LPA foragers manage to cope with successfully. These range from arctic tundras to boreal forests in the far north, to productive temperate or tropical forests, to resource-stingy jungles, and to fertile plains or game-rich savannas and arid semideserts.22 These environments include coastal areas as well, which prehistorically were likely to have served as refuges from glacial cold snaps or droughts. These sites today often would be under water, and it’s conceivable that people could have become sedentary for a time while exploiting them. It’s even possible that sometimes they did so for long enough to begin to lose their egalitarian, meat-sharing lifestyle if a long-term habitat was rich enough to permit food storage. However, while families’ economic standards of living may have begun to differ, it’s likely that political egalitarianism would have been more resistant to change, and in any event these outliers would not have negated the social central tendencies I’ve been describing; they would have held very widely.
Climates today range from hot to frigid and from stable to sometimes fairly unpredictable, but before the Holocene phased in, Late-Pleistocene-type climates could change with a rapidity we seldom see today. It’s no accident that during the lengthy Pleistocene Epoch, human brains just kept on getting bigger, for we’ve had a lot of challenges to cope with,23 and surely some of them involved situations of desperation and famine. In Chapter 10 we’ll learn, from today’s foragers, exactly how desperate these situations were likely to have been and what could have happened to the usual food-sharing practices when people were facing actual starvation.
It’s remarkable that a single main “type” of band composition and group life can work so successfully when such a startling array of environmental challenges is faced, but this in fact is the case. Scholars agree that socioecological flexibility is what makes this possible, and although the band is an obvious focus, to get the total picture we have to think in terms of many culturally similar bands dispersed over sizable regions, with families changing bands on a rather frequent basis. In the Late Pleistocene with its dangerously capricious environments, very likely this highly flexible approach to group living and subsistence was not merely convenient but often absolutely necessary to getting by—with the sheer survival of entire bands or regional populations surely being on the line much more frequently than is the case today.
In that epoch, as today, there would have been at least a few exceptions to the basic overall patterns—that is, the strong central tendencies I’ve been describing. I’ve just suggested that temporary sedentary adaptations were likely, and the food storage could have reduced the sharing of large game. Another readily understood contemporary exception that was likely prehistorically can be seen in the few foraging societies that cope with environments so spare that most of the time they are able to forage only as families, without forming bands—as with some desert Australian Aborigines, who subsist partly on insects, or as with certain of the Shoshonean Indians living in America’s semidesertic Great Basin area, whose fat and protein come mainly from fluctuating harvests of piñon nuts rather than from wild game.24 In the unstable and periodically dangerous Late Pleistocene, occasional divergences from the central tendencies I described previously were likely to have been more frequent. However, the great majority of these prehistoric foragers would still have followed today’s main pattern, meaning that they lived in mobile, flexible, egalitarian multifamily bands of twenty to thirty and they invariably shared their beloved large game with its exceptional fat content. That, I propose, was the central tendency for those large-game hunters, and surely it was a strong one, then as now, even though these and possibly some other outliers were likely.
I’ve taken 45,000 years before the present as the time when Homo sapiens populations in Africa had become culturally modern; this means that they had a full capacity to flexibly invent and maintain the remarkably variable material and social patterns that LPA foragers exhibit today. However, this date may be somewhat conservative,25 for humans had already become anatomically modern by 200,000 BP,26 which means that they were then at least physically indistinguishable from us. Increasingly, it’s looking as though cultural modernity, as deduced from the making of increasingly complex and variable artifacts, some of them symbolic, was phasing in earlier than 45,000 BP. The problem is that cultural modernity evolved in Africa and African archaeology is just getting up a real head of steam. Thus, even though I shall use the 45,000 BP figure to keep the analysis conservative, we might put in its place a date of 50,000 BP or even 75,000 BP or earlier. Only time, and more excavations, will tell.
Much of this book is about how punitive social control, a harsh form of social selection indeed, has acted on human gene pools. I shall be proposing that aggressive (and originally nonmoral) social sanctioning shaped the earlier human genome to give us an evolutionary conscience, and that extensive curbing of free riders was another important effect. In turn, free-rider suppression opened the way for the evolution of altruism; this development will be explained in detail in chapters to come. These three developments, taken together, may be seen as the scientific story of moral origins.
Both punitive and positive social selection were closely involved with group political dynamics, and when band members started to form consensual moral opinions, and were systematically punishing deviant behaviors and rewarding prosocial behaviors, a novel and powerful new element was added to human evolutionary process. The ultimate result was the human nature we carry around with us today, which of course combines selfish egoism with nepotism but also includes enough sympathetic altruism to make a major difference socially.
The more Draconian the sanctioning actions of angry groups were, the stronger the force of punitive social selection would have been as it acted on prehistoric gene pools. Lethal attacks on disliked individuals by sizable coalitions can be projected back into the Late Pleistocene Epoch with great confidence, for as we’ll see in the next chapter, such killings had at least a significant precursor in the shared ancestor of humans and two of the African great apes. However, among today’s LPA hunter-gatherers it’s difficult to say, for at least two reasons, exactly how often they inflict capital punishment on one of their own. First, when an ethnographer visits a foraging society for a year or two, it’s extremely unlikely that a deviant’s being killed by the band will be witnessed or even be talked about—unless the right questions about remembered history are asked. Second, band executions that have taken place say a hundred years before or earlier may simply be lost to memory.
Here we’ll be surveying my entire coded sample of fifty LPA societies to get at least a partial idea of how often capital punishment has been taking place over the past century or so. But keep in mind that we’ll be seeing only the tip of an iceberg, not only because of the two reasons I just gave but also because most hunter-gatherer ethnographies, although obviously precious, are at the same time likely to be seriously incomplete with respect to capital punishment because indigenous people quickly learn that missionaries and colonial administrators view their executions of dangerous deviants as “murder.” As a result, they quickly learn to clam up, and often stop this practice.
TABLE I CAPITAL PUNISHMENT IN FIFTY LPA FORAGING SOCIETIES*
Type of Deviance | Specific Deviances | Societies Reporting |
Intimidation of group | Intimidation through malicious sorcery | 11 |
Repeated murder | 5 | |
Action otherwise as tyrant | 3 | |
Psychotic aggression | 2 | |
Cunning deviance | Theft | 1 |
Cheating (meat-sharing context) | 1 | |
Sexual transgression | Incest | 3 |
Adultery | 2 | |
Premarital sex | 1 | |
Miscellaneous | Violation of taboo (endangering group) | 5 |
Betrayal of group to outsiders | 2 | |
“Serious” or “shocking” transgression | 2 | |
Deviance unspecified | 7 | |
Total societies reporting capital punishment | 24 |
*The above figures are derived from the author’s hunter-gather database.
Table I shows the basic patterns of active capital punishment (almost all of males, by males) that did emerge in spite of such seriously incomplete data. I should mention here that just a few bands had a formal system for deciding to eliminate a deviant, with a council of elders meeting to come up with a death sentence. More often the process was far less “structured,” which means that the entire band (including the females) would simply agree, informally, that a close relative of the deviant should do him in. The deviant was almost always a “he,” and there was a very good reason for his executioner, a well-armed hunter, being a close relative. This will be explained in Chapter 7.
This intensive survey of fifty LPA hunter-gatherers involved coding over two hundred ethnographic sources, and half of the time, surely for reasons that have just been explained, capital punishment wasn’t even mentioned. For almost half the societies, however, some instance of capital punishment was in fact reported, as seen in Table I, and often the ethnographers were in a position to specify the deviance patterns that brought it on.
Eventually, I will have coded at least three times as many of these societies, which will make such statistics less subject to the vagaries of sampling error. But these numbers do enable us to pick out one very salient pattern: half the people killed, almost all of them males, were intimidating their groups. This was done by greedily or maliciously using supernatural power to seriously threaten the welfare or lives of others; by being far too ready to kill, repeatedly, out of greed or anger; by otherwise managing to seriously dominate others; or (much more rarely) by being aggressively insane.
This main pattern fits with the previously emphasized fact that all LPA bands today are highly egalitarian in their social worldview, which means that groups quickly become angered when individuals given to domination (ones like the Pygmy Cephu) try to self-aggrandize. Except for the killing of psychotically aggressive men, all of these aggressive intimidators were considered to be morally deviant, and other research I have done, including that set forth in my book Hierarchy in the Forest,27 suggests that if small nonliterate groups are to keep in place the egalitarian political orders they so strongly prefer, they sometimes will be obliged to use capital punishment when a menacingly aggressive personality appears in their midst.
The choice is between suffering a determined tyrant or taking him out, and it’s precisely because people inherit varying dominance tendencies from their parents that in any small egalitarian society, eventually someone with proclivities to behave far too dominantly will show up, act on these propensities, and manage to get himself in serious trouble. This is either because he doesn’t have the ability to properly read the situation or because he lacks the capacity to restrain himself. The gender pattern is as follows: it is almost always a well-armed male hunter who tries to dominate his peers, but when a band coalesces to bring down such a tyrant, the females may be as active as the males in the political dynamics involved, and in one rare execution that was actively communal, they became physically involved.28
If we look now at the several low-frequency effects in Table I, we see that stealthily breaking rules to take advantage of others by cheating or theft are at least mentioned. With respect to free riding, this means that both intimidating and deceptive free riders can receive a death sentence, but keep in mind, here, that we are considering only reasons for capital punishment. Lesser sanctions, like ostracism, shaming, and expulsion from the group, are more likely for lesser crimes that do not necessarily threaten an entire group’s safety, autonomy, or welfare. In Chapter 7 I will present data on these lesser sanctions, which do not suffer so much from the incomplete data problems I have described with respect to capital punishment.
Being punished capitally will have varying but always deleterious effects on the reproductive success of the serious malefactors who are killed. This obviously precludes having further offspring, so if a man is executed while in his twenties or thirties, the fitness consequences are enormous. Furthermore, after his loss the offspring he’s already created will receive less parental support and their fitness (half of which is shared with the father) will suffer accordingly. Furthermore, he will not be around to help coresident closer kinsmen like siblings if they need it or to cooperate with them.
As human gene pools evolved over evolutionary time, the effects of active group punishment on individual reproductive success could have been quite significant, and surely these widespread effects became increasingly influential as people gained stronger consciences and their group punishment increasingly became driven by moral outrage. Here we’re speaking not only about capital punishment but also about banishment and serious ostracism, which can interfere significantly with an individual’s gaining the benefits of cooperation. In addition, just acquiring a bad moral reputation surely would have made some people avoid such deviants in important choice situations such as marriage or in looking for subsistence partners.
All of these mechanisms entailed social selection in the sense that preferences shared by groups were affecting gene pools.29 More specifically, all involved negative preferences, and all disadvantaged the reproductive prospects of individuals prone to social deviance—or at least those who could not control their sexuality, greed, or inappropriate hunger for power. For such moralistic social selection to have been a significant factor in shaping human gene pools, probably it had to be operating for at least a thousand generations,30 which in long-lived humans comes to about 25,000 years. Thus, the period just before cultural modernity arrived could have figured importantly in moral origins—if this development took place at all quickly.
In the next chapter we will start at the beginning to consider in more detail the head starts that were provided by two distant but highly relevant ape ancestors, the Common Ancestor and Ancestral Pan, so as to discern how some special developments in the human line led us, but not chimpanzees or bonobos, to evolve a capacity for virtue—and shame.