1

HUMAN NATURE

Our Inner Tribesman

Human nature is real. Few statements are less controversial among the people who study the subject and more controversial among people who don’t.

It is fair to say that no reputable psychologist, neuroscientist, linguist (including Noam Chomsky), or economist disputes the fact that human beings come preloaded with a great deal of software. Indeed, the fashionable metaphor today is not software but “apps”—as in the applications we have on our smartphones. Different situations trigger different apps, and sometimes these apps can be in conflict.

All of the serious debates about nature versus nurture start with the premise that there is already a lot built into our nature. The only question is what we can add on top of nature or what apps we can override. Think of a car. We all generally agree that a car comes with an engine, four wheels, and a steering wheel. These things come standard. That’s nature. Nurture provides the options, and there are a great many options. But no matter how many add-ons you buy, a car is not going to be a helicopter.

In his enlightening book Just Babies: The Origins of Good and Evil, psychologist Paul Bloom chronicles a remarkable number of experiments conducted on infants and toddlers. (Rest assured: No babies were harmed in the process.) He demonstrates that babies as young as six months already come preloaded with a number of psychological traits that suggest an innate moral sense. For instance, infants between six and ten months old were shown puppet plays. One puppet would be trying to get up a hill. Another puppet would either come to the hill-climbing puppet’s aid or it would get in the way, stymieing the climber’s efforts. Afterward, the babies were given a choice between the mean puppet and the nice puppet. The babies almost uniformly preferred the nice puppet over the jackass puppet. When a similar study was performed with twenty-month-old toddlers, the kids would reward the nice puppet with candy and punish the bad puppet by taking its candy away.1 Other studies confirm that we are all born with some very basic programming about empathy, altruism, cooperation, and other moral intuitions.

Bloom takes great care in pointing out that, just because we are born with a kind of moral sense, that doesn’t mean we are therefore moral. Rather, we are born with moral taste buds. How we use them depends on the environment we grow up in and, crucially, how we define “morality.”

One of the most important findings of not just Bloom but thousands of researchers across numerous disciplines is that we are all born with a natural distrust of strangers. Very young babies can identify language; their cries even have regional accents. “Young babies can recognize the language that they have been exposed to, and they prefer it to other languages even if it is spoken by a stranger,” Bloom reports. “Experiments that use methodologies in which babies suck on a pacifier to indicate their preferences find that Russian babies prefer to hear Russian, French babies prefer French, American babies prefer English, and so on. This effect shows up mere minutes after birth, suggesting that babies were becoming familiar with those muffled sounds that they heard in the womb” [emphasis mine].2

Interestingly, our brains dedicate an enormous amount of resources to facial recognition. We are born with an intense interest in human faces. No doubt there are many reasons for this. For instance, much early human communication was done nonverbally and that’s still true for humans today, particularly before we learn to speak. One can debate whether reading faces was important in the past or today, but our ability to recognize faces was clearly more vital in the past. Being able to instantly recognize kin or friends from strangers could mean the difference between life and death. (It’s telling that our ability to identify faces is actually much more sophisticated than our ability to verbally articulate the differences between faces. Most of us can instantly distinguish between, say, Matt Damon’s face and Matthew McConaughey’s. But can you instantly explain what makes their faces so different?)

The desire for unity and distrust of strangers are universal human tendencies—but just tendencies. While I don’t think they can be wholly taught out of us, they certainly can be tempered and channeled in productive ways. It is a common cliché among certain tribes of humanists to say something like “There is no race but the human race,” which of course is just a more secular version of “We are all children of God” and similar endearing platitudes. All things being equal, I think this is a benign cliché and worth incorporating into our civilizational dogma. But I should point out that, of all the systems ever created that actually put this belief into practice, none has been more successful on the ground than the market. The market lowers the risk—or “price”—of distrust by letting very different peoples and cultures find common interest.

The distrust of strangers and the craving for unity are important themes in this book, because they illuminate a much broader fact: Ideology is downstream of human nature. Children and adults are constantly told that one needs to be taught to hate. This is laudable nonsense. We are, in a very real sense, born to hate every bit as much as we are born to love. The task of parents, schools, society, and civilization isn’t to teach us not to hate any more than it is to teach us not to love. The role of all of these institutions is to teach what we should or should not hate.

Bloom writes that “just about all the readers of this book believe that it’s wrong to hate someone solely because of the color of his or her skin. But this is a modern insight; for most of human history, nobody saw anything wrong with racism.”3 All good people are supposed to hate evil, but the definition of what constitutes evil is rather expansive across time, and refining the definition of evil is the very essence of what civilizations do.

Every culture ever known has things it hates and things it loves. And every political ideology ever known has some group it considers the Other. The pro-Nazi philosopher Carl Schmitt famously said, “Tell me who your enemy is, and I will tell you who you are.”4 Fascism is supposedly defined by its demonization of “the other.” Obviously, in Nazi Germany, the Other was best represented by the Jew. But communism had its Others too. They went by such names as the bourgeois, or the ruling class, or the kulaks. Contemporary liberalism has a host of Others it hates. We’ve all probably met avowed lovers of tolerance who talk about how much they hate intolerant people—but only certain kinds of intolerant people. I’ve lost count of the number of times I’ve heard people insist that the slightest prejudice against Muslims is evil and then proceed to explain how awful evangelical Christians are.

The anthropologist Richard Shweder compiled a useful list of things that different societies have thought was praiseworthy, neutral, or appalling:

masturbation, homosexuality, sexual abstinence, polygamy, abortion, circumcision, corporal punishment, capital punishment, Islam, Christianity, Judaism, capitalism, democracy, flag burning, miniskirts, long hair, no hair, alcohol consumption, meat eating, medical inoculations, atheism, idol worship, divorce, widow marriage, arranged marriage, romantic love marriage, parents and children sleeping in the same bed, parents and children not sleeping in the same bed, women being allowed to work, women not being allowed to work.5

In other words, the capacity for humans to think certain things are “naturally” good or bad is remarkably elastic. But there’s a difference between elastic and infinite. For example, incest has been a taboo everywhere. Obviously the strength of that taboo has varied, but no society has celebrated it. (Alas, that taboo has been steadily weakening in American popular culture.) Similarly, there’s no society in the world—now or known to have existed—where people didn’t give preference to relatives and friends over strangers, a point I’ll be coming back to quite a bit in later chapters.

Anthropologist Donald E. Brown compiled a list of attributes that describe “the Universal People”—i.e., everybody, everywhere. “Human universals—of which hundreds have been identified—consist of those features of culture, society, language, behavior, and mind that, so far as the record has been examined, are found among all peoples known to ethnography and history.”6 The list is too long to reprint here. But some of the most important, for our purposes at least, include coalitions; conflict; cooperation and cooperative behavior; corporate statuses; collective decision making; divination; ethnocentrism; entification (treating patterns and relations as things); envy; etiquette; fear; feasting; folklore; food sharing; gift giving; gossip; government; group living; (collective) identity; in-groups (as distinguished from out-groups); in-group biases in favor of close kin (as distinguished from distant kin groups); kin terms translatable by basic relations of procreation; kinship statuses; judging others; law (rights and obligations); law (rules of membership); leaders; magic; magic to increase life; magic to sustain life; male and female and adult and child seen as having different natures; males dominating the public/political realm; males more aggressive; males more prone to lethal violence; males more prone to theft; moral sentiments; myths; narrative; overestimating objectivity of thought; planning; planning for the future; preference for own children and close kin (nepotism); prestige inequalities; private inner life; promise; property; psychological defense mechanisms; rape; rape proscribed; reciprocal exchanges (of labor, goods, or services); reciprocity, negative (revenge, retaliation); reciprocity, positive (recognition of individuals by face); redress of wrongs; rites of passage; rituals; role and personality seen in dynamic interrelationship (i.e., departures from role can be explained in terms of individual personality); sanctions; sanctions for crimes against the collectivity; sanctions including removal from the social unit; self distinguished from other; self as neither wholly passive nor wholly autonomous; self as subject and object; self as responsible; self-image, awareness of (concern for) what others think; self-image, manipulation of; self-image, wanted to be positive; social structure; socialization; socialization expected from senior kin; socialization includes toilet training; spear; special speech for special occasions; statuses and roles; statuses ascribed and achieved; statuses distinguished from individuals; statuses based on something other than sex, age, or kinship; succession; sweets preferred; symbolism; symbolic speech; taboos: tabooed foods; tabooed utterances; taxonomy; territoriality; trade; and turn taking.

Again, this is only a partial list.

One of the most interesting taboos in American life is the taboo against discussing human nature. This is an entirely modern prohibition. The ancient Greeks and Romans, not to mention every major world religion, considered human nature not only real but an essential subject for study and contemplation. I think there are multiple overlapping reasons—many of them laudable—for our aversion to the subject. Our civilization has struggled to live up to the ideals of universal equality enshrined in the Declaration of Independence, the U.S. Constitution, and similar canons. Discussion of human nature inevitably bleeds into debates about genetic differences between groups or claims that certain behaviors or choices are “unnatural.” Discussion of human nature also grinds against the idea that the individual is unconstrained by external—or internal!—restraints, a nearly unique dogma of the West. Another reason why “human nature” sounds like fighting words is that it is at loggerheads with the French Enlightenment tradition that believes in the “perfectibility of man.”

But while some of these concerns are valid, the fact is the human universals identified by Brown apply to blacks and whites, Asians and aborigines. I am agnostic about the issue of racial differences, in part because I’m not clear on why they should matter even if they exist. Most of the good work on the subject—there’s a great deal of awful work as well—focuses on large aggregate and statistical differences between populations. Whatever may or may not explain these differences has no bearing whatsoever on how we should treat individuals as a matter of law, manners, or morality.

But one of the sources of the taboo against discussions of human nature does need addressing: the idea of the noble savage.

Jean-Jacques Rousseau is often credited with coining the phrase “noble savage,” though that honor belongs to John Dryden, who wrote in The Conquest of Granada (1670):

I am as free as nature first made man,

Ere the base laws of servitude began,

When wild in woods the noble savage ran.7

“The concept of the noble savage was inspired by European colonists’ discovery of indigenous peoples in the Americas, Africa, and (later) Oceania,” Steven Pinker writes. “It captures the belief that humans in their natural state are selfless, peaceable, and untroubled, and that blights such as greed, anxiety, and violence are the products of civilization.”8

Again, Rousseau didn’t coin the term, but he was the great popularizer of this myth. He wrote in 1755:

So many writers have hastily concluded that man is naturally cruel, and requires requires civil institutions to make him more mild; whereas nothing is more gentle than man in his primitive state, as he is placed by nature at an equal distance from the stupidity of brutes, and the fatal ingenuity of civilised man.9

“Rousseau reversed the poles of civilization and barbarism,” writes Arthur Herman. “His paeans of praise for primitive man, the ‘noble savage’…who lives in effortless harmony with nature and his fellow human beings, were meant as a reproach against his refined Parisian contemporaries. But they were also a reproach against the idea of history as progress.”10 For Rousseau, the advent of private property, the development of the arts, and the general advancement of human health and prosperity were actually giant steps backward.

Rousseau is considered by many to be the father of romanticism. And for a seminar on intellectual history, that is a fine way to describe him. But it is my argument that romanticism shouldn’t be understood as a school of art, literature, or philosophy but as a school of rebellion against the unnatural nature of the Enlightenment and all of the Enlightenment’s offspring: capitalism, democracy, natural rights, and science. The romantic spirit rebels against the iron cage of modernity, demanding a return to an imagined authenticity in harmony with nature. Romantic rebellion is less an argument and more of a primal yawp. It is a feeling that the world around us is dehumanizing, fake, artificial, and oppressive. “Romanticism is precisely situated neither in choice of subject nor in exact truth, but in a mode of feeling,” explained the French poet Charles Baudelaire (the man who coined the term “modernity,” as it happens).11

I will be returning to this point throughout the chapters that follow, but for now the important point is that this idea, this feeling, that modern man is corrupt and unnatural—or, more specifically, has been corrupted by modern society—suffuses vast swaths of our culture. It fuels a host of ideological and religious assumptions about past “golden ages” and nostalgic nostrums about how things used to be better in previous generations.

Romanticism is neither right nor left, because it is a pre-rational passion written into the human heart. It manifests itself in different ways and at different times across the ideological spectrum. It has been the fuel behind nationalism, populism, radicalism, and various forms of “reactionary” politics. It is also the wellspring of most of the great art of the last three hundred years, speaking to, and for, the parts of the soul that cannot speak through reason and science alone.

In short, it is a rebellion against the unnatural constraints of modern civilization. It shouts “I am not a number!” or “I am not a machine!” or “The man can’t keep me down!”

A common thread between various forms of left-wing romanticism, including assorted flavors of Marxism, and right-wing libertarianism and anarchism is that they make much of the fact that the state—or even civilization itself—is a form of institutionalized violence. As we will see, this is largely true. Where this insight goes off the rails is when it is assumed the past was less violent, that humans lived in peace and harmony in some golden age before the enslaving force of the state imposed itself.

“The idea that violence is rooted in human nature is difficult for many people to accept,” writes Francis Fukuyama in The Origins of Political Order: From Prehuman Times to the French Revolution. “Many anthropologists, in particular, are committed, like Rousseau, to the view that violence is an invention of later civilizations, just as many people would like to believe that early societies understood how to live in balance with their local environments. Unfortunately, there is little evidence to support either view.”12 Deirdre McCloskey rightly observes that “conquest, enslavement, robbery, murder—briefly, force—has characterized the sad annals of humankind since Cain and Abel.”13

According to Steven Pinker, author of The Better Angels of Our Nature: Why Violence Has Declined, if similar proportions of people died from violence in the twentieth century as did in most prehistoric societies, the death count of the twentieth century—allegedly the “bloodiest century”—would not be 100 million but two billion, or twenty times greater.14 This is because roughly one-third of primitive humans in small-scale societies died from raids and fights alone (though this is somewhat misleading, since the death rate for males is twice that of females).15

“To minimize risk, primitive societies chose tactics like the ambush and the dawn raid,” writes Nicholas Wade in Before the Dawn: Recovering the Lost History of Our Ancestors. “Even so, their casualty rates were enormous, not least because they did not take prisoners. That policy was compatible with their usual strategic goal: to exterminate the opponent’s society. Captured warriors were killed on the spot, except in the case of the Iroquois, who took captives home to torture them before death, and certain tribes in Colombia, who liked to fatten prisoners before eating them.”16 For generation after generation, day in and day out, warfare was normal.

This is no longer a debated point among most serious scholars. People who think we once lived in glorious harmony with each other—and the environment—aren’t scientists, they’re poets and propagandists. The evidence for mankind’s blood-soaked past can be found in the archaeological record, DNA analysis, the writings of ancient commentators and historians, and the firsthand reports of those remaining societies that have so far resisted modernity.

Napoleon Chagnon, the famous—and famously controversial—anthropologist, lived among the Yanomamö people in the Amazon for long stints starting in the 1960s and ending in the 1990s. He found that killing was a central institution of life.17 Roughly 44 percent of men over the age of twenty-five had participated in killing someone. One-third of adult male deaths were from violence and more than two-thirds of men over the age of forty had lost at least one close relative to violence.18

Chagnon found that the Yanomamö culture lived in a state of “chronic warfare.” The most common motives for raids and battles revolved around efforts to steal women, recover stolen women, or seek revenge for past abductions of women. Of course, men went to war for other reasons: blood feuds were particularly popular. But Chagnon did not find much evidence to confirm the prevailing orthodoxy of the day that warfare was “modern” and that to the extent primitive societies resorted to war it was because of scarce resources, specifically “protein scarcity.” This is a version of a very common assumption: that scarcity of resources is the chief cause of war. Obviously, this is not outlandish. But it is exaggerated. Wars are very often the by-product of pride, honor, and a desire for status.19

The barbarity of the past is hardly defined solely by the prevalence of war. Consider just two of the most obvious examples of what we today consider barbaric behavior: torture and slavery.

Torture, the deliberate infliction of pain or agony for punishment, fun, or profit, is the international pastime of premodern man (and it hardly died out suddenly in the 1700s either). In ancient societies many forms of torture were the preliminary rituals of human sacrifice. The Aztecs routinely burned victims alive, removed them from the flames, and then cut out their still-beating hearts.20 The Mayans skipped the burning for the most part and simply pinned their live victims to an altar and cut out their hearts.21

The Assyrians deserve their status in the torture hall of fame. Flaying—by which the skin is removed while the victim is still alive—was particularly popular. Staking was even more revered. The best torturers were able to do it in a way that left the staked alive and suffering for days.22 The Persians were inventive as well. One method involved simply forcing a person to stand in a room full of very fine ash for as long as he could. When he collapsed from fatigue, he would inhale the ashes and slowly suffocate.

That seems preferable, however, to “sitting in the tub.” In this practice, the victim was placed in a wooden tub with only their head sticking out. The executioner would then paint the victim’s face with milk and honey. Flies would begin to swarm around the victim’s nose and eyelids. The victim was also fed regularly and fairly soon they would virtually be swimming in their own excrement. At which stage maggots and worms would devour their body. One victim apparently survived for seventeen days. He decayed alive.23 (Scaphism, a variant of this technique, involved more or less the same thing, but with the victim tied to boats or logs.)

If you’re the sort of person who enjoys this sort of thing, the Internet is a smorgasbord of lists of torture methods, from sewing animals into living victims so they would have to eat their way out, to using fire to force rats to eat their way in. The ancient Greeks would not even consider confessions unless elicited by torture. The Romans had the same practice.24 They also perfected crucifixion, from which we get the word “excruciating.” The Chinese had lingchi, or death by a thousand cuts.

The centrality of torture as a tool of statecraft around the world cannot be exaggerated. But few societies put more time, energy, and ingenuity into the practice than medieval Europeans.25

Diehard members of the cult of the Noble Savage may want to say all of the cultures and civilizations were subsequent to man’s fall from grace. But there is simply nothing in the archaeological record to support that. “We need to recognize and accept the idea of nonpeaceful past for the entire time of human existence,” writes Stephen A. LeBlanc, co-author with Katherine E. Register of Constant Battles: Why We Fight. “Though there were certainly times and places during which peace prevailed, overall, such interludes seem to have been short-lived and infrequent….To understand much of today’s war, we must see it as a common and almost universal human behavior that has been with us as we went from ape to human.”26

Then there’s slavery.


It is surely true that slavery was less common among primitive man than among the societies that arose after the agricultural revolution. That is not because primitive man was more moral; it is because primitive man was so much poorer. Slaves are a very large expense for nomadic bands. Guarding an enemy who doesn’t want to be part of the group is costly and dangerous. Children can be taken in as assets—a common practice in many primitive societies, notably among American Indians—and women can be forced into marriage, very often a kind of slavery. But captured warriors from another tribe are a liability. Better to kill them, often theatrically, for the amusement of the victors.

After the agricultural revolution roughly 11,000 years ago, slavery emerges almost everywhere. The most ancient texts make reference to it. The Bible takes it as a given in human affairs. The Code of Hammurabi says that freeing a slave is a crime punishable by death.27 There are records of slavery in China going back to 1800 B.C.28

For understandable reasons, America’s shameful experience with slavery informs the way we talk about the institution. That’s right and proper. But it also distorts our understanding of it. As Thomas Sowell has chronicled, Americans tend to believe—because it is what they are taught—that slavery is an inherently racist institution.29 Some even seem to believe that slavery is a uniquely American sin. America certainly must take ownership of its use of slaves and the central role racism played in it. But the conventional understanding gets the causality backward. American racism stems from slavery, not the other way around.

Historically speaking, there are two remarkable aspects of American slavery. The first is the hypocrisy. Other societies relied on slavery more than we did, and some were arguably crueler to their slaves (though American slavery was plenty cruel). But none of those societies were founded on principles of universal human rights and dignity. The Romans, Greeks, Chinese, and Egyptians were not hypocrites for keeping humans in bondage; they sincerely believed that it was natural (even Aristotle said so).30 But America was born with the Declaration of Independence and the words “All men are created equal.” That is irreconcilable with slavery, no matter the rationalization.31

Which brings us to the second remarkable thing about American slavery. Against the backdrop of the last 10,000 years, the amazing thing about American slavery is not that it existed but that we put an end to it. In the context of the last thousand years, there were many efforts to abolish slavery. Many failed and many more were only half measures, establishing various forms of de facto slavery, such as serfdom. Over the next century, slavery was outlawed across much of Europe and in most northern colonies and states in America. England abolished the slave trade in 1807. The Dutch followed in 1814.32 The Congress of Vienna, which determined the fate of post-Napoleonic Europe, condemned slavery.33 Britain would abolish slavery in all of its colonies in 1834, though the Dutch would not follow suit until 1863.34

America, meanwhile, though it banned the slave trade in 1808,35 was otherwise tardy, and the effort was bloody and painful. But we officially ended the practice in 1865, with the passage of the Thirteenth Amendment.

The timing was not coincidental. “The fact is that slavery disappeared only as industrial capitalism emerged,” writes economist Don Boudreaux. “And it disappeared first where industrial capitalism appeared first: Great Britain. This was no coincidence. Slavery was destroyed by capitalism.”36 Adam Smith not only opposed slavery on moral grounds* but also considered it incompatible with the free market. “It appears, accordingly, from the experience of all ages and nations, I believe, that the work done by free men comes cheaper in the end than the work performed by slaves.”37 He also wrote that “whatever work he does beyond what is sufficient to purchase his own maintenance, can be squeezed out of him by violence only, and not by any interest of his own.”38

The fact that we needed a war to end the institution demonstrates that not everybody saw the light all at once. Nor is it altogether accurate to say that the war was launched to end slavery, though the war would never have started absent slavery. But what is true is that a liberal democratic order—and by extension a modern economy—cannot last while tolerating slavery. An array of internal contradictions led to the Civil War. As Abraham Lincoln put it, “I believe this government cannot endure, permanently half slave and half free.” And, famously invoking Jesus’s admonition, “A house divided against itself cannot stand.”39

Take the Declaration of Independence out of it, and American slavery was normal. MIT economists Daron Acemoglu and Alexander Wolitzky write:

Standard economic models of the labor market, regardless of whether they incorporate imperfections, assume that transactions in the labor market are “free.” For most of human history, however, the bulk of labor transactions have been “coercive,” meaning that the threat of force was essential in convincing workers to take part in the employment relationship, and thus in determining compensation. Slavery and forced labor were the most common forms of labor transactions in most ancient civilizations, including Greece, Egypt, Rome, several Islamic and Asian Empires, and most known pre-Colombian civilizations…40

In other words, the very notion that humans can sell their services or labor in a free market is a remarkably recent idea. Conversely, while almost all socialist and communist doctrine claims to oppose slavery—including so-called wage slavery in the case of the Marxists—the reality of socialism taken to its logical conclusion has often led to slavery in the form of forced labor. Command economies are just that: command economies. The Soviet Union, Nazi Germany, communist China, and North Korea have all widely used forced labor. China’s laogai system was set up in the 1950s and modeled on the Soviet Union’s gulag. “Laogai” means reform through labor, and the ostensible idea was to create committed Communists by forcing them to be indoctrinated to communism with the aid of backbreaking labor. The system became a profit center for party leaders and exists to this day, though the government has ditched the name “laogai” in favor of jianyu, or “prison.” But the practice endures. In the 2000s, it was revealed that administrators continued to profit from prisoners even after they worked them to death, by selling the organs of slaves.41

That China continued to rely on slave labor even after it embraced “capitalism” isn’t an indictment of capitalism, properly understood, but of authoritarianism. Authoritarian regimes can make profits, but that doesn’t make them free-market systems. The slaveholding rulers of the South got very rich, but everyone else stayed poor—or enslaved—because others were denied the full scope of liberty and rights necessary for capitalism to work. It’s a good thing China has embraced some market principles, because history shows that the development of a strong middle class creates a demand for responsive and accountable government. But China will not be a free country until the Communist Party is laid on the ash heap of history, where it belongs. As for now, it should be understood as a de facto authoritarian aristocracy, as we will see.

Disciples of the noble savage and radical egalitarians aren’t the only constituencies vexed by the reality of human nature. Partisans for the free market often run up against the inconvenient fact that Homo sapiens and Homo economicus are not synonyms.

The phrase Homo economicus, economic man, emerges as a criticism of John Stuart Mill and other thinkers who were seen—usually unfairly, particularly in the case of Adam Smith—as reducing humans to purely rational, profit-maximizing, economic beings. It’s not at all clear to me that Mill actually believed that man was purely a profit maximizer and it is clear to me that Smith did not. In other words, Homo economicus is one of those terms, like “social Darwinism,” that has few if any adherents but is especially useful as an intellectual epithet. Mill was quite clear that his definition of man as a profit maximizer was bound to the study of economics:

Geometry presupposes an arbitrary definition of a line, “that which has length but not breadth.” Just in the same manner does Political Economy presuppose an arbitrary definition of man, as a being who invariably does that by which he may obtain the greatest amount of necessaries, conveniences, and luxuries, with the smallest quantity of labour and physical self-denial with which they can be obtained in the existing state of knowledge [emphasis mine].42

An expert on football would have a definition of Homo footballis that would ignore what the players did off the field. Is it any scandal that an economist would have a definition of people that was contingent on economic activity?

Still, it is true that, historically, many economists and free-market ideologues have all too often looked at human behavior through a narrow economic lens. And, to be fair, Marxism has a tendency to reduce all questions to economic concerns as well. The old adage “If all you have is a hammer, every problem looks like a nail” seems apposite. Some free-market ideologues often sound like they believe in Homo economicus.

Regardless, the fact is that humans are not defined by the pursuit of profit, even if they often pursue profit. Many critics of capitalism find the idea of “economic man” a useful straw man in their indictment of capitalism, because bound up in the idea of economic man is the idea that “greed is good,” in the words of Oliver Stone’s straw man Gordon Gekko in 1987’s Wall Street.

Wherever you come down on these issues, it’s fair to say that we tend to think that humans are motivated by financial greed far more than they are. To be sure, greed is a staple of human nature, but greed for money has only been around for as long as money has been around—and, in evolutionary terms, that hasn’t been a very long time. Surely reasonable people can agree that greed predated money. One can easily see why greed for food and other basic resources would evolve. We can imagine countless circumstances in human history where the altruistic man starved to death, while the greedy one lived another day and hence passed on his genes.

But just as greed—or covetousness—is natural, so is altruism. Without altruism, it is unlikely the human race would have made it this long. Altruism can be driven by compassion, another universal human tendency. But it also is closely linked to gift exchange, reciprocity, and cooperation. Long before there were coins, the economy of primitive man was governed by gift exchange and reciprocity: I do this thing for you; you do something for me. I give you a hunk of meat; you help me fend off a bully eager to rob me. (The social economy of prisons—arguably the closest approximations to a state of nature in modern society—works according to these principles.) Richard Leakey and Kurt Lewin have ascribed the essence of the survival of the human species to the concept of reciprocity. Small bands of early humans could only survive if they learned to share resources “in an honored network of obligation.”43 It is a well-established finding in anthropology, psychology, and sociology that people who violate the norms of reciprocity are shunned by the larger group. Even among criminal organizations—prison gangs, the Mafia, etc.—the rules of reciprocity must be honored within the group. People who over-adhere to these norms—the generous, the philanthropic, etc.—are admired and often endowed with authority, political or moral, over others. The “big men” who lead many primitive societies often earned their status by the perceived justness with which they distributed resources to the group.44

Which brings us to admiration. Of course, all other things being equal, people want to be rich rather than poor. But what they want even more is to be admired, respected, and valued. Adam Smith understood this—and so much else about human nature—in his Theory of Moral Sentiments:

Man naturally desires, not only to be loved, but to be lovely; or to be that thing which is the natural and proper object of love. He naturally dreads, not only to be hated, but to be hateful; or to be that thing which is the natural and proper object of hatred. He desires not only praise, but praiseworthiness; or to be that thing which, though it should be praised by nobody, is, however, the natural and proper object of praise. He dreads, not only blame, but blame-worthiness; or to be that thing which, though it should be blamed by nobody, is, however, the natural and proper object of blame.45

A desire to be admired is hardwired into us, just as it is in chimpanzees. There’s no debate about this, as far as I can tell, among the diverse range of disciplines that look at such things. The researchers, however, focus less on the concept of admiration and more on the idea of “status.” Status is the essence of chimpanzee politics, as Frans de Waal persuasively argued at book-length in his Chimpanzee Politics: Power and Sex Among Apes.46 It would be a strange believer in evolution who thought that something that was central to the life of our nearest genetic cousins is irrelevant to us.

Sociologists distinguish between two kinds of status in all human societies: ascribed status and achieved status. “Ascribed status” refers to what you are born with. Royalty is the quintessential example of ascribed status: the belief that some people are just born better or worse thanks to their lineage or parents. In numerous societies, India arguably the most famous, the whole of the population was divided up into different categories of ascribed status called castes. These castes set the acceptable parameters of virtually every meaningful pursuit in life, including the places one could live, whom one could marry, and even what kind of occupations one could hold. Europe’s caste system was perhaps less austere but no less binding, with categories of serfs, peasants, nobility, and other rankings of humans’ innate worth.

One of the greatest yet among the least appreciated achievements of the American Revolution was the decision to abolish such things. One very good reason it’s so unappreciated is that we maintained another version of applied status: slavery. In the Roman tradition of slavery, slaves were not born, they were made. The child of a slave did not inherit that status. In the American South, defenders of slavery realized that this common tradition of slavery was incompatible with their system, so they adopted the Aristotelian notion that some people are simply slaves by nature, making slavery an ascribed status.47

Though we have abandoned formal, legal, and applied status in America, the desire for status it is still a fact of our lives. If you had a typical grade school or high school experience, you know that status seeking is the very heart of adolescent social life. Adolescents talk not of status but of popularity, though this is not that meaningful a distinction. Cliques are all about status. So are the petty and often cruel contests that define the politics of the locker room and the playground. The same goes for prisons.

More broadly, one need only look at the enduring success of political dynasties to see that inherited status still plays a big role in our culture. Kennedys, Bushes, Clintons, Roosevelts, Romneys—we ascribe status to the progeny of famous people whose worth and honor is unearned by their own actions. In marketing terms, certain last names and bloodlines have in effect become a kind of inherited title, though today we would call it a “brand.”

This is even more the case outside of politics. We cavalierly talk about “Hollywood royalty” without really understanding what we mean by that. Out of a concern for my eternal soul, I’ve vowed never to write about the Kardashians, but I think I am not putting it in too much peril to simply note that we treat that ridiculous band of airheads and slatterns as some sort of celebrity gentry.48

This is natural. Every family has within it a spark of dynastic ambition. I’m reminded of Tywin Lannister’s sermon to Jaime Lannister on the importance of family in Game of Thrones: “Your mother’s dead. Before long I’ll be dead, and you and your brother and your sister and all of her children, all of us dead, all of us rotting underground. It’s the family name that lives on. It’s all that lives on. Not your personal glory, not your honor, but family. You understand?”49

Status is closely linked to our natural instinct for authority and hierarchy—an instinct that is found in just about every species of animal that lives in groups. Dogs, chickens, and apes have hierarchies and pecking orders. Jonathan Haidt notes that these impulses are so baked into us they manifest themselves in language. “The urge to respect hierarchical relationships is so deep that many languages encode it directly,” he writes. “In French, as in other romance languages, speakers are forced to choose whether they’ll address someone using the respectful form (vous) or the familiar form (tu). Even English, which doesn’t embed status into verb conjugations, embeds it elsewhere. Until recently, Americans addressed strangers and superiors using title plus last name (Mrs. Smith, Dr. Jones), whereas intimates and subordinates were called by first name. If you’ve ever felt a flash of distaste when a salesperson called you by first name without being invited to do so, or if you felt a pang of awkwardness when an older person you have long revered asked you to call him by first name, then you have experienced the activation of some of the modules that comprise the Authority/subversion foundation.”50

Haidt’s use of the word “foundation” refers to part of the Moral Foundations Theory he and other researchers formulated. He lays it all out in his path-breaking book The Righteous Mind: Why Good People Are Divided by Politics and Religion. If you can read that book and not come away confident that there is something called “human nature,” you might as well put down this book too.

Moral Foundations Theory holds that there are six components to moral sentiments that form the basis of all forms of moral reasoning. They are: care, fairness, liberty, loyalty, authority, and sanctity. How these foundations are applied and interact explain all of the variations in human cultures and societies when it comes to how we define right and wrong.

Indeed, the need for norms of behavior is another universal facet of human nature. As we’ve seen, there is a lot of variability in the kinds of norms we establish, but the need for norms is uniform across all societies. And certain basic rules of conduct or moral behavior seem to be universal—and pre-rational—as well. When someone cuts in line in front of us at the grocery store, there is a chemical reaction in our brains that fills us with anger. Our reaction is often quite disproportionate to the harm actually done to us. That’s because we evolved to see the stakes in norm violations to be much greater than they are at the local Walmart or Kroger. A norm violation on the African savannas could be a matter of life and death.

Paul Bloom recounts how the innate and universal tendency of children to tattle on their siblings and classmates seems to be an early form of norm enforcement. One key indicator of this is that kids rarely make up stuff when they rat out each other. Psychologists Gordon Ingram and Jesse Bering studied tattling by children in an inner-city school in Belfast and found that “the great majority of children’s talk about their peers’ behavior took the form of descriptions of norm violations.”51

Norms matter evolutionarily because they are the sinew of all cooperation. If primitive man lived a solitary life, as Thomas Hobbes believed, norms wouldn’t matter. But we evolved in groups, specifically tribes. Without cooperation, we would still be a mid-tier species, not the planet’s apex predator. And cooperation is impossible without norms, or rules. Think about it for just a moment and this becomes obvious. A hunting party cannot work as a group unless it has agreed upon rules, including lines of authority. What separates an army from a rabble is that the soldiers know their place and their duties, even when they’re not being supervised.

Charles Darwin himself speculated about how cooperativeness—altruism, reciprocity, consensus around norms, and, most of all, unity—was the key to human survival. The tribe that works together survives to pass its genes on. “If…the one tribe included…courageous, sympathetic and faithful members, who were always ready to warn each other of danger, to aid and defend each other,” Darwin observed, “this tribe would without doubt succeed best and conquer the other.”52 Cooperation explains the evolution of language, religion, warfare, and almost every uniquely human endeavor. But the drive and desire to cooperate isn’t just a society-wide phenomenon. Politics, long before we had the word “politics,” has been about forming coalitions: within the band, the tribe, or any other social unit. Chimpanzees and humans alike form coalitions around all manner of interests. These coalitions are like subtribes, and they are every bit as prone to the logic of us-versus-them as the tribe as a whole is to foreign enemies. As we will see, the dire shape of our politics today is a function of this tendency, as Americans break up into “tribal” coalitions against other Americans they only see as “the other.”

Adherence to norms is impossible without some collective understanding that norms must be enforced. Paul H. Robinson and Sarah M. Robinson’s book Pirates, Prisoners, and Lepers: Lessons from Life Outside the Law brilliantly and, to my mind, incontrovertibly demonstrates how notions of punishment and retribution are not merely universal (a relatively uncontroversial point) but absolutely necessary for cooperation. This is much more controversial. A growing number of criminologists and ethicists see punishment itself to be illegitimate and dangerous. David Garland, a professor of sociology and law at New York University, insists: “It is only the mainstream processes of socialization (internalized morality and the sense of duty, the informal inducements and rewards of conformity, the practical and cultural networks of mutual expectation and interdependence, etc.) which are able to promote proper conduct on a consistent and regular basis.”53 Another scholar claims “the institution of criminal punishment is ethically, politically, and legally unjustifiable…[A] society concerned about protecting all of its members from violations of their claims of right should rely on institutions other than criminal punishment.”54

But, as the Robinsons show, with exhaustive citations from both psychological research as well as the historical record, cooperation is unsustainable without some kind of sanction against those who do not cooperate. If you and nine of your friends are tasked with digging a ditch in the hot sun on the promise that you will all be rewarded at the end of the day with a great meal, how long will you tolerate an able-bodied free rider who sits in the shade under a tree as you toil in discomfort? How likely is it that the group will include the slacker in the meal at the end of the day? Countless laboratory and real-life experiments show that the free rider will be punished, certainly with scorn, and usually with exclusion from the reward.

This instinct to enforce group norms manifests itself everywhere from primitive tribes to sports teams and even, as the Robinsons demonstrate, to utopian hippie communes where everyone is supposed to be free to let their freak flags fly. And whenever the instinct is not enforced by a central authority or the collective itself—or both—the unit disintegrates. Even a young Bernie Sanders was kicked off his commune because he was less interested in doing the work of socialism than he was in talking about the need for socialism.55

Instead, let us move on to a final, crucial facet of human nature: meaning. We are creators of meaning. What do I mean by “meaning”? Simply that we have a natural tendency to imbue things, practices, people, events, ideas, and everything else around us with significance beyond the rational and material. Just consider the meaning we invest in eating food. The family dinner, breaking bread with old friends, Thanksgiving—we invest in these things meaning far beyond the need for sustenance. There’s a vast anthropological and sociological literature dedicated to the role eating plays in every culture. The primary form of what social scientists call “gift exchange” has always been, at least until very recently, the sharing of food.56

For millennia food—preparation, blessing, sharing—has been the sinew of society. Many major religious holy days in Judaism, Christianity, and Islam involve food—either the communal eating of it or the communal abstention from it. Indeed, when some Christian denominations take Communion—i.e., joining not just the community of fellow Christians but entering into the body of the church itself—they do so by eating the transubstantiated flesh of Jesus. Depending on the denomination, this is a solemn moment or a celebratory one, but in all cases the significance is greater than merely snacking on “my little cracker,” in the words of Donald Trump.57

In subsistence societies, a great feast combined every aspect of life. It was a celebration, a cause for thanksgiving as well as entertainment. The feast was also the keystone of politics: the Big Man’s authority derived in very large part from how he distributed food at such gatherings to members of the clan or tribe. The feast was also a vital tool of diplomacy. Peace with an enemy tribe was often brokered and usually celebrated at a feast. Marriages, one of the central tools of alliance making, were both sealed and solemnized with a great feast.58

The Hebrew Bible is crammed with countless rules about what kind of animals we can eat, how they must be prepared, etc. The modern mind looks at such rules and says, Ah, that was about hygiene, or some such. But that misses the forest for the trees. No doubt, hygiene played a role in such norms, but to say Kashrut (“kosherism”) is just the ancient version of a sign saying “Employees must wash hands” is absurd. Prior to the scientific revolution, we did not compartmentalize meaning as we do today.

Think of it this way: For some primitive tribes a tree was many things—a source of fuel, a resource for shelter and tools, a plaything for children, perhaps a source of food, and a manifestation of some divine purpose or entity. Separating the practical ways of seeing a tree from the transcendent ones is a modern invention. To the extent anyone does believe we are simply Homo economicus and nothing more, they blind themselves to the vast scope of meaning that cannot be reduced to economic inputs.

Ernest Gellner argues that, in the transition to modernity, we lost something that defined how humans saw the world until three hundred years ago.59 We used to layer meaning atop meaning, horizontally, like one sheet of tinted film atop another. Utility and sacredness, habit and ritual, convenience and tradition, metaphor and fact—each lay atop one another, producing a single lens through which we viewed the world. The scientific revolution changed that. We now hold each sheet of film separately and look at the world through it. We have one sheet for religion. Another for business. Yet another for family. We pick up each one like a special and separate magnifying glass for each specimen.

This division of mental labor has helped to produce enormous prosperity, cure diseases, reduce violence, and liberate humanity from millennia of superstitions that held individual humans from realizing their potential.

But it has also produced enormous challenges, because this way of seeing the world is unnatural. Our modern tendency to put different aspects of our lives into distinct silos—religion over here, entertainment here, food here, politics over there—is wholly alien to how man evolved to live. In our evolutionary natural habitat, we stack meaning atop meaning. Family is a sacred bond, but it is also a survival mechanism. Food is sustenance, but it is also an opportunity for community and sacralization. Politics isn’t an artificial mechanism or a separate sphere of life, blocked off from religion, survival, or community; it is how every important question of daily life is answered, and it is how God, or the gods, or our ancestors, wanted us to live—all at the same time.

By separating out the different meanings of our lives, each one seems more diluted, less all-encompassing and fulfilling. We miss the unity of the pre-Enlightenment mind. And so we yearn to restore meaning where it isn’t. We yearn to end the division of labor and find ways of life that are “authentic” and “holistic.” We attach ourselves to ideologies that promise unity, where we are all part of the same family or tribe. We flock to leaders on the left and the right who promise to tear down walls and end division—but always on our terms. So much of the rhetoric about the evils of money and the “capitalist system” isn’t really about money but about how so much of our lives has been chopped up by this division of psychic and spiritual labor, banishing the ecstasy of the transcendent from our daily lives. This, in short, is the romantic temper. The romantic wants to pull down the walls of compartmentalized lives and restore a sense of sacred or patriotic unity of meaning and purpose.

Still, just because the modern mind compartmentalizes more than the ancient one did doesn’t mean the walls are as high and as sturdy as we think. They are being broken down every day, in our own minds and in the larger community. That is because our inner tribesman doesn’t like this world, and he is desperate to get back to where he came from. The problem is that seeking such unity in all things is the first step in leaving the Miracle of modernity. The desire to decompartmentalize every facet of life—work, family, politics, economics, art, etc.—is a reactionary one. It is the totalitarian temptation, and a corruption of the civilization we are blessed to live in. And it is utterly natural.

* Smith writes in The Theory of Moral Sentiments, “There is not a negro from the coast of Africa who does not…possess a degree of magnanimity which the soul of his sordid master is too often scarce capable of conceiving. Fortune never exerted more cruelly her empire over mankind, than when she subjected those nations of heroes to the refuse of the jails of Europe, to wretches who possess the virtues neither of the countries which they come from, nor of those which they go to, and whose levity, brutality, and baseness, so justly expose them to the contempt of the vanquished.” Adam Smith. “V.1.19.: Of the Influence of Custom and Fashion upon our Notions of Beauty and Deformity.” The Theory of Moral Sentiments. Library of Economics and Liberty. http://www.econlib.org/​library/​Smith/​smMS5.html