Dominic D. P. Johnson
It was not the legions which crossed the Rubicon, but Caesar.
—Napoleon Bonaparte
In 401 B.C., Cyrus the Younger launched an expedition into Persia against his rival and brother, Artaxerxes II. At the battle of Cunaxa, Cyrus was killed, and in the ensuing peace negotiations, his second in command, along with all the other generals and many lower-level commanders, were tricked and murdered. Famously, the remaining army of Greek soldiers—“the Ten Thousand”—was left stranded in the middle of hostile territory, hundreds of miles from home, and, most notably, leaderless. Their remarkable march to freedom, fighting as they went, was recounted for history by a soldier named Xenophon (4th century B.C./1989). According to legend, the key to their success was the leadership that emerged organically out of a desperate situation. Leaders, including Xenophon, were “elected” by the troops, and many decisions were made democratically. Against all the odds, the Ten Thousand made it home after a year of long-distance travel and war.
Leadership is a subject that has fascinated people for millennia (Hogan & Kaiser, 2005; King, Johnson, & van Vugt, 2009; Tecza & Johnson, in press; van Vugt, 2006). This should be no surprise, considering that we are one of the great apes—and social mammals more generally—all of which are characterized by strong dominance hierarchies (Mazur, 2005; van Vugt & Tybur, Chapter 32, this volume). When examining leadership in our own societies, we often look to leaders of the past for inspiration. Xenophon's story is remarkable because it bucks the trend of top-down hierarchical leadership that is so familiar to us from contemporary and historical societies, especially in military contexts. Indeed, it hints that democratic leadership was a critical ingredient of their survival against the odds. Most striking of all, perhaps, the idea of the “marching republic” is appealing because small-scale hunter-gatherer societies, like those in which modern humans evolved during the Pleistocene epoch, were also supposedly egalitarian and did not have dominant leaders. But they did have lethal intergroup conflict.1 Xenophon may therefore symbolize important aspects of the origins of human leadership in war.
To begin, what is the evolutionary context for leadership and war? What are the broader patterns of coordination and conflict among other animals, among our primate relatives, and among early human societies? And in what respects are human leadership and war unique?
Many of our physiological and psychological mechanisms go much further back than the Pleistocene era, and are found in all primates, all mammals, and many other vertebrates. These fundamental phenomena can have relevance to war and leadership, even if they evolved for other reasons. Examples include our fight-or-flight response, dominance hierarchies, and herding behavior. Such ancient traits undoubtedly affect leadership and followership in war, but they are not adaptations for war. A different question is whether there are unique leadership and followership traits that evolved specifically to deal with war—or at least with intergroup conflict more generally.
Collective movement is evident in a range of species, including lower-order ones such as ants, locusts, fish, birds, and antelope (Couzin, Krause, Franks, & Levin, 2005; King et al., 2009). But there is little evidence that such traits represent leadership in any strict sense (much of it appears to be self-organizing, individual behavior), nor that they are specialized for fighting. Most animals fight, and many fights occur between groups, but these are not really organized or led. Some animals do engage in something approaching war, in terms of synchronous lethal violence against other groups—notably ants, lions, and wolves (Wrangham, 1999a). Such conflicts, specifically, can be initiated by certain individuals with the most at stake. Others follow. For example, in group-on-group encounters, female lions with cubs—those with the most to lose—tend to lead approaches towards intruders (McComb, Packer, & Pusey, 1994). But while both leadership and war have precedents in the animal kingdom, they are rudimentary, and the combination—war leadership—seems absent.
One of the most prominent characteristics of all primate groups is a strong social hierarchy, and these show signs of nascent leadership—even or especially in conflict. For example, male chimpanzees may form coalitions to oust the alpha male, while alpha males themselves tend to side with the weak when fights break out, which preserves the status quo (de Waal, 1998). However, these are within-group contexts. Is there any evidence of leadership in between-group conflict? Many primates have noisy and vigorous intergroup fights, but they are not usually led or lethal (Manson & Wrangham, 1991). Chimpanzees are unique among primates for deadly intergroup conflict, in which small parties carry out raids on members of neighboring groups (Wilson et al., 2014; Wrangham, 1999a). Although these attacks appear to be deliberate (moving stealthily into bordering territory), it is not clear from the limited data which individuals, if any, “lead” or initiate lethal raids. There is some indication that higher-ranking males and/or those with more at stake (in status or offspring) are more likely to initiate approaches to intruders, territorial border patrols, and raids (Boehm, 2001, pp. 27–29; Gilby, Wilson, & Pusey, 2013; Wilson, Hauser, & Wrangham, 2001). Still, even if this is confirmed, it does not differ from other contexts, where high-ranking individuals may make the first move in foraging or hunting, but cannot compel others to follow. While dominance hierarchies and intergroup aggression clearly have deep roots among our mammalian ancestors, if we want to examine leadership in war we have to turn to humans themselves.
The nonhuman animal examples are interesting because they suggest (a) you do not need leadership for collective action and (b) lethal intergroup aggression is not unique to humans. However, while commonalities are important, the animal analogues also highlight what is different about human leadership—especially in war (Table 29.1). Two overarching attributes in particular set us apart. First, human leadership is unique because of sophisticated cognition: For example, theory of mind, language, forward planning, and strategizing can be brought to bear by both leaders and followers. Second, human leadership is unique because of sophisticated social organization: For example, large groups, divisions of labor, chains of command, and intergroup alliances all broaden the scope of leadership and followership. Both sets of characteristics enable and extend the practice of war as well as increasing the necessity and utility of leadership. Indeed, Ferguson finds that “war by tribal peoples displays a gradient of more sophisticated organization and practice linked to increasing social complexity and political hierarchy” (Ferguson, 2012, p. 2232). Leadership—both individual and structural—is therefore heavily implicated in the evolution of war, even if it came late in the game in human evolution more generally.
Table 29.1 Key Similarities and Differences in War Leadership Across Contexts
Animals | Primates | Small-Scale Societies | Chiefdoms | History | Today | |
Dedicated war leaders | No | No | No | Yes | Yes | Yes |
Voluntary participation | Yes | Yes | Yes | Variable | Variable | Variable |
Benefits to leaders | Yes | Yes | Yes | Yes | Yes | Variable |
Benefits to warriors | Yes | Yes | Yes | Variable | Variable | Variable |
Costs for leaders | Yes | Yes | Yes | Variable | Variable | Variable |
Costs for warriors | Yes | Yes | Yes | Yes | Yes | Yes |
Sexes involved | Both | Mainly males | Males | Males | Males | Mainly males |
Size of armies | Fewa | Tens | Tens to hundreds | Hundreds to thousands | Tens of thousands | Millions |
Military institutions | No | No | No | Yes | Yes | Yes |
aAlthough ant battles can involve hundreds of individuals. |
Let's look first at the context of leadership in general. Our best model for human evolution during the Pleistocene epoch is represented by hunter-gatherer societies—seminomadic kinship bands of a few dozen people. A considerable literature concurs that they are remarkably egalitarian, with equal rights, little or no private property, and no clear leaders (Boehm, 2001; Lee & Daly, 2004). Although there is variation, hunter-gatherers are at least much more egalitarian than our primate forebears or the larger chiefdoms, kingdoms, and civilizations that came later. Among the common traits of small-scale hunter-gatherers are that “Leadership is less formal and more subject to constraints of popular opinion than in village societies governed by headmen and chiefs. Leadership in band societies tends to be by example, not by fiat. The leader can persuade but not command” (Lee & Daly, 2004, p. 4). For example, the !Kung of the Kalahari and the Hadza of Tanzania had “either no leaders at all, or temporary leaders whose authority was severely constrained” (Gowdy, 2004, p. 391).
Of course, dominance relationships and power struggles do exist. Indeed, leadership of some form or other may be a human universal (Brown, 1991; van Vugt, 2006). But small-scale societies have social mechanisms for keeping overly domineering individuals in check, and restricting authority to specific domains of expertise (Boehm, 2001). As Tim Ingold put it, “To eliminate distinctions of power…is not the same as eliminating power itself. Despite their egalitarianism, hunter-gatherers generally attribute great importance to power and its effects. For them, power is not power over, nor are its effects coercive in nature. Rather, power takes the form of the physical strength, skill, or wisdom that draws people into relations clustered around individuals renowned for one or more of these qualities” (Ingold, 2004, p. 404). In the evolutionary psychology literature, this phenomenon has focused attention on the role of status, prestige, and coordination, rather than dominance (Henrich & Gil-White, 2001; Price & van Vugt, 2014; van Vugt & Tybur, Chapter 32, this volume). Ingold has concerns about the concept of “prestige,” because it “suggests a competitiveness and ostentation which are wholly foreign to the tenor of hunter-gatherer life,” but nevertheless recognizes that it serves to “bring out the point that power works by attraction rather than coercion. Bands do have leaders [or, at least, instances of leadership], but the relationship between leader and follower is based not on domination but on trust” (Ingold, 2004, p. 404).
Such leadership, however informal or weak, may translate into the realm of conflict and war as well. Just as good hunters may be consulted on hunting matters or lead foraging expeditions, so skillful fighters are often consulted on intergroup conflicts or lead raids (Boehm, 2001). Raiding is generally voluntary, and often widely discussed, even though individuals with greater experience or motivation may make the case for offensive action, suggest a strategy, and take the lead. But in hunter-gatherers there are still no real war leaders.
Even among larger tribal societies, Boehm notes that some “have panels of elders who attempt to resolve feuds, but any such resolution is totally voluntary for the parties concerned. There is no centralized coercive power to stop internecine conflict, just as there is no centralized power to make decisions of war and peace” (Boehm, 2001, p. 97). Instead, tribes tend to make decisions by consensus. Ferguson's review of the literature also found that “with some exceptions, tribal warfare relies on consensus and voluntary participation” (Ferguson, 2012, p. 2233). For example, among the Mae Enga of Highland New Guinea, long and inclusive meetings are held to debate whether or not to launch any major attack on a rival group, with everyone permitted to speak. “Big Men” do not interfere, except to summarize and confirm the consensus decision (Meggitt, 1977).
Planning is one thing. Fighting is another. While one can take time and consult others at length in deciding what to do, where, when, how, and so on, “in the thick of combat it is difficult for the entire group to talk over its next move” (Boehm, 2001, p. 97). Of course, this is a familiar problem of combat leadership throughout military history (van Creveld, 1985). Though difficult, leadership in battle is nevertheless attempted in small-scale societies just as it is in modern war. The Meru of Kenya would conduct “carefully planned” raids on cattle, “moving and attacking in specialized formations. The raid organizer was in command, though if courses of action were disputed, men could switch loyalties to other leaders” (Ferguson, 2012, p. 2232). Among the Mae Enga, during battles themselves, experienced “fight leaders” played an important role, switching between leading attacks at the front line and directing the action from the sidelines (Meggitt, 1977, p. 68). When multiple groups ally to fight together, strategies are planned in advance, and a “supreme chieftain” may be assigned to coordinate them in battle (though not always effectively).
Ferguson (2012, p. 2237) notes that because warriors' participation is voluntary and they can vote with their feet, in “the absence of the power to punish for behavior in battle,” leaders are constrained in what they can expect and achieve. Others also emphasize the problem of enforcement in coalitionary conflict—which, given the risk of injury or death, represents the mother of all collective action problems (Tooby & Cosmides, 1988). However, punishment or other social consequences may be present even if not obvious or direct. For example, Mathew and Boyd (2011) found that among the Turkana of East Africa, cowardice and desertion in warfare could result in sanctions, physical punishment, or fines by the wider group. Indeed, they argue that without this system of punishment, collective action for war would not be possible. Still, if and when the risks are sufficiently low and the benefits are sufficiently high, punishment may not be necessary for individuals to be motivated to participate (Chagnon, 1988; Johnson & MacKay, 2015; Manson & Wrangham, 1991; Tooby & Cosmides, 1988).
There is a considerable literature on the ethnography of small-scale societies, but while war is a common topic of discussion and analysis, the role of war leadership is much harder to find. Where it does arise, insights are often combined for hunter-gatherers, horticulturalists, and pastoralists (even though the socio-ecological context can be quite different). In lieu of any established literature or theory on war leadership in small-scale societies, below I list a set of common characteristics that tend to recur among reviews of indigenous warfare (e.g., Ferguson, 2012; Gat, 2006; Keeley, 1996; LeBlanc & Register, 2003; Otterbein, 1989; Turney-High, 1949; Wrangham & Peterson, 1996):
Although this may appear a somewhat ad-hoc collection of features of war leadership in small-scale societies, the common denominators are that (a) war leadership is present but limited, (b) war leaders usually (but not always) participate in fighting, (c) warriors are not easy to control, and (d) war leaders depend on, and benefit from, prestige and status—like Xenophon.
While limited in small-scale subsistence societies, war leadership quickly became important and specialized as societies increased in complexity. How leadership and war covary across different types of society can be explored using the Standard Cross-Cultural Sample of 186 indigenous societies around the world (Murdock & White, 1969), which has variables indexing war, political organization, and leadership (Ross, 1983a; Tuden & Marshall, 1972). Key findings are that (a) 53% of societies had no political organization beyond the immediate community (so-called stateless societies), while 29% had a single authoritative leader (Tuden & Marshall, 1972); (b) increasing political complexity and hierarchy (indications of the role of leadership in general) are correlated with social and economic complexity, larger societies, and higher levels of “external” (outgroup) warfare (Ember, 1962; Roes & Raymond, 2003; Ross, 1983b); but (c) external warfare was not correlated with the concentration, specialization, or centralization of political power—that is, more war is associated with greater leadership structures, but it did not make for more authoritarian leaders (Ross, 1983a).
While these findings reveal broad patterns across all types of societies, the dynamics of how war leadership changes as human societies develop may be more important. The situation changed considerably as soon as we moved out of small, mostly subsistence groups into larger, hierarchical and ranked chiefdoms. In fact, war and war leadership may have played a direct role in this very transformation. Scholars of early warfare suggest that the transition from egalitarian to hierarchical societies was driven in large part by war leaders gaining prominence and holding on to their power after, or between wars (Ferguson, 2012; LeBlanc & Register, 2003). We saw that among small-scale societies, leadership is expressly limited by domain of activity, and “people cannot extend such situational authority into generalized control over others” (Endicott, 2004, p. 416). However, war leaders may have become a special case because, unlike leaders in other domains, they had an opportunity to make use of their power, resources, loyal warriors, and alliances (as well as, often, physical strength and a record of victories) to consolidate their position and pass on the benefits to kin (Boehm, 2001; Gat, 2006; LeBlanc & Register, 2003). The process may have taken many generations and hinged on additional conditions, such as sendentarism and divisions of labor, but it is war leaders, rather than other types of leaders, that seemed to emerge as kings of the mountain. However important or unimportant war was among Pleistocene hunter-gatherers, therefore, war leaders may have played a disproportionate role in the subsequent development of human social and cultural organization.
While debates persist about the extent of war or leadership in small-scale societies, by the emergence of chiefdoms no one doubts the importance of either. Leadership is not absolute and even in “structural” leadership positions (e.g., inherited ones), power can still be limited. For example, Ferguson concludes that “such leaders have considerable say in war decisions. But most chiefs exercise influence, rather than power” (Ferguson, 2012, p. 2237). Nevertheless, by the time of chiefdoms we were in the era of intensive warfare, dedicated warriors, and strong, sometimes coercive, war leaders—features that would only reach new heights as societies developed into kingdoms, civilizations, and empires. This, of course, sounds familiar in the context of much of subsequent history.
Interestingly, therefore, egalitarianism is an anomaly in the broad history of evolution. Most social mammals, including primates, are despotic. An individual or coalition will dominate all others. Humans shifted away from this ancestral pattern into egalitarianism, but have since fallen back into it. Our evolutionary foray into egalitarianism may have been very important, because if it lasted for most of the Pleistocene epoch, then our cognition, behavior, and social organization may have adapted to it. However, as soon as agriculture was invented, strong hierarchies were back and humans became despotic for all of history (Betzig, 1986; Diamond, 1998). Only in the past few decades have democracy spread and monarchies and dictatorships fallen, although even now this process has not been universally completed or universally successful. Corruption is rampant around the world, including in many Western democracies, and even among the least corrupt, individuals still vie for power and status despite democratic oversight and institutions that resist it (Ludwig, 2002; Robertson, 2012; Shenkman, 1999).
Thomas Carlyle proclaimed that “The history of the world is but the biography of great men.” One could approximately paraphrase that as the biography of great war leaders. Biblical, preclassical, Greek, Roman, medieval and modern history is largely about the men who led, fought, and conquered empires. This is no doubt an exaggeration. However, to the extent that it is true, war leadership in history becomes an important subject of study for us—as evolutionary scientists—because of the role of evolutionary legacy in human behavior. While historians have for centuries recounted and revised our understanding of war, there are many features that, despite the political, economic and social complexities, suggest the timeless workings of certain traits of human nature and, therefore, an explanatory role for evolutionary psychology—and not least, the struggles for power, nepotism among kin, intergroup conflict, territorial aggression, and a range of cognitive biases (Betzig, 1986; Gat, 2009; Johnson & Toft, 2014; Tetlock, 1998; Thayer, 2004). An area ripe for new research is therefore evolutionary perspectives on historical war leadership (Johnson, 2004; Mazur, 2005; McDermott, 2007; Rosen, 2004), which can shed light on (a) universal features of war leadership across the ages and (b) problems of mismatch in which evolved traits are counterproductive for leaders in modern war (van Vugt, Johnson, Kaiser, & O'Gorman, 2008). But to explore the influence of evolutionary legacy on contemporary war, we need to pay attention to commonalities and differences with the wars of our past.
Modern war is usefully contrasted with war in our past by what Turney-High (1949) called the “military horizon.” He outlined various features that distinguished “primitive” from modern warfare, notably its low levels of manpower, resources, training, command and control, weapons, specialization, and tactics. Although one can debate the details, there was a line crossed at some point in human history in which war became a militarized endeavor with large numbers of professional soldiers under rigid command structures. Today, all of these characteristics are very different from combat among small-scale societies. The scale, complexity, hierarchy, technology, communications, and objectives make military leadership, as well as war, a very different type of activity (Gat, 2006; van Creveld, 1985). However, not everything is different.
For one thing, although modern armies are vast, a recurrent feature across time is the role of small units “at the sharp end” of any fighting, which are preserved today at the level of the platoon—a couple of dozen men. These are closely bonded teams who live, train, and fight together, and who must rely on each other to kill and avoid being killed (Rielly, 2000). And, of course, these units have a leader who faces the same essential challenge: leading a small group of men by example, earning their respect, and keeping them motivated in the face of lethal aggression. At low levels, therefore, the social context of war may be almost identical to how it always has been.
Second, great attention is paid to advances in military technology and weapons, but such innovations by and large become an advantage for both sides. There may be a lag in who gets weapons first, but in general, opponents catch up with each other in an arms race, which means they stay in the same relative position, just as in evolutionary arms races (Cohen, 2007; Dawkins & Krebs, 1979; Rosen, 1991). Consequently, the real competitive edge often comes not from technology but from age-old human factors: strategy, morale, discipline, and, not least, leadership. Even nuclear weapons turned out to represent a largely psychological challenge in the high-stakes game of deterrence and bluff (Freedman, 2003; Schelling, 1960). Despite the remarkable advances in technology, human factors in general and military leadership in particular remain crucial elements of war, and can be decisive factors in victory and defeat (Cohen, 2002; Rosen, 1991).
So neither scale nor technology—two of modern war's most distinctive features—undermines the importance of leadership in war across the ages. But what makes a good leader? Sun Tzu and Carl von Clausewitz, widely regarded as the two greatest strategic thinkers of all time, agreed on many things, but they disagreed about the possibilities and prerequisites of leadership (Handel, 2001; Sun Tzu, 2009; von Clausewitz, 1832/1976). An evolutionary perspective generates some surprising new insight into these differences (Table 29.2).
Table 29.2 Differences Between Sun Tzu and Clausewitzian Views on War and Leadership
Importance according to | ||
Sun Tzu | Clausewitz | |
Intelligence | Vital | Overrated |
Deception | Critical | Unimportant |
Surprise | Critical | Unimportant |
Control | High | Low |
Outcomes | Predictable | Unpredictable |
⇓ | ⇓ | |
Ideal military leader | Rational, calculating | Intuitive geniuses |
Evolutionary analog | Raids | Battles |
Application | Primitive/ancient warfare | Modern/recent warfare |
What is striking from an evolutionary perspective is that the key features important to Sun Tzu—deception, surprise, and predictability of outcomes—are closely aligned with raids in primitive warfare (e.g., see Wrangham 1999a, 1999b). By contrast, Clausewitz's emphasis on unpredictability and confusion is much more closely aligned with battles and modern warfare. This is interesting in itself, because it suggests fundamental differences between ancient and modern war, and eastern and western strategy (Sun Tzu was writing in China around 500 B.C., Clausewitz in Prussia in the 1800s). But most remarkable of all is the implications for leadership. Not coincidentally, Sun Tzu envisages the ideal leader as calculating, rational, and able to weigh decisions based on prior intelligence and force strengths. Clausewitz, by contrast, is skeptical of the effectiveness of surprise and stresses the problems of unpredictability and “friction” (when interacting parts do not perform as expected), leading him to suggest that intuitive “geniuses” are required to make good judgments in the fog of war—Napoleon being the archetype (who had remarkable mental and multitasking abilities, van Creveld, 1985). The point here is that ancestral war may have favored Sun Tzu-style leaders, who were effective in mounting the asymmetric surprise raids of the Pleistocene era, but came ill-equipped for the problems of modern war captured by Clausewitz—large, complex, slow-moving armies that clashed in chaotic open battles of annihilation.
A major research area in political science and international relations is the role of psychology in decision making—especially in crises and wars (Levy, 1983; McDermott, 2004a; Post & George, 2004; Sears, Huddy, & Jervis, 2003; Tetlock, 1998; Vertzberger, 1990). Robert Jervis's (1976) landmark book Perceptions and Misperceptions in International Politics drew on the “cognitive revolution” in psychology to offer new accounts of a range of puzzles in diplomacy, deterrence, and conflict. While this literature has mainly relied on social psychology and behavioral economics, there is a gathering interest in the evolutionary origins of judgment and decision-making biases, which often leads to novel predictions (Johnson & Toft, 2014; Lopez, McDermott, & Petersen, 2011; McDermott, Fowler, & Smirnov, 2008; Rosen, 2004; Thayer, 2004). While there are many cognitive and motivational biases that are of relevance to leadership and war (to be found, for example, in Kagel & Roth, 1995; Kahneman, 2011; Sears et al., 2003; van Vugt & Ahuja, 2011), here I have summarized some key examples in Table 29.3, and expand in the text on just three “big ones” that (a) have been implicated as influencing leaders' decisions about war and (b) are argued to have evolutionary foundations.
Table 29.3 Some Key Psychological Biases Affecting Leadership in War
Bias | Effect | Example | References |
Prospect Theory | Risk-proneness when facing losses | Cuban Missile Crisis | Haas, 2001; Levi and Whyte, 1997 |
In-group/out-group bias | Devaluation and dehumanization of out-groups | Rwandan genocide | Fiske, 2002; Staub and Bar-Tal, 2003 |
Overconfidence | Overestimation of benefits or probability of victory | World War I | Blainey, 1973; Johnson and Tierney, 2011 |
Cognitive dissonance | Forcing data to match beliefs | 2003 Iraq War | Cooper, 2007; Festinger, 1957 |
Fundamental attribution error | Assuming others' actions are malicious | Cold War | Gilbert and Malone, 1995; Larson, 1997 |
Analogizing | Tendency to fit new problems to past events | Vietnam | Khong, 1992; May, 1973 |
Note. (1) All such biases can affect leaders up and down the hierarchy, including political leaders (deciding whether or not to go to war), military leaders (deciding how to fight a war), and bureaucratic leaders (deciding how to resource and run a war), and (2) these biases may have been adaptive in the past, but they are likely to be maladaptive today, due to a mismatch between their original triggers and function and the modern social and physical contexts in which they arise (leading to failure rather than success). |
A key psychological phenomenon affecting decision making about conflict is prospect theory. In decisions involving uncertain outcomes, people are risk-averse when choosing among potential positive outcomes (the “domain of gains”), but risk-prone when choosing among potential negative outcomes (the “domain of losses”). In essence, people tend to gamble more when facing the prospect of losses (Kahneman & Tversky, 1979; McDermott, 1998).
Prospect theory has been used to explain key historical events such as Japan's decision for war in 1941, the Cuban Missile Crisis of 1962, and the tendency to escalate wars rather than accept defeat, such as in Vietnam (Haas, 2001; Levi & Whyte, 1997; Levy, 2000, 2003; McDermott, 1998, 2004b; Taliaferro, 2004).
Of particular interest for us is that the preferences underlying prospect theory may have an evolutionary origin (McDermott et al., 2008). When resources are plentiful and dangers scarce, organisms should avoid risky decisions, just as standard economic models of expected utility would predict. However, when starvation or other dangers threaten survival, selection can favor whatever risk-taking is necessary to give the animal a chance of life rather than certain death. This does not necessarily maximize expected payoffs (e.g., food). But it maximizes Darwinian fitness. An evolutionary perspective therefore suggests novel predictions for when and why we may expect to see risky decision-making among leaders about, or during, war.
Of the long list of psychological biases in human judgment and decision making, one of the most pervasive and powerful is the “ingroup/outgroup” bias. A mass of empirical evidence demonstrates that people (a) rapidly identify with their ingroups (even when they are strangers assigned into arbitrary groups), (b) systematically overvalue their own group's performance and qualities, and (c) systematically devalue the performance and qualities of other groups (Fiske, 2002; Fiske & Taylor, 2007; Hewstone, Rubin, & Willis, 2002; Tajfel, 1974).
Ingroup/outgroup bias has been implicated in a range of aspects of war, including genocides such as in Bosnia and Rwanda (Staub & Bar-Tal, 2003), perceptions of enemies such as the United States and the USSR during the Cold War (Larson, 1997; Silverstein, 1989), and influential theories about why states are intrinsically hostile to each other (Jervis, 1976; Wendt, 1999).
Again this bias appears to have an evolutionary origin (Haselton & Nettle, 2006; Sidanius & Kurzban, 2003, see also Kurzban & Neuberg, 2005). In human evolution, familiar and kin-based ingroups provided security, resources, and social exchange, while contact with outgroups risked exploitation, injury, or death. Attachment to the ingroup and avoidance of outgroups was therefore likely to be strongly favored by natural selection. Again, an evolutionary perspective suggests novel predictions about when and why we may expect to see intergroup biases among leaders encouraging or affecting war.
All mentally healthy people, especially men, show a systematic bias towards overconfidence in a wide range of domains. In particular, people tend to (1) overestimate their capabilities, (2) overestimate their control over events, and (3) underestimate their vulnerability to risk—three widely replicated phenomena collectively known as “positive illusions” (Sharot, 2011; Taylor & Brown, 1994).
Overconfidence has long been identified as a cause of war by both historians and political scientists, encouraging overambition, reckless diplomacy, overestimation of one's strength, and underestimation of the enemy and the costs of war (Ganguly, 2001; Howard, 1983; Johnson, 2004; Johnson et al., 2006; Johnson, McDermott, Cowden, & Tingley, 2012; Lebow, 1981; Stoessinger, 1998; White, 1968). Two landmark books on the causes of war—separated by 25 years of work on the subject—both highlighted overconfidence (or “false optimism”) as a recurrent and powerful phenomenon on the eve of war throughout history (Blainey, 1973; Van Evera, 1999). For example, overconfidence is argued to have contributed to European states' expectations of a quick victory in 1914 (Johnson & Tierney, 2011), U.S. expectations in Vietnam (Tuchman, 1984), and the Bush administrations' discounting of the challenges of postwar reconstruction in Iraq (Woodward, 2005). Jack Levy concluded that “Of all forms of misperceptions, the one most likely to play a critical role in the processes leading to war is the underestimation of the adversary's capabilities” (Levy, 1983, p. 83).
Once again, recent work suggests an evolutionary origin for overconfidence (Johnson, 2004; Johnson & Fowler, 2011; Nettle, 2004). Overconfidence can be adaptive because it increases ambition, resolve, persistence, deterrence, and the credibility of bluffing, generating a self-fulfilling prophecy in which exaggerated confidence actually increases the probability of success (Nettle, 2004; Taylor & Brown, 1994; Trivers, 2011). Some authors have specifically suggested that overconfidence is adaptive in war because of the importance of resolve, bluffing, and exploiting opportunities (Johnson, Weidmann, & Cederman, 2011; Wrangham, 1999b). Intriguingly, van Vugt (2006) highlights the empirical association of leadership with boldness, risk-taking, and seizing the initiative to solve problems of coordination, especially when there are large potential gains and high levels of uncertainty. In our evolutionary model, overconfidence was more likely to evolve precisely when the stakes and uncertainty are high (Johnson & Fowler, 2011). Some level of emboldened confidence may, therefore, be an essential ingredient of successful leadership, as both psychologists and military commentators have noted (Baumeister, 1989; von Clausewitz, 1832/1976). Once again, an evolutionary perspective generates a range of new and testable predictions about when and why we may expect to see overconfidence among leaders before or during war.
Above we explored various general traits of evolved psychology that can affect modern war and leadership. This leaves the more speculative but 6-million-dollar question of whether we also have traits that are, in fact, specifically evolved adaptations for war leadership (Table 29.4). I say “speculative” because (a) even the idea that we have evolved traits for leadership of any kind is still a new area of investigation (Price & van Vugt, 2014; van Vugt, 2006; van Vugt & Ahuja, 2011); (b) as we saw earlier, war leadership is sporadic and limited in small-scale human societies, so it is not obvious whether we should expect specific evolved traits for war leadership per se; and (c) there is very little experimental work that has tested this possibility.
Table 29.4 Evolutionary Hypotheses on War Leadership
Hypothesis | Implications | Evidence | References |
Humans have evolved traits for war leadership | Evolved traits for military leadership | Nascent | The Military Intelligence Hypothesis (this chapter) |
Humans have evolved traits for leadership (that may carry over into war) | Evolved traits for leadership | Growing | van Vugt and Ahuja, 2011 |
Humans have evolved traits for coalitions (that may carry over into leadership) | Evolved traits for warriors | Strong | Tooby and Cosmides, 1988 |
Humans have evolved traits that influence contemporary war leadership | Evolved traits for life in general (including psychological biases); with liabilities for people in leadership positions | Strong | Rosen, 2004 |
Although there has been considerable research on evolutionary adaptations for dominance, status, coalitions, aggression, and fighting (Buss, 1996; Buss & Shackelford, 1997; Daly & Wilson, 1988; Duntley & Buss, 2011; Henrich & Gil-White, 2001; Kurzban, Tooby, & Cosmides, 2001; Lopez et al., 2011; Mazur, 2005; Tooby & Cosmides, 1988, 2010; van Vugt & Tybur, Chapter 32, this volume; Wrangham, 1999a), there is hardly anything on evolved traits associated explicitly with war leadership. One problem is that much of evolutionary psychology is about universal traits that were adaptive for, and hence manifest themselves in, all people. Since, by definition, only some people can be leaders (in fact, only a tiny minority), leadership traits may be constrained to evolve by some form of frequency-dependent selection—traits that do well as long as not too many people have or express them (van Vugt, 2006). And indeed, some authors have noted that leaders are overrepresented by people with certain personality types or even personality disorders (Ghaemi, 2011; Ludwig, 2002; Nettle, 2001). An alternative is that we all have leadership (and followership) traits, but they are differentially expressed according to the situation or environment (Price & van Vugt, 2014; Spisak, O'Brien, Nicholson, & van Vugt, 2015). These two possibilities—the trait-versus-state debate in the leadership literature—are both plausible but in need of further investigation from an evolutionary perspective (van Vugt, 2006; van Vugt & Ahuja, 2011), especially in the context of leadership in war.
We do have some intriguing studies to build on, however. One of the most striking results to emerge from the evolutionary psychology literature in recent years is how brute physical features can predict preferences and behavior. For example, experiments and empirical studies have shown that, in general, people favor leaders who are male, older, trustworthy, taller, and from one's own group (Todorov, Mandisodza, Goren, & Hall, 2005; van Vugt & Ahuja, 2011; van Vugt & Spisak, 2008). However, this can depend on context. In a manipulation study of morphed faces, people preferred certain leadership traits (such as greater age and being male) more in circumstances of intergroup threat and war than in other types of scenarios (Spisak, 2012; Spisak, Dekker, Krüger, & van Vugt, 2012). This concurs with real-world observations that in times of crisis, people may prefer or accept more aggressive and authoritarian leaders (Boehm, 2001; McCann, 1992). However, these studies come down to an understanding of followership, rather than leadership, or at least “cognitive models” of the kind of leaders people want in a given setting. We are not yet sure how such leaders would decide or act, and whether or not they would be successful as a result.
A few studies get at traits more directly relevant to leadership, or at least to more dominant individuals. For example, Aaron Sell and colleagues found that men's muscle mass can predict their beliefs in the utility of force—both in the context of everyday life and in foreign policy (Sell, Hone, & Pound, 2012; Sell, Tooby, & Cosmides, 2009). Stronger men are more likely to resort to and endorse fighting, which makes evolutionary sense given their greater ability to extract resources and attract, coerce, or deter others. In studies of leaders themselves, facial features associated with dominance have been found to predict the later attainment of military rank among U.S. military cadets (Mueller & Mazur, 1996) and achievement drive among U.S. presidents (Lewis, Lefevre, & Bates, 2012).
Clearly, there are intriguing findings suggesting a role for dominance in both leadership and conflict, as well as strong hints that prestige and social coordination are as or more important (Cheng, Tracy, Foulsham, Kingstone, & Henrich, 2013; Price & van Vugt, 2014; van Vugt, 2006; van Vugt & Tybur, Chapter 32, this volume). But much work needs to be done to verify whether any such traits are adaptations for leadership in war. The final section, however, proposes that there is at least one evolved trait that is likely to be associated explicitly with war leadership, and it lies not in brawn, but in brainpower.
War is a complex, lethal activity, and all else equal, the side that is better prepared, organized, and coordinated—that is, better led—is more likely to win. Here I propose the “Military Intelligence Hypothesis” (MIH), which is that (a) intergroup conflict poses cognitively demanding adaptive problems, (b) solving these problems was important for fitness, and (c) this contributed to the evolution of human intelligence. The brain, in other words, has been honed in part to the myriad ways to kill and avoid being killed (see also Duntley & Buss, 2011; Thayer, 2004). The Military Intelligence Hypothesis is thus a kind of “anti-social brain hypothesis,” in contrast to Humphrey and Dunbar's “social brain hypothesis” (Dunbar, 2003; Humphrey, 1986), in which human cognition was influenced by the adaptive challenge not (only) of intragroup competition and cooperation, but rather of intergroup conflict. The evolution and metabolic expense of our disproportionately large brain has proven a significant puzzle for science, but an important, if unfashionable, piece of this puzzle may be the unforgiving problem of surviving and thriving in an environment of lethal intergroup warfare.
Cognitive advances would help any individual, whether leader or soldier, but they apply most strongly to war leadership because the real purchase of this military intelligence is in organizing multiple individuals to act cleverly together (via coordination by a leader), not individuals acting cleverly themselves (via uncoordinated independent actions). In combat, a disciplined whole is strikingly more powerful than the sum of its parts (Johnson & MacKay, 2015).
The MIH is consistent with cross-species comparisons. Lethal intergroup conflict is rare in the animal world, but where it occurs, it tends to be restricted to social mammals of high intelligence—most notably chimpanzees among primates, and wolves among carnivores (Manson & Wrangham, 1991; Wrangham, 1999a; ants are an interesting exception). The mere formation of coalitions is restricted to higher-intelligence animals, such as primates, canids, and dolphins (Harcourt & de Waal, 1992). From a broad comparative point of view, therefore, it may be no coincidence that humans have both remarkable levels of intelligence and remarkable levels of war. But the argument is not that the luxuries of intelligence begat war. Rather, the demands of war begat (or boosted) intelligence.
The MIH also concords with archeological evidence. A study of 175 hominid skulls from across the Pleistocene epoch found that variance in cranial capacity was best predicted by measures of population density, suggesting that while several factors may have contributed, brain evolution was primarily driven by competition with other humans (Bailey & Geary, 2009). Other studies have found that population pressure (population density controlled for available resources) correlates with the level of warfare (Kelly, 2013). It may also be no coincidence that the cognitively sophisticated Homo sapiens rapidly replaced long-established Neanderthals in both the Levant and Europe (Gat, 1999).
The primary focus of the MIH is the cognitive demands of strategy—the complex challenge of planning what to do in interaction with an unpredictable and deadly opponent. However, the demands of war leadership are much more far-reaching than this, and include a range of cognitively demanding tasks (Table 29.5). Of course, many of the traits listed are adaptive in interpersonal and within-group interactions (not just war), so they are also consistent with the social brain hypothesis. However, there are three reasons why the application of even these traits may be of special importance in war.
Table 29.5 Domains of Intergroup Conflict That Demand Sophisticated Cognition (in alphabetical order)
Domain | Significance | Cognitive Demands |
Alliances | Gaining and maintaining third-party supporters | Perspective taking, theory of mind |
Cooperation | Mobilizing and maintaining warriors | Cheater detection, enforcement |
Coordination | Aligning interests and goals of warriors and supporters | Initiative, problem-solving |
Deception | Achieving surprise, masking intentions | Bluffing, acting, concealment |
Diplomacy | Extracting gains while averting costs | Bargaining, perspective-taking, patience |
Intelligence | Anticipating enemy strengths, weaknesses, and intentions | Collecting, understanding, and integrating information |
Persuasion | Mustering support | Reasoning, moralizing, rhetoric, oratory |
Strategy | Planning, deploying, and utilizing forces | Dealing with uncertainty, interactions, rapid decision making, cunning, prediction |
Weapons | Staying ahead of the arms race | Designing, making, and using tools |
First, the problems of war are harder. Since war tends to be against out-groups rather than the in-group, it poses special adaptive challenges including, for example, predicting the behavior of people you do not know, which is harder than predicting the behavior of people you do. One also has to deal with limited information about their strengths, resources, reserves, or alliance arrangements. “Knowing the enemy” is a classic challenge of war.
Second, the problems of war have higher stakes. Not only does war threaten unusual levels of costs and lethality, it also offers the possibility of bountiful gains (booty, land, resources, elimination of rivals, status, and women). For all participants—victors and vanquished alike—fitness consequences are significant. Therefore, even if war was infrequent in our evolutionary history, it may have exerted a strong selection pressure on ways to exploit or avoid it.
Third, the problems of war are pervasive, even in times of peace. Because of the ever-present threat of intergroup conflict, even when (or precisely because) war is not actually occurring, there are numerous tasks and challenges that require cognitive sophistication and have significant implications for Darwinian fitness. These include building fortifications, social organization, forming alliances, signaling, deterrence, strategizing, allocating resources, preparations for war, stockpiling, training, designing and making weapons, gathering intelligence, and contingency planning. Groups (and individuals within them) that were poorly organized, prepared, or trained for war would have been more likely to suffer at the hands of rivals in a better state of readiness. We may therefore expect additional selection pressures on military intelligence arising from a host of peacetime activities that nevertheless stem from war—and indeed, influence its outcome.
The MIH might seem to overemphasize the role of war in the evolution of human intelligence. However, (a) it is not mutually exclusive of other factors driving human intelligence; (b) the high level of death rates from war in ethnographic and archaeological populations (c.15%; Bowles, 2009) suggest that adaptations affecting success in war would be under strong selection pressure; and (c) it is, in fact, a logical extension of a previous argument for the role of human intelligence in war made by John Tooby and Leda Cosmides. Tooby and Cosmides (1988) noted, in a widely cited but never published paper, that coalitional aggression is remarkable not only for its importance among humans, but for its rarity among other animals. Numerous species, such as elephant seals, deer, or gorillas, have a single male that dominates all reproduction in the group. If lesser males ganged together, they could easily depose the alpha and split the spoils. But they never do. Tooby and Cosmides suggest the reason is that forming a coalition demands sophisticated cognitive mechanisms to achieve and sustain the necessary levels of cooperation. Since most other animals do not have such mental sophistication, the great opportunities of coalitions and alliances are foreclosed to them (as we saw above, coalitions are found only among a select few other species, all of which have higher intelligence—such as chimpanzees, wolves, and dolphins). This led Tooby and Cosmides to suggest that humans have evolved distinctive psychological traits for forming coalitions (Kurzban et al., 2001; Tooby & Cosmides, 2010; van der Dennen, 1995; Wrangham, 1999a).
What were these traits? For coalitionary aggression to make sense given the inherent risks to life and limb, two features must be in place: (1) some reasonable probability of net gains and (2) the detection and sanction of free riders. Tooby and Cosmides (1988) argued that in the Pleistocene setting of asymmetric raids, the large gains and low costs should easily tilt the balance in favor of war (as do Johnson & MacKay, 2015; Manson & Wrangham, 1991; Wrangham, 1999a). The bigger problem remains in identifying shirkers and enforcing cooperation: Who takes on these policing costs? They suggest this may have been solved by certain individuals having higher stakes in war, or enforcement being delegated to others. Although it is hard to see exactly how this might play out among a coalitionary group of equal individuals (as they envisioned the problem), it is easy to see how leadership can plug the gaps here. Leaders are likely to have higher stakes in the outcome, as well as lower costs of enforcement (given physical power, authority, status, kin ties, or allies).
Tooby and Cosmides identified a crucial problem in the great benefits of coalitionary aggression and yet the significant evolutionary obstacles of achieving it. But leadership may have helped to cross the canyon of this big collective action problem to reach the fertile fields of war. Taken together, the multiple advantages of intelligence for effective war leadership, the high death rates due to intergroup conflict, and the cognitive challenges of coalitionary warfare suggest that war itself may have contributed to the enlargement and sophistication of the human brain.
The importance of war in human evolutionary history remains controversial, but it seems likely that it exerted significant selection pressure on human social organization, behavior, and cognition. Although there are variations in the form and frequency of warfare among small-scale societies, there are also remarkably consistent patterns, which suggest a common adaptive problem and common solutions to solve it. One important solution is likely to have been coordination and leadership—without these, victory comes hard and death comes easily. But even if war leaders were only transitory or weak in our evolutionary past, evolutionary psychology still has much to say about leadership in modern war. This chapter has addressed two very different strands of insight: (1) humans have a range of evolved dispositions and biases (many of which are described elsewhere in this volume), that can have large and important effects on leaders in their decisions for war and how they fight them (just as they can affect any other kinds of decision), and (2) humans may have evolved leadership and followership traits, some of which could be explicit adaptations to intergroup conflict and war. A significant one is hypothesized to be intelligence itself.
What are the lessons for contemporary war? Wars in recent centuries tended to involve the clashes of large, institutionalized armies of states and empires. In the 21st century, war is more commonly proving to be asymmetric conflicts against ad-hoc, loosely organized, often nonstate actors (Kilcullen, 2010; Strachan & Scheipers, 2011). These smaller sides are not without leaders, but they are much more decentralized. New research—from an evolutionary perspective—argues that this gives them an edge in terms of greater flexibility and faster adaptation than the slow, lumbering machinery of Western military organizations (Johnson, 2009; Sagarin et al., 2010). In Iraq and Afghanistan, where Western troops were faced with a novel military challenge against indistinct foes wearing civilian clothes and using unconventional methods, established doctrines quickly failed. Moreover, within a large and complex organization such as the U.S. Army, change was not easy to accomplish and even harder to institutionalize. Instead, there became a premium on another solution—individual leaders on the ground who showed themselves to be flexible within the constraints of the military machine. General Petraeus, among other senior officers, called for a new generation of “adaptive leaders” (Wong, 2004). As Petreaus explained: “There is no substitute for flexible, adaptable leaders. The key to all that we did in Iraq was leaders—especially young leaders—who repeatedly rose to the occasion and took on tasks for which they'd had little or no training” (Broadwell, 2009). Human factors and leadership remain as important to contemporary warfare as they did for Xenophon and the Ten Thousand, and perhaps for warriors of all human societies since time immemorial.
Strategic theorists since Sun Tzu and Clausewitz have consistently emphasized the difficulty of leadership in war—it is a domain of unrivalled contingency, uncertainty, and confusion. As a result, Lawrence Freedman cautions that “it must never be forgotten that strategy is an art and not a science” (Freedman, 2007, p. 369). However, if war was important in human evolutionary history, natural selection is likely to have favored cognitive and behavioral strategies that helped to coordinate and kill—and avoid being killed—whatever the difficulties. One of the most important tools of all, both then and now, is effective leadership. Combined with the many insights on evolved psychological biases that affect war leaders just as they affect everyone else, evolutionary psychology offers a scientific framework to help us understand the role of leadership even in the “art” of war.