DESPITE A DECADE OF military operations across Afghanistan, by the winter of 2010 it had become clear that the United States was not succeeding. Hoping to induce the Afghan insurgents into peace talks, U.S. and NATO officials tried to bribe the Taliban to the conference table. They paid an undisclosed and hefty sum to Mullah Akhtar Muhammad Mansour for his participation, at one point flying the Taliban’s second-in-command to meet with President Hamid Karzai in Kabul. The talks seemed to be proceeding well. Mansour’s demands were remarkably reasonable. Yet one thing did trouble some officials. Mansour was several inches shorter than he should have been.
Unfortunately, the Taliban commander was a fake, a shopkeeper from Quetta, Pakistan.1 Following the third round of negotiations, the clever merchant made off with a fortune, no doubt laughing as he spirited his wealth away. The episode exposed how poorly the United States knew its enemy in this ongoing war. On a superficial level, American and NATO officials could not even identify the number-two man in their opponent’s organization. On the more strategic level, they did not notice that throughout three separate meetings, the impostor never once requested that foreign troops withdraw from Afghan soil—a staple of Taliban demands. Without concrete descriptions of Mansour’s appearance, the U.S. and NATO had to focus on his behavior. Did he think the way a Taliban commander would? In a sense, they needed to read Mansour’s mind.
What NATO and U.S. officials lacked was strategic empathy: the ability to think like their opponent. Strategic empathy is the skill of stepping out of our own heads and into the minds of others. It is what allows us to pinpoint what truly drives and constrains the other side. Unlike stereotypes, which lump people into simplistic categories, strategic empathy distinguishes what is unique about individuals and their situation. To achieve strategic empathy, you must first identify the information that matters most.
Knowing how another thinks depends initially on gathering and analyzing information. Most leaders use the “great mass” approach. Drawing on intelligence networks, they gather up as much data as they can. The problem is that it is too easy to drown in an ocean of information. Determining which data matter and connecting the dots then grows even harder. In contrast to the great mass approach, others believe that a “thin slice” of information is more effective at revealing someone’s true nature. The danger is that we often choose the wrong slice, leading us painfully astray.2 The conclusion here is inescapable. The quantity of information is irrelevant; it’s the relevance of any quantity that matters. The key is not to collect a great mass or a thin slice but the right chunk.
The challenge that has long bedeviled leaders is to find heuristics—decision-making shortcuts—to help them locate those right chunks. Such shortcuts would not generate omniscience, but they would equip us with a sense for what makes our enemies tick. And that sense would greatly improve our odds of anticipating the enemy’s actions. This is what strategic empathy enables, and you can imagine how valuable this skill would be.
This is a book about prediction, though not of the ordinary kind. It is not about predicting sports matches, stock markets, elections, or any of the typical things people bet on. Instead, it’s about predicting other people’s behavior when the stakes are the highest they can be—over matters of war and peace. It’s a book about how we get out of our own minds and into someone else’s head, and it focuses on how national leaders in modern times have struggled to do it well.
This is specifically a history of how leaders within governments have tried to think like their enemies. It explores the zig-zag stories when each side in a conflict sought to outmaneuver the other. It is a walk through one of the twentieth-century’s most challenging yet crucial quests: reading the enemy mind.
A Sense of the Enemy addresses two questions. First, what produces strategic empathy? Second, how has strategic empathy, or the lack of it, shaped pivotal periods in twentieth-century international conflict?
More than 2,000 years ago the Chinese military philosopher Sun Tzu advised generals to know thy enemy. The question has always been how to do it. Though millennia have passed, we are still searching for the answer. Writing in 1996, the philosopher Isaiah Berlin argued that political genius, the ability to synthesize “the fleeting, broken, infinitely various wisps and fragments that make up life at any level,” is simply a sense—you either have it or you don’t.3 But what if Berlin was wrong? What if that sense could actually be learned and improved?
Using the masterful nineteenth-century statesman Otto von Bismarck as a prime example of one who was exceedingly gifted at divining an opponent’s reactions, Berlin observed that the German Chancellor managed to integrate vast amounts of disparate data over a breath-takingly expansive range. Then, rather disappointingly, Berlin asserted that any careful study of this sense could never lead to any meaningful guides. As he saw it, the gift of political judgment came from seeing the unique in any situation, and any generalizations would be useless in future contexts. Berlin is so uncommonly sensible, so thoroughly compelling, that I am almost tempted to agree.
As a historian of international relations, I find myself deeply planted in Berlin’s camp. I am dubious about the value of international relations theories, and I seek no predictive models of behavior. Yet I question his conviction that a rigorous investigation of the ability to know one’s enemy would yield nothing of value. Leaders who possess this skill are adept at identifying, as well as synthesizing, the data relevant to a given problem. A careful look at such leaders would bring us closer to comprehending how they thought and in that way further illuminate why events unfolded as they did. It might also help us understand how they knew which information to scrutinize and which to ignore. As the psychologists Christopher Chabris and Daniel Simons observe: “For the human brain, attention is necessarily a zero-sum game. If we pay more attention to one place, object, or event, we necessarily pay less attention to others.”4
We often assume that the experts in any field have absorbed and retained vast amounts of data on their given subject. Though it has been more than a century and a half since he first assumed the German Chancellorship, Bismarck is still viewed in this light. Christoph Tiedemann served as the Chancellor’s personal assistant from 1875 to 1880. Though steadfast in his work ethic, even he struggled to keep pace with the indefatigable statesman. Sessions with the Chancellor typically lasted all day. Once Bismarck dictated a single letter to the Emperor for five hours straight, without interruption. At one point in the dictation, Tiedemann’s arms began to cramp, so he swiftly removed his jacket. Bismarck gazed at him with amazement, astonished that Tiedemann should require a break in the action. Bismarck’s ability to dictate for such long stretches stemmed from his total mastery of the relevant information. His aim, as the Chancellor’s most recent biographer put it, was to know everything about everything in “a constant, furious absorption of material.”5 Yet today we know more about how the mind works. Chabris and Simons, among other psychologists, have shown that a central aspect of decision-making is not the absorption of massive amounts of material but instead the capacity to ignore the bulk of it while focusing on the few key data points that truly matter. “Intuitively, most people think that experts consider more alternatives and more possible diagnoses rather than fewer. Yet the mark of true expertise is not the ability to consider more options,” Chabris says, “but the ability to filter out irrelevant ones.”6 As leaders filter out the noise, they must also sense where to find the signal.
One key to strategic empathy comes not from the pattern of past behavior but from the behavior at pattern breaks.
We can better understand the past century of international conflict by scrutinizing how leaders struggled to think like their enemies. When they did it poorly, they tended either to assume that their opponents’ future behavior would resemble their past behavior or they assumed that their enemies would think and act as they themselves would do. But when leaders succeeded in thinking like their enemies, they focused on the enemy’s behavior during meaningful pattern breaks.
In a twenty-first century marked by mind-boggling amounts of accessible data, we naturally assume that pattern recognition is supreme. When sophisticated algorithms met super-fast computing, and when network analysis joined with social science, we dramatically expanded our predictive power over individuals as well as masses. Today whole industries have arisen on the backs of pattern spotters. Nearly every major corporation hopes to transform mass consumer behavior into profit streams. Amazon can suggest the books we might enjoy. Netflix does the same for films. And Pandora predicts what songs we’ll like to hear. These types of predictions of our preferences rely on pattern recognition. What many people may not realize is that, although these preference predictions seem targeted specifically at you and me, they are largely based on analysis of how vast numbers of people similar to us have previously behaved. As impressive as these algorithms are, there remains a limit to their magic.
Quantitative analysis fails us when statesmen are the subject. Knowing what past dictators have done in similar situations, for example, cannot shed much light on what the current autocrat may do. Each case of international conflict and every ruler is sufficiently unique to make analogical reasoning a dangerous endeavor.7 Left to consider only the past behavior of a particular ruler, foreign leaders are caught in a quandary. Most of the time, the record of actions is mixed—full of seemingly conflicting behavior, out of which opposing interpretations can easily be drawn. In other words, if one seeks evidence of malignant or benign intentions, both can usually be found. This is why fixating on the patterns in enemy behavior can easily lead us in circles. We need to be aware of prior patterns, but we also need a better heuristic for making sense of what drives the other side.
Consider two historical examples that make this point. Throughout the 1930s, some British and French officials concluded it was best not to confront Hitler because they believed in his repeated assurances of peace. They observed his pattern of conciliatory behavior after each new demand, and they assumed he could therefore be appeased. In contrast, during the Cuban Missile Crisis, some American officials, such as Air Force General Curtis LeMay, insisted that war with the Soviet Union was inevitable based on evidence of prior Soviet aggression. For that reason, LeMay strenuously argued for a full-scale attack on Cuba. Both sets of officials advocated policies based on patterns they perceived—patterns drawn from a selected sampling of data—and both sets of officials were wrong. If these policymakers had focused on a different sampling of data, they could have located the opposite patterns in their enemies’ behavior. Winston Churchill, for example, saw in Hitler’s behavior a pattern of insatiable aggression, whereas President John F. Kennedy saw evidence of Nikita Khrushchev’s reluctance for war. Had statesmen in the 1930s correctly read Hitler, many lives might have been saved. Had statesmen in 1962 forced a direct confrontation with the Soviets, in the age of nuclear weapons, many more lives might have been lost. Reading the enemy right is clearly a priceless skill, and leaders cannot afford to ground their assessments in a select sampling of past behavior. Instead, they must understand what makes the current enemy tick. But how can we know in the moment which patterns reveal the enemy’s true motives?
Leaders are better served not by straining to perceive patterns of behavior but by focusing their attention on behaviors at pattern breaks. It is at these moments when statesmen typically reveal their underlying drivers—those goals that are most important to them. These episodes can also expose much about a leader’s character, showing the kind of measures he is willing to employ.
Pattern breaks are merely deviations from the routine. These deviations can involve sudden spikes in violence, sharp reversals of policy, unexpected alterations in relations, or any substantial disruption from the norm. There are two main types of pattern breaks: pattern-break events and pattern-break behaviors. The behaviors provide valuable information about an enemy but only under certain conditions.
Naturally, pattern breaks frequently occur. Most are meaningless, but some are meaningful. To distinguish one from the other we must focus on costs. Henry Kissinger, the American National Security Adviser who negotiated with the North Vietnamese over an end to the war, offered an excellent example of a meaningless pattern break. In his reflections on those negotiations, Kissinger explained how Hanoi invariably lectured America in its pronouncements, always insisting that the United States “must” do this or that. At one point Hanoi suddenly used a different word, declaring that the United States “should” meet a particular demand. Kissinger and his team thought they were on the brink of a major breakthrough. It proved a fleeting fancy. The next communication returned to the usual insistent language.8 This momentary word change cost Hanoi nothing. In theory it could have marked a shift in Hanoi’s attitude, but it revealed nothing about Hanoi’s underlying intentions.
In contrast to Kissinger’s experience, consider the Chernobyl nuclear accident of 1986 for a brief example of a meaningful pattern break. Although it took him more than two weeks to issue a public announcement, Gorbachev did disclose the true horror of what had occurred. Prior to that moment, Soviet leaders had typically denied any weaknesses in their economy or their society, invariably extolling the superiority of communism. By admitting that the nuclear disaster had occurred, and by inviting American medical experts into the Soviet Union to assist in caring for the sick, Gorbachev openly acknowledged certain failings of the Soviet system. He thereby risked incurring the enmity of the old-school hardliners within his regime. This was a pattern-break behavior, and it revealed much about the Soviet Premier. When he first came to power, Gorbachev initiated the reforms dubbed Glasnost and Perestroika (an openness to free speech and a rebuilding of the economy). For those outside observers who doubted the sincerity of these reforms, Gorbachev’s behavior during Chernobyl indicated that he was a truly different leader from those who had preceded him. Chernobyl itself was a pattern-break event, and Gorbachev’s behavior surrounding it revealed much about his underlying intentions.9
We must first be able to recognize patterns before we can spot a pattern break. When no breaks are apparent, we may have no better option than to assume that the enemy’s future behavior will resemble his past. I call that the “continuity heuristic,” and I will say more about it in chapter 8. It is a method with many flaws, though at times it is all we have to go on. Sometimes, however, there is a better way, a way that leaders have employed to good effect and one that can be just as valuable today.
To summarize, meaningful pattern breaks are those episodes that expose an enemy’s underlying drivers or constraints. Those less obvious factors become apparent when an opponent behaves in a way that imposes genuine costs upon himself—costs with long-term implications. The enemy need not change his behavior at those times. He might continue on exactly as he had done before. The pattern break simply provides an opportunity for revealing what he values most. It acts as a spotlight, illuminating qualities that might otherwise be hidden. In the chapters that follow, we will witness cases of pattern-break events as well as pattern-break behaviors, and we will dig deeply into examples of meaningful pattern breaks that involved behaviors costly to the enemy in question. Above all, we will see how talented strategic empaths used those pattern breaks as teachable moments to help them gain a sharper sense of their enemy. As you encounter their stories, remember that strategic empathy is not a trait—a superior quality with which one is simply born. This might be true of empathy itself, though it is possible that empathy can be cultivated, but this would be the subject for a different book. Strategic empathy, on the other hand, should be thought of as a skill that you can develop and enhance. Like all skills, no matter how much you might practice, you can never achieve perfect results every time. Focusing on behaviors at pattern-break moments cannot guarantee an accurate reading of your rival’s mind, but it can certainly up the odds.
Each chapter investigates cases of how particular statesmen struggled with strategic empathy. Chapter 1 considers how Mahatma Gandhi read British leaders shortly after World War I, when the Indian independence movement was just about to blossom. Chapters 2 and 3 center on German Foreign Minister Gustav Stresemann’s attempts to read the Russians in the 1920s. Amid the tumult of ever-changing Weimar coalitions, Stresemann remained the one steady leader of German foreign policy. His diplomatic acumen helped restore his defeated nation to a position of strength. These chapters ask how he did it.
Chapters 4 and 5 explore both Stalin’s and Roosevelt’s attempts to read Hitler in the years prior to Hitler’s invasion of Russia. The central question is this: How did Stalin and FDR think about Hitler? To borrow a term from cognitive science, I am asking how they “mentalized” about their enemies. I explain what it means to mentalize by discussing the findings of cognitive scientists and relating their discoveries to historical subjects. Chapters 6 and 7 turn to Vietnam. They probe the North Vietnamese leadership’s efforts to understand America in the years preceding the war’s escalation. While we know a great deal about what American leaders thought about Vietnam during the war, far less has been written on what the North Vietnamese leaders thought about the United States. Few Americans even realize that the man largely running North Vietnam just prior to and during the war was not Ho Chi Minh but rather a shadowy figure named Le Duan. These chapters try to gain a sense of how Le Duan struggled to grasp America’s drivers and constraints.
The first two of these historical cases (the chapters on Gandhi and Stresemann) provide examples of talented strategic empaths. The case of Stalin presents an example of empathic failure. In the case regarding Vietnam’s Le Duan, the record is mixed. In each case I show that understanding pattern breaks provided an essential heuristic for achieving strategic empathy.
Chapter 8 steps back from particular case studies to consider several notable efforts to assess an enemy in the twentieth century. As a useful comparison, I consider the thinking behind the opposite of the pattern-break heuristic—what I refer to as the “continuity heuristic.” By using past behavior as a guide, leaders and their advisors have often missed the mark. Finally, chapter 9 examines a present-day trend: a troubled love affair with quantitative analysis as the basis for predicting enemy behavior.
In an afterword I briefly describe how this book fits into the existing history and political science literature on enemy assessments, while I also spotlight some of the theories and concepts from other fields, which help illuminate our task. I then discuss my basic approach: the methodology I employ for tackling the questions surrounding the history of war and peace.
Most of the cases I consider in this book involve states in militarily and economically weaker positions with respect to their chief opponents. All nations need strategic empathy, but for the weaker states in any conflict, strategic empathy can be necessary for survival. If the United States is entering a period in which its relative power is declining, the lessons from past strategic empaths will only rise in value. And even if this were not the case, America, or any nation in the stronger position, can always profit from a clearer sense of its enemies.
My primary aim in this book is to write a history of international conflict through an alternative lens. In essence, I am conducting a meta-exercise: to think about how leaders thought about their enemies. Historians typically try to reconstruct the past through the eyes of history’s key actors. We do this mainly by attempting to see the world as those people saw it. In the pages that follow, however, I am attempting to enter the minds of certain leaders, to see how they in turn tried to enter the minds of others.
Because this book is a work of history, it is more descriptive than prescriptive. It asks not what statesmen should have done but rather what they actually did, how they thought, and what they believed about their opponents. That said, the book has a secondary aim. This study may hold value for present-day analysts by highlighting ways of thinking about the problem of prediction.
I have still a third aim with this study. As with each of my previous books, I want to allow history to illuminate how we think. While the cognitive sciences, from psychology to behavioral economics and the like, are steadily advancing our knowledge of how the mind works, those fields suffer from a serious constraint. Their conclusions are based on carefully controlled laboratory experiments. As a result, it is much harder to say how people would behave under actual conditions. History, however, provides us with precisely that. It enables us to reconstruct how people thought and made decisions in real life, under the complex, uncontrolled, and uncontrollable conditions of their realities. This book, then, is also a study of historical decision-making.
The chapters that follow provide an alternative way of thinking about modern international affairs. Together they tell the story of how pivotal moments in history resulted from the ways that leaders identified their enemies’ underlying drivers and constraints. I do not assume that strategic empathy is the sole cause of foreign policy successes. Multiple factors invariably combine to shape outcomes. Contingency and chance are always at play. A statesman’s strategic empathy is only one factor in success, though I argue it is often a crucial one. How leaders came to think like their opponents is a telling and too often overlooked aspect of international conflict. If we can deepen our understanding of how key figures thought, we will better comprehend why wars are fought, lost, and won. And if we could actually apply those insights, we might just take one step closer to making war no more.
Before we can begin the exploration of the past century’s greatest struggles, we first need to understand a bit about how we mentalize—meaning how we all try to enter into someone else’s head—and we need to know about heuristics—the decision-making shortcuts we all employ. One powerful example of each can be found in the story of an eighteen-year-old orphan who had a chance to win a fortune. With the spotlights on and cameras rolling, the young man had to mentalize about a stranger. Riches were in reach. All he had to do was to penetrate the mind of the enigmatic TV host who was offering a cryptic clue.
Jamal was on the spot. He had just two options left. If he chose wisely, he would win 10 million rupees and advance to the final round. If he chose poorly, he would lose it all. The entire Indian nation was watching. The problem was that Jamal did not know the answer. He had no choice but to guess.
In the blockbuster film Slumdog Millionaire, a young man from the Mumbai slums lands a spot on a popular game show. By an amazing run of good luck, despite his lack of formal education, Jamal is asked a series of questions to which he always knows the answers. The show’s host, however, continues to belittle Jamal’s success, demeaning him as a mere tea server from the slums. By the penultimate round the stakes have grown exceedingly high, and Jamal is stuck. Which cricketer has scored the most first-class centuries in history? His options are reduced to B or D. A commercial break allows the tension to stretch out. It also presents Jamal with a strategic conundrum.
During the break, Jamal and the host meet in the men’s room. Jamal admits that he is clueless and will lose. To our surprise, the host encourages Jamal, telling him that if he selects the right answer, he will become the most fortunate slum dweller in India, the only person other than the host himself to have risen from extreme poverty to riches and fame. The host tells him not to lose heart. Before exiting the men’s room, the host cryptically suggests: “Perhaps it is written.” Jamal then sees that in the steam on the bathroom mirror the host has traced the letter B.
Jamal now needs to think strategically. If he trusts the host and believes he is giving him the correct answer, he can choose B with confidence. But if he thinks there is a chance that the host could be lying, wanting him to lose, then Jamal’s situation becomes infinitely more complex. He cannot simply choose D, the cricketer Jack Hobbs, and be sure that this is correct. He must instead assess his enemy on two counts: how clever is the host, and how clever does the host think Jamal is.
The host might be setting a trap. The correct answer might in fact be B, but the host could be psyching Jamal out, giving him the right answer but expecting that Jamal will choose D just because it is the opposite of what the host advised. Of course, if Jamal thinks that the host expects him to expect this trap, then Jamal should instead choose D. At this point, Jamal could fall into an impossibly complicated loop of “if he thinks that I think that he thinks,” ad infinitum.
But does anyone really think this way?10 In fact, we almost never do. Instead, we all rely on heuristics—shortcuts for decision-making. Because no one can keep track of so many levels of second-guessing, heuristics help us to simplify strategic decisions with rules of thumb.
Jamal chose D, the cricketer Jack Hobbs, and he was right. The host had fed him the wrong answer in the mirror. Obviously, we cannot know for certain what Jamal, a fictional character, used for a shortcut to reach his decision, but the film does give us a number of clues. Up to this point we have witnessed flashbacks of key moments in Jamal’s life, and they have been painful to watch. As a young boy he lost his mother to rampaging Hindu nationalists. He was nearly blinded by a cruel orphanage operator who drugged the children and scooped out their eyes with a spoon, then used them to beg for money on street corners. And his own brother separated him from the woman he loved. Given the information we have about Jamal, it is likely that he employed a simple heuristic: Trust no one.
Like Jamal, leaders also use heuristics in the game of international affairs, even though they have vastly more to lose than in a quiz show. Sometimes national leaders formulate their heuristics as maxims: The enemy of my enemy is my friend. Sometimes their heuristics are analogies: If aggressors are appeased, they will become as aggressive as Hitler after Munich. However dubious they might be, heuristics ease decision-making by simplifying the thinking process. Jamal’s experience spotlights a hidden truth: When the stakes are high, we all need shortcuts for predicting our enemy’s moves.
While Jamal’s story gives us a hint of how we all try to strategize in high-stakes situations, that fictional tale can only take us so far. It’s time to examine a true Indian hero, upon whom the eyes of every Indian, and indeed the entire world, were fixed. Mahatma Gandhi holds special interest in a study of strategic empathy precisely because he managed to win his nation’s freedom without ever firing a single shot.