7

Analyzing Risks Like a Physicist

The following scenario involves a fictional Japanese telecommunications company operating in Burma:

Although we made up this crisis scenario for our course several years ago, the general conditions in Burma are real. The country has been riven by ethnic conflict, corruption, and authoritarian rule for decades. Starting in 2010, however, Burma began opening to the outside world. The military junta was replaced, national elections were held, and Nobel Peace Prize winner Aung San Suu Kyi was released from house arrest. In response, the United States and the European Union lifted a number of sanctions, opening new opportunities for foreign investment.

In 2017, Burma’s military, its security forces, and others launched a brutal crackdown against the country’s Muslim Rohingya minority, making our hypothetical scenario tragically realistic.

We use this case to walk students through the nuts and bolts of analyzing political risks in a challenging new market. The full case includes information about Burma’s history, sanctions regimes, human rights concerns, and telecommunications industry dynamics. As in our Mexican cruise industry case, there is no right answer. But there are three important lessons about analytic pitfalls and how to avoid them.

The first lesson is about what constitutes useful data. Each year, when our MBA students discuss whether entering Burma was a good idea, they seize on national-level data in the case materials such as literacy rates, GDP, and cell phone penetration. These figures are quantifiable, available, and useful for assessing the business opportunity. But they do not say much about political risks. Instead, as we note in chapter 4, political risk data are often localized and hard to quantify. What are the prospects for democratization in Burma? Where is ethnic conflict most severe and likely? Answering these questions requires more specific and qualitative data.

We write the case deliberately so that Kiku Telecom begins its Burma operations in a western region called the Rakhine State. In real life, this is one of the worst locations from a political risk perspective. While central Burma is populated by an ethnic Bamar majority and is under firm control of the state, it is surrounded by a horseshoe of ethnic and separatist conflict involving nearly two dozen groups. The Rakhine State is wracked by poverty and conflict between Buddhists and Rohingya Muslims, a group described by the United Nations as one of the most persecuted minorities in the world.2 Although we include this information about ethnic conflict in the case materials, it tends to be overlooked. Why? Because for many, “data” means numbers, not words. People find hard numbers alluring and reassuring.

The “aha” moment in class usually comes when we point to the paragraph discussing the Rakhine State and ask, “So why did the company start here, of all places?” Students realize that they overlooked a major risk. Investing in Burma is one thing. Making your investment beachhead in one of the most conflict-prone areas of the country is quite another. The location and phase of the investment increased the probability of an adverse event and narrowed risk mitigation options from the start.

The second lesson is that political risk analysis should not stop once an investment begins. In our Burma case, fictitious company executives assess political risks before signing the joint venture deal but do not conduct ongoing analysis afterward. That’s surprisingly common in the real world, too. Ian Bremmer, CEO and founder of the Eurasia Group, notes that most businesses analyze political risks for new investments but few continue to assess political risks once the investment is made.3 A survey conducted by Bremmer’s firm and PricewaterhouseCoopers in 2006 found that only 24 percent of respondents reported on political risk on a biannual or more frequent basis.4 Ten years later, a McKinsey global survey of executives found that only a quarter had integrated risk analysis into a formal process rather than conducting ad hoc analyses as events arose.5

Especially in today’s environment, political conditions change, sometimes quickly. Good political risk analysis is not a one-shot deal, it is a continuous endeavor.

The third lesson is to beware of optimism bias. Most students walk into class enthusiastic about Burma, seeing a rare opening and a historic business opportunity.6 After three hours of political risk discussion, they reach a more sober assessment of the trade-offs. Although many say they would enter Burma anyway, nearly all say they would enter differently. Once they focus on political risks, students come up with all sorts of mitigation ideas—from starting construction in central Burma to building relationships with human rights groups to insisting on legislation that limits the government’s ability to cut transmissions. Every year, we find that optimism bias keeps students from seeing risks and ways to reduce them.

Integrating political risk analysis into business decision-making helps guard against optimism bias. Not enough companies do this. SeaWorld executives saw profits, not exposure, in the company’s dependence on orcas for theme park attendance and brand reputation. Jack Welch got so excited about the business potential of the GE-Honeywell merger that he did not take seriously enough the risk that the European Commission would nix the deal. Boeing executives were so buoyed by demand for the 787 Dreamliner that they did not see the slow-motion supply chain disruption coming when the fastener industry contracted after 9/11. By contrast, companies that manage political risks well make risk analysis a core part of the business. They bake political risk in. They do not bolt it on. And importantly, they see political risk as helping the business. The “risk folks” are not the voice of doom. As Pat Donovan, Chevron’s director for global security, told us, “We have to speak to people in a way that we aren’t trying to prevent them from doing their job, but showing them the best way to go about doing it—politely, diplomatically, but candidly. No doesn’t work. Ninety-nine percent, yes, we will find a solution, does.”

These takeaways from our Burma case raise three questions that every organization should ask to analyze political risks.

1. How can we get good information about the political risks we face?

General Michael Hayden, who ran the NSA and the CIA, has a knack for explaining complex ideas in colorful ways. At his 2006 confirmation hearing, Hayden described the perils of analysis. “I have three great kids,” he told the Senate Intelligence Committee, “but if you tell me, ‘Go out and find all the bad things they’ve done, Hayden,’ I can build you a pretty good dossier, and you’d think they were pretty bad people, because that was what I was looking for and that’s what I’d build up. That would be very wrong. That would be inaccurate. That would be misleading.”7 Hayden’s message was: Be careful. Good information is not so objective. Context matters. What you find depends on what you seek.

Companies analyzing political risks confront many of the same challenges the CIA does. Information often seems straightforward when it isn’t. The story of Hayden’s kids reminds us that context is crucial. Even cold, hard facts can tell different stories. Is the drug-related violence in Mexico increasing or decreasing? It depends on when you start counting and what cities you use for comparison. Where should Kiku Telecom begin its operations in Burma? If you are looking to start where cell phone penetration is lowest, the Rakhine State is a good bet. If you are looking to mitigate political risks, not so much. Identifying what constitutes “good information” is itself an act of judgment. What you find depends on what you seek.

Three rules of thumb are useful.

Rule 1: Good information is specific, not generic.

Good information does more than indicate the political risks of operating in Country X. It helps answer the question, “What are the political risks to my organization from this place at this particular time?”

Note the words “my organization.” Political risk information should be tailored to your company, its risk appetite, alternatives, strategy, and strengths. Recall that in Iraq, ExxonMobil was willing to assume far greater political risk in pursuing oil exploration and production contracts with Iraqi Kurds than Royal Dutch Shell was. That’s because even in the same industry and location, the two companies had different risk appetites and strategies.

For this reason, off-the-shelf products are likely to be off the mark. Country reports, corruption indices, global industry analyses, and other generic products are fine places to start analyzing risks, but poor places to end. For example, Transparency International conducts an annual poll assessing perceptions of corruption by country. The poll is a valuable indicator of national-level corruption over time, but not of how corruption varies within a country. Research finds that within-country variations in corruption are significant even among EU member states such as Belgium, Bulgaria, Portugal, Romania, and Spain.8

Similarly, industry-based information about political risks in a particular market tends to rely on assessments at a single point in time. With today’s pace of change, the lag between assessment and reality can be substantial. Argentina is the poster child for just how fast a nation’s economic policies and political circumstances can shift. In October 2015, Argentina was poised to continue its leftist rule of heavy government subsidies and interventionist economic policies. Populism was so deeply rooted, no center-right candidate had led the nation since democracy was restored in 1983. Daniel Scioli, the ruling-party candidate, was expected to win. Instead, Mauricio Macri, a millionaire conservative businessman and the mayor of Buenos Aires, pulled an upset victory. Suddenly, Argentina appeared poised for transformation. “Today is a historic day,” Macri declared in his victory speech. “We need to build an Argentina with zero poverty. A marvelous phase is beginning for Argentina.”9 He moved with lightning speed, dismantling subsidies and lifting currency controls on the peso to spur foreign investment. In March 2016, he negotiated an end to the country’s debt dispute with holdout creditors, clearing the way for Argentina to access global credit markets.10 Later that month, he restored relations with the United States, hosting President Obama, the first visit by an American president in two decades. “Argentina is back,” then finance minister Alfonso Prat-Gay proclaimed.11 But by summer, Argentina’s political situation was changing yet again. With a shrinking economy and surging inflation and unemployment, Macri faced rising popular discontent. In July, angry citizens reacted to rising utility prices by banging pots and pans in the streets.12 In August, more protests erupted, with tens of thousands gathering outside the presidential palace. In September, pilots halted flights of Argentina’s airline to demand higher wages. Macri’s approval plummeted. While economic analysts praised his efforts to reform the nation’s economy and urged patience, political uncertainties grew about whether he would suffer the same fate as so many Argentine leaders ousted by popular economic discontent. All of these developments—the predicted continuation of Cristina Kirchner’s socialist policies, the surprise election of a pro-business conservative president, the breakneck pace of economic reforms followed by political backlash, social unrest, and rising uncertainty about whether Macri would even finish his term—transpired in less than a year.

In short, generic products are likely to be incomplete and quickly outdated. Companies need to develop the capability to drill down into national data, gathering additional information that is more localized, contextual, and dynamic about the risks that matter most for them.

Doing this well does not require hiring an army of doctoral students or regional experts. Marriott International, one of the pioneers in global political risk management, has a full-time team of just two professional intelligence analysts, one in Hong Kong, one in Washington, D.C., so that someone is assessing intelligence all the time. Because Marriott operates properties around the world, including in some high-threat terrorist locations, Vice President for Global Safety and Security Alan Orlob is continuously assessing whether to move a hotel up or down a five-tiered color-coded alert system. He and his team do this by taking the intelligence they receive and analyzing its implications for specific properties. “We’re very careful about not spreading threat levels across a country,” Orlob told us. “For example we don’t say, ‘All of India is at our highest level.’ We look at it city by city, product by product (what kind of hotel it is), and we look at it by history.”

Management by walking around goes a long way, too. The best information about political risks in the field often comes from visiting the field. Consider Royal Caribbean International’s assessment of the political risk of developing its port destination in Labadee, Haiti, during the 1980s. As the crow flies, Labadee is located just eighty-five miles from the violence-ridden capital of Port-au-Prince, a seemingly poor choice from a political risk perspective. But when executive Peter Whelpton visited, he found that Labadee was actually much farther, since there was only one poorly maintained road that took several hours to traverse.13 “People say, ‘Oh, there’s insurrection in Haiti; my God, there’s shooting in the street,’” Whelpton told us. “But Labadee is way, way, way from Port-au-Prince.”

Back to the question, “What are the political risks to my organization from this place at this particular time?” We use the phrase “from this place,” not “in this place.” As we noted earlier, political risks in one location or time often produce effects elsewhere. They cascade, thanks to connective technologies, global supply chains, and politics. In our hypothetical Burma case, for example, one of the issues we discuss is how human rights abuses by the Burmese military, which is Kiku Telecom’s joint venture partner, could generate corporate social responsibility concerns among Kiku’s Japanese and American customers. Political risks in Burma may not stay in Burma.

In the real world, examples of cascade effects abound. The outbreak of unrest in Yemen hurt tea farmers in Kenya. Progress against the illegal drug trade in Colombia shifted trafficking to Mexico, which in turn contributed to rising drug-related violence there. During the Arab Spring, the self-immolation of a fruit seller in Tunisia ended up bringing down the Mubarak regime in Egypt. In 2014, China moved an offshore oil rig close to Vietnam, sparking local Vietnamese protests, factory shutdowns, and clothing stock-outs in American cities.

Corruption is the most obvious cascading political risk that global companies face. As we note in chapter 2, the extraterritorial reach of U.S. and U.K. antibribery laws is extensive, covering activities anywhere in the world for companies that have some portion of their business operations in the United States or the United Kingdom. These laws also cover the activities of third-party contractors. “The middleman did it” is no defense. Small bribery requests in faraway places may pose no serious political risk where they occur, but they present substantial political risks of fines and prosecution in the United States and the United Kingdom.

When we asked Royal Caribbean executive Adam Goldstein to identify the hardest political risks his company faces, he immediately brought up the Foreign Corrupt Practices Act, sharing an incident that highlights the challenge. A few years ago, the company was charged a fee at one of its ports of call that appeared suspicious. Executives tried to get a justification from the host government, but as the ship neared the port, no answer came. “In the end, we were never provided with the proper documentation to justify the charge and they never backed off insisting on the charge,” said Goldstein. “And so we did not call on the port. The restitution to our guests for missing that port of call for a reason that was not an act of God was $1 million to avoid a $5,000 bribe.” Goldstein and his team knew the political risk was from the foreign port of call but was not confined there. The real risk was back home, in the potential to expose the company to violations of the Foreign Corrupt Practices Act.

Rule 2: Good information includes perception and emotion.

Good information also provides insight about perception and emotion. Perception and emotion are tightly intertwined drivers of human behavior, whether it’s consumers in department stores, protesters on streets, or lawmakers in Congress.

Perhaps no episode in Condi’s government service illustrates the power of perception and emotion better than the Dubai Ports World controversy. She taught this case to all incoming MBA students for several years. Here’s what happened.14

In 2006, Dubai Ports World, an award-winning port management company owned by the government of the United Arab Emirates, acquired the London-based Peninsular and Oriental Steam Navigation Company (P&O), one of the world’s oldest and largest port operators. The $6.8 billion deal made Dubai Ports World the fourth-largest container port operator (by throughput) in the world, with terminals in the Middle East, Europe, Asia, Australia, Latin America, and the United States. Because the acquisition gave Dubai Ports World control over operating container terminals in six U.S. ports (Baltimore, Miami, New Orleans, New York, New Jersey, and Philadelphia), it needed to be approved by the Committee on Foreign Investment in the United States (CFIUS), a federal interagency panel charged with reviewing the national security implications of transactions granting foreign control over U.S. businesses. At the time, twelve government agencies were represented on CFIUS, including the National Security Council and the Departments of Defense, Homeland Security, State, and Justice.

The approval process began smoothly. Foreign operation of shipping terminals in the United States was in fact common: Seventy-five percent of shipping at the time passed through terminals leased by foreign companies.15 Security at American ports was the primary responsibility of the Coast Guard, not terminal operators. What’s more, the CFIUS review found no specific cause for concern. The Department of Homeland Security found Dubai Ports World to be fully cooperative with the mission of protecting American ports and American ships in foreign ports. The UAE was a longtime American ally and partner in the war on terror. In fact, UAE ports hosted more U.S. Navy ships than any port outside the United States. In addition, the UAE was providing valuable assistance to American missions in both Iraq and Afghanistan, and the government had played an instrumental role in arresting Abd al-Rahim al-Nashiri, the suspected mastermind of the USS Cole bombing (which killed seventeen American sailors) and the 1998 U.S. embassy bombings in Kenya and Tanzania.

On paper, everything looked good. CFIUS unanimously approved the deal.

But perception and emotion were quite another matter. Just four years after 9/11, fears of terrorist attacks on the U.S. homeland remained acute. The prospect of allowing an Arab government to operate shipping terminals in American ports sparked concern and outrage in Congress. The UAE was an Arab country, one of only three nations that recognized the Taliban regime in Afghanistan before 9/11, home to two of the 9/11 hijackers, and the place where financial transactions supporting the attack appeared to have occurred.16 The Dubai Ports World deal quickly came under fire. Opposition was visceral, bipartisan, and widespread. Democratic senator Chuck Schumer declared, “Foreign control of our ports, which are vital to homeland security, is a risky proposition. Riskier yet is that we are turning it over to a country that has been linked to terrorism previously.”17 Republican representative Sue Myrick wrote a one-sentence letter to President Bush that read, “Dear Mr. President: In regards to selling American ports to the United Arab Emirates, not just NO but HELL NO!”18 A Gallup poll found that 66 percent of the American public opposed the deal, and 45 percent opposed it strongly. On March 8, the House Appropriations Committee voted 62–2 to block the transaction. Although the White House wanted the deal to go through, it had little room to maneuver. On March 9, one day after the House committee vote, Dubai Ports World announced that it would transfer its shipping terminal operations to an American entity.

Condi was secretary of state at the time. She remembers thinking that the facts showed there was no security risk in the deal. No agency in the U.S. government, including the Departments of Defense and Homeland Security, believed that having a UAE company operating shipping terminals in American ports posed a threat to national security. Especially in the aftermath of 9/11, every department and agency was hypervigilant about potential homeland security vulnerabilities. But none of that mattered. The timing and ownership of the company were political killers. As Condi later reflected, “Americans heard ‘Arabs, ports, and 9/11,’ and those three things just couldn’t go together. In the aftermath of 9/11 there was no way this deal was going to go through.”

How can companies avoid a Dubai Ports World moment? For starters, by realizing that good information includes gathering a sense of the heated feelings and passions of key audiences. Think of “perception information” as anything that helps put a finger on the pulse of a group critical to your business, whether it’s a segment of customers, a leadership group of government officials, or the political mood of a population. No telepathic powers or pinpoint accuracy is necessary. Perception is about general feel, broad trends, the arc of emotion.

Condi spent a great deal of time in the Middle East as secretary of state, traveling to the region thirty-one times in four years. In 2005, six years before protests erupted in Tahrir Square and brought down the regime of President Hosni Mubarak, she went to Cairo and delivered a speech urging Mubarak to embrace political reform before it was too late. “The Egyptian government must put its faith in its own people,” she said, “[and] the day must come when the rule of law replaces emergency decrees.” Years later, Amy asked why she gave the Cairo speech long before anyone thought Egypt would experience a major political change, through either evolution or revolution. “I could not tell you when the Arab Spring was going to ignite or exactly how a Tunisian fruit seller would lead to the fall of an Egyptian monarch,” Condi reflected, “but I could tell you even then that the Middle East had some brittle regimes, bad demographics, rising social unrest, few outlets for all that pressure, and that time was definitely not on the side of the autocrats.” Close contact with the region gave her insight.

Rule 3: Good information comes from asking good questions.

Third, and finally, good data come from asking good questions. How you label a problem can send your team searching in the right direction or in a very wrong one.

On March 11, 2011, a 9.0 magnitude earthquake struck Japan, triggering a catastrophic tsunami that hit the Fukushima Daiichi Nuclear Power Plant. Immediately after the temblor, the plant shut down its nuclear reactors according to emergency protocols. Then the power went out. Because power is essential for cooling nuclear fuel rods, diesel backup generators kicked into action. So far, safety systems were running as planned. But then the tsunami hit. The wave flooded the backup power generators for five of the six reactors, knocking them out. Without sufficient power, the plant suffered partial meltdowns in two reactors. The result was the worst nuclear disaster since Chernobyl. Nearly two hundred thousand residents were forced to evacuate as several hydrogen explosions erupted, radioactive steam and water were released, and officials desperately sought to contain the contamination and prevent a full core meltdown.19

The conventional wisdom about Fukushima is that nobody could have seen this disaster coming. The 9.0 magnitude earthquake was one of the most severe ever recorded. The tsunami, too, was a rare event: There had been only three tsunamis in fifty years in Japan. Fukushima seemed like a black swan, so unlikely as to be unpredictable.

But not if you frame the risk differently. Our Stanford colleague Rod Ewing, a leading expert on nuclear waste, has examined what went wrong at Fukushima. He found that while the immediate risk of this specific event might seem remote, the longer-term risk of a major seismic event affecting a Japanese nuclear power reactor was far more likely. “You could ask, ‘What if I have a string of reactors along the eastern coast of Japan? What is the risk of a tsunami hitting one of those reactors over their lifetime, say, 100 years?’” said Ewing.20

The answer is, pretty high. For starters, Japan has a lot of nuclear reactors—fifty-four in 2011, the third most of any country in the world.21 Most are located on coasts. Before the disaster, Japan planned to double the number of reactors and expected to rely heavily on nuclear power for hundreds of years. Japan is also located in the Pacific “Ring of Fire,” where 90 percent of the world’s earthquakes originate. Finally, the fact that major tsunamis had not struck recently gave false comfort: Ten years before Fukushima, some of Japan’s leading earth scientists, led by Koji Minoura at Tohoku University, found the interval of major earthquakes and tsunamis to be about a thousand years, with the last one occurring in 869. “From a geologic perspective,” Ewing writes, “the earthquake and its great magnitude should not have been a surprise.”22

Considering long-term seismic risks to nuclear facilities throughout Japan could have unearthed what turned out to be the critical design failure at Fukushima: the location of backup power generators. All five failing generators were located near the low-lying coast, which is why they flooded so fast. Only backup generator number 6, located on higher ground, kept working. The flooding of five generators was not an accident; it was a design flaw. All of the backup generators could have been located farther from the ocean on higher ground had the risk been framed differently.23

Advances in evidence-based medicine also reveal how finding useful data begins with asking the right questions. For years, researchers did not think to conduct experiments gathering data on certain medical procedures because they assumed the outcomes were obviously better than any alternative course of treatment. But the evidence-based medicine movement has shown how doctors’ experience and judgment can be wrong—sometimes frighteningly so. David S. Jones, a Harvard Medical School professor, recounts how two of the most common treatments for heart disease, coronary bypass surgery and angioplasty, have been widely used for years because doctors believed—falsely, it turns out—that these procedures would extend life expectancy.24 Physicians believed that patients suffering from blocked arteries would live longer if the clogs could somehow be removed or circumvented. Bypass surgery did this by grafting veins or arteries from another part of the body into the heart vasculature. Angioplasty involved inserting a balloon into the blocked artery to compress and shrink the blockage, and then inserting a meshlike stent to keep future clots from forming. In 1996, doctors performed a peak of six hundred thousand heart bypass operations. In the 2000s, more than a million angioplasties were performed annually. Yet when randomized clinical trials were finally conducted, results showed that bypass surgery and angioplasty did not extend life expectancy any more than medication and lifestyle changes did except for a few of the sickest patients. And importantly, surgery imposed significant risks of side effects, including brain damage, that medication and lifestyle changes did not. The bottom line was that for most cardiac patients, noninvasive treatment options produced equivalent outcomes at a lower risk of serious complications.

For years, the data about bypass surgery and angioplasty were out there, waiting to be collected. But nobody ever asked: Do these surgical procedures lead to overall better outcomes than less invasive alternatives when we consider possible side effects?25 Good information remained hidden until doctors began asking good questions.

Harnessing the power of information requires searching for data tailored to your organization’s needs and strengths, considering perception and emotion, and asking the right questions to search in the right direction.

2. How can we ensure rigorous analysis?

The first step in good political risk analysis is getting good information. Step 2 is analyzing that information well.

Richard Feynman, one of the world’s great physicists, said that analysis is how we try not to fool ourselves.26 He was onto something. Whether you are analyzing next quarter’s prospects in a foreign market or geostrategic trends over the next five years, good political risk analysis challenges assumptions and mental models about how the future could unfold. The goal of political risk analysis is not to predict the future. Nobody can do that. The goal is to create better decisions for your organization by developing insight about key drivers and possibilities. Rigorous analysis hinges on traps, tools, and teams—understanding the cognitive traps and group pathologies that lead analysts astray, utilizing analytic tools to overcome these barriers, and enlisting the entire business team.

Traps

Cognitive traps are deadly and they are everywhere. In chapter 4, we discussed how humans are terrible when it comes to statistics and calculating risk. People are more worried about dying in an airplane crash than a car crash even though airplanes are about seventy times safer than cars. All of us suffer from the “availability heuristic,” believing that bad events which can be recalled easily (usually because of press coverage) are more likely to occur than they actually are. Optimism bias is also pervasive, explaining why investors believe their investments will perform better than average, why NFL fans overpredict wins for their favorite teams, and why so many were taken by surprise by the “Brexit” vote to leave the European Union in the summer of 2016 even though polls consistently showed the vote was statistically too close to call.

Mental mind-sets are particularly challenging. Everyone uses them. Mind-sets are unconscious analytic frames used to organize information and make sense of complexity.27 While frequently useful, mind-sets can also distort thinking in hidden ways. To see how, try your hand at the following exercise. It first appeared in Norman Maier’s 1930 article “Reasoning in Humans” and for years has been part of Richards Heuer’s book on intelligence analysis, a staple in CIA analytic training.28

The Nine-Dot Exercise

Instructions: Without lifting your pencil from the paper, draw no more than four straight lines that cross through all nine dots.

image

Many people find this puzzle difficult to solve because they assume they are not supposed to let the pencil go outside of an imaginary square drawn around the dots. They try drawing the lines this way:

image

This obviously does not work: Drawing a line around the perimeter omits the center dot. However, once you discard the assumption of equal line length, it should be pretty easy to draw the four-line answer below:

image

Many also assume that the lines must pass through the center of the dots. This constraint is entirely in the mind of the problem-solver as well. Once this constraint is relaxed, you can reach a solution using only three lines, not four.

Another answer can be found if you relax the mental mind-set that the puzzle has to be solved in a two-dimensional plane. If you roll the paper into a cylinder, you can draw a single line passing through all nine dots.

image

Amy has used this exercise in a number of classes over the years. Each time, students react the same way: “You never told us we could solve the puzzle that way!” And that is precisely the point. Students’ own mind-sets impose barriers to analyzing the problem and finding a solution.

As the nine-dot exercise shows, people are routinely constrained by mind-sets they do not even realize exist. These mind-sets are formed by many inputs—past experience, cultural norms, organizational standard operating procedures, situational context, education, and training. Recognizing them is the first step toward overcoming them. The simple act of awareness can unlock a host of possibilities.

In addition to cognitive traps, group dynamics pose analytic challenges. Nobody wants to be seen disagreeing with the boss. Many bosses do not like to hear dissenting views. Hierarchy and status often stifle discussion of vital information without anyone realizing it.

In his bestselling book The Checklist Manifesto, physician Atul Gawande finds that one of the reasons surgical complications arise with such frequency has to do with group dynamics: Nurses, doctors, and other operating room staff typically come together in ad hoc teams for each procedure. Often, not all the people in the room even know one another’s names. Add to this the natural tendency to defer to the doctor and you get a silent social system where dissenting information is hard to elicit. A 2008 study instituted a checklist at eight hospitals worldwide that included a simple step: Before surgery began, every member of the team had to introduce herself and say what her job was. That simple act of learning the names of the team was found to generate different team dynamics and better patient outcomes. Nobody quite knows how or why, but it’s believed that a basic act generating familiarity and camaraderie also generated more valuable dissenting information—a process known as the “activation phenomenon.”29 In one particular case in Jordan, a surgeon inadvertently contaminated his glove while adjusting an overhead light. A nurse noticed and spoke up, requesting that he change his glove to avoid infecting the patient. The surgeon initially brushed it off, but after the nurse told him not to be stupid and demanded that he change it, he obliged.30

Research finds that even when dissenting views are encouraged, there are strong psychological pressures toward conformity. Particularly in high-pressure situations, individuals may come to value their membership in a decision-making group more than anything else. Preserving the group’s cohesiveness takes precedence over considering alternative views, leading members to adopt a distorted view of reality, unwittingly suppress their own nagging doubts, silence dissent, and strive for unanimity—a process Irving Janis called “groupthink.” In his pioneering 1972 study, Janis examined how psychological dynamics in small groups led to foreign policy fiascoes, including the Bay of Pigs invasion and escalation of the war in Vietnam.31

Cognitive traps and group dynamics are big challenges. The good news is that there is help; political officials, intelligence professionals, and business leaders have developed and deployed a number of tools to combat cognitive traps, groupthink, and other pitfalls. We discuss some of our favorites below so that you can use them, too.

Scenario Planning

In 1965, Ted Newland was tapped to start a unit called Long-Term Studies at Royal Dutch Shell’s London headquarters. “I was placed in a little cubicle on the 18th floor and told to think about the future, with no real indications of what was required of me,” Newland later recalled.32 It was the beginning of Shell’s pioneering use of scenario planning for political risk analysis. Soon Newland was joined by Pierre Wack, a former magazine editor who believed in the value of storytelling. In 1971, they began a major scenario planning exercise, looking for events that could affect the price of oil. The task was more radical than it sounds: Because the price of oil had experienced low volatility since the end of World War II, imagining factors that could dramatically affect oil prices was a venture into the unfamiliar. The conventional wisdom at Shell was that stable oil prices would continue.

Wack and Newland found a number of reasons why oil prices might spike at some point down the road. American demand for energy was rising while domestic reserves were dwindling. The Organization of the Petroleum Exporting Countries (OPEC) consisted heavily of Arab countries. While OPEC had not coordinated to boost oil prices yet, their opposition to Western support of Israel during the 1967 Six-Day War might give them reason to, and there was already a scheduled renegotiation of the Tehran price agreement slated for 1975. In September 1972, Wack and Newland developed two scenarios, a stable price scenario and a drastic price change scenario. The stable price scenario was eventually called “the three miracles” because its occurrence hinged on wildly optimistic exploration and production; all major countries’ willing depletion of their hydrocarbon resources to meet consumer demand; and no major supply or demand changes (including regional wars or demand spikes).

Wack and Newland did not know if, when, or specifically why a major price hike might occur. But their scenario planning revealed something essential: The possibility was much more likely than Shell executives had imagined. As Wack later put it, “We wanted to change our managers’ view of reality.”33

Sure enough, in October 1973, about a year later, OPEC did suddenly boost oil prices, triggering an energy crisis. Shell was the only major oil company whose executives were prepared emotionally, strategically, and operationally, thanks to the scenario planning process.34 Scenario planning has been at Shell ever since. Angela Wilkinson, formerly on Shell’s corporate scenario team, and Roland Kupers, a former Shell senior executive, wrote, “For an operation that doesn’t contribute directly to the bottom line, and that emphasizes the uncertainty of the future rather than making bold predictions, this is remarkable.”35

Scenario planning is used more frequently today.36 Bain & Company’s annual survey queries thirteen thousand respondents from more than seventy countries about the use and utility of management tools. In 2014, 18 percent of respondents reported using scenario planning and 60 percent said they expected to use it in 2015.37

As with any tool, there is scenario planning and then there is effective scenario planning. Simply spinning out possible futures is unlikely to get attention or action in the C-suite. Pierre Wack and Peter Schwartz, who worked together at Shell, have each written extensively about what makes for good scenario planning. We summarize their top tips in the box below.38

It is important to underscore that the test of good scenario planning is not whether it pinpoints the future but whether it enables managers to view the future differently—helping them get out of their mental mind-sets. Wack and Schwartz call this “re-perceiving.”40 Scenarios are not prediction tools. They are learning tools. They help managers cross the bridge between their mental mind-set view of the world and future realities that may be far different.

Red Teams, Devil’s Advocates, Thinking Backwards, and Other Tools

Scenario planning is probably the best-known tool but certainly not the only one to illuminate “risks around the corner.” Organizations ranging from military units to start-up companies use many other mechanisms and processes to help managers “re-perceive” the world, step out of their mental mind-sets, and overcome team barriers like groupthink.

The Lego Group, for example, uses both scenario planning and Monte Carlo computer-based simulations, which employ mathematical equations to quantify the likelihood of possible outcomes, to get a better handle on the quantification of various risks. Paychex employs a “Tournament of Risk” that is modeled after the NCAA men’s basketball tournament where managers vote on risks in head-to-head competition to flesh out ideas. When Carmen Medina ran the Central Intelligence Agency’s analytic directorate, she asked a handful of top analysts to create graphic designs of threat landscapes. The end product had to fit on a single page, and it had to use images to depict the drivers of conflict and their interconnections. For analysts accustomed to writing reports in prose, it was a creative, powerful way to unpack hidden assumptions about the evolving threat environment. U.S. Strategic Command, the combatant command that oversees U.S. nuclear forces and has to think long and hard about conflict escalation, has a dedicated war-gaming unit that designs and role-plays realistic future conflicts to stress test assumptions about how interactions might unfold. At STRATCOM headquarters in Nebraska, there is even a special war game room with a Hollywood-like center stage whose floor is a map of the world. We use role-playing simulations extensively in all of our Stanford courses as well as in our cyber “boot camps” for policymakers and have found them to be invaluable tools for breaking mind-sets and generating fresh insight.

The Intel Corporation, Goldman Sachs, the Central Intelligence Agency, the New York Police Department, and many others use “red teams,” people assigned to assume the role of competitors or adversaries.41 Some red teams conduct cyber hacks of the companies’ own systems to expose hidden technical and human vulnerabilities.42 Some pretend to be terrorists smuggling weapons through airport security checkpoints to improve Transportation Security Administration performance. Some scrub a potential new business strategy or government policy by exposing it to contrarian thinking. All are designed to push insiders to “think like the enemy.”43

Jayson E. Street is one of the most colorful and successful cyber red teamers around. “I promise you, I would never try to steal from you, kill you, or ruin you financially unless you pay me first,” he declared at a DEF CON hacker conference. “I am just going to F you up the best possible way.”44 He makes his penetration tests as easy as possible, spending no more than two hours conducting reconnaissance on Google and the client’s own website. His goal is to show that anyone can get in. Using simple “social engineering” techniques, such as posing as a tech repairman, customer, or job applicant, he convinces employees to help him compromise their own systems. He also deliberately escalates until he gets caught—Street wants to create teachable moments for frontline employees, not just reports for managers.45 He has conducted red team exercises for financial institutions around the world. In one, the Beirut Bank of Lebanon wanted to know whether an online compromise could come from a physical source. With access to just three branches selected at random, and no advance research or preparation, Street was able to execute a fraudulent wire transfer. It took less than three minutes for him to gain full physical access to the first branch, including the manager’s office. Within twenty minutes, he was given an employee’s ID card, network password, and smart card for authenticating network access.46

Not every tool for political risk analysis involves flying wily penetration testers around the globe, spending half a million dollars on a corporate war game,47 or building a cool auditorium with a James Bond–like map of the world. Two frugal tools can help analyze political risks from your desk or nearby conference room: thinking backwards and devil’s advocates.

Thinking backwards is an exercise that starts by imagining that a surprising and significant future event has transpired. Your job is then to look back from this future and understand how the event could have come to pass.48 Like scenario planning, thinking backwards is not designed to predict the future. It is designed to help individuals learn, to challenge mental mind-sets and resist the thinking of the herd.49

Devil’s advocates can be a “quicker and dirtier” version of red teams. They were first used by the Catholic Church centuries ago to inject more rigor into the saint-making process. In 1587, church authorities created the official position of Advocatus Diaboli to serve as an in-house skeptic, arguing against a candidate’s saintliness, questioning miracles performed, and exhaustively investigating claims and evidence in a process that could take decades.50 By the eighteenth century, the term was being used secularly and broadly to describe any in-house dissenter whose job was to challenge the predominant view and defend unpopular views even if they didn’t actually believe them. As many have noted, real devils are always better than devil’s advocates. Dissenting views carry more weight when they are genuine. And the more dissent becomes a routine process, the greater the danger that it will be dismissed or perceived as simply an exercise in “ritualized compliance.”51 However, devil’s advocates, particularly on matters where there is strong consensus, can play a useful role.

The goal of all these analytic tools is the same: opening mental mind-sets and group processes so that information can be assessed from a range of perspectives. The most common analytic mistake organizations make is assuming the future will look like the present. It almost never does.

3. How can we integrate political risk analysis into business decisions?

The team, finally, is critical. Business leaders throughout the organization have to believe that political risk is integral to their jobs or the best information and analysis won’t accomplish much. Companies that manage political risk well go out of their way to integrate political risk analysis into the everyday rhythms and decisions of the business.

Here, too, the Lego Group is an innovative leader. Remember that when Hans Læssøe started building Lego’s strategic risk management capability in 2006, he literally began by Googling it. He quickly realized that instituting a robust risk analysis process was just half the battle: Ensuring that strategic risks were “owned” across the company was the other half. He divided the practice of strategic risk management into four building blocks. Block number 1 was spreading risk management across the company so that it was genuinely seen as everyone’s business.52 As we mentioned in chapter 6, Læssøe created a systematic, continuous process to engage every important business leader, including the board, in setting the company’s risk appetite, understanding risks, and integrating risk assessment and mitigation into business planning.53 Among his innovations, Læssøe created a database and risk process that every project manager could easily use, and he created training in risk management for them all.54 In addition, rather than putting risk management in the reporting chain to the general counsel, where it could be seen as another onerous compliance activity, he reported to the chief financial officer and gave himself the title of Lego’s “Professional Paranoid.” He told the Wall Street Journal, “I use the title jokingly whenever I get the chance, because it lets me pose questions that I couldn’t have gotten away with as a business controller. It removes some of the defensiveness that managers feel when they are being questioned.”55 Læssøe realized that getting risk analysis embraced by the company’s board, executive leaders, nineteen senior vice presidents, and ten thousand employees was an act of persuasion, not authority.

A second building block in Læssøe’s plan was creating a standardized decision-making process that incorporated strategic risk analysis. He called it Active Risk and Opportunity Planning, or AROP. As the name suggests, AROP was designed to ensure that managers consider both downside risks and upside opportunities. And while Læssøe utilized a number of risk assessment tools, including Monte Carlo computer simulations, Google Trends word searches, and scenario planning, keeping business leaders engaged was always top of mind. The Lego Group’s scenarios used attention-getting names like “Cut Throat Competition” and “Murphy’s Surprise.”56 As a result of Læssøe’s efforts, strategic risk analysis that included political risks became tightly integrated into business decision-making. Making strategic risk everyone’s business helped the Lego Group turn the corner after nearly going bankrupt.

We interviewed a number of political risk officers from different industries. All of them emphasized the same lesson that Læssøe learned early on: Their most important job is making political risk useful throughout the company. That starts with developing a deep understanding of the business and the needs of managers. As one political risk officer at a major international airline told us, “You’ve got to know what’s going on and ask, ‘So what for my company? Why does this matter?’ Political risk analysis has got to be proactive and forward-looking.” This risk officer used to work in an organization with many academics. “They had a genuine interest in the issues. They’d sit around and talk about political risks over morning coffee,” he said. “But company leaders can’t do that. The people who are consuming what you’re putting out cannot necessarily have a natural interest in it. It will take them away from running the business, so you’d better be able to say why you’re telling them what you’re telling them and why it matters.”

When Pat Donovan was hired to take over Chevron’s global security unit, the first thing he did was embark on a company listening tour. “I needed to figure out what they wanted to know but didn’t know yet,” he told us. “You have to know the business,” he emphasized. “What is it like to have two hundred customers out there? You don’t take all your problems to the chairman. You need to go out and learn what the business is and how you can bring value to that.” Donovan sees business leaders, including the chairman, regularly, and works hard to ensure that every member of his eight-person global risk team is “forward deployed,” working as closely as possible with each business unit. Forward deployment, understanding his “customers,” and getting buy-in from the top have been essential for his successful integration of political risk into Chevron’s business decision-making. When we asked how he judged success, Donovan answered without hesitation: “Whether the business unit keeps coming back to us and wanting more, and whether they are considering what we say when they make their decisions.”

At FedEx, the world’s largest express transportation company, natural and man-made disruptions to delivery service are a constant concern—from labor strikes to civil unrest to tropical storms and volcanic eruptions. The company constantly monitors and mitigates risks of all kinds at its Global Operations Control Center in Memphis, Tennessee. As we will see in the next chapter, part of the reason FedEx excels at risk management is its focus on the human factor: Many company leaders, including Paul Tronsor, FedEx’s vice president of global operations and service quality assurance, started out as couriers and package handlers. FedEx’s “Purple Promise” to make every FedEx experience outstanding is not just a slogan. It is personal. Company leaders, including risk managers, understand firsthand how what they do affects the customer.

Ultimately, integrating political risk analysis into business decisions requires encouraging different perspectives and building trust. In the last chapter, we talked about how understanding political risks requires seeing how others may view the world in ways that could generate challenges and opportunities for your business. Recognizing the incentives, priorities, and perspectives of others does not apply just to external stakeholders, but within companies as well. When it comes to corporate roles, where you stand does depend largely on where you sit. General counsels get paid to focus on legal restrictions. CFOs naturally gravitate to the financial implications of decisions. Sales teams are focused on driving revenues. Strategy shops think about longer-term industry trends, consumer tastes, competitor moves, market opportunities, and company advantages. Political risk officers are hired to assess how political actions could affect business operations and opportunities around the globe. None of these functions work well in isolation. Effectively integrating political risk analysis requires developing mechanisms to see these different perspectives so that political risk can be baked into assessing the trade-offs of a business decision, not considered as an afterthought.

One final example sends this point home. It involves Nike,57 one of the world’s largest suppliers of athletic footwear and apparel and the most valuable brand in sports.58 In 2013, Nike faced a major decision: Should the company increase its manufacturing relationship with Lyric Industries, a Bangladeshi supplier? Bangladesh was home to a $20 billion garment industry, offering some of the lowest-priced labor in the world, along with notoriously unsafe working conditions. “Our competitors were moving fast into Bangladesh and the pressure was getting bigger and bigger,” noted Nike chief operating officer Eric Sprunk. “We needed a strong point of view to say, ‘Are we going to increase our source base there or not?’”59

While Sprunk and other manufacturing executives believed they could cut costs, improve margins, and put adequate safety conditions in place, Hannah Jones, who headed Nike’s sustainable business team, warned that Nike could not ensure safe working conditions in Bangladesh and that the cost advantages would not be worth the risk to its brand of another outsourced labor scandal. Jones’s team had hired a consultant to create a country risk index for Nike’s global operations, and Bangladesh ranked near the bottom because of its labor practices and unsafe working conditions. “We faced a decision,” said Jones. “There was a lot of pressure to say, ‘Let’s go into Bangladesh like everyone else and get great margins and low costs.’ So we went through this discussion and looked at the risk index and said, ‘We’re not going to do that.’”60 The issue was especially salient for Nike. The company had come under heavy fire in the 1990s for child labor violations, and despite its devoting tremendous resources and effort to improving working safety and labor conditions in its suppliers since then and serving as a corporate social responsibility leader, challenges remained. In 2006, Nike faced another public crisis when a supplier was found to be using Pakistani children to stitch World Cup soccer balls, forcing Nike to pull $100 million worth of balls from stores just weeks before the World Cup began.61

Sprunk and Jones, both of whom reported to the CEO, could not agree about Lyric Industries. So they decided that their teams should go on a field trip together to see conditions on the ground and make a joint decision about what Nike should do. “They had to go together,” said Sprunk, “because if Hannah came back and said we couldn’t do it there, the manufacturing guys would be unconvinced, and if only Nick [Athanasakos, the vice president of global sourcing and manufacturing] went, there would still be doubts.”62 After visiting the facility and meeting with Lyric managers, employees, and local residents, the joint team decided to cut ties with Lyric and reduce its manufacturing footprint in Bangladesh. Months later, Bangladesh experienced the worst industrial disaster in its history when the Rana Plaza factory collapsed, killing more than a thousand people. The disaster caused some companies, including Disney, to pull manufacturing from Bangladesh, while others stayed and vowed to institute new safety accords and inspection regimes.63

For Nike, the political risk perspective and the business perspective suggested different decisions about Bangladesh. But rather than handing political risk analysis off to business leaders and calling it a day, Nike executives realized they needed to bring the two teams together, in the field, to see conditions. Production teams are going to gravitate to the financial implications of a decision. Corporate social responsibility teams are more acutely attuned to the potential NGO reaction and damage to the brand. Spending time together in Bangladesh gave both teams a better feel for conditions and something else that proved just as essential: trust. Nike integrated political risk analysis into the business through shared experience.