Chapter 8
Forecasting Your Multiple Futures

Memories of the Future, the Hindsight Bias, Minus 148°, and the Principle of Multiple Futures

A business plan is based on a forecast, and a forecast is based on your ability to predict the future. The reason that no plan ever survives the first contact with the enemy is that our ability to predict the future is inherently flawed.

Bijan Moazami is a very smart guy. He graduated from McGill University with a bachelor’s degree in commercial finance and then went on to get a master’s in international finance and monetary theory. In 1995, when he graduated, he went to work for Morgan Stanley as a research analyst studying and reporting on property/casualty insurance and reinsurance stocks.

For the next twelve years Moazami learned more about insurance and underwriting. He spent his days studying the large insurance companies, and their portfolios, using the most advanced predictive analysis tools to report on them. He even consults, parttime, for McKinsey & Company, which is developing an advanced claims-adjusting software program. Moazami is an expert, a genius if you will, in the insurance industry. In 2003, his expertise and reputation took him to Arlington, Virginia, where he became the managing director and group head of insurance for the investment banking firm Friedman, Billings, Ramsey (FBR). In 2007, Forbes magazine named Moazami the third best stock analyst in the country.

A year later, FBR issued a press release based on Moazami’s predictive analytical modeling which said: “AIG [the insurance company] could have huge gains in the second quarter.” FBR declared it a “strong buy” and a top pick for investment portfolios. It was advice coming from the third best analyst in the country. However, a month later AIG ended up losing $5 billion in the second quarter. In the three months that followed, it lost an additional $25 billion. In September, it spiraled out of control and the U.S. government had to step in and bail it out to the tune of $150 billion. In other words, it cost every household in the United States more than $1,000 to keep AIG afloat. Unbelievable—a tragedy that no one saw coming, not even the third best analyst in the country, who spends his days—his life—focused on and studying such things.

How can this happen? How can a top analyst, a person with a deep understanding of the latest underwriting theories, who spends his days poring over financial statements, with the latest technology and access to mounds of data—make such a horrid prediction? The answer, I say, has nothing to do with Bijan; it is that it’s impossible to predict this type of thing with complete accuracy.

There are more than 57,000 books available today on Amazon. com that deal with the future. More than 6,000 titles on prediction. Another 4,000 on forecasting. Faith Popcorn makes a good living telling us what’s going to happen next and predicting future trends. But the reality is, when it comes to certain things, such as economic forecasting, long-term global weather patterns, business trends, and insurance underwriting, it’s impossible to tell the future. The reason is simple mathematics: if you have three independent variables that determine the outcome of an event, you have more than a thousand possible outcomes. If you have a hundred independent variables, the number of permutations is in the billions. Of course, Moazami and his counterparts are well aware of this, so what they do is analyze the past, determine which variables drive various outcomes, simplify the reality, then monitor those variables to determine future outcomes. In other words, they develop elaborate mathematical models based upon the past and then use those models to predict the future. At its core, though, it’s a flawed process because it’s impossible to know which variables are determining. And the math and science behind it blind us to what’s really happening. As Nassim Taleb put it in his book The Black Swan, “The more science behind a prediction, the less likely I am to believe it.”

Likewise, in 1981, before the inaugural flight of the space shuttle, NASA predicted its failure as 1 in 100,000 flights. Now, those are some smart guys at NASA; I know because I used to work with them (seriously, I’m not being sarcastic, they’re friggin’ rocket scientists, for Christ’s sake). However, history gives us quite a different answer. The Challenger failed on its twenty-fifth flight, Columbia on its twenty-eighth. That’s a failure rate of less than one in thirty, not exactly consistent with the well-thought-out prediction, is it?

And consider this. In 1999, Larry Page and Sergey Brin decided that their Internet start-up was interfering with their schoolwork and decided to sell it. They thought it was worth $1 million. Yahoo! wasn’t interested. AltaVista decided to pass, but Excite (an Internet portal that was riding high in the nineties) was interested but thought that the price was too high, so the guys lowered it to $750,000. Still too high, they said. Okay, never mind, Page and Brin said and decided not to sell. Today Google is worth almost $200 billion.

Business history is ripe with really bad predictions. The board of directors of Apple fired Steve Jobs in 1985 after a fight between Jobs and the CEO, John Scully. Jobs was an entrepreneur, not a CEO, they said. Ross Perot tried to buy Microsoft from Bill Gates in 1979. Gates wanted about $40 million for the company, and Perot thought it was worth about a quarter of that. They never came to terms, and Perot calls it the “worst business decision I ever made.” Decca Records turned down an offer to produce an up-and-coming new rock group, saying, “The Beatles have no future in show business.” In the late 1950s, BusinessWeek said, “The Japanese auto industry isn’t likely to carve out a big slice of the U.S. market.” Thomas Watson, Jr., the chairman of IBM, said, “I think there is a world market for maybe five computers.” And Bill Gates once predicted that “640K ought to be enough for anybody.”

Why do smart people make bad predictions? And why do we even think we can make good forecasts in the first place?

Memories of the Future

Actually, human beings are very good at making predictions. You and I do it all the time, and we do it with great precision. For example, I can predict what’ll happen if I drop a glass of milk on to the floor: it’ll shatter into a hundred pieces. I can also predict, with similar accuracy, what my wife will say if I walk into the house at two in the morning with liquor on my breath and no explanation of where I’ve been. I can make such forecasts because I’ve done those things before. This is what’s called backward reasoning.

For simple things, such as dropping a glass of milk or coming home late, backward reasoning works well. Our ancient ancestors on the open savanna honed these skills long before we developed agriculture, before we domesticated animals, before we began to live in complex societies. It worked for the simple stuff, such as determining whether the rustling of the grass was an approaching saber-toothed tiger or merely a harmless wind. However, for more complex things, backward reasoning is less effective, working only in certain cases and for short time frames. Sometimes it can be dangerously misleading, causing spacecraft to blow up or thousands of soldiers to die, or bankrupting once dominant and storied companies. Yet we do it all the time because they are the skills that natural selection has left us with. We project the past onto the future, making a forecast an exercise in recalling memories of the future.

Even a simple game such as tic-tac-toe is difficult to predict. Despite its simplicity, there are 255,168 possible game situations. If “X” is the first mover, there are 131,184 possible ways for “X” to win and 77,904 for “O” to win. Chess, on the other hand, is impossible to predict from backward reasoning. There are too many pieces, too many possible moves, so we can’t begin the game with the end in mind. Even though chess has a strict set of rules, all moves are observable, and there is no uncertainty about the positions or the players’ motives, it’s still impossible to predict the way a game will unfold. Chess experts estimate that there are 10120 possible moves; that’s 1 with 120 zeros behind it. Even the most powerful computers would take billions of years to lay out all of the possible game combinations. You see, the more complex the game, the more players, and the more moves, the harder it is to predict how an outcome through backward reasoning. We can use it near the end of the game, when there are only a few pieces left on each side, but we can’t early in the game.

Business is more like chess than it is like tic-tac-toe. In fact, it’s far more complex than chess: it has more variables and more possible moves and so is far less deterministic. Unlike in chess, we don’t often see the important moves, there are multiple players, we’re often not privy to their goals and motivations, and unlike in chess, which has three possible outcomes (win, lose, or draw), in business there are an infinite number. The possible ways that things could play out are greater than 1 followed by 120 zeros. In fact, the possibilities are infinite. Yet we use backward reasoning, thinking that we know how to predict these complex things.

The problem is that our view is biased from past events. Looking back at events creates a bias in how we perceive future events. They may look the same, but they never truly are. The Danish philosopher and theologian Søren Kierkegaard noted this a century and half ago when he said, “Life is lived forward but understood backward.”

Living life with backward reasoning is like trying to drive a car by looking in the rearview mirror. Yet we all do it. We have to, it’s all we have. We just need to be careful and understand its limitations, or else we’ll make some serious mistakes. As a mountaineer, I can tell you that looking up a steep, ice-filled couloir is far different from looking down from the same spot. Things don’t look as steep when you’re looking up as they do when you’re looking down. The same is true of looking into the future. Looking back creates a bias in how we perceive things, and it’s important for us to understand this prejudice. The views are fundamentally different.

The Hindsight Bias

The difference between looking forward and looking backward is the difference between walking a trail that’s a single path versus walking one with multiple pathways. Looking forward, there are numerous options, different roads you can take, alternative futures. Looking backward, there’s only a single road, a solitary path, just one way that things actually happened. So when you look at the past, it always looks deterministic, since only a single sequence of events took place. It makes the path to the future look obvious and the branches nonexistent. We don’t see the alternative limbs, the optional futures, or what could have been when we look behind us and study history.

Jeffrey Rachlinski, a psychology professor at Cornell, conducted an experiment to see how the bias plays itself out in the courtroom. In an experiment based on an actual case, he asked two groups to estimate the probability of flood damage caused by a drawbridge that blocked the flow of water. The control group was told things known to the city managers when they decided not to hire a watchman on the bridge. They knew nothing about the actual flood or how the bridge had backed up the water and caused millions of dollars’ worth of damage. The second group, the experimental group, was given the same information plus the fact that the flood had actually occurred and how the drawbridge had acted as a dam and caused the damage. Then each group was asked the likelihood of multimillion-dollar damage from a flood given what the city managers knew. Only 24 percent of the control group, who had no knowledge of the actual flood, predicted that there would be a high-impact flood and the city should take precautions against it. However, more than 75 percent of the experimental group said that a flood was imminent and that the city should be legally negligent if it didn’t take precautions. The experiment was run numerous times, and each time the results were the same. Remember, both groups were given the same basic information, yet their predictions were dramatically different. The only difference was that the experimental group knew that the flood had actually happened, and so, in hindsight, their expectations of their ability to predict were highly prejudiced by actual history.

We see this play out everywhere. We think it was so obvious that the Challenger was in trouble on that January morning. Looking back, the warnings look blatant. However, what we don’t see are the hundreds of other various concerns with the shuttle system, the thousands of things the NASA managers were considering. And their minds were biased by all the successful launches, dozens of them. They were projecting that past into the future, just as any of us would do.

Even in mountaineering, the hindsight bias plays tricks on your mind. For example, in order to qualify to climb McKinley, you have to have a résumé of lesser peaks climbed in winter conditions. The guide service doesn’t want to lead inexperienced people into such a harsh environment. It’s dangerous. So as preparation, my friend and I decided to climb Mount Whitney in January. At 14,495 feet, Whitney is the tallest mountain in the lower forty-eight states, and we decided to climb the mountaineers’ route—the steep escarpment of the east face. It was a four-day expedition, and it was the worse four days of my life. As luck would have it, it was the coldest weekend of the year. Our first night in the tent, at 11,000 feet, the temperature dropped to minus 25 degrees—inside the tent! To make matters worse, I became incredibly claustrophobic and refused to zip my sleeping bag up all the way—it felt too much as though I was in a tomb. I thought I was going to die. A few times I drifted off to sleep but woke up gasping for air, out of breath, as though someone had punched me in the stomach. It’s a typical reaction to high-altitude camping called sleep apnea. The air is so thin that while you sleep your body realizes it needs more oxygen, so it stops breathing so that you’ll wake up and gasp for air. You wake up with the wind knocked out of you. It’s a cruel biological trick, and it’s terrifying the first couple of times it happens to you. I didn’t fall asleep the rest of the night and was not sure that I would survive till dawn. I wasn’t equipped physically or mentally for the harsh environment. We never summited the mountain, and at the end of the trip I told my friend there was no way I could ever handle the (even harsher) environment of Mount McKinley. Then a funny thing happened.

A few days after the trip, after I recounted the horror of it to friends and family, my recollection of it started to change. Looking back, it was a great adventure and I was proud of myself for surviving it. Looking back, I knew that I would survive it, so the horror and uncertainty of it became lost in the remembrance. A week later I was committed to take on Alaska. I had completely forgotten the pain and suffering, how the mountain had knocked the wind out of me. I recalled only the adventure, and I perceived only a single future, not the possibility that I could have died. Hindsight had biased my mind, and a few months later, in Alaska, it would all come back to me and I’d be pissed at myself for forgetting. Go figure, right?

The bias is a result, as I’ve said, of the psychological difference between looking backward and looking forward. It clouds our perception of things. However, our business model is based on solving a set of problems and using the scientific method to solve them. We develop a hypothesis, based on our understanding of cause and effect, and then begin solving the problem by managing through metrics that are driven by our tactics. We analyze the results and then make adjustments based on our analysis. The problem is that using this method creates its own set of biases. So although we use the scientific method to construct our model and make adjustments to it, at the same time we need to be aware of the fact that business is a pseudoscience at best.

The Curse of Isaac Newton

Isaac Newton was arguably the most influential person of the last four hundred years. He made science practical. His three laws of motion enabled engineers to construct larger buildings, longer bridges, and safer rocket ships. It allowed them to predict, with great accuracy, the strength of structures and vehicles. Before Newton, construction was guided by trial and error. Lessons were learned when a building or bridge collapsed. After Newton, engineers were able to predict how things would hold up. They could “try things out” with paper and pencil before they built them. And so Newton created a mass belief in the power of the scientific method and the role of the scientist—a new religion, if you will, and one that was really, really good at making predictions.

The scientific method is simple: The scientist makes an observation of a natural phenomenon or effect. Then he makes a hypothesis as to the cause of the effect. Then he uses this hypothesis to predict future behavior and verify it through experimentation. Newton used the observations of Johannaes Kepler—how things behaved in the sky—and then the observations of Galileo—how things behaved on the surface of the earth—and from those he created a series of hypotheses, which became his three laws of motion, and then made predictions on how things would play out (for example, calculating the orbital path of a comet and so predicting its arrival). He called the cause “gravity,” and the effect was “attraction.” Science is a subject that’s based on backward reasoning. We look at what’s happened, determine cause and effect, and then project them into the future.

Science made us masters at forecasting, so we began applying this method to a whole range of subjects. We used it in chemistry, geology, and astronomy. It worked well. But then we began using it in politics, economics, psychology, and sociology. It didn’t work as well. Today we refer to these domains as the “soft” sciences. But are they really sciences at all?

The difference between them is the complexity of the subjects. While you may think of Newtonian physics as highly complex, relative to other domains it’s not. It’s the study of very simplified things. Calculating the trajectory of a ball or a missile is a relatively simple process with only a half-dozen variables. However, in the soft sciences, the number of variables increases dramatically. Newtonian physics is like tic-tac-toe; you can easily calculate the outcomes because the variables are controlled and the laws are in place and easy to follow. You can set up experiments to verify a hypothesis. Business, on the other hand, is incredibly complex; there are far more moving parts and interrelated variables, we don’t really understand the governing laws, and it’s difficult to set up experiments to verify any hypothesis about them. This makes economics more like chess. We can only guess and look at historical information, employing backward reasoning and so compounding the hindsight bias. We refer to economics as a science, but it really isn’t, not the same way that physics is. I’m not saying that the work economists do is wrong or meaningless, I’m just saying that economics is not a science, not as deterministic as Newtonian physics. Calling it a science lulls us into a false sense of security. It makes us think there is only one alternative future.

Adaptive management, ironically, uses the scientific method to construct and implement business plans. The danger of this method is that it makes us think of business as a deterministic subject, something that we can predict if we know the causes and effects, all of the variables, and how they react. This is the curse of Newtonian physics, because we’re applying the scientific method in a place and to a subject that has too many variables, one in which the causes and effects are hard to quantify, hard to predict, and difficult to verify with experimentation.

Instead, in order to make reasonable guesses and develop hypotheses about our markets and how to control those markets, we have to understand chaotic systems and the difference between organic design (a bottom-up process) and intentional design (a top-down process). Random things happen, and we have to use those things to our advantage.

Your business model operates in a chaotic system, ruled in part by seemingly random events that are impossible to predict. Often we forget this, especially when we are successful in predicting the outcome of certain events. It lulls us into a false sense of security. It makes us think that everything happens for a reason, that the world is deterministic, that if we study things hard enough we will be able to see into the future. Nothing could be further from the truth.

Chaotic Systems

Human conduct is incredibly complex. We don’t have a good understanding of what drives decision making; it’s a strange combination of logic, emotion, and subconscious desires and drives. It’s very difficult to predict, especially when the situation is made up of hundreds, thousands, or even millions of individual decisions, as in business situations. The theoretical physicist Leonard Mlodinow says that, unlike physics, social systems such as businesses are not governed by fundamental laws, at least ones that we fully understand. “Human affairs are so complex,” he says, “that it is doubtful that we could carry out the necessary calculations even if we understood the laws and possessed the data. As a result, determinism is a poor model for the human experience.” A better model, it turns out, is one created by a meteorologist in the 1960s.

Edward Lorenz was in the forefront of meteorology. He was one of the first in his field to have access to a powerful computer, and he used it to develop predictive models for weather forecasting. Using his understanding of the laws that govern weather patterns, he wrote a complex computer program that turned these laws into mathematical equations. He’d then plug in certain meteorological data, such as barometric pressure, frontal systems, and the moisture content of those systems. The computer would then predict future weather patterns based upon this information. It worked pretty well.

Then one day Lorenz decided he wanted to run a particular scenario again and see how it would look if he extended his forecast. Instead of starting from the beginning, he plugged in data from halfway through the scenario, and something strange happened. Instead of the same prediction as before, the computer predicted a completely different scenario. Lorenz was baffled. How could that be? He had entered the data exactly as it appeared in the other scenario, even though he had started at a different point in time. Instead of dismissing the error, he spent days poring over the data to determine the cause of the mistake. He discovered that he hadn’t actually entered the data exactly as in the previous scenario. Though the computer calculated things out to ten decimal places, it printed out the numbers in only three decimal places. So instead of entering 0.293141345, for example, he had simply entered 0.293. Now, you’d think that this wouldn’t make much of a difference, that a slight variation in one condition would simply lead to a slight variation in the overall prediction. But it didn’t. The slight variation led to a completely different prediction down the line. A small thing had a huge consequence over time.

Lorenz would go on to poetically label this the “butterfly effect”; in other words, the tiny flap of a butterfly’s wing (a change of ten decimal places in data) in California could result in a tornado in Kansas City or a hurricane on the other side of the planet. Not only is this a memorable metaphor, it’s a useful one for more than just meteorology. The butterfly effect dominates any endeavor in which human behavior is involved.

If you don’t believe me, simply trace the important things that have happened in your own life, and you’ll find that, at some point, a completely random event was involved. For example, thirty years ago, right before my graduation from the University of Vermont, I accepted a job with Newport News Shipbuilding in Virginia. I was going to design and engineer nuclear submarines for the navy. The week before graduation was filled with all-night parties, day long hikes in the Green Mountains, and dreams of the new life that awaited me. One morning, as I sat in my kitchen plotting out the day’s adventure, I noticed a scrap of paper lying on the floor. I picked it up and went to toss it into the trash but noticed writing on the other side and so turned it over. It was a message for me, a man’s name and a telephone number that my roommate had taken for me the day before. Or the day before that, for I’m not sure how long it had been lying there. I didn’t recognize the name or area code.

It turned out that the message was from Ernie Oddo, the director of the Structures Laboratory of the McDonnell Douglas Astronautics Company in Huntington Beach, California. I had sent out two hundred résumés the months before graduation, and one of them had found its way to his desk. When I returned his call, he said he had a job opening, but I explained that I had already accepted a job in Virginia. He said he was prepared to offer me a job over the phone. I’d be working on the space shuttle, the Delta rocket, and the MX missile program. He also offered me more money and the opportunity to live on the beach in southern California. My fate was sealed. I took the offer, became an aerospace engineer, bought a surfboard and Corvette, and the rest, as they say, is history. Destiny. I met my wife, Terri, at McDonnell Douglas, and years later she’d give birth to a beautiful baby girl. So now Katie lives in San Clemente, California. She was captain of the high school cheerleading team and is now studying to be a psychologist. Her very existence is due to my picking up a scrap of paper off of my kitchen floor. Go figure, right? I suspect that your existence is also the result of a chance occurrence. You’re indebted to randomness. Although it doesn’t seem like it, the world is ruled by chance occurrence, and so it is a chaotic system. As the Nobel laureate Max Born once said, “Chance is a more fundamental conception than causality.”

So instead of gambling the future of our business on a single outcome, we need to prepare for multiple outcomes. The key to forecasting, it turns out, is to make numerous predictions. The reason Nostradamus was so accurate in his prophecies is that he made a shitload of them. As the economist Edgar Fiedler said, “If you have to forecast, forecast often.”

The Principle of Multiple Futures

Herman Kahn was a likable fellow. Portly, balding, and bespectacled, he looked like your neighbor’s dad or a nerdy accountant. But beneath the fatherly mannerisms was the mind of a devious, and some say dangerous, strategic seer. Kahn, the founder of the Hudson Institute think tank, began his career at the RAND Corporation as a systems theorist and became well known for his analysis of the consequences of an all-out nuclear war. In 1960, he published a book called On Thermonuclear War, an obvious homage to On War by Clausewitz, which laid out various situations after a nuclear exchange with the Soviet Union. Whereas most military and political strategists considered a nuclear war unwinnable, resulting in complete annihilation, Kahn posited that the United States could survive a nuclear attack from the Soviets. Hundreds of millions of people might die, major cities would be destroyed, life as we know it would change, but there would still be survivors, and for them life would go on. He described various outcomes, most of them horrific, but the survivors would not envy the dead.

This led to a major shift in strategic thinking. Using basic game theory, Kahn explained that it was crucial that the United States prepare for an attack and have the resources to retaliate in the case of a first strike by the Soviets. No matter how effective their first strike, Kahn reasoned, we had to have the capability to strike back. This led to the doctrine known as “mutual assured destruction,” with the macabre but appropriate acronym MAD. In other words, the Soviets would experience large-scale destruction if they ever launched an attack on the United States. And vice versa. One would have to be suicidal or “mad” to launch a first strike. That led to the arms race, the idea that the more bombs, the more likely some would survive an attack, and the Cold War strategy known as “the triad”: land-based missiles in hardened silos able to withstand anything but a direct blow from a Soviet warhead; submarine-based missiles, able to avoid detection beneath the sea; and airborne missiles, always in the air, hard to locate, and ready to strike. Stanley Kubrick’s classic movie Dr. Strangelove was based, in part, on Kahn’s work, and Kubrick is said to have carried a copy of Kahn’s book with him as he wrote and filmed the movie. He even met with Kahn a few times to consult on some of the finer points of nuclear strategy. In fact, it was Kahn who suggested the idea of the Doomsday Device (a machine automatically triggered by a first strike that would destroy the entire world), which was a critical plot point in the Kubrick screenplay.

For us, though, it isn’t Kahn’s contribution to thermonuclear strategy that’s so important, it’s the process that he used to develop this strategy. Using game theory and systems analysis, he developed the idea of scenario planning as a means of developing strategic alternatives and ultimately crafting strategic doctrine. Most futurists, up to that time and still today, rely on a single “most likely” scenario to develop their predictions and to base their strategy upon. Kahn pioneered the idea of multiple scenarios when it came to strategic development, a routine still practiced today in the bowels of the Pentagon. A decade later, Royal Dutch Shell, the giant oil corporation, would institute a similar process for its business strategy in response to the oil shortage brought on by the OPEC cartel. (Later, it would become the largest industrial company in the world.)

The Principle of Multiple Futures is an homage to Kahn and says that it’s impossible to predict the future because business operates in a chaotic environment in which there are multiple competitors and multiple customers, all interacting in ways that are impossible to quantify. So the adaptive manager engages in scenario planning, which gives him the insight and intelligence to make real-time adjustments to his plan. The reason that Lieutenant Colonel Rudder had grappling hooks on D-Day was that Ike had considered several scenarios in which the Allies weren’t able to knock out howitzers with tank or aircraft fire, and he wanted them to have the ability to scale the cliffs if one of these scenarios came to pass. And it did. That’s why we create a tactical inventory. Even if we never need to use the grappling hook, it’s nice to have just in case.

It’s because of multiple futures that we have the need for adaptive management in the first place. We realize that there’s a good chance that our primary scenario will probably not play itself out, and so we consider the others. This allows us to build our tactical inventory and have other tactics available, like grappling hooks, just in case. Though scenario planning has been embraced by the military, it’s never really come into full acceptance in the business world. Some business leaders consider it sacrilegious or an insult to their perceived ability to be business prophets.

It’s important to take the time to identify multiple scenarios in addition to the one you think most likely. This forces you to think in terms of multiple futures and makes you sensitive to the reality of the situation as it unfolds. Scenario planning helps you determine the weaknesses in your business model and enables you to reinforce it if possible. The process is very simple.

First, you have to get a team together to craft the different scenarios. These should include both plausible scenes and ones you consider highly unlikely. Some of your scenarios should be uncomfortable. For example, what will happen if Microsoft gets into the business and copies our model? How will our business fare in the case of a massive recession? Depression? What will happen if we lose our top three customers? For example, a small company I consult with is in high-growth mode, but all of that growth is from a single customer, Wal-Mart. What will happen if Wal-Mart decides to pull the plug? It would be devastating to the company, but it’s a real possibility. That would result in a contingency plan. Cash reserves. Layoffs. It’s important for business leaders to consider these different scenarios.

Scenario planning is different from contingency planning. With contingencies, we typically take into account a single uncertainty. With scenarios, we consider the combination of uncertainties. This allows us to identify key external and internal drivers of our business model. How sensitive are certain aspects of the model? For example, at Preferred Capital, we were very sensitive to the availability of our funding sources. We had only a few large sources, and if we lost one it would be devastating to our business. That was a very sensitive problem and meant that we developed a number of different ways to help cement those relationships.

Here are a few more guidelines for developing scenarios. First, choose your time frame. Remember, time frames are key both in predicting and in evaluating the plan as you implement it. Second, define the driving forces in your business model. These are both external things such as the market, competitors, customers, and suppliers, and internal things such as resources, cash flow and available cash, employees, and strengths and weaknesses. Also define trends in the market and even in society in general. Third, brainstorm on the possible scenarios. Choose uncomfortable ones. Define the worst-case scenarios. Play the trends out. Define possible catastrophic events. Scenarios should include legal problems, cultural trends, competitive moves, and internal resources. Fourth, choose several scenarios and define how you would respond to them. Write out the scenarios and create a narrative, just as you did with your strategic plan, for the narrative will help you to “play them out” just as the military does when it plays war games.

Once you have the scenarios, look and see how you’d respond. Think in terms of defensive responses. Can you weather the storm if your largest customer leaves? Think also in terms of offensive responses. What can you do if that happens? You’ll begin to develop a sense of your risks, be able to craft a more realistic, “futureproof” strategy, and also be more able to see and sense opportunities as they unfold. That’s really the key to Plan B. The evolution of your plan is based on your ability to sense opportunity when it knocks and to open the door and walk through when it does. You want to leave yourself options; in fact, options are the key to opportunity. Hernan Cortés burned his boats so that he’d motivate his troops when they landed in the Mexican mainland—an impressive move but one that limited his options. Don’t make that mistake. In fact, we want the boats, we may need the boats, for we may want to sail somewhere else to find opportunity. This isn’t just true for business; it’s true for life in general.

Minus 148°

As I trained for my McKinley expedition, I did a lot of scenario planning. Mountaineering is full of worse-case situations. For me, the obvious one was climbing in a place with extreme exposures, huge drops, on wind-scarred slopes or rim ice. I developed some tactics for dealing with those, such as walking on a slick ladder laid across two rocks in my backyard. This taught me to focus and to walk on small outcroppings with mountaineering boots and front-point crampons. After a while I felt comfortable doing it.

However, it wasn’t until I read the book Minus 148° that I took my scenario planning really seriously. It tells the story of a 1967 expedition that is caught just above the 17,000-foot camp at Denali Pass and endures, as the title implies, some of the harshest conditions ever suffered by an explorer or human being anywhere in the world. McKinley is one of the coldest places on Earth, and Art Davidson, Ray Genet, and Dave Johnston became Alaskan heroes when they faced the worst that the mountain has ever thrown at an expedition party—terrifying winds, massive snow-fall, and temperatures that should have killed them. It’s a story of incredible hardship, and it terrified me. The idea of camping on the same exposed ridge as the 1967 party haunted my dreams. I dreaded the 17,000-foot camp, and it became my new worst-case scenario. Over the next few months, I devised a contingency plan to help me endure it that included the purchase of a down sleeping bag rated at minus 40 degrees, a pair of Millet high-altitude boots, and an Absolute Zero down-filled mountaineering suit. They were my grappling hooks.