An executive is a person who always decides; sometimes he decides correctly, but he always decides.
—John H. Patterson
The entrepreneur’s world is an uncertain one full of risks and messy, incomplete, and incoherent data. That’s a tough environment for making decisions, but indecision is fatal. John H. Patterson (1844–1922) was the founder of the National Cash Register Company, which he took public in 1925 in what was the largest public offering of its time. As a CEO he pioneered the notion of creating humane working conditions and established the world’s first sales training school and the first “daylight factory” buildings in 1893. (He was also Dartmouth class of 1867.) He both hired and fired Thomas Watson, Sr., who later built the Computing Tabulating Recording Corporation into one of the most effective selling companies in history; in 1924 it was renamed IBM. Patterson was fond of saying, “An executive is a person who always decides; sometimes he decides correctly, but he always decides.” This should be taped to the bathroom mirror of every start-up founder and leader. Decision making is one of the most important attributes of a leader in any organization, especially one that is starting up, because the decisions come every day, and they have to be made or they will make themselves.
In June 2007 Yahoo cofounder Jerry Yang assumed the role of CEO of the company he founded, hoping to pull it together in tough times. Employees were excited, but the mood quickly turned as Yang began “punting tough decisions.” At the time one of Yahoo’s former executives said, “The thing that Yahoo needs most is someone who can make decisions and who is comfortable with the risk of making fast decisions.” It had become clear that “Mr. Yang’s inability to make tough decisions on matters ranging from new products to strategic alliances stymied his effectiveness and failed to get Yahoo out of its hole.” In November 2008, Yang stepped down as CEO. He didn’t get fired for turning down a purchase offer of $33 a share from Microsoft in April 2008 and running the stock price down to $9; he got fired for being indecisive. The stock went up 9 percent when the board announced his resignation.1
A 2005 Wall Street Journal article titled “Non-Deciders Make Everyone Else Suffer” said it perfectly: “A non-decider possesses a hypnotic ability to redirect everyone’s attention. … Indecisive managers may not accomplish much. But on the long list of things they don’t do is this: get fired.”2
This is the challenge entrepreneur founders and leaders face every day: decision or indecision? There’s a fully developed field of decision science, complete with models, texts, a research canon, and a curriculum in most business schools. However, no entrepreneurship book we know has a chapter about what it’s like and what it takes to make decisions in an entrepreneurial setting. Perhaps the assumption is that entrepreneurs already know how to decide effectively, however doubtful that seems.
Decisions—big decisions, often with long-lasting and profound consequences—come fast and furious in the entrepreneurial setting, and they seldom allow much time to make them. In fact, often it’s not clear which decisions are called for and when. Defining decision-level issues and their timing is in fact part of the art—and make no mistake, decision making is at least as much art as science, if not more. Rational decision making has long been studied. Recently, investigators have begun examining the role of experience and intuition in decision making. Good decisions are hard to come by, but they’re worth a lot. Hard work and good execution are keys to success, but they can’t make up for bad decisions.
It’s hardly possible, of course, to cover the rich field of decision making in a short chapter in a book on start-up entrepreneurship. As with most of the chapter topics in this book, there is ample literature if you want go deeper. Some of it is quite good and relevant to instinctive decision making in entrepreneurial settings. There analytical methods based on reams of data are almost useless, and creativity and framing are essential. We do, however, want to sensitize you to a few key points. We particularly want to help frame for you what really goes on in the brain when humans make decisions. We hope this can help you get into a mindset in which you are comfortable making complex, difficult decisions when you don’t have all the information you need (you seldom will) and there isn’t the luxury of time to deliberate and ponder (there hardly ever is). Nondeciders make everyone suffer, but in an early-stage entrepreneurial setting it’s the enterprise that suffers most of all, sometimes terminally.
Much of decision science focuses on the notion of rational decision making. Here is a popular definition: the “systematic, step-by-step method in which ‘hard’ (quantitative) data obtained through observation or mathematical (statistical) analysis or modeling is used for making long-term decisions.”3 A quick search of the literature turns up a rich sampling of various rational decision-making models:
Get the idea.
Evaluate the idea.
Find the problem.
Devise alternatives.
Choose.
Understand the problem.
Devise a plan.
Carry out the plan.
Evaluate the results.
Identify the problem.
Define the problem.
Evaluate possible solutions.
Act.
Evaluate successes.
Identify the problem.
Obtain information to diagnose causes.
Generate possible solutions.
Evaluate the various solutions.
Select a strategy for performance.
Perform and revise the solution.
Sense the problem.
Define the problem.
Generate alternatives.
Evaluate alternatives.
Choose.
Plan action.
Implement.
You get the idea.
Unfortunately, much rational decision theory is drawn from algorithm-based models or has been developed through experimental work, which mainly uses artificial tasks. These are based on comparisons of multiple options defined by the experimenter that don’t require problem definition or domain experience. Rational decision making often is taught by people who haven’t had careers infused with the trial and error of real-world problem solving.4 The roles of experience, intuition, intangible insight, and emotion are deemphasized or unclear.5
How useful are models like these? Mechanistic, rational problem solving teaches rules, facts, and procedures. It may make sense for simple procedural tasks, but it is not clear that it’s useful at higher levels of expertise or for making complicated judgment calls that take into account what is going to happen in the future. Input consist primarily of information about the past and present. The future is largely ignored as unknowable. It’s like trying to drive a car by looking in the rearview mirror.6 Trends are often the least reliable indicator of the future. Yogi Berra had it right: “It’s tough to make predictions, especially about the future.”
“Decision Theory is frequently inapplicable because too often its information requirements cannot be satisfied. The theory is irrelevant because real-world decisions are frequently made by matching or assessment.”7
In a seminal paper in 1955, Herbert Simon suggested that because people cannot possibly take into account all the factors involved in economic decisions or make all the necessary computations, they put boundaries on the decision process and settle for options that are “good enough”—what he called “satisficing.” This is the concept of bounded rationality in decision making.8
Never mind the data or the algorithms; in the entrepreneurial world, even the problems often are ill-defined. You have to continue clarifying the goal even as you try to achieve it, and often the goal does not remain constant because changes of information are so frequent. “The first dreary conclusion is that unknowable situations are widespread and inevitable. … The second dreary conclusion is that most investors—whose training fits a world where states and probabilities are known—have little idea of how to deal with the unknowable.”9
Thus, rational decision-making models work mostly with lots of data from the past and present. However, they fail to take into account the idea that past trends often are poor predictors of the future, and they are unwieldy in an entrepreneurial world of messy, incomplete, and incoherent data. Even more important, it’s not clear that the models take into account how our minds really think about decisions—especially in circumstances of data overload and massive uncertainty—or show how to use that information to make more effective decisions. That said, researchers looking at decisions made under constraints of time and uncertain information find that fast decision makers in such environments tend to use more information than slow decision makers, develop more alternatives, and make better decisions.10
Decision research published by Lowell Busenitz of Oklahoma University and Jay Barney of Ohio State University compared decision making by entrepreneurs and managers in large organizations and found that entrepreneurs and innovators are more likely to use nonrational decision processes and more frequently employ biases and heuristics: rules- or intuition-based methods for making decisions. Two in particular stand out: overconfidence (overestimating the probability of being right) and representativeness (overgeneralizing from a few characteristics or observations).11 Many entrepreneurial contexts involve complex problems and multiple hurdles that render rational decision analysis overwhelming or impossible. In contrast to large, long-standing organizations in which there is history, performance, and other information, in the start-up environment entrepreneurs have no past history, only a bewildering complex of information from all directions in the environment. In such circumstances, decisions are nearly impossible without the simplifying power of a number of biases and heuristics that different people use at different times.12,13 Also, overconfidence helps inspire and motivate a team and investors. Busenitz and Barney note, however, that the tendency to rely on biases and heuristics may be a handicap for entrepreneurs who want to make the transition to being managers of their growing organizations.
Other researchers have looked beyond compartmentalized behaviors such as biases and heuristics. Writing about his best-selling book Blink, Malcolm Gladwell posted this online at Slate:
One of the key arguments in my book is that human beings think in two very different ways. Sometimes we consciously and carefully gather all facts, traditionally sort through them, and draw what we take to be a rational conclusion (the Standard Model). And sometimes we reach conclusions unconsciously—our mind quickly and silently sorts through the available information and draws an immediate judgment, which may be done so quickly and so far below the level of awareness that we may have no understanding of where our conclusions came from.14
Blink popularized a growing field of decision research that is focused not on models and systematics but on observing what goes on when people actually make decisions in real time and real circumstances. Gladwell talks about what he calls thin-slicing—finding significant patterns in narrow slices of experience, for example, marriage counselors accurately judging the likelihood of future divorces by recognizing “contempt” signals such as eye rolls and tone.15 Gladwell summarizes research showing that thin-slicing doesn’t pay attention to everything head on; it removes the “fog of too much data”16 and the distractions of the stereotype,17 instead coming at an issue “sideways.”18 We can change the way we thin-slice by changing the experiences that constitute our impressions.19
One of Gladwell’s main sources is the work of the research psychologist Gary Klein, who began studying what he now calls naturalistic decision making in the early 1970s. His work with firefighters, medical professionals, and the military led him to conclude that laboratory-based decision-making models aren’t representative of what happens when decisions are made in contexts of uncertainty. His book Sources of Power summarizes his work and the model he calls Recognition Primed Decision Making.20
Here is an example of what Klein and Gladwell are talking about:
The Sixth Sense
It is a simple house fire in a one-story house in a residential neighborhood. The fire is in the back, in the kitchen area. The lieutenant leads his hose crew into the building, to the back, to spray water on the fire, but the fire just roars back at them.
“Odd,” he thinks. The water should have more of an impact. They try dousing it again and get the same results. They retreat a few steps to regroup.
Then the lieutenant starts to feel as if something is not right. He doesn’t have any clues; he just doesn’t feel right about being in that house, so he orders his men out of the building – a perfectly standard building with nothing out of the ordinary.
As soon as his men leave the building, the floor where they had been standing collapses. Had they still been inside, they would have plunged into the fire below.21
What does Klein want us to see in these circumstances? Experience sees a situation as a prototype of things seen and experienced before. It identifies a reasonable reaction without having to see all the facts and without having to generate and compare options. These deciders don’t refuse to compare options; they don’t have to. Klein quotes a fire ground commander: “I don’t make decisions. I don’t remember when I’ve ever made a decision.”22 These deciders don’t generate options one at a time; they look for the first workable solution, not the best. They evaluate by imagining how it will play out, not by analysis and comparison to others. Imagining suggests improvements, not just choices.
There is an impulse to act rather than waiting to evaluate all choices. Often they use analogues and metaphors to direct thinking, framing situation awareness, identifying appropriate goals, and flagging relevant pieces of information. This provides a structure for making predictions when there are many unknown factors, linking familiar sets of causes to past outcomes. Klein believes that most of this happens without a deliberate thought process. Like the firefighter, these people hear emotion talking to them in their heads, not calculation. Granted, the context of the example is one of crisis, with no time for contemplation or calculation, but Klein’s research shows that the same thought processes seem to be at work in less pressed and stressful situations too.
There are some surprising empirical studies supporting this less-than-rational model for how people make decisions. In a psychology study done at the University of Amsterdam, Ap Dijksterhuis and colleagues looked at how consumers made a number of purchase decisions and what they thought later about how good those decisions were. The researchers concluded:
Contrary to conventional wisdom, it is not always advantageous to engage in thorough conscious deliberation before choosing. On the basis of recent insights into the characteristics of conscious and unconscious thought, we tested the hypothesis that simple choices (such as between different towels or different sets of oven mitts) indeed produce better results after conscious thought, but that choices in complex matters (such as between different houses or different cars) should be left to unconscious thought. Named the “deliberation-without-attention” hypothesis, it was confirmed in four studies on consumer choice, both in the laboratory as well as among actual shoppers, that purchases of complex products were viewed more favorably when decisions had been made in the absence of attentive deliberation.23
Admittedly, in their study Dijksterhuis and his colleagues couldn’t fully equate “post choice satisfaction” and objective “advantageous choices.” Their focus was not on that but on the mechanics of how decisions are made. They talk about “thought or deliberation in the absence of conscious attention directed at the problem.” Unconscious choosers made better choices than did conscious thinkers or immediate choosers. In fact, the authors concluded that whereas conscious thought is precise, “conscious thought has a low capacity.” What’s going on? Is there another channel that has more capacity? Klein thinks so. He thinks he sees people making complex decisions by analogy, pattern match, and emotion. The key question is how deciders’ brains communicate the analogies and pattern matches they find.
Science writer Gretchen Vogel says, “Psychologists have long known that when people make decisions, whether it’s choosing whom to marry or which breakfast cereal to buy, they draw on more than just rational thought.”24 As cited by Vogel in her article, the psychologist Stephen Kosslyn of Harvard University says, “Emotion apparently is not something that necessarily clouds reasoning, but rather seems to provide an essential foundation for at least some kinds of reasoning.”25 Vogel reported on an important study done at the University of Iowa by a group of psychologists studying patients with damage to specific centers of the brain and how those patients fared in decision-focused experiments. Quoting the Harvard psychologist and author Howard Gardner, she further noted that “the new work ‘fits in with an impressive heap of individual studies showing that people rely on a variety of emotional cues—ranging from a general sense of déjà vu to specific feelings like fear—when making decisions.”26
Listening to emotions is critical in making decisions, “recognition-primed” or otherwise. If you really pay attention to how you make decisions, we think you eventually will concede that usually it’s an experience of emotion, often several, and that the reasons for the decision often follow rather than precede the moment of choice. Here’s the study Vogel was writing about. The details are fascinating and help illustrate the complex process going on in our heads when we are making decisions.
A group of neuroscientists at the University of Iowa College of Medicine led by Antoine Bechara have a registry of over 2,000 brain-damaged patients, among whom is a valuable group of people with lesions in the ventromedial prefrontal cortex, an area of the brain above the eyes that has been linked with emotion. Patients with damage to this area display little to no emotion. They score well on IQ and memory tests but are terrible decision makers. Notes Vogel: “Some drift in and out of marriages; others squander money or often offend co-workers inadvertently … when faced with real-life decisions, they at first waffle, then make unwise choices.”27
Bechara and Hanna Damasio state in the 1997 paper that Vogel referenced, “Deciding advantageously in a complex situation is thought to require overt reasoning on declarative knowledge, namely, on facts pertaining to premises, options for action, and outcomes of actions that embody the pertinent previous experience.” Sound familiar? “An alternative possibility was investigated: that overt reasoning is preceded by a nonconscious biasing step that uses neural systems other than those that support declarative knowledge” (emphasis added).28
Bechara and Damasio tested this idea by comparing 10 “normal” control subjects and 6 patients with bilateral damage to the ventromedial sector of the pre-frontal cortex. They were given a gambling task with rewards and penalties designed to simulate real-life decision making under uncertainty. They played for money, choosing to turn over cards from four decks. Turning each card carried an immediate reward ($100 in two decks, A and B, and $50 in the other two decks, C and D). Unpredictably, turning over some cards also carried a penalty, large in A and B and small in C and D. Playing mostly from the disadvantageous decks A and B led to an overall loss. Playing from the advantageous decks C and D led to an overall gain. The players had no way of predicting when a penalty would arise in a deck, no way to calculate with precision the net gain or loss from each deck, and no knowledge of how many cards they had to turn to end the game. All subjects were wired to measure skin conductive responses (SCRs), a rapid, direct physiological indicator of emotional arousal.
After experiencing a few losses, normal participants began to generate SCRs before selecting a card from the bad decks, indicating an emotional response even though they were not consciously aware of it. They also began avoiding the decks with large losses. Patients with bilateral damage to the ventromedial prefrontal cortex did neither. At the twentieth card, none of the subjects—normal subjects and patients—indicated that they had a clue about what was going on. By about card 50, all 10 of the normal participants began to express a “hunch” that decks A and B were riskier and all generated anticipatory SCRs whenever they pondered a choice from those decks. None of the patients generated anticipatory SCRs or expressed a “hunch.”
By card 80, 7 of the 10 normal participants expressed accurate knowledge about why, in the long run, decks A and B were bad and decks C and D were good. Even the three normal participants who did not reach the conceptual period still made advantageous choices. Three of the six patients eventually could describe which were the bad and good decks correctly, but they still chose disadvantageously. None of the six patients ever generated anticipatory SCRs; they never activated emotions. Write Bechara and Damasio:
Thus, despite an accurate account of the task and of the correct strategy, these patients failed to generate autonomic responses and continued to select cards from the bad decks. The patients failed to act according to their correct conceptual knowledge. … We suspect that the autonomic responses we detected are evidence for a complex process of nonconscious signaling, which reflects access to records of previous individual experience—specifically, of records shaped by reward, punishment, and the emotional state that attends them.29
In a subsequent 1999 paper the Bechara group went further, reporting on the results of a study comparing decision making in patients with damage to either of two major centers of emotion in the brain: the amygdala and the ventromedial pre-frontal cortex. They found that damage to each of the centers critically impairs decision making and, even more interesting, does so in different ways.30
What is the takeaway? Emotions play a critical role in decision making, something rational decision-making strategies ignore at their peril. It’s your peril too if you follow them. Consideration, deliberation, and doing lots of homework and analysis are valuable in framing a decision—within the time allowed, of course. But in the end your emotions will decide, and the only task your rational brain will fulfill is to rationalize to yourself why you decided what you did. Emotions decide; reason explains. Therefore, the more tuned you can be to your emotions, the more effective you will be at getting all the centers of your brain working together. Remember, even though the patients with impaired emotions figured out the right strategy to play the gambling game, it didn’t affect their decisions.
Gladwell says that quick, intuition-guided decisions can be good ones, sometimes better ones. The acuity of emotions in analogizing from relevant factors among complex patterns appears to be a major asset in deciding effectively. This may explain why seasoning and experience seem to be so highly valued in decision makers. Gladwell offers some important lessons in decision making:
Successful decision making relies on a balance between deliberate and instinctive thinking.31
Try to screen out sources of first impressions not relevant to the question.32
Frugality matters—reduce a problem to its simplest elements. Information overload can paralyze decision making.33
Sometimes you have to consciously resist a particular kind of snap judgment: In some circumstances, it’s better to slow things down to get more information. At other times this can only cloud the issue.34
Analytic and intuitive decision making can both be effective, but only if used in the appropriate circumstances.35
How do you know when intuitive decisions are appropriate? You don’t. That’s just one more intuitive decision, at best an educated guess based on a library of learned experiences.36
It isn’t that they can’t see the solution. It’s that they can’t see the problem.
—G. K. Chesterton
What do you really know? Most knowing is not disinterested knowing. Perhaps your greatest enemy as a decision maker, after ignoring the central role of emotions, is failing to take into account the biases and preferences that shape your sense of what you think you know. The best thinking sorts through this maze of subjectivity and self-interest.
In almost any setting involving responsibility, the quality of decisions is no better than the knowledge and judgment that support those decisions. Critical to good decision making is honest discipline in determining what “knowledge” is valid and relevant. What is the context? What issue needs deciding? This is another way of saying that before you let your emotions loose on a question, you need to be sure you are thinking about the right question. This is epistemology in decision making: How do you know what you know? How much do you really know? What is the degree of validity or the level of confidence that your facts are complete and accurate? For this, you need to steel yourself to be a bit of a skeptic about knowledge. Pause, and before you think about deciding, think through some careful epistemology.
In philosophy, epistemology means being deliberate and disciplined in determining what we think we know. This is not to say that entrepreneurs need to become philosophers about the nature of knowledge or beat a question to death. But deciding on the basis of poor or incorrect knowledge is bad business. Defining the right question and getting the facts right the first time—as well as you can, anyway—is the beginning of good business. In our entrepreneurial world of messy, incomplete, and incoherent information, this is not always as easy or obvious as it seems. More often than we might like, it turns out that many things we think we know are either unknowable or just not so.
It ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so.
—Artemus Ward
There are some consistent glitches in our thought processes that seem to be hardwired in all of us, and they lead to predictable epistemology errors. Decision psychologists talk about things such as the following:
Priming or anchoring effect. Being influenced by recently absorbed information, especially in a guess or estimate. For example, studies show that when people are asked to make estimates, they are predictably influenced by a recently heard number even if that number has no relation to the issue at hand.
Selective perception bias. Internalizing or screening out information because of prior experience, attitudes, beliefs, preferences, or habits. Selective perception bias makes pure objectivity almost impossible. We confuse the reliability of a source of information with its predictive ability, for example, using trade statistics to forecast the stock market because they are hard numbers, not because we know that they have a connection to stock prices.
Confirmation bias. Giving disproportionate weight to what supports our prevailing views and ignoring information we don’t want to hear.
Representation bias. Assessing the likelihood of an event on the basis of how closely it resembles a known event or events. Web site founders who pitch their marketing plans on the basis of the success of Facebook’s launch are falling for a representation bias.
Framing bias. Letting definitions or the framing of a problem affect what we attend to and what we filter out.
Availability bias. Remembering and attending to information that is most recent, vivid, or frequently heard. We put too much confidence in memory; we remember events better than statistics. For example, after the 9/11 terrorist attacks, many people avoided airplanes and drove instead, even though the odds of dying in a traffic accident (1 in 6,000) are astronomically higher than those of dying in a terrorist attack on an airliner (estimated at 1 in 135,000).37 A Dutch decisions researcher estimated that over 1,500 people died as a result of this increase in driving, more than died on all the 9/11 airliners put together.38 Related is the Von Restorff effect: selectively remembering the striking or unusual.
Affect heuristic. Biasing perceptions of risk based on perceived benefits.
The winners’ effect. Attributing genius to winners and stupidity to losers even when the quality of their thinking about a problem is the same. Every day some people come out of Las Vegas casinos big winners, but that doesn’t mean they have a system.
Limited search effect. Perhaps the most tragic of all, oversimplifying a complex or challenging situation by limiting the search for more information or alternatives. We indulge in what psychologists call bounded rationality, constructing models that are based on essential features and ignoring subtleties or richness of information. An example would be a college graduate who takes the first job to come along instead of planning a long-term career path.
Thomas Gilovich, a professor of psychology at Cornell University, has spent a career studying these kinds of things, many of which he summarizes in a book every entrepreneur ought to read: How We Know What Isn’t So.39 Among the more annoying of these practices are the following:
Seeing regularity and order where only the vagaries of chance are operating. He calls this a “Clustering Illusion,” and it consists of failing to appreciate the nature of randomness or misunderstanding statistical regression to the mean. For example, people talk about the so-called hot hand in sports. Random strings of data are not uniform—clusters are normal. If you flip a coin 100 times, you will get close to 50 heads and 50 tails, but clusters of all heads or all tails will be common.
Failing to detect and correct for biases that result from incomplete or unrepresentative data.
Interpreting ambiguous and inconsistent data in light of preferences, pet theories, or a priori expectations.
Believing things because of wishful thinking and self-serving distortions of reality.
Relying on secondhand information and distortions introduced by communicators such as the media and celebrities.
Thinking that others believe what we believe and are more like us than they really are.
Failing to balance the import of outcomes: “the tendency to be more impressed by what has happened than by what has failed to happen, and the temptation to draw conclusions from what has occurred under present circumstances without comparing it to what would have occurred under alternate circumstances.”40
Failing to distinguish correlation from cause and effect. The fact that two events correlate does not mean that one causes the other. For something to be said to cause an effect, it has to be both necessary and sufficient. When human behavior is involved, cause and effect can be almost impossible to establish. Attributing an event to cause and effect anyway is little more than wishful thinking or prejudice.
Sloppy thinking about cause and effect is one of the great decision bloopers. Incorrectly learning from hindsight is another. Every decision maker should read the article “The Silly Certainty of Hindsight” by the psychologist Baruch Fischoff.41 That silly certainty dooms us to learning the wrong lessons from our past. Fischoff found that hindsight gives a false sense of inevitability. When we know an outcome, we tend to believe that others, with foresight, should have recognized things we know only from hindsight. He cites an example: In 1974 in Eugene, Oregon, Cletus Bowles, a convicted murderer and bank robber, left the Oregon penitentiary on a four-hour pass. While out, he fled, kidnapped, and murdered two people. There was an outcry over incompetence and failure to foresee the danger. The prison warden was pressured to resign. “Bowles’ record, in the prison and out, shows he could not be trusted,” ran one newspaper editorial. However, it turns out that Bowles was a model prisoner before he left the penitentiary.
Hindsight affects the way people judge results. We rationalize results, retrospectively manipulating information or selectively attending to data. This is so pervasive that we tend to have difficulty remembering accurately how we viewed a situation before knowing the outcome. In experiments people interpret the same facts differently, depending on the reported outcome. In one of Fischoff’s experiments, groups of subjects were given the background of a colonial struggle in Nepal and asked to judge the probabilities of four possible outcomes. Some of the groups were told in advance that one of the four outcomes actually happened. Those given the outcome judged it as twice as likely as did those who were given no outcome, even when the outcome was fictitious and wrong.
Fischoff says that a human tendency to see events in retrospect as foreseeable gives people a false sense of inevitability, denying the element of surprise in the way things work out: “Without a sense of surprise, we feel little need to reevaluate the implicit assumptions and rules we use to interpret what goes on in the world, and we repeat our mistakes. A surprise-free past is prologue to a surprise-full future” (emphasis added).
It seems almost daunting—messy, incomplete, and incoherent data sloshing around in a sea of emotions and psychological biases, haunted by the certainty of hindsight that is almost certainly wrong. What to do? Here are some suggestions:
Start by reminding yourself constantly about the uncertainty in people’s judgment of the past.
Hunt down objective records of the past as it was originally experienced. Keep written records. Take notes and date them, especially in regard to predictions and the considerations that guided them.
For past events with a known outcome, try to imagine alternative futures for them. Think about what could have gone wrong.
Disguise an actual event and present the circumstances to other experts for their opinions.
Ask yourself whether a hypothesis of random outcomes is equally likely or even more likely. By giving up illusory control and concentrating on the nonrandom aspects of a situation, you can often exercise greater control.
Start by asking yourself, What are the key uncertainties? and To what level of detail does the question have to be answered?
Have at least a working familiarity with the basic principles of statistics and probability. Exposure to the “probabilistic” sciences may be more effective than experience with the “deterministic” sciences in evaluating the messy statistical phenomena of everyday life.
Never confuse correlation with cause and effect. Be sure you see necessity and sufficiency before assuming cause and effect. Accept the fact that there are many things we can’t know even when compelling correlations make us think we do.
Groups trump individuals. Find ways to access the power of groups. Assemble them from your team. Use your boards and advisor networks.
Don’t underestimate the value of mistakes. Oscar Wilde said, “Experience is the name everyone gives to their mistakes” Mistakes are often the best teachers. In fact, most businesspeople will tell you they learned far more from their mistakes than from their successes. However, you can benefit only if you realize you made a mistake and examine it carefully to learn from it, all the while recognizing the distortions of the silly certainty of hindsight.
There is one last point about decision making that is often underemphasized: When do you need to decide? It’s not always obvious that because a decision has presented for choosing, it has to be decided then. Indecision can be paralyzing, but deciding peremptorily can be damaging too. Sometimes deferring a decision for lack of information or conviction can be a good idea—if there’s no time pressure and more data will improve the decision rather than complicate or impair it. This happens more often than we think. Decide or delay—it’s a tension. Deferring a decision can be wise or tantamount to choosing one alternative by default; no decision can end up being a “no” decision.
You can balance between deciding too quickly and deciding too late—when there is the luxury of time—by developing multiple options to choose from. Having more than one option for the way things can work out can be valuable. All things equal, develop as many options as you can effectively (and cost-effectively) manage and hold them open as long as circumstances don’t force you to choose. In this way, you will be taking advantage of the possibility that additional information will surface that may favor one or more of the options. Think about it this way: Deciding, in the end, is more about eliminating poorer options and keeping the best one than about choosing one alternative. As options demand a take-it-or-leave-it decision, look one more time at all the remaining alternatives; is the one calling for a take-or-drop choice more attractive than the potential of all the remaining options taken together? If so, take it. If not, drop it. Keep working the options until only the one you like the best is left standing. It’s sort of a natural selection evolution of opportunities—the fittest survive. Just as in nature, on the surface the process can seem wasteful, generating more options than you will use. But the cost of too quickly choosing a suboptimal alternative because you didn’t get information that was available or didn’t properly conceive alternatives can be expensive too. More often than not, what really happens is that multiple options suggest combinations and creative solutions that single-channel thinking might not have revealed.
All things considered, in decision making you can’t always consider everything, but you always need to decide eventually. You can get lost in a fog of precision and granularity or just as easily get lost in a fit of intuition and limited search effect. You can hip shoot a decision that quickly appears cavalier and stupid. There are no right answers when it comes to decision making, only opportunities and pitfalls. Most entrepreneurs find it exhilarating to have the challenge of making decisions that matter, day after day, decisions that they have to make or no one will. Thrilling or not, decisions have to be made. When your head, your heart, your experience, and your homework line up, that’s the best you can hope for. Concentrate, be disciplined, and make that happen whenever you can.
Can you describe clearly to someone you know how you approach making decisions?
Ask half a dozen people who work with you or know you well how willing they think you are to make decisions when they’re needed. Do you avoid making decisions? Do you make them too quickly or casually?
What do you think is the proper balance between getting more facts to work the details and making a decision and getting it carried out? (There is no single right answer.)
What do you think is the proper balance between intuition and deliberate, calculated decisions?
1. Vascellaro, Jessica E., and Joann S. Lublin, “Yang’s Exit Doesn’t Fix Yahoo,” Wall Street Journal, November 19, 2008: B2.
2. Sandberg, Jared, “Non-Deciders Make Everyone Else Suffer,” Wall Street Journal, November 8, 2005: B1.
3. http://www.businessdictionary.com/definition/rational-decision-making-approach.html (accessed September 11, 2010).
4. Lipschitz, Raanan, and Orna Strauss, “Coping with Uncertainty: A Naturalistic Decision-Making Analysis,” Organizational Behavior and Human Decision Processes 69, no. 2 (February 1997): 149–163.
5. Klein, Gary, Sources of Power (Cambridge, MA: MIT Press, 1998): 28–30, 286–292.
6. Ibid.: 155.
7. Lipschitz, Ranaan, “Decision Making in Three Modes,” Journal for the Theory of Social Behavior 24, no. 1 (1994): 47–65.
8. Simon, Herbert A., “A Behavioral Model of Rational Choice,” Quarterly Journal of Economics, 69, issue 1 (February 1955): 99–118.
9. Zeckhauser, Richard, “Investing in the Unknown and Unknowable,” Capitalism and Society 1, issue 2, article 5.
10. Eisenhardt, Kathleen M., “Making Fast Strategic Decisions in High-Velocity Environments,” Academy of Management Journal 32, no. 3: 543–576.
11. Busenitz, Lowell W., and Jay B. Barney, “Differences between Entrepreneurs and Managers in Large Organizations: Biases and Heuristics in Strategic Decision-Making,” Journal of Business Venturing 12, issue 1 (1997): 9–30.
12. Pitz, Gordon F., and Natalie J. Sachs, “Judgment and Decision: Theory and Application,” Annual Review of Psychology 35 (1984): 139–163.
13. Haley, Usha C. V., and Stephen A. Stumpf, “Cognitive Trails in Strategic Decision-Making: Linking Theories of Personalities and Cognitions,” Journal of Management Studies 26, no. 5 (September 1989): 477–497.
14. Gladwell, Malcolm, Web posting, “Challenging the Standard Model of Decision-Making, a Reply to James Surowiecki,” Slate, January 10, 2005, http://slate.msn.com/id/2111894/entry/2112064/ (accessed Sept 11, 2010).
15. Gottman, John M., James Coan, Sybil Carerre, and Catherine Swanson, “Predicting Marital Happiness and Stability from Newlywed Interactions,” Journal of Marriage and Family 60, no. 1 (February 1998): 5–22.
16. Gladwell, Malcolm, Blink (New York: Little, Brown, 2005): 32, 34.
17. Ibid.: 37–38, 245–254.
18. Ibid.: 39.
19. Ibid.: 71.
20. Klein, 1998.
21. Ibid: 32.
22. Ibid.: 11.
23. Dijksterhuis, Ap, Maarten W. Bos, Loran F. Nordgren, and Rick B. van Baaren, “On Making the Right Choice: The Deliberation-without-Attention Effect,” Science 311, no. 5763 (February 17, 2006): 1005–1007.
24. Vogel, Gretchen, “Scientists Probe Feelings behind Decision Making,” Science 275, no. 5304 (February 28, 1997): 1269.
25. Ibid.
26. Ibid.
27. Ibid.
28. Bechara, Antoine, and Hanna Damasio, “Deciding Advantageously before Knowing the Advantageous Strategy,” Science 275, no. 5304 (February 28, 1997): 1293.
29. Bechara and Damasio, 1997.
30. Bechara, Antoine, Hanna Damasio, Antonio R. Damasio, and Gregory P. Lee, “Different Contributions of the Human Amygdala and Ventromedial Prefrontal Cortex to Decision-Making,” Journal of Neuroscience 19, no. 13 (July 1, 1999): 5473–5481.
31. Gladwell, 2005: 141.
32. Ibid.: 154–155, 173, 179–183.
33. Ibid.: 141–142.
34. Ibid.: 183.
35. Ibid.: 70–71.
36. Hogarth, Robin M., University of Chicago Graduate School of Business, “Judgment and Choice,” in Educating Intuition (Chicago: University of Chicago Press, 2001).
37. Gardner, Daniel, The Science of Fear (New York: Dutton, 2008): 3.
38. Gigerenzer, Gerd, “Out of the Frying Pan into the Fire: Behavioral Reactions to Terrorist Attacks,” Risk Analysis 26, no. 2 (2006): 347–351.
39. Gilovich, Thomas, How We Know What Isn’t So: The Fallibility of Human Reason In Everyday Life (New York: Free Press, 1991).
40. Ibid.: 186.
41. Fischhoff, Baruch, “The Silly Certainty of Hindsight,” Psychology Today 8, no. 11 (August 1975): 71–76.