images

Our knowledge can only be finite, while our ignorance must necessarily be infinite.

—Karl Popper

The fundamental cause of trouble in the world today is that the stupid are cocksure while the intelligent are full of doubt.

—Bertrand Russell

So there you have it. Our journey through the minefields of thinking and deciding is just about over. As we've seen, we have a number of cognitive tendencies that lead us to form incorrect beliefs and make erroneous decisions. Of course, it's not all bad. We've done pretty well at surviving on this rotating ball we call home—but we could do much better. Let's take a few minutes to revisit the six major mistakes that often get us into trouble.

We prefer stories to statistics. Since we have evolved as storytelling creatures, our mind naturally gravitates toward stories and away from statistics. As a result, we overemphasize anecdotal information when forming beliefs and making decisions. Our preference for anecdotal data cannot be overestimated. In fact, you may have noticed that I've discussed a number of personal stories in this book. Knowing that we pay more attention to anecdotes, I thought they would be the best way to get the main points across. Of course, the conclusions reached here are backed by rigorous scientific investigation. The problem is, when we rely purely on anecdotal information in our everyday decision making, we typically disregard the statistics that may conflict with the anecdotes. Our failure to rely on the statistics of science leads us to believe in homeopathic remedies, dowsing, facilitated communication, and a host of other weird and/or erroneous claims.

We seek to confirm. In order to make balanced and informed decisions, we should pay attention to both supporting and contradictory information. But we don't. Instead, we emphasize information that confirms our existing beliefs and expectations, and disregard or reinterpret information that contradicts them. In essence, once we develop a preference or expectation, we have an ingrained tendency to interpret new information in a way that supports what we expect or want to believe. As we've seen, this biased evaluation of evidence is a main contributor to holding countless faulty beliefs.

We rarely appreciate the role of chance and coincidence in life. We are causal-seeking animals. From an evolutionary standpoint, this tendency has served us well, because when we discover the cause for something, our knowledge increases, as does our chance of survival. However, our penchant to look for causes is so overpowering that we see associations when none exists—we begin to see causes for things that are random or simply the result of coincidence. We consequently believe that a hot hand can affect the outcome of a basketball game, that an evaluation of past stock prices allows us to predict future prices, and that superstitious behavior can affect our performance.

We can misperceive our world. We like to think that we perceive the world as it actually is, but our senses can be deceived. We can see and hear things that don't really exist. A considerable amount of research indicates that our perceptions are greatly influenced by what we expect to see and what we want to see. And so, our biases can result in hallucinations—if we believe in ghosts or aliens, we're more likely to see them. Misperceiving the world is one of the main reasons why anecdotal data can lead us astray.

We oversimplify. Since we lead very complex lives, we're constantly on the lookout for ways to simplify things. This also happens in our decision making. We use a number of simplifying heuristics when we make decisions, and while those heuristics often serve us well, they can also lead to serious errors. When we base our decisions on similarity assessments, for example, we ignore other relevant information, like the impact of base rates, sample size, and regression to the mean. When we rely on what comes easily to mind, we overestimate the likelihood of sensational events. As a result, our beliefs and decisions can be greatly influenced by unreliable information, and insufficiently influenced by relevant and reliable data.

We have faulty memories. Although we often complain about our forgetfulness, we tend to think that the things we mange to remember are recalled quite accurately, especially if we have confidence in the memory. But research indicates that our memories can be very wrong, even when we're very confident. This even occurs in regard to sensational and tragic events. How did you hear about the World Trade Center disaster? Your answer may be quite different if asked the question in three years, as compared to three days, after the tragedy occurred. Current beliefs, expectations, and even suggestive questioning can affect our memories. In effect, we may reconstruct our memories, and with each successive reconstruction, memories can get further and further from the truth. Given that much of the information we use in our thinking and deciding is retrieved from memory, those faulty memories can have a major impact on our forming erroneous beliefs and decisions.

Of course, we've talked about a number of other pitfalls in our thinking, but the six listed above are the main categories. As I've tried to stress, you shouldn't feel bad if you make these mistakes—everybody I know makes them. Why? Most of the problems are the result of our evolutionary development or our desire—and need—to simplify our thinking. We can't pay attention to all the information which barrages us every day. Fortunately, our simplifying strategies work well in many cases—they give us decisions that are good enough. The problem is, we start to rely on them when we shouldn't, leading to grossly inaccurate beliefs and decisions that can lead to disasters.

One other point must be kept in mind. Knowledge of these pitfalls is the first step to improving our beliefs and decisions. But that knowledge doesn't ensure that our decisions will yield the best possible outcomes. As we've seen, chance has an important influence on our lives, so even if we follow the best possible decision strategy, the outcomes of our decisions can still go horribly wrong. To see what I mean, consider the current interest in high-stakes poker, played almost every night on ESPN, Bravo, and the Travel Channel. On a recent show, the announcer evaluated the hands of two players, Mark and Steve, and said, “At this point, Mark is a 90 percent favorite to win the hand.” How did he know? Mark had a strong winning hand at the time, and Steve's only chance to beat him was to draw to an inside straight, a very unlikely event. Accordingly, Mark bet big. Steve decided to stay in and, amazingly, filled the straight to win the hand. Was Mark's decision to bet big a bad one because he lost the hand? Not at all. Given the information at the time, his decision was right, even though the outcome was bad. So it is with many decisions in life. When judging if someone is a good decision maker, we have to judge the quality of his decision process (how did he go about making the decision?), not the quality of the decision outcome.

I've tried to emphasize that the best way to improve our thinking and deciding is to take a skeptical and critical approach. Unfortunately, we are quick to believe things on the basis of incomplete or inappropriate evidence—critical thinking does not come naturally to us. As psychologist Alfred Mander stated back in 1947, “Thinking is skilled work. It is not true that we are naturally endowed with the ability to think clearly and logically—without learning how, or without practicing.”…. People with untrained minds should no more expect to think clearly and logically than people who have never learned and never practiced can expect to find themselves good carpenters, golfers, bridge-players, or pianists.”1

One thing, above all, must be kept in mind. We humans are believing creatures—we want to believe things. But as Theodore Schick and Lewis Vaughn noted, if we have a good reason to question a belief, we can't accept it as true. Wanting something to be true will not make it true, no matter how hard we try. The best we can do is proportion the extent of our belief to the extent of the evidence for that belief. And if the evidence doesn't strongly support a belief, a leap of faith will never help us know that the belief is true.2 Amazingly, one of the paradoxes of human nature is that we hold some of our strongest beliefs in areas that we know the least about.

We want to believe things because we want certainty in life. But life can be very complex and unpredictable. While we might find it more comfortable to be certain in our beliefs—to think in terms of black and white—we must learn to accept how much we don't know. Sometimes we have to live with the various shades of gray in our knowledge. This is particularly significant because erroneous beliefs can cause more problems than not believing at all. As psychologist Tom Gilovich said, “Sometimes it's not the things we don't know that get us into trouble; it's the things we know that just ain't so.”3 We have to be, therefore, stingy with our beliefs—to withhold a belief in something until compelling evidence exists in its support. While this may go against our deeply ingrained predispositions, it is, without a doubt, one of the most important things we can do. On a personal level, and as a society, we will benefit from this skeptical stance, and make more informed judgments and decisions.