images

You are what you think…geez, that's frightening.

—Lily Tomlin

I saw a ghost. It was the middle of the night, and I had just woken up to get a drink of water. Before getting out of bed, I looked over to my right where Kathy was sleeping, and there, floating about a foot above her, was a glowing vision of an old woman staring back at me. The apparition was lying over Kathy as if she wanted to be near her. She looked to be about ninety, with long white hair and a deeply wrinkled face. There was a definite family resemblance; she looked like she could have been Kathy's great-grandmother or some other distant relative. Of course, this startled the hell out of me. I looked away, shook my head a couple of times, and looked back. The old woman was still there, staring right back at me. Our eyes were locked on one another for what seemed to be an hour, although it was probably just a few moments. Her face was expressionless—no smile, no warmth, just a penetrating stare. Once again, I looked away, shook my head, and then looked back. This time she was gone. A bit unnerved, I got up to get some water and then went back to bed. When we woke the next morning I told Kathy what I had seen. She remembered hearing me get up and leave the room, so we thought it wasn't just a dream. Also, what I had experienced seemed so real. It was as if I could reach out and touch the old woman.

Now, I don't believe in ghosts. While the experience was very compelling, I thought that my mind was just playing tricks on me. However, an amazing thing happened when I began to tell my story to other people. Many of them immediately took the experience as proof that ghosts exist. They often would say, “What else could explain it? Just look at the facts. You were awake; Kathy heard you get up. You saw it, and we don't see things that aren't there. It was floating and it disappeared mysteriously. That's what ghosts do. And, the woman resembled Kathy, so it was probably one of her deceased relatives watching over her.” Clearly, all the evidence leads to one conclusion. I had seen a ghost! It just makes sense. Doesn't it?

It turns out that there is another, more prosaic, explanation for my “ghostly” encounter. Research has documented that we can experience hypnopompic hallucinations, which occur as we're coming out of sleep. Individuals have reported seeing all kinds of visions in this state, including alien creatures, departed relatives, and monsters. There's also a strong sense of being awake during these hallucinations. Since they occur just before waking, Kathy could have heard me get up, because I eventually did wake and leave the room. As for the family resemblance, for all I know I was just projecting what I thought Kathy would look like when she got very old. And so, my experience can easily be explained by what we know from scientific research on human perception (and misperception)—without the need to evoke supernatural phenomena. Nevertheless, those individuals who were predisposed to believe in ghosts immediately interpreted my story as proof that ghosts exist. We are often willing to form very extraordinary beliefs on the basis of very flimsy evidence.

Maybe you don't believe in things as “far-fetched” as ghosts, especially given such evidence. However, most of us hold some beliefs that have very little evidence to support them (and have considerable evidence against them). Do you think that some alternative medicines, like homeopathic remedies, cure disease, or that some people have extrasensory perception (ESP), or that near death experiences provide evidence of an afterlife? Don't believe everything you think. Why? We often believe things because we want to believe them, not because of the evidence. Even if we don't have a preconceived notion of what we want to believe, we can still believe things that are not true. Do you think that silicon breast implants cause major disease, that low self-esteem is a cause of aggression, or that crime in America is steadily increasing? Many people hold such beliefs, but research indicates that they're not true. As we will see, we form many incorrect beliefs because we have natural tendencies to evaluate evidence in a biased and faulty manner.

The beliefs we hold are closely tied to the decisions we make. In effect, what we believe affects what we decide. My good friend Chris thinks that he can beat the stock market. He believes that, if he puts enough time into learning the ins and outs of the market, he'll earn substantially more on his stock picks than what the average market return will give him. He once told me that a friend of his bought a couple of stocks and later sold them for a profit of over $30,000. Chris reasoned that he could do the same. He even went so far as to talk about paying off his mortgage from his stock proceeds. When I asked about the other stocks his friend bought, Chris was a little more evasive. When pressed, he said that some were losers, but he continued to focus on the winners.

My friend Chris is a very smart person. Many intelligent people believe they can beat the market if they put in the time and energy. This belief is fueled by a number of books written by “experts” who say that, if you use their method, you too can make a killing in the market. The Internet is replete with such claims. However, research has shown that it is very difficult, if not impossible, to analyze a firm's financial condition and consistently pick stocks that outperform the market at a given risk level. Experienced fund managers, who are so-called experts at picking stocks, can't consistently beat the market. In fact, a monkey throwing darts at a Wall Street Journal stock listing can usually pick stocks as well as the experts!1 Of course, you may get lucky and pick a stock that really takes off. By the same token, you could put $20,000 on red at a Vegas roulette wheel and win—but that's a very risky proposition, and you won't be able to win consistently.

A wealth of research demonstrates that it's better to put your money in a general index fund, like the Standard & Poor's 500, rather than bet on just a few stock picks. Still, people want to believe that they can beat the market. Chris's belief led him to invest a significant amount of cash in a few select stocks, which caused him to lose over $20,000 in a period when the overall market rose by about 25 percent. Erroneously held beliefs can be dangerous to your wealth!

WHAT IT'S ALL ABOUT

We humans are amazing creatures. We have the capacity to think creatively and solve complex problems. We've made technological advances that have made our lives easier and more enjoyable. We've built machines that allow us to explore the outer reaches of space and the depths of the oceans. We've made medical advancements that have significantly extended our life spans. We've built sophisticated civilizations. And yet, despite all we have achieved, we still fall prey to flawed thinking.

This book is about how we form our beliefs and make our decisions. More important, it's about the many ways that our beliefs and decisions can go wrong. A belief is essentially a point of view that we hold to be true. We can arrive at our beliefs in different ways. Sometimes, they come from a quick “gut” reaction. In other cases, we spend considerable time and effort thinking about a topic before we form a belief. In addition, variables such as parental predilections; sibling influences; peer pressures; and educational, social, and cultural influences can affect the beliefs we form.2 Irrespective of how we arrive at a belief, if we accept a viewpoint to be true, the belief we hold can have a major impact on the decisions we make.

We obviously make many very good decisions every day of our lives. If not, we wouldn't survive very long. However, we also make many mistakes, and we're often not even aware that we make them. And yet, those mistakes can have significant consequences for our well-being. They can result in spending considerable time and money on things that don't work, and worse, they can lead us to make decisions that negatively affect our health and even our lives.

If we believe in the ability of psychics, fortunetellers, and astrologers, we're more likely to lay down our hard-earned money to find out if Uncle Harry is still mad at us from his grave, or if we should marry the person we just met last night. When the Reagans were in the White House, Nancy Reagan's belief in astrology led her to consult an astrologist when deciding President Reagan's schedule.3 If we believe that an alternative medicine treatment works, we're likely to spend considerable money on the treatment, even if there's little reliable evidence to support it. In fact, a number of people who have avoided traditional medicine could have easily been cured by it. Instead, they embraced alternative healing techniques, and because of this, many died.4

Faulty beliefs and decisions affect not only our everyday personal lives, but also societal decisions that have an impact on us all. Public officials set policy, pass laws, and spend our money. Many of these decisions are based upon faulty beliefs, which can result in allocating billions of our tax dollars to solve a problem that has little impact on society's welfare, while neglecting, or actually causing, more serious problems. As an example, US cities spent around $10 billion in the 1990s to eliminate asbestos from public buildings. While asbestos may be dangerous if inhaled, its presence in most buildings was not a serious health hazard. In fact, its removal is often more dangerous than leaving it in place.5

So why do we fall prey to erroneous thinking? Are we stupid? It certainly doesn't seem so! All of us make the kinds of mistakes in thinking and deciding that are discussed in this book, including highly trained professionals such as doctors, lawyers, and CEOs of major corporations. Instead, two basic reasons come to mind. First, we all have natural tendencies to search for and evaluate evidence in a faulty manner. The reasons for these tendencies range from evolutionary considerations to just wanting to simplify the thinking process. Second, critical thinking and decision-making skills, which could counteract our natural tendencies to err, are typically not taught in our schools. Our education system requires courses in English, history, math, and the sciences, but not in critical thinking and decision making. Yet such courses would develop skills that could have a significant impact on the decisions we face every day of our lives.

Most of the topics discussed here come from two fascinating areas. One concerns the psychology of judgment and decision making, which has uncovered a wealth of information on how we think, and how our thinking can go wrong. The other concerns the difference between science and pseudoscience. Much of what is reported on TV and other media outlets is actually pseudo or junk science, which is not real science, but is passed off as such. You can surf the channels on any night and find so-called scientific investigators reporting on such things as ESP, alien encounters, Bigfoot, and the search for Atlantis. Given the proliferation of pseudoscientific thinking that permeates the media, we are increasingly susceptible to thinking like a pseudoscientist—which contributes profoundly to errors in our beliefs and decisions.

The essence behind many of the ideas in this book relate to being a skeptical thinker. The term skeptic has received a bad rap in our society. People often think of a skeptic as someone who's cynical, always looking for the fault in things. However, a skeptic is just a person who wants to see and evaluate the evidence before believing. In its truest sense, a skeptic is someone who keeps an open mind, but requires rigorous investigation before choosing to believe something. It's the quality of our reasons for believing that make us intelligent and thoughtful individuals, and the more important or extraordinary the belief, the more compelling the evidence should be before we believe it. If that's how you form your beliefs, then you're a skeptic. In essence, a skeptic follows the motto of Missouri, the “show me” state—if you've got a claim, show me the evidence.

A SIX PACK OF PROBLEMS

We make too many wrong mistakes.

—Yogi Berra

When reading any book, the details seem to fall away from memory over time. At this stage of my life, I'm lucky to remember just a few of the main points. So I think it would be a good idea to start off by listing the key points of this book. It is my hope that you'll find the rest of the book full of interesting examples of these main ideas. I've narrowed them down to six—I call them our six pack of problems.

1) We prefer stories to statistics.

We have evolved as storytelling creatures. From our very beginnings, our history and knowledge have been passed from one generation to the next by means of personal stories. In evolutionary terms, it's only recently that we recorded and stored our knowledge in forms that are easily accessible. As a consequence, we have a penchant to pay close attention to information that comes to us in the form of a story or personal account.6

Stories are wonderful. They add enjoyment to our lives, they engage our imagination, they move us. We are social animals, so we're particularly interested in the personal stories of others. As we will see, however, relying on this anecdotal evidence to form our beliefs and decisions can be fraught with errors. Why? It means that we ignore other more relevant information. For example, we shy away from statistics. The mere word can cause otherwise intelligent individuals' eyes to glaze over. At our core, we are storytellers, not statisticians. But statistics often provide us with the best and most reliable information with which to make our decisions. In many cases, unfortunately, our knowledge of even simple statistics is rudimentary. Former president Dwight Eisenhower was appalled to learn that about half of our children had below average intelligence, thinking that something had to be done about such poor performance. But, of course, about half of our children will be below (and half above) average intelligence.7 In other cases, we ignore statistics because they seem abstract and boring. As a result, even if we know the statistics, we let personal stories affect us more.

Consider the following. You're thinking about buying a new car, so you check Consumer Reports to investigate its reliability. The statistics from prior years' models indicate that the car is very reliable. Happy with your research, you go to a party where a friend informs you that he recently bought that very same car. “It's been nothing but trouble!” he exclaims. “It's in the shop every few months. I've replaced the clutch, there were brake problems, and it keeps stalling out on me.” How do you react to this information? For many of us, learning of our friend's plight would make us question our decision and possibly not buy the car. However, it's better to rely on the frequency of repairs, as summarized in Consumer Reports. That data is based upon a large sample of similar cars, while our friend's experience is based on only one car. There's variance in everything—there can be lemons with any model of car. Your friend may have just been unlucky to get one of the few problem cars. The point is, if you listen to your friend, you're basing your decision upon anecdotal evidence that is much less relevant. And yet, most of us have a tendency to give considerable attention to such personal experiences when making our decisions.

2) We seek to confirm.

If you support gun control, do you give more credence to information that supports a ban on guns? If you have a favorite presidential candidate, do you pay more attention to information that's favorable to the candidate? If you believe in the ability of psychics to predict the future, do you remember the few times they were right, and forget the vast majority of times they were wrong? It turns out that this is how we think. We have a natural tendency to use “confirming” decision strategies. That is, we place greater importance on information that supports our existing beliefs and expectations, or what we want to believe, and less importance on information that is contradictory to those beliefs. In effect, we remember the hits and forget the misses.

Our penchant to attend to confirming evidence is so deeply ingrained in our thinking processes that we'll often seek out supporting data even when we don't have a strong belief or expectation. To see what I mean, think of someone you know and try to decide if the person is charitable. More likely than not, you will think of instances when the person exhibited charitable behavior, such as donating money, helping others, etc. You won't think of all the times the person wasn't charitable but could have been. Why is that? We seem to find it easier to think in terms of those instances that support whatever notion we're testing. The problem is, by selectively focusing on supporting information, we ignore contradictory information that may be very relevant to the decisions we make.

3) We rarely appreciate the role of chance and coincidence in life.

Suppose you see an ad in the Wall Street Journal touting the performance of the “Super-growth” mutual fund. “This fund has earned more than the average of all other funds over the past five years!” the ad proclaims. A photo of its big-name manager is also prominently displayed, leading you to believe that the fund's earnings are directly related to the manager's stock-picking prowess. Sounds convincing, but does that performance demonstrate superior knowledge of the stock market? Should you invest in the fund? Before deciding, you have to ask yourself, Could this superior performance be due to just chance? If you flip a coin five times, it will sometimes come up heads five times in a row simply due to chance. As we'll see later, the evidence suggests that the long-term performance of mutual funds is similar to flipping a coin. Thus, the so-called experts typically don't achieve superior returns over the long run. In fact, it may be prudent not to invest in a fund that has outperformed the average recently because it will likely drop in the future due to a phenomenon called regression to the mean.

Why are we so quick to believe that superior knowledge led to the fund's above average performance? We generally don't appreciate the role that chance and coincidence play in our lives. While chance affects many aspects of our world, we don't like to think that things happen by chance. Instead, we want to believe that things happen for a reason. We are causal-seeking animals—we have an ingrained desire to find cause and effect relationships in the world. This desire to look for causes likely arose because of our evolutionary development. Our early ancestors who discovered the causes for things survived and passed on their genes; for example, those who noticed that a spark starts a fire began using fire and were more likely to survive. This preference to seek out causes usually serves us well. The problem is, the tendency is so central to our cognitive makeup and thought processes that we overapply it. We start seeing causes for things that are simply the result of chance occurrences.

4) We can misperceive our world.

We like to think that we perceive the world as it actually is. How many times have we heard someone say, “I know what I saw.” However, our senses can be deceived. Sometimes the problem is selective perception, where we don't see certain things because our focus is elsewhere. In other cases, we can actually see things that aren't there. Remember my ghostly encounter? Well, studies show that a significant number of us have hallucinated at some point in our lives. Of course, problems are likely to arise when we use these inaccurate perceptions in our thinking. Two factors have a particularly important effect on how we perceive the world: our expectations and our desires. That is, our perceptions are greatly influenced by what we expect to see and what we want to see. Consider the following events.

A newsflash just reported that a large and dangerous bear has escaped from the city zoo. What happens? The 911 switchboard lights up. The bear has been spotted in a tree, running across the park, and rummaging in a dumpster down a back alley. People report seeing the bear all over town. But it turns out the bear never wandered more than 100 yards from the zoo.8 Our expectations create perceptions. Suppose we're at a football game when our favorite team is playing its archrival. Chances are we'll notice many more infractions committed by the other team than by our team. Of course, those rooting for the other team will likely see more penalties committed by our side.9 We see what we want to see.

Our faulty perceptions have led to a number of bizarre occurrences throughout human history. Every now and then, a collective delusion occurs that causes mass hysteria in some segment of society. A “monkey man” scare recently occurred in India, where people reported seeing a half monkey–half human creature with razor-sharp fingernails and superhuman strength. Outbreaks of “penis shrinking” panics occur in various parts of Asia, where people perceive their genitalia shriveling up into their bodies.10 And, of course, in this country there are the numerous reports of alien abduction. Clearly, our perception of reality can be unreliable, which should make us wary of beliefs that are based only on our personal experiences, especially if those beliefs are extraordinary.

5) We oversimplify.

Life can be very complex. We often have to juggle many different things just to get through the day. This also happens when we make decisions. Sometimes the amount of information available is overwhelming. In fact, if we paid attention to all of it, we'd spend most of our time just gathering and evaluating information. To avoid this “analysis paralysis,” we use a number of simplifying strategies. For example, we often base our decisions upon information that can easily be brought to mind. If we're deciding whether a sport, like downhill skiing, is risky, we don't conduct an exhaustive search of the ways a person could get hurt skiing, or search out the number of skiing injuries per year. Instead, we often simplify the task by thinking about our friends' skiing experiences or about skiing accidents we heard reported on TV. We may remember, for example, that both Sonny Bono and Michael Kennedy were killed while skiing in the same year and conclude that skiing is a very dangerous sport (even though there are more injuries in many other recreational activities, such as boating and bike riding).

Simplifying strategies can be quite beneficial. They save time and effort, and allow us to make a decision quickly and move on to something else. Fortunately, they often result in reasonably good decisions. While they may not give us the best decision, they often are “good enough.” However, when we use these simplifying strategies we don't pay attention to all the information that's relevant to a decision, which can get us into trouble.

What if you went to your doctor and got tested for a debilitating viral disease. The test comes back positive—it says you have the virus! How worried should you be? The doctor tells you, “The test is 100 percent accurate in indicating a person has the virus when they actually have it, but it also says a person has the virus when they don't have it 5 percent of the time.” You also learn that about one in five hundred people have the virus. So, what's the probability you have the virus if the test says you do? Most people say it's around 95 percent. In fact, the correct answer is only 4 percent! As we will see, our use of simplifying strategies results in ignoring very important information, which can lead to grossly inaccurate judgments.

6) We have faulty memories.

Imagine that you're having a relaxing evening watching TV when you hear a knock at the door. As you open it, a policeman slaps a pair of handcuffs on you and states, “You're under arrest for sexual assault.” Amazingly, you find out that your daughter has just accused you of sexually molesting her when she was a little girl, some twenty years ago. You can't believe it because you've always had a great relationship with your daughter, and you know that you never abused her. However, to resolve some emotional problems, she recently saw a therapist who thought that her problems could be the result of childhood sexual abuse. After several sessions of hypnosis, your daughter started to remember a number of instances in which you sexually assaulted her. Based on these repressed memories, you're convicted and sent to prison, even with a complete lack of physical evidence for the abuse.

Sounds crazy? You don't think it could happen? Well, it has happened in a number of cases in the United States.11 Why is that? Many of us—including those who testify as witnesses—think that our memory is a permanent record of past experiences. Of course, we know that we can't remember everything, but many of us think that if we use special techniques, like hypnosis, we'll be able to recall previously inaccessible events. In fact, surveys indicate that most Americans hold this view of memory.12 And when we're confident in our memory, we believe that we remember things as they actually occurred.

However, considerable research indicates that our memories can change. We can even create new memories for events that never actually happened! In effect, our memory is not a literal snapshot of events which we later retrieve from our album of past experiences. Instead, memory is constructive. Current beliefs, expectations, environment, and even suggestive questioning can influence our memory of past events. It's more accurate to think of memory as a reconstruction of the past—and with each successive reconstruction, our memories can get further and further from the truth. Memories thus change over time, even when we're confident that they haven't, and those memories can have a significant influence on the beliefs we form and the decisions we make.

SUMMING UP

As you can see, we have a number of tendencies that can lead to faulty beliefs and decisions. Some of them are deeply rooted in our cognitive processes because of our evolutionary development, as in our preference for stories over statistics. Others are there to simplify our complex lives and decision making. Of course, we don't always fall prey to these problems. While we often seek out confirming data, we sometimes pay attention to disconfirming information. In addition, these cognitive characteristics can serve us very well in many instances. If we didn't use simplifying strategies, we would often become overloaded with information, making it difficult to reach any decision. However, these tendencies also cause us a number of problems when we form our beliefs and make our decisions.

One other thing must be kept in mind. Don't feel bad if you find yourself making the kinds of errors in thinking that are discussed in this book. I've made them, my friends have made them, and everyone I've ever known has made them. That's how ingrained they are in our cognitive makeup. Since we're typically not even aware they exist, the first step in making better decisions is to identify the pitfalls in our thinking. So let's take a look at where and why we can go wrong.