WHY DO SMART PEOPLE MAKE STUPID MISTAKES?
How our brains continue to protect us against threats that no longer exist
Close your eyes and imagine that the entire history of the universe, from the Big Bang to now, occurred in 24 hours. It started at midnight, but our precious sun rose only at 3:43 PM. Bacteria, the first sign of life, appeared at 6:40 PM. Insects showed up five hours later, and an asteroid hit the earth with a huge blast at six minutes to midnight, obliterating the dinosaurs. Humans and monkeys came down from the trees only at 31 seconds to midnight, and we Homo sapiens came into being at less than a second to midnight. All of human history as we know it happened in the last three hundredths of that second.
In that miniscule time span, in galactic terms, we have invented medieval romance, the atom bomb, careers, nationalism (the last two only about 150 years ago), and the wondrous phenomenon of reality shows. Our daily concerns are nothing more than an evolutionary blink of the eye. This begs the question: if the human brain has spent more time among insects and monkeys than with other humans, shouldn’t that influence the way we respond to modern-day challenges? Indeed, it appears that human development in recent centuries has been too rapid for the slow process of evolution, and various parts of our brains have still not adapted to the requirements of modern life.
Our similarity to other species indicates our limitations: we share 98.76% of our genetic makeup with chimpanzees, the species most like us. It is hardly surprising, therefore, that when facing a challenge, we are most likely to use a skill that we share with chimpanzees and lesser species, without even knowing we’re doing it. Our brain, like theirs, is a sophisticated machine constantly on the lookout for threats. In its futile search for existential threats long gone from our world, the human brain enlists its primitive ability to detect patterns: it is a law of nature that survival depends on early detection of threat patterns.
For our brain, it is better to pay the price of 99 false alarms than miss one real threat. The result is an apparently wasteful mechanism that does not “punish” false detection or the enlistment of resources to ward off every detected potential threat. In financial investment, by comparison, false detection of patterns or trends is almost immediately penalized.
Much attention has been paid recently to the two systems that amicably share our brains: the emotional system (1) and the rational system (2). The emotional system is what we perceive as our “gut feeling”; its quick, automatic responses are the brain’s default reactions. System 1 is responsible for our survival and manages vast quantities of information coming our way through use of heuristics, or shortcuts—often at the expense of sound judgment. But when your life is on the line, speedy decisions are paramount. When we speak in our mother tongue or utter truths, it is the emotional system in action. The same system makes us pay closer attention to changes in a speaker’s tone than to what he is saying. System 2 makes more complicated evaluations and subsequently okays or changes decisions made by System 1. However, when it is distracted or too slow (the emotional system is twice as fast), we wrongly assume that we’re being rational when we’re actually responding through our highly bias-prone emotional system.
The major drawback of the emotional system is its utter blindness to probability. As a result, we allow random events to influence how we think. Einstein thought that “coincidence is God’s way of remaining anonymous.” In a similar vein, Nobel-winning physicist Wolfgang Pauli suggested that coincidences are visible traces of invisible principles. When my wife and I run into a long-lost childhood friend of hers in London, it takes an effort to remind ourselves that in a city of ten million residents, a one-in-a-million occurrence happens ten times a day. When it comes to coincidence, we tend to reject statistics and succumb to the temptation of lending our lives a semblance of meaning—which creates a sense of control. Ignoring the role that sheer randomness plays in our lives makes us attribute success to talent and put failure down to bad luck.
It is much more boring to acknowledge statistical truths like “regression toward the mean”: a phenomenon in which a variable that is extreme on its first measurement will tend to be closer to the average on its second measurement. This tendency was first noticed by Francis Galton in the late 19th century, when he observed that extreme characteristics (such as height) in parents are not fully passed on to their offspring. Children of tall parents tend to be tall, but not as much as their parents; children of short parents will be shorter than most, but probably taller than their parents. The popular magazine Sports Illustrated had a real problem on its hands when top athletes refused to go on its cover, because of a rumor that appearing on the magazine’s cover would lead to bad performance or injury in following weeks. The editors investigated what they thought was an urban legend and discovered, much to their dismay, that it was true: in 913 of the 2,456 editions checked, they found that athletes’ performance dropped after appearing on the magazine’s cover. Regression toward the mean explains this perfectly: athletes grace front pages after outstanding achievements. After that, barring some immediate leap in performance-enhancing technology, they will naturally return to their average scores, which are not the ones to garner headlines.
The media loves an unusual story, but never follows up on the dreary return to average. All we receive is the number of road casualties on a particularly sad weekend, a record high or low of soccer goals, or pilot performance in the Air Force Academy as famously discussed by Daniel Kahneman in his Nobel Prize acceptance speech. If you don’t know about regression toward the mean and tend to seek out meaning and patterns, you’re bound to place too much weight on a single unusual occurrence.
To be sure, nature provides us with a tempting abundance of random patterns. With a large enough sample, we can detect any pattern we want: our brains will connect the dots on their own. Look up at the sky on a starry night and you’ll see whatever you wish—a lion, a scorpion or a dipper, you name it. British mathematician and philosopher Frank Ramsey devoted his short life to studying chaos. He found that a certain order can be discovered even in relatively small samples. According to Ramsey’s Law, if we rearrange the first 101 numbers in any order we like, we will always find 11 numbers in a sequential (yet not necessarily consecutive) increasing or decreasing sequence.
Can you see the bad decision just waiting to happen here? On one hand, primal parts of our brain look for patterns to detect threats; on the other hand, nature (and humanity) create random patterns all the time. The meeting of the two generates that all-too-familiar human weakness: overconfidence. Eighty percent of us believe that we’re better drivers, lovers and parents than average, and that we’ll live longer than our fellow students. Seventy percent of lawyers believe that their case is more founded than that of their adversary. Given that 19% of U.S. citizens believe they’re in the top 1% in terms of wealth, overconfidence is bound to make you attractive at any job interview or party.
Overconfidence has its roots in skills that our ancestors acquired in order to deceive others, and—more importantly—themselves. To steal some of the hunt’s loot before it was officially shared out, you had to know how to deceive the others. Nowadays, we use deception mostly to improve our social status. How annoying, then, that our brain doesn’t like it when we lie and keeps giving us away by changing our physiological responses or choice of phrase and a variety of other tell-tale signs. To get away with deception, we have to fool our own brains first. It’s hard to cope with the failures that life throws at us without developing some feisty repression mechanisms. The path to overconfidence begins there.
Our present-day information overdose cultivates overconfidence: we think we can efficiently process the endless stream of information coming at us from every direction and always separate the wheat from the chaff. Both assumptions, unfortunately, are wrong. In a famous experiment, psychologist Paul Slovic tested the connection between levels of information and overconfidence. He asked eight horseracing bookmakers what factors they thought determined a horse’s chances to win a race. The outcome was a list of 88 variables such as the horse’s performance history, the quality of the racecourse, and the jockey’s weight. Next, Slovic gave them data on 40 past races based on the order of importance they had listed. The data was presented in four stages: at first, five variables, then ten, then 20, and finally, 40 factors together. At every stage, the researcher asked the bookies which five horses they thought had won the race, and how sure they were of their answer. He found that the accuracy of the picks remained the same, no matter how much information the bookmaker had. However, the bookies’ confidence in their answer rose sharply as the information grew. Why go as far as a racecourse? Google’s search engine works better, the less words you feed it.
Overconfidence leads us to bad decisions because it makes us act even when we shouldn’t. Investors know that over-action is terrible for business, and a particularly original study found that goalkeepers would do better standing still during a penalty kick than leaping at one of the beams. Overconfidence also makes us deny sheer chance, gives us a false sense of control and leads us to underestimate other people’s reactions—all classic blunders that engender bad decisions. People who get constant feedback about their decision, like meteorologists, are less prone to overconfidence. Also, apparently, being depressed can keep you from being overconfident and gives you a better grip on reality than that annoyingly jolly colleague.
We make decisions based on three kinds of information: things we know that we know, things we know that we don’t know, and things we don’t know that we don’t know. The last category is influenced by overconfidence—we make better decisions the more we acknowledge what we don’t know. But do we have to be clinically depressed to do that? As previously mentioned, a relatively new branch of psychology offers a more optimistic option termed “intellectual humility” (see page 59 “A Bit of Humble Pie Goes a Long Way”).
Two leading researchers of intellectual humility, Peter Samuelson and Ian Church, published an article in 2014 titled “Known Unknowns or: How we learned to stop worrying about uncertainty and love intellectual humility.” They stress how important it is for policy makers (Donald Rumsfeld, in this case) and the rest of us to acknowledge that some information out there just isn’t known. They define intellectual humility as “holding a belief with the firmness the belief merits. Some beliefs, like the belief that 2+2=4, merit being held with the utmost firmness; to do otherwise—to have serious, lingering doubts as to whether or not 2+2=4—is to be intellectually diffident or intellectually self-deprecating. Other beliefs, like the beliefs regarding the number of angels that can dance on the head of a pin, merit being held with very little firmness; to do otherwise, to be convinced that exactly five angels can dance on the head of a pin—is to be intellectually arrogant.”
The basic questions, however, remain unresolved: can one be aware of being intellectually humble? If you proclaim that you’re intellectually humble, isn’t that a form of hubris? Crime writer Helen Nielsen offered one answer: “Humility is like underwear; essential, but indecent if it shows.”