Harry Watanabe opened a small gift shop in Omaha, Nebraska, in 1932. His inventory was unique, consisting mostly of trinkets from Japan. His business, which he eventually named Oriental Trading Company, soon expanded into seventeen shops throughout the Midwest. Harry had two children, Terrance and Pam, and in keeping with Japanese tradition, Harry dreamed that one day Terrance would take over for him as head of the family business. In 1977 his dream was realized: Terrance become president of the company. Moving the focus of production to party supplies and favors, Terrance oversaw the rise of a business that would eventually serve 18 million customers, churn out 25,000 products, employ 3,000 workers, and earn $300 million.
You know the sort of trinket we’re talking about: spider rings, rubber bouncy balls, key chains, and those miniature pink muscle men that expand when placed in water. Americans have been rooting through cereal boxes in search of just such prizes for decades. For us, these small plastic delights have been the surprise found in the hollow center of the traditional Italian uova di Pasqua (chocolate Easter eggs) passed around at our own family Easter celebrations for as long as we can remember. Indeed, a browse through the Oriental Trading Company’s online catalog will surely be a walk down memory lane for any child raised in the last half century.
Terrance was incredibly devoted to the success of the company, so much so that his friends and family couldn’t help noting how he was never able to maintain a close romantic relationship. How proud Harry must have been of Terrance, making such sacrifices to devote his life to the family business. How happy Terrance must have been to be the source of such pride. And indeed, how much pleasure Terrance himself must have taken in his own professional and financial success. How surprising it is, then, that in 2000, after shepherding the family business so responsibly for over two decades, Terrance sold the company and proceeded to blow through most of his hard-earned proceeds at Vegas casinos. We’re not talking just a few thousand dollars. Terrance Watanabe lost a mind-blowing $127 million in a single year. Doesn’t it seem quite strange that someone so successful, who had earned a fortune by making intelligent, calculated decisions about costs and benefits, could so foolishly fall prey to the lure of the flashing casino lights?1
You may be tempted to lump Terrance, who clearly had developed a gambling problem, in with all other addicts. And depending upon your views of addiction, you might see him as weak—unable to overcome the temptation of winning on the next hand. You might see compulsive gambling as a character flaw, signaling a type of person who is unreliable, untrustworthy, and certainly not the sort with whom one would want to conduct business. But is the person who loses $127 million at the casino so different from the person who plays the stock market or dabbles in real estate? The difference between these types of individuals, we’ll argue, isn’t so much their character—after all, all three of these activities are high-stakes gambling—but rather in their sensitivities to risk and reward. As we intend to show in this chapter, our perceptions of risk and probability can change on a dime and are subject to the push and pull of dueling forces in the mind. If you acknowledge this fact, then suddenly the woman pouring her weekly paychecks into the slot machines in Atlantic City might not seem as deviant or flawed as you think.
Consider an entirely different kind of situation involving a risk that most people have personally experienced: flying on airplanes. Surely you’ve heard the statistic that you’re more likely to get into a fatal accident in your car on the way to the airport than to be killed in a plane crash, but for the many people who fear flying, this fact doesn’t always provide much comfort. Though we may know that the probability of a plane crash is low, our intuitions are harder to convince. That’s because when emotions run high, our assessments of probability and risk are skewed by all kinds of cognitive biases. For example, studies show that after a high-profile plane crash hits the headlines, people estimate the likelihood of being killed in a crash as being much higher than they might have the day before. Again, rationally, this doesn’t make sense. The likelihood of dying in a crash on March 26, 1977, was almost exactly the same as the likelihood of dying in a crash on March 28, 1977, but it probably didn’t feel that way to the many people who watched the footage of the 583 bodies being pulled from the wreckage after two 747s collided at the Tenerife airport in the Canary Islands on March 27. Indeed, much research has shown that simply being able to recall something easily and vividly, like a recent and well-publicized tragedy, makes it suddenly seem more likely to occur again even though the odds have not objectively changed.2
To understand why this irrational fear persists, think about the relative visibility of plane crashes vs. car crashes. Every day most of us see hundreds of cars safely traveling through city streets. Very occasionally we’ll witness an accident, but we witness infinitely more safe trips than crashes. Not the case with airplanes. Unless you’re an air traffic controller, chances are you’ve been exposed to a fairly high ratio of accidents to safe landings. After all, every time a plane crashes, the images are splattered all over the news for days, sometimes weeks, but we don’t see the thousands and thousands of planes that take off and land safely every day. In short, we are selectively exposed to the catastrophes. And these vivid images are seared in our brains, creating expectations of harm that are detached from the statistical realities.
These same kinds of cognitive errors are at work when someone such as Terrance Watanabe steps up to the craps table. In the same way that rational judgments of the probability of a crash take a backseat when evaluating the safety of air travel, the logical probabilities involved in losing or winning at the craps table can be lost on gamblers. When we gamble, we tend to focus on the possibility, not the probability, of winning, much the way fliers focus on the possibility, not the probability, of dying in a fiery crash.
It turns out that almost everyone—not just compulsive gamblers and fearful fliers—can be biased when it comes to judging probability and weighing the potential for risks and rewards. This fact certainly flies in the face of rational models of human decision making, which suggest that people make decisions about risk by carefully and methodically calculating the likelihood of possible outcomes. But if you haven’t guessed yet, we and other psychologists of our ilk don’t put much stock in such models. Sure, it would be nice if decisions were made by making use of all available information and rationally weighing all the costs and benefits. If this were the case, then our decisions about whether to play the next hand, get on that transatlantic flight, risk taking that shortcut through a bad neighborhood, or skip the birth control this time would generally turn out all right. But unfortunately, the ant and the grasshopper are not truth seekers—each recruits all the psychological ammunition it can to convince you to go all in with a pair of twos, pop four Xanax to get you through the trip home, make that condom seem too far away to reach for, or have you reach for the Purell gel every time you shake a hand.
Food and sex. These are two things that are pretty much universally enjoyed. But they are also two things that consistently cause us to make errors in judgments of risk. Just as with gambling, when it comes to food and sex, what seem to be failures of will, like eating that second piece of cake or cheating on one’s significant other, often actually boil down to our inability to accurately weigh the short-term rewards of our actions (e.g., satisfying a sweet tooth or a carnal urge) against the long-term risks (e.g., weight gain or ruining a relationship). When you’re considering whether or not to add extra cheese to that pizza or buy those reduced-fat Wheat Thins, chances are you don’t often stand there calculating the long-term risks involved in eating too much salt or fat, right? Similarly, if a partner tells you, in the heat of the moment, to forget protection and get on with it already, are you going to stop and rationally evaluate his or her sexual history? No. You have an urge, and you act on it. In other words, when it comes to food and sex, short-term pleasure seems to win every time. But are these urges to engage in risky behavior rooted only in our brains, or are they also sensitive to cues in our external environment?
That’s what Peter Ditto and his colleagues at the University of California wanted to find out.3 More specifically, Ditto and his team were interested in the extent to which people’s sensitivity to risk hinges on the proximity of reward. In one experiment, they told participants that they would be playing a game of chance. If they won, they would get some freshly baked chocolate chip cookies that were waiting in the next room. If they lost, they would have to spend an extra thirty minutes filling out boring questionnaires. The rules of the game were as follows: Participants would pick a card from one of four decks of ten cards. Each card would be either a win or a loss. But different decks would have different odds of winning, and participants would be told these odds before they drew, at which point they could choose whether or not to play. The experimenters wanted to see how many people would choose to play the game at the varying levels of risk.
Now, if participants were at all sensitive to objective information about risk, then the results should be obvious: more people would choose to play the game when the risk of losing was lower (or the odds of winning were higher). And this is indeed what happened. But wait—the experiment wasn’t over yet. Now the researchers wanted to see what would happen when the rewards stayed the same but were brought a little closer to home. So they conducted the experiment a second time. Here, instead of simply telling the participants that they could win cookies, they set up a small oven in their lab and actually baked the cookies right in front of the subjects. Would sitting in that room, with the cookies turning golden in front of them and the smell of freshly baked deliciousness wafting through the air, change their decision making? Yes. As the experimenter slipped on an oven mitt and pulled the hot tray from the oven to let the morsels cool ever so slightly, somehow the participants’ willingness to take risks miraculously skyrocketed. As suspected, the temptation to gamble, even when the odds of winning were low, was now too much to resist. Participants’ inner grasshoppers wanted those damn cookies, and they wanted them bad: “To hell with the possibility of consequences later! I have a chance to win chocolate now!” With this voice echoing in their subconscious, just as many people chose to play the game when the deck was stacked against them as when the odds of winning were high. It seemed, the researchers concluded, that making the reward more vivid and immediate can overwhelm the ability to weigh risks rationally. In other words, when the reward looms close, the risk becomes harder to resist.
Our perceptions of risk seem to be similarly swayed when we make decisions about sex. When you show men pictures of women and ask them to gauge the odds of contracting a disease from the women, the more physically attractive the woman, the lower the estimate.4 When you think about it, this is completely irrational; after all, if anything, a sexier woman should be expected to have had more partners and therefore more opportunities to contract a disease. But we don’t take the time to assess this logically when such an alluring immediate reward—sex with a gorgeous woman—is on the line. In a similar demonstration of the power of immediate rewards, another study showed that men who were sexually aroused reported being more willing to engage in risky sexual behavior than men who weren’t aroused.5 It’s not that any of these men were inherently bigger risk takers; it was that when the visual and sensory cues of sexual opportunity are there, the desire for immediate pleasure takes over, turning even the most responsible guy into a carefree Lothario.
So the more appealing and immediate the reward, the more we instinctively ignore or downplay the risks involved. This may not seem particularly shocking. We’ve probably all been in situations where we’re more than willing to throw caution to the wind in pursuit of something or someone we really wanted. But as we’re about to see, when simple shifts in our environments completely blind us to the long-term consequences of our actions, it can have some pretty surprising—and often disastrous—consequences.
It turns out that many of the most important decisions of our lives, as well as the ones that seem to have direct implications for our character, are rooted in our subconscious assessments of risk. Whether he’s aware of it or not, a smoker’s decision about whether to quit is directly related to his belief about the odds that smoking causes cancer. Similarly, a voter’s support for a policy geared toward, say, ending workplace discrimination and harassment will hinge on her judgment of how frequently these types of offenses occur. At first this may appear fine; after all, these people are grown-ups and free to make their own decisions. But the problem is, as shown by the studies described above, people rarely make these decisions rationally, although they like to believe they do. Rather, they allow emotional cues to override logic, which, more often than not, results in flawed decisions or judgment. We teamed up with colleagues Richard Petty and Duane Wegener at Ohio State and Derek Rucker at Northwestern’s Kellogg School of Business to look at how small shifts in people’s emotional states affect their assessments of risk and reward. If we were correct in thinking that simple changes in mood could alter your view of what awaited you behind the next door, the implications on our lives could be profound. For example, what if a smoker were suddenly less willing to go to a cancer screening because her good feelings about a recent job promotion made her underestimate the risk that the cancer would metastasize? Or what if a star athlete’s elation after winning a big game inured him to the risks of unprotected sex?
To see how this might work, let’s conduct a simple thought experiment. Imagine the scene during Hurricane Katrina. Think about the thousands of people desperately scrambling through the storm to find shelter, leaving their homes and, in many cases, their friends and family behind as they fought for survival. Picture those children who clung to their pets, only to be torn away by rescue workers and forced to leave the dogs and cats to certain doom. Think of the overwhelming grief of far-flung friends and relatives as they slowly received word of the loved ones they lost. Feeling a little sad yet? Now, answer this question: of the four million people in the United States who will propose marriage to someone this year, how many will be refused by the person they love?
This is more or less the exercise we put our participants through in our study. We had them read a news story that was intended to elicit a particular emotional state, such as sadness or anger, and then asked them to predict the likelihood of various other events. We overwhelmingly found that feeling sad or angry, simply from reading about an event such as a natural disaster or an anti-American protest in Iraq, was all it took to color their judgments about the odds of completely unrelated events occurring. It wasn’t that hearing about an event such as a plane crash made them think plane crashes were more likely; it was that their emotional state swayed their general perception of the world around them. When people felt sad, they believed tragedy to be more prevalent; for example, they estimated that there were higher numbers of children starving in Romanian orphanages and brides being left at the altar. By the same token, people who were feeling angry overestimated the frequency of infuriating events, such as being screwed over by a used-car salesperson or being stuck in traffic.6
It may seem disconcerting at first to learn that not only do we fail to use logic when weighing probabilities but feelings and moods that have absolutely nothing to do with the decision being made can bias our judgments. But don’t fret. It turns out that this tendency to overestimate risks can actually have its advantages, evolutionarily speaking.
Consider the following example: You’re walking through the savannah with some of your family in search of a little breakfast. You come across a type of animal you’ve never seen before. It has dark brown fur with a white stripe down its spine. As you approach, it lunges at your merry band, sinking its teeth into your eldest daughter’s neck and killing her. Now let’s say we asked you what the probability is that the next animal with dark brown fur and a white stripe down its spine you see would be dangerous. You’d probably say 100 percent, and that’s the most rational guess you could make since the single dark-furred, white-striped animal you’ve encountered proved to be dangerous.
Now, let’s say you accidentally happen upon another one of these creatures. This time the animal sits there peacefully, even assuming a deferential posture as you pass. Again we ask you, what is the probability that the next animal with dark brown fur and a white stripe down its spine will be dangerous? You’d probably pause. Rationally, your answer should be 50 percent, since as of this moment, one of two has proved dangerous. But your gut says something different. It’s true that it is no longer reasonable to expect that all individuals of this species are dangerous, but on an intuitive level you know it’s better to be safe than sorry. In your heightened emotional state, the cost of taking a longer path to avoid the brown and white critter is far less than the risk of losing another life. And in this case, your intuitive mind is right. While avoiding all animals with dark fur and white stripes would be an irrational calculation rooted in emotion (namely, fear), it is also an adaptive one.
Of course, this isn’t just true in the jungle. In modern life too, listening to intuition and being more sensitive to the possibility of harm will serve you better on average than evaluating each individual situation rationally and objectively, particularly in situations that require rapid decisions for which you have incomplete information. It’s hard if not impossible to know the odds involved in any given risk. What is the probability that you will get attacked if you walk down your own street? If you asked Kitty Genovese this question early on the night of March 13, 1964, she probably would have said it wasn’t that high. But she was attacked. And she was killed. What are the chances you will get sick if you share a cup or if you eat a serrano pepper? Again, probably not that high. But tell that to the college students who contracted swine flu or fell victim to the salmonella outbreak of 2008.
The point is that our past experiences play a large role in our assessment of risk—perhaps an even bigger role than our mood or proximity to reward. When we undergo a painful experience, the desire to prevent such a thing from ever happening again can be so strong that we’d rather ignore the probabilities and just play it safe. If that means you have to avoid serrano peppers for a year, so be it. Our intuitive systems don’t give much credence to that old maxim about lightning never striking the same place twice.
At the same time, having missed out on a reward in the past can make us more willing to take a risk in the future. For example, if you fold your hand in a poker game and the next card that’s turned is the one you were waiting for, it’s hard to convince yourself you made the right decision. Now the money you could have won is staring you in the face, coaxing you to go for it the next time and put it all on the line.
Studies such as ours have shown that not only does feeling sad or angry lead people to overestimate the prevalence of tragic or infuriating events, but feeling happy makes people more likely to overestimate the likelihood of positive events. This too is adaptive. How? Because it might compel you to take a chance on something you otherwise wouldn’t have. Take a promotion, for example. Let’s say only 10 percent of the people in your company get promoted to the next level. Logic and reason would tell you these are terrible odds and that you shouldn’t even bother trying. But what if on one particularly sunny and cheerful morning your gut tells you just to go ask for that promotion even if, logically speaking, it’s a fool’s errand? What often seems like a fool’s errand isn’t, and if you put in the effort, you may just be rewarded. Sometimes you have to be in it to win it. So, it can often be better to listen to our intuition and play the possibilities than the probabilities.
But if following our intuition often leads to better outcomes in the long run, how does this explain Terrance Watanabe’s gambling losses? It seems as though he had the opposite problem. Terrance wasn’t in a situation where he had to make split-second decisions. The massive losses at the casinos unfolded over time. The answer is that Terrance was underestimating the risks. Instead, like the people who were more likely to gamble when they could smell the warm cookies, he was overly focused on the immediate reward. Each time he bet, the possibility that the next spin of the roulette wheel or the next turn of the card would win him the jackpot was so seductive, it blocked out all rational concerns about his long-term financial well-being or his family’s reaction to his blowing their nest egg on a few rolls of the dice. When we think about judgments of risk and reward in terms of the battle between the ant and the grasshopper, Terrance’s behavior and phenomena like it begin to make a lot more sense. The desires to avoid immediate losses and to obtain immediate rewards—whether in the savannah or in the poker room—all stem from the psychological processes geared toward our short-term interests. The processes that govern long-term interests are the voices in the back of our head advising us to forget about what’s in front of our eyes and focus on what will be there much later on. And as we know, these are the voices that so often can be ignored.
So we see that gambling, or taking risks, is less about our “character” and more about situation and circumstance: our past experiences, our moods and emotions, and the visibility of rewards in that moment. The variability of all these factors is exactly what makes us seem to be daredevils one minute and straight arrows the next. When it comes to risk, our decisions are under the control of the ant and grasshopper, with important implications for how we are judged by those around us. In fact, understanding the processes underlying risk taking provides a compelling explanation for why we consider some types of people valiant heroes and others meek cowards.
In our culture, heroes tend to be risk takers: the general who orders a daring assault to win a battle, the investor who makes a wild gambit and ends up with a windfall, the politician who puts his career on the line to champion a noble cause. But why do we have so much respect for those who run headfirst into danger, who don’t think twice before acting? Why is this considered so heroic, whereas careful, cautious, and reasoned behavior isn’t?
The answer to this conundrum lies in an unlikely place: sports. Ask yourself why few figures in sports are more beloved than the underdog. It’s because people are fascinated by those who “beat the odds.” As any Red Sox fan will tell you, no moment in recent sports history comes close to the thrill of seeing the 2004 team come back from an 0–3 deficit streak to beat their long-standing rival, the New York Yankees, in the American League Championship Series. But this thrill wasn’t just about the win. The victory was icing on the cake. This defiance of odds and expectations, the unlikely becoming reality, is what captured our hearts. The marketers at major television networks are well aware of this fact, which is why it seems impossible to watch a sporting event or even a reality show such as Dancing with the Stars without being bombarded with information about the unlikely circumstances from which particular athletes or contestants emerged. Indeed, it’s become increasingly difficult to tell the difference between coverage of an Olympic event and a heartwarming biopic. The announcers know that what is likely isn’t interesting (the record-breaking quarterback with a twelve-game winning streak makes another touchdown, yawn); it’s the unlikely that gets the ratings.
Case in point: During the 2010 NCAA college basketball tournament, the Butler Bulldogs knocked out a series of higher-ranked opponents on their way to a national championship showdown with the heavily favored Duke Blue Devils. It was painted as a David vs. Goliath matchup, and the nation was captivated. Even those who had absolutely no interest in college basketball were tuning in to see the drama unfold. Butler lost by two points after their last second shot clanked off the rim, but no one cared all that much about the outcome; the nation loved the Bulldogs for the mere fact that they’d gotten there by beating the odds. Taking risks that seem insurmountable may be the key to being seen as a hero.
Let’s see how this psychological bias for the unlikely plays out in another competitive context: Wall Street. There is perhaps no group of individuals toward whom more vitriol and scorn have been directed over the past several years than Wall Street traders (or the greedy, callous, irresponsible, money-hungry leeches, as they’re usually referred to). But it turns out that the psychological processes that cause us to root for the underdog (this attraction to beating the odds) might be the exact same ones that are responsible for the risky investment strategies that most likely contributed to the 2008 economic collapse.
Wall Street traders feel the same way about the high-stakes game of buying and selling that most people feel about sex and warm cookies: they like it. They like it a lot. To see just how much, Brian Knutson, a neuroscientist at Stanford, put traders into fMRI machines. Not surprisingly, when the traders were making high-risk decisions, the pleasure centers of their brains lit up like Christmas trees. And the riskier the decisions became (i.e., the worse the odds), the more pleasure they brought the traders.7 In a way, the same thing is true for sports fans—the less likely the dark horse is to win, the more excited we are just to watch them play. And the less likely it is that a firefighter will come out alive from a burning building, the more praise we heap upon him if he or she survives.
So the next time you curse those bankers on Wall Street and wonder at how they could possibly be so indifferent to the risks they were taking and the choices they were making, remember the pleasure you take in seeing Cinderella stories unfold. Sure, rooting for Seabiscuit doesn’t have the same consequences as gambling away millions of dollars of taxpayers’ money, but the psychology behind it is much the same. And, by the same token, the next time you’re tempted to judge someone such as Terrance Watanabe for gambling away his family’s fortune, remember that those same mental mechanisms that bring you so much joy in the fortune of unlikely winners are much the same as those that repeatedly drove him to bet thousands of dollars on a measly pair of twos. Again, when we look at risk in terms of the battle between the ant and the grasshopper, what seem at first glance to be deficiencies in character suddenly become a little more understandable after all.
So if risk takers are heroes, then what about those who avoid risk at all costs? What about the cowards? To understand how common an aversion to risk can be, and why it is rooted in a fundamental property of the mind, let’s first consider the little-known eccentricities of a famous figure: Charles Darwin. Darwin was nothing if not meticulous. And one particularly interesting detail about his travels that’s not mentioned very often is the fact that not only did he keep detailed logs of the many species he encountered, he also kept a detailed log of his flatulence and bowel movements (as well as daily records of the severity and frequency of his tinnitus, or the ringing in his ears).8 His writing on this matter certainly does not rank up there with On the Origin of Species, but it was something he evidently expended a considerable amount of time on. After all, he was known to be a hypochondriac. Hypochondria is a classic example of the human tendency to overestimate the possibility of immediate risks in our environment. If we were to ask Darwin or any other hypochondriac the likelihood that his stomach grumblings were symptoms of a serious ailment, he would most likely say close to 100 percent. Clearly, this would be inaccurate, but when our minds are always attuned to danger, we see it wherever we look.
This kind of mentality takes many forms. Agoraphobics confine themselves to their home because they’ve overestimated the risks they perceive in the outside world. Hoarders can’t bear to throw anything away because they can’t risk not having that old flowerpot when they need it. Of course, these are extreme situations, but in milder forms, risk aversion is actually an extremely common psychological trait.
Consider the following example. If we were to ask whether you’d rather have $50 right now or flip a coin for the chance to win $100, which would you choose? If you’re like most people, you’d go with the former, and this makes sense. Though the expected outcome of each decision is the same ($50), there is risk involved in the coin toss—you might end up with nothing. But what if we asked if you’d rather have $40 or flip the coin for a chance at $100? Logically, if you calculated the risk, the odds of the coin toss would be in your favor, but you’d probably still choose the guaranteed $40.9 This is an example of irrational risk aversion, also known as loss aversion, and most people experience it in one way or another. We seem to be wired to avoid immediate losses, even when it means sacrificing potential long-term gain. Yet as we’ve noted, in our culture this kind of behavior is often construed as a weakness in character. In fact, we reserve a word for those who avoid any kind of risky behavior: cowards.
Many of us feel like cowards at some point in our life. When we can’t muster up the will to go talk to that person we’ve been eyeing all night, for fear of being rejected. When we’d prefer to keep all our money in savings accounts (or under our mattresses) so we don’t lose it all in the stock market. When we refuse to walk home alone in the dark for fear of being mugged. When we don’t let our kid eat that candy bar with the slightly torn wrapper in case it has a razor blade inside. These fears may not be rational, and they certainly aren’t sexy, but again, they can be adaptive. In the long run, cowards are less likely to get rejected, lose their nest eggs, get mugged, and feed their kids razor blades. Which brings us back to the question at the heart of the chapter. What makes a person a risk taker in one context, and a coward in another? Once again, we see it has to do with our subjective understanding of the risks involved. Consider the child of a lifelong firefighter. Every day he sees Dad leave the house in the morning to go extinguish burning buildings and then come home safe and sound. Might this child grow up with a different idea about the risks associated with running into burning buildings than a child of a firefighter who died in a blaze? Of course he would. As we saw when we talked about the irrational fear of air travel, experience and exposure powerfully sway our perceptions of risk. So would the former child be more willing, later in life, to climb a fire escape to pull a baby out of a fourth-floor window than the latter child? Probably. But would that mean he’s a braver person, a person of better character? A hero instead of a coward? Well, not necessarily.
The point is that “heroes” aren’t necessarily braver people; they may simply have different estimates of the probabilities involved with the events. If you don’t buy this, then you may have to reevaluate your opinions about adolescents, especially boys. Most people (over the age of eighteen, at least) would not agree that teenagers are necessarily more courageous or heroic than adults. But research has found that they certainly are less risk-averse.10 Suggest to a fifteen-year-old boy that the two of you grab your skateboards and careen down the steps of city hall and he’d probably give you a high five, whereas most adults would look at you like you’d lost your mind. This isn’t just because most adults look ridiculous on a skateboard. It’s because the teen and the adult are wired to think differently about the risks involved. Research has shown that the teen brain hasn’t fully developed the ability to develop what psychologists call “counterfactuals.” In other words, they lack the cognitive ability to imagine the potential consequences of their actions (i.e., the skateboard going into the street and its rider getting flattened by an oncoming bus). And if a teenager can’t even envision breaking his neck by skateboarding down a steep staircase, then how can he accurately assess the risk that it might happen? How could he be considered a hero for taking a risk he can’t even fathom? So whether we act like heroes or cowards is not as much a matter of character as people tend to think it is. When the grasshopper is in charge, it can turn us into heroes, addicts, or cowards, depending on the context.
Imagine that on the table in front of you are four decks of cards. You know only two things about these decks. First, every card will have a number on it that represents the amount of money you will either win or lose, depending on the card. Second, the cards differ among the decks. But what you don’t know is that in this game, known as the Iowa Gambling Task, some decks have better odds than others. The risky decks offer greater potential payoffs but have more “loss” cards; the safe decks offer smaller payoffs but at a more constant rate. But again, you know none of this, at least not yet. So how do you decide which deck to choose from?
When people play this game, at first they use trial and error; they pick from the different decks more or less randomly and see what happens. After about forty or fifty trials, however, they have developed a pretty good sense of which decks are safe and which are not, and then begin to choose cards almost entirely from safe ones. Why? They know that the game is going to go on for a while and therefore that their ultimate profit will be determined over the course of the game, not just on the next draw. In other words, somehow the systems of the ant kick in and shift people’s attention away from short-term wins and onto the accrual of money over the long haul.
Intuitively this makes sense. Imagine playing the game again, but this time the experimenters have placed sensors on your skin that can gauge your arousal level by measuring increases in perspiration. That is, they can literally see you sweat. When Antoine Bechara and his colleagues did this, they found something fascinating: around the tenth card draw—long before you have any conscious inkling of which decks are risky—you begin to show anxiety (as measured by arousal level) each time your hand reaches to draw from what you will only later consciously realize is a risky deck. You’re nervous, but you aren’t even aware of it.11
This is a compelling demonstration of the ant at work. It acts as a silent statistician, calculating the risks and rewards associated with each deck and trying to steer you one way or the other based not on each individual flip but on the effect multiple flips will have over the long term. Left to the devices of the grasshopper, people might continue to flip from a deck from which they get immediate positive feedback or avoid a deck from which they’ve just been burned. But remember, the ant is focused on the probability, not the possibility, of rewards. After all, playing the probabilities is the key to success over the long term. The power of the Bechara study lies in its demonstration of just how subtly, how deeply below our level of consciousness, the ant can work. Clearly, we know on an intuitive level which of the decks are risky, otherwise we wouldn’t be experiencing that anxiety. It takes us thirty more rounds—300 percent longer—to be able to consciously report this knowledge and adjust our behavior to minimize losses. Why? The grasshopper doesn’t go down without a fight. The impulse to avoid immediate harms and gravitate toward immediate gains competes with the anxiety generated by the ant. In this kind of controlled situation, over time the scales tip toward long-term concerns and the players wise up. But in the real world, unfortunately, this isn’t always the case. For many of the most important decisions in our lives, sometimes the ant needs a little help.
Earlier in the chapter we talked about how our perceptions of risk can impact health-related decisions such as whether or not to quit smoking or go for cancer screenings. In both cases we make these choices by subconsciously weighing the short-term benefits against the long-term risks. In the case of smoking, it’s the pleasure of cigarettes vs. the risk of cancer. With the screenings, it’s the reward of avoiding all that unpleasant poking and prodding (and the worry about receiving bad news) vs. the risk that a disease will go undetected. In their best-selling book Nudge, Richard Thaler and Cass Sunstein talk about how, by understanding the ways in which people think irrationally, we can help nudge them toward healthier, more responsible, and more productive behaviors.12 Building on that idea, how can we use what we know about the psychology of risk taking to encourage people to be more responsible in looking after their health? In other words, how do we get people not only to hear the voice of the ant telling them to focus on the long term but actually to heed it? When it comes to our health, it’s not enough to intuitively know those risks are there, like the players in the early rounds of the Iowa Gambling Task did. We have to actually act on them!
If you still believe that focusing disproportionately on risks makes you a coward, consider the following field experiment. Yale psychologist Peter Salovey was interested in how to get more women to go for mammograms. He quickly realized that in order to voluntarily subject themselves to the unpleasant procedure, women would have to judge the long-term risks of not going (cancer, possibly death) as being greater than the short-term costs of going (the hassle of going to the doctor, the physical discomfort of the X-ray, the mental anguish of worrying about a bad result, and so on). Logically, this seems like a no-brainer, but we shouldn’t have to tell you at this point that logic has little to do with it. Manipulating mental and physical discomfort would be tricky, so Salovey and his team decided to focus on the risk part of the equation. They teamed up with a local phone company to recruit women in the New Haven area to come into his lab and watch short public service announcements on their lunch break. The announcements were of two types. Both urged women to get mammograms, but one video talked about the benefits of mammography (e.g., finding a tumor early increases survival odds); the other talked about the risks (e.g., not finding a tumor early can lead to death).
This seems like a trivial difference, but it actually turned out to have a huge impact on the women’s decisions. Those who were made to focus on the long-term risks rather than the benefits were much more likely to later act responsibly and go for a screening. Why? Simple. When the announcement was framed in such a way that the ultimate long-term consequence was front and center, the ant suddenly couldn’t be ignored. Here again we see how the gambles we take, even the big ones such as whether we’re willing to risk our long-term health for short-term conveniences, can be greatly influenced by small and subtle differences.13
This may make it sound as if all would be well with the world if we always listened to the ant and focused on the long term. We might not have as much fun, but we’d be responsible and better off in the end, right? Well, that’s true when it comes to our health, since the stakes are so high. But in other situations that rule of thumb doesn’t always work because, as we’ve learned, the ant’s foresight isn’t always 20/20.
Ask any new professor what’s the worst thing that can happen to his or her career and nine out of ten will give you this answer: being denied tenure. To avoid that future horror, they will make great sacrifices: working twenty-hour days, not spending as much time with their families as they’d like, letting their teaching responsibilities slide, and so on (trust us, we’ve seen it). But as work by Dan Gilbert and his colleagues has shown, all this extra effort may not, in the end, be justified.14 Sure, being denied tenure is bad, but when Gilbert assessed the actual levels of unhappiness among professors who had been denied tenure, it quickly became clear that they were actually a lot happier than their younger selves would have predicted. And as Gilbert’s team has shown, this type of prediction error for happiness is quite pervasive; we’re as bad at predicting future happiness about all kinds of long-term rewards—everything from wealth to the outcome of an election and more—as we are at predicting risk. It’s hard to make decisions regarding our long-term welfare if we can’t accurately predict what will make us better off. Here again, neither intuition nor rationality always provides the answer.
So what does this all mean? Our decisions and behaviors are guided in large part by what our minds and circumstances trick us into believing about relative risks and rewards. Add to this the fact that our estimations of risks and rewards not only are very frequently flawed but are also quite fluid, and the mechanisms shaping character quickly become more complex. Once we come to grips with these dueling forces and how they can sway us—once we realize that we too are just one or two big poker wins away from a whole lot of more losses—then we can start making better decisions about when to gamble and when to play it safe.