CHAPTER 3

The Joy of Being Wrong

The Thrill of Not Believing Everything You Think

I have a degree from Harvard. Whenever I’m wrong, the world makes a little less sense.

—Dr. Frasier Crane, played by Kelsey Grammer

In the fall of 1959, a prominent psychologist welcomed new participants into a wildly unethical study. He had handpicked a group of Harvard sophomores to join a series of experiments that would run through the rest of their time in college. The students volunteered to spend a couple of hours a week contributing to knowledge about how personality develops and how psychological problems can be solved. They had no idea that they were actually signing up to have their beliefs attacked.

The researcher, Henry Murray, had originally trained as a physician and biochemist. After becoming a distinguished psychologist, he was disillusioned that his field paid little attention to how people navigate difficult interactions, so he decided to create them in his own lab. He gave students a month to write out their personal philosophy of life, including their core values and guiding principles. When they showed up to submit their work, they were paired with another student who had done the same exercise. They would have a day or two to read each other’s philosophies, and then they would be filmed debating them. The experience would be much more intense than they anticipated.

Murray modeled the study on psychological assessments he had developed for spies in World War II. As a lieutenant colonel, Murray had been recruited to vet potential agents for the Office of Strategic Services, the precursor to the CIA. To gauge how candidates would handle pressure, he sent them down to a basement to be interrogated with a bright light shining in their faces. The examiner would wait for an inconsistency in their accounts to pop up and then scream, “You’re a liar!” Some candidates quit on the spot; others were reduced to tears. Those who withstood the onslaught got the gig.

Now Murray was ready for a more systematic study of reactions to stress. He had carefully screened students to create a sample that included a wide range of personalities and mental health profiles. He gave them code names based on their character traits, including Drill, Quartz, Locust, Hinge, and Lawful—more on him later.

When students arrived for the debate, they discovered that their sparring partner was not a peer but a law student. What they didn’t know was that the law student was in cahoots with the research team: his task was to spend eighteen minutes launching an aggressive assault on their worldviews. Murray called it a “stressful interpersonal disputation,” having directed the law student to make the participants angry and anxious with a “mode of attack” that was “vehement, sweeping, and personally abusive.” The poor students sweated and shouted as they struggled to defend their ideals.

The pain didn’t stop there. In the weeks that followed, the students were invited back to the lab to discuss the films of their own interactions. They watched themselves grimacing and stringing together incoherent sentences. All in all, they spent about eight hours reliving those humiliating eighteen minutes. A quarter century later, when the participants reflected on the experience, it was clear that many had found it agonizing. Drill described feeling “unabating rage.” Locust recalled his bewilderment, anger, chagrin, and discomfort. “They have deceived me, telling me there was going to be a discussion, when in fact there was an attack,” he wrote. “How could they have done this to me; what is the point of this?”

Other participants had a strikingly different response: they actually seemed to get a kick out of being forced to rethink their beliefs. “Some may have found the experience mildly discomforting, in that their cherished (and in my case, at least, sophomoric) philosophies were challenged in an aggressive manner,” one participant remembers. “But it was hardly an experience that would blight one for a week, let alone a life.” Another described the whole series of events as “highly agreeable.” A third went so far as to call it “fun.”

Ever since I first read about the participants who reacted enthusiastically, I’ve been fascinated by what made them tick. How did they manage to enjoy the experience of having their beliefs eviscerated—and how can the rest of us learn to do the same?

Since the records of the study are still sealed and the vast majority of the participants haven’t revealed their identities, I did the next best thing: I went searching for people like them. I found a Nobel Prize–winning scientist and two of the world’s top election forecasters. They aren’t just comfortable being wrong; they actually seem to be thrilled by it. I think they can teach us something about how to be more graceful and accepting in moments when we discover that our beliefs might not be true. The goal is not to be wrong more often. It’s to recognize that we’re all wrong more often than we’d like to admit, and the more we deny it, the deeper the hole we dig for ourselves.

THE DICTATOR POLICING YOUR THOUGHTS

When our son was five, he was excited to learn that his uncle was expecting a child. My wife and I both predicted a boy, and so did our son. A few weeks later, we found out the baby would be a girl. When we broke the news to our son, he burst into tears. “Why are you crying?” I asked. “Is it because you were hoping your new cousin would be a boy?”

“No!” he shouted, pounding his fists on the floor. “Because we were wrong!”

I explained that being wrong isn’t always a bad thing. It can be a sign that we’ve learned something new—and that discovery itself can be a delight.

This realization didn’t come naturally to me. Growing up, I was determined to be right. In second grade I corrected my teacher for misspelling the word lightning as lightening. When trading baseball cards I would rattle off statistics from recent games as proof that the price guide was valuing players inaccurately. My friends found this annoying and started calling me Mr. Facts. It got so bad that one day my best friend announced that he wouldn’t talk to me until I admitted I was wrong. It was the beginning of my journey to become more accepting of my own fallibility.

In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions. Did you know that the moon might originally have formed inside a vaporous Earth out of magma rain? That a narwhal’s tusk is actually a tooth? When an idea or assumption doesn’t matter deeply to us, we’re often excited to question it. The natural sequence of emotions is surprise (“Really?”) followed by curiosity (“Tell me more!”) and thrill (“Whoa!”). To paraphrase a line attributed to Isaac Asimov, great discoveries often begin not with “Eureka!” but with “That’s funny . . .”

When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds, much like Kim Jong-un controls the press in North Korea. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information.

It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard for our minds, protecting our self-image by feeding us comforting lies. They’re all just jealous. You’re really, really, ridiculously good-looking. You’re on the verge of inventing the next Pet Rock. As physicist Richard Feynman quipped, “You must not fool yourself—and you are the easiest person to fool.”

Our inner dictator also likes to take charge when our deeply held opinions are threatened. In the Harvard study of attacking students’ worldviews, the participant who had the strongest negative reaction was code-named Lawful. He came from a blue-collar background and was unusually precocious, having started college at sixteen and joined the study at seventeen. One of his beliefs was that technology was harming civilization, and he became hostile when his views were questioned. Lawful went on to become an academic, and when he penned his magnum opus, it was clear that he hadn’t changed his mind. His concerns about technology had only intensified:

The Industrial Revolution and its consequences have been a disaster for the human race. They have greatly increased the life-expectancy of those of us who live in “advanced” countries, but they have destabilized society, have made life unfulfilling, have subjected human beings to indignities . . . to physical suffering as well . . . and have inflicted severe damage on the natural world.

That kind of conviction is a common response to threats. Neuroscientists find that when our core beliefs are challenged, it can trigger the amygdala, the primitive “lizard brain” that breezes right past cool rationality and activates a hot fight-or-flight response. The anger and fear are visceral: it feels as if we’ve been punched in the mind. The totalitarian ego comes to the rescue with mental armor. We become preachers or prosecutors striving to convert or condemn the unenlightened. “Presented with someone else’s argument, we’re quite adept at spotting the weaknesses,” journalist Elizabeth Kolbert writes, but “the positions we’re blind about are our own.”

I find this odd, because we weren’t born with our opinions. Unlike our height or raw intelligence, we have full control over what we believe is true. We choose our views, and we can choose to rethink them any time we want. This should be a familiar task, because we have a lifetime of evidence that we’re wrong on a regular basis. I was sure I’d finish a draft of this chapter by Friday. I was certain the cereal with the toucan on the box was Fruit Loops, but I just noticed the box says Froot Loops. I was sure I put the milk back in the fridge last night, but strangely it’s sitting on the counter this morning.

The inner dictator manages to prevail by activating an overconfidence cycle. First, our wrong opinions are shielded in filter bubbles, where we feel pride when we see only information that supports our convictions. Then our beliefs are sealed in echo chambers, where we hear only from people who intensify and validate them. Although the resulting fortress can appear impenetrable, there’s a growing community of experts who are determined to break through.

ATTACHMENT ISSUES

Not long ago I gave a speech at a conference about my research on givers, takers, and matchers. I was studying whether generous, selfish, or fair people were more productive in jobs like sales and engineering. One of the attendees was Daniel Kahneman, the Nobel Prize–winning psychologist who has spent much of his career demonstrating how flawed our intuitions are. He told me afterward that he was surprised by my finding that givers had higher rates of failure than takers and matchers—but higher rates of success, too.

When you read a study that surprises you, how do you react? Many people would get defensive, searching for flaws in the study’s design or the statistical analysis. Danny did the opposite. His eyes lit up, and a huge grin appeared on his face. “That was wonderful,” he said. “I was wrong.”

Later, I sat down with Danny for lunch and asked him about his reaction. It looked a lot to me like the joy of being wrong—his eyes twinkled as if he was having fun. He said that in his eighty-five years, no one had pointed that out before, but yes, he genuinely enjoys discovering that he was wrong, because it means he is now less wrong than before.

I knew the feeling. In college, what first attracted me to social science was reading studies that clashed with my expectations; I couldn’t wait to tell my roommates about all the assumptions I’d been rethinking. In my first independent research project, I tested some predictions of my own, and more than a dozen of my hypotheses turned out to be false.* It was a major lesson in intellectual humility, but I wasn’t devastated. I felt an immediate rush of excitement. Discovering I was wrong felt joyful because it meant I’d learned something. As Danny told me, “Being wrong is the only way I feel sure I’ve learned anything.”

Danny isn’t interested in preaching, prosecuting, or politicking. He’s a scientist devoted to the truth. When I asked him how he stays in that mode, he said he refuses to let his beliefs become part of his identity. “I change my mind at a speed that drives my collaborators crazy,” he explained. “My attachment to my ideas is provisional. There’s no unconditional love for them.”

Attachment. That’s what keeps us from recognizing when our opinions are off the mark and rethinking them. To unlock the joy of being wrong, we need to detach. I’ve learned that two kinds of detachment are especially useful: detaching your present from your past and detaching your opinions from your identity.

Let’s start with detaching your present from your past. In psychology, one way of measuring the similarity between the person you are right now and your former self is to ask: which pair of circles best describes how you see yourself?

In the moment, separating your past self from your current self can be unsettling. Even positive changes can lead to negative emotions; evolving your identity can leave you feeling derailed and disconnected. Over time, though, rethinking who you are appears to become mentally healthy—as long as you can tell a coherent story about how you got from past to present you. In one study, when people felt detached from their past selves, they became less depressed over the course of the year. When you feel as if your life is changing direction, and you’re in the process of shifting who you are, it’s easier to walk away from foolish beliefs you once held.

My past self was Mr. Facts—I was too fixated on knowing. Now I’m more interested in finding out what I don’t know. As Bridgewater founder Ray Dalio told me, “If you don’t look back at yourself and think, ‘Wow, how stupid I was a year ago,’ then you must not have learned much in the last year.”

The second kind of detachment is separating your opinions from your identity. I’m guessing you wouldn’t want to see a doctor whose identity is Professional Lobotomist, send your kids to a teacher whose identity is Corporal Punisher, or live in a town where the police chief’s identity is Stop-and-Frisker. Once upon a time, all of these practices were seen as reasonable and effective.

Most of us are accustomed to defining ourselves in terms of our beliefs, ideas, and ideologies. This can become a problem when it prevents us from changing our minds as the world changes and knowledge evolves. Our opinions can become so sacred that we grow hostile to the mere thought of being wrong, and the totalitarian ego leaps in to silence counterarguments, squash contrary evidence, and close the door on learning.

Who you are should be a question of what you value, not what you believe. Values are your core principles in life—they might be excellence and generosity, freedom and fairness, or security and integrity. Basing your identity on these kinds of principles enables you to remain open-minded about the best ways to advance them. You want the doctor whose identity is protecting health, the teacher whose identity is helping students learn, and the police chief whose identity is promoting safety and justice. When they define themselves by values rather than opinions, they buy themselves the flexibility to update their practices in light of new evidence.

THE YODA EFFECT: “YOU MUST UNLEARN WHAT YOU HAVE LEARNED”

On my quest to find people who enjoy discovering they were wrong, a trusted colleague told me I had to meet Jean-Pierre Beugoms. He’s in his late forties, and he’s the sort of person who’s honest to a fault; he tells the truth even if it hurts. When his son was a toddler, they were watching a space documentary together, and Jean-Pierre casually mentioned that the sun would one day turn into a red giant and engulf the Earth. His son was not amused. Between tears, he cried, “But I love this planet!” Jean-Pierre felt so terrible that he decided to bite his tongue instead of mentioning threats that could prevent the Earth from even lasting that long.

Back in the 1990s, Jean-Pierre had a hobby of collecting the predictions that pundits made on the news and scoring his own forecasts against them. Eventually he started competing in forecasting tournaments—international contests hosted by Good Judgment, where people try to predict the future. It’s a daunting task; there’s an old saying that historians can’t even predict the past. A typical tournament draws thousands of entrants from around the world to anticipate big political, economic, and technological events. The questions are time-bound, with measurable, specific results. Will the current president of Iran still be in office in six months? Which soccer team will win the next World Cup? In the following year, will an individual or a company face criminal charges for an accident involving a self-driving vehicle?

Participants don’t just answer yes or no; they have to give their odds. It’s a systematic way of testing whether they know what they don’t know. They get scored months later on accuracy and calibration—earning points not just for giving the right answer, but also for having the right level of conviction. The best forecasters have confidence in their predictions that come true and doubt in their predictions that prove false.

On November 18, 2015, Jean-Pierre registered a prediction that stunned his opponents. A day earlier, a new question had popped up in an open forecasting tournament: in July 2016, who would win the U.S. Republican presidential primary? The options were Jeb Bush, Ben Carson, Ted Cruz, Carly Fiorina, Marco Rubio, Donald Trump, and none of the above. With eight months to go before the Republican National Convention, Trump was largely seen as a joke. His odds of becoming the Republican nominee were only 6 percent according to Nate Silver, the celebrated statistician behind the website FiveThirtyEight. When Jean-Pierre peered into his crystal ball, though, he decided Trump had a 68 percent chance of winning.

Jean-Pierre didn’t just excel in predicting the results of American events. His Brexit forecasts hovered in the 50 percent range when most of his competitors thought the referendum had little chance of passing. He successfully predicted that the incumbent would lose a presidential election in Senegal, even though the base rates of reelection were extremely high and other forecasters were expecting a decisive win. And he had, in fact, pegged Trump as the favorite long before pundits and pollsters even considered him a viable contender. “It’s striking,” Jean-Pierre wrote early on, back in 2015, that so many forecasters are “still in denial about his chances.”

Based on his performance, Jean-Pierre might be the world’s best election forecaster. His advantage: he thinks like a scientist. He’s passionately dispassionate. At various points in his life, Jean-Pierre has changed his political ideologies and religious beliefs.* He doesn’t come from a polling or statistics background; he’s a military historian, which means he has no stake in the way things have always been done in forecasting. The statisticians were attached to their views about how to aggregate polls. Jean-Pierre paid more attention to factors that were hard to measure and overlooked. For Trump, those included “Mastery at manipulating the media; Name recognition; and A winning issue (i.e., immigration and ‘the wall’).”

Even if forecasting isn’t your hobby, there’s a lot to be learned from studying how forecasters like Jean-Pierre form their opinions. My colleague Phil Tetlock finds that forecasting skill is less a matter of what we know than of how we think. When he and his collaborators studied a host of factors that predict excellence in forecasting, grit and ambition didn’t rise to the top. Neither did intelligence, which came in second. There was another factor that had roughly triple the predictive power of brainpower.

The single most important driver of forecasters’ success was how often they updated their beliefs. The best forecasters went through more rethinking cycles. They had the confident humility to doubt their judgments and the curiosity to discover new information that led them to revise their predictions.

A key question here is how much rethinking is necessary. Although the sweet spot will always vary from one person and situation to the next, the averages can give us a clue. A few years into their tournaments, typical competitors updated their predictions about twice per question. The superforecasters updated their predictions more than four times per question.

Think about how manageable that is. Better judgment doesn’t necessarily require hundreds or even dozens of updates. Just a few more efforts at rethinking can move the needle. It’s also worth noting, though, how unusual that level of rethinking is. How many of us can even remember the last time we admitted being wrong and revised our opinions accordingly? As journalist Kathryn Schulz observes, “Although small amounts of evidence are sufficient to make us draw conclusions, they are seldom sufficient to make us revise them.”

That’s where the best forecasters excelled: they were eager to think again. They saw their opinions more as hunches than as truths—as possibilities to entertain rather than facts to embrace. They questioned ideas before accepting them, and they were willing to keep questioning them even after accepting them. They were constantly seeking new information and better evidence—especially disconfirming evidence.

On Seinfeld, George Costanza famously said, “It’s not a lie if you believe it.” I might add that it doesn’t become the truth just because you believe it. It’s a sign of wisdom to avoid believing every thought that enters your mind. It’s a mark of emotional intelligence to avoid internalizing every feeling that enters your heart.

Ellis Rosen/The New Yorker Collection/The Cartoon Bank

Another of the world’s top forecasters is Kjirste Morrell. She’s obviously bright—she has a doctorate from MIT in mechanical engineering—but her academic and professional experience wasn’t exactly relevant to predicting world events. Her background was in human hip joint mechanics, designing better shoes, and building robotic wheelchairs. When I asked Kjirste what made her so good at forecasting, she replied, “There’s no benefit to me for being wrong for longer. It’s much better if I change my beliefs sooner, and it’s a good feeling to have that sense of a discovery, that surprise—I would think people would enjoy that.”

Kjirste hasn’t just figured out how to erase the pain of being wrong. She’s transformed it into a source of pleasure. She landed there through a form of classical conditioning, like when Pavlov’s dog learned to salivate at the sound of a bell. If being wrong repeatedly leads us to the right answer, the experience of being wrong itself can become joyful.

That doesn’t mean we’ll enjoy it every step of the way. One of Kjirste’s biggest misses was her forecast for the 2016 U.S. presidential election, where she bet on Hillary Clinton to beat Donald Trump. Since she wasn’t a Trump supporter, the prospect of being wrong was painful—it was too central to her identity. She knew a Trump presidency was possible, but she didn’t want to think it was probable, so she couldn’t bring herself to forecast it.

That was a common mistake in 2016. Countless experts, pollsters, and pundits underestimated Trump—and Brexit—because they were too emotionally invested in their past predictions and identities. If you want to be a better forecaster today, it helps to let go of your commitment to the opinions you held yesterday. Just wake up in the morning, snap your fingers, and decide you don’t care. It doesn’t matter who’s president or what happens to your country. The world is unjust and the expertise you spent decades developing is obsolete! It’s a piece of cake, right? About as easy as willing yourself to fall out of love. Somehow, Jean-Pierre Beugoms managed to pull it off.

When Donald Trump first declared his candidacy in the spring of 2015, Jean-Pierre gave him only a 2 percent chance of becoming the nominee. As Trump began rising in the August polls, Jean-Pierre was motivated to question himself. He detached his present from his past, acknowledging that his original prediction was understandable, given the information he had at the time.

Detaching his opinions from his identity was harder. Jean-Pierre didn’t want Trump to win, so it would’ve been easy to fall into the trap of desirability bias. He overcame it by focusing on a different goal. “I wasn’t so attached to my original forecast,” he explained, because of “the desire to win, the desire to be the best forecaster.” He still had a stake in the outcome he actually preferred, but he had an even bigger stake in not making a mistake. His values put truth above tribe: “If the evidence strongly suggests that my tribe is wrong on a particular issue, then so be it. I consider all of my opinions tentative. When the facts change, I change my opinions.”

Research suggests that identifying even a single reason why we might be wrong can be enough to curb overconfidence. Jean-Pierre went further; he made a list of all the arguments that pundits were making about why Trump couldn’t win and went looking for evidence that they (and he) were wrong. He found that evidence within the polls: in contrast with widespread claims that Trump was a factional candidate with narrow appeal, Jean-Pierre saw that Trump was popular across key Republican demographic groups. By mid-September, Jean-Pierre was an outlier, putting Trump’s odds of becoming the nominee over 50 percent. “Accept the fact that you’re going to be wrong,” Jean-Pierre advises. “Try to disprove yourself. When you’re wrong, it’s not something to be depressed about. Say, ‘Hey, I discovered something!’”

MISTAKES WERE MADE . . . MOST LIKELY BY ME

As prescient as Jean-Pierre’s bet on Trump was, he still had trouble sticking to it in the face of his feelings. In the spring of 2016, he identified the media coverage of Hillary Clinton’s emails as a red flag, and kept predicting a Trump victory for two months more. By the summer, though, as he contemplated the impending possibility of a Trump presidency, he found himself struggling to sleep at night. He changed his forecast to Clinton.

Looking back, Jean-Pierre isn’t defensive about his decision. He freely admits that despite being an experienced forecaster, he made the rookie mistake of falling victim to desirability bias, allowing his preference to cloud his judgment. He focused on the forces that would enable him to predict a Clinton win because he desperately wanted a Trump loss. “That was just a way of me trying to deal with this unpleasant forecast I had issued,” he says. Then he does something unexpected: he laughs at himself.

If we’re insecure, we make fun of others. If we’re comfortable being wrong, we’re not afraid to poke fun at ourselves. Laughing at ourselves reminds us that although we might take our decisions seriously, we don’t have to take ourselves too seriously. Research suggests that the more frequently we make fun of ourselves, the happier we tend to be.* Instead of beating ourselves up about our mistakes, we can turn some of our past misconceptions into sources of present amusement.

Being wrong won’t always be joyful. The path to embracing mistakes is full of painful moments, and we handle those moments better when we remember they’re essential for progress. But if we can’t learn to find occasional glee in discovering we were wrong, it will be awfully hard to get anything right.

I’ve noticed a paradox in great scientists and superforecasters: the reason they’re so comfortable being wrong is that they’re terrified of being wrong. What sets them apart is the time horizon. They’re determined to reach the correct answer in the long run, and they know that means they have to be open to stumbling, backtracking, and rerouting in the short run. They shun rose-colored glasses in favor of a sturdy mirror. The fear of missing the mark next year is a powerful motivator to get a crystal-clear view of last year’s mistakes. “People who are right a lot listen a lot, and they change their mind a lot,” Jeff Bezos says. “If you don’t change your mind frequently, you’re going to be wrong a lot.”

Jean-Pierre Beugoms has a favorite trick for catching himself when he’s wrong. When he makes a forecast, he also makes a list of the conditions in which it should hold true—as well as the conditions under which he would change his mind. He explains that this keeps him honest, preventing him from getting attached to a bad prediction.

What forecasters do in tournaments is good practice in life. When you form an opinion, ask yourself what would have to happen to prove it false. Then keep track of your views so you can see when you were right, when you were wrong, and how your thinking has evolved. “I started out just wanting to prove myself,” Jean-Pierre says. “Now I want to improve myself—to see how good I can get.”

It’s one thing to admit to ourselves that we’ve been wrong. It’s another thing to confess that to other people. Even if we manage to overthrow our inner dictator, we run the risk of facing outer ridicule. In some cases we fear that if others find out we were wrong, it could destroy our reputations. How do people who accept being wrong cope with that?

In the early 1990s, the British physicist Andrew Lyne published a major discovery in the world’s most prestigious science journal. He presented the first evidence that a planet could orbit a neutron star—a star that had exploded into a supernova. Several months later, while preparing to give a presentation at an astronomy conference, he noticed that he hadn’t adjusted for the fact that the Earth moves in an elliptical orbit, not a circular one. He was embarrassingly, horribly wrong. The planet he had discovered didn’t exist.

In front of hundreds of colleagues, Andrew walked onto the ballroom stage and admitted his mistake. When he finished his confession, the room exploded in a standing ovation. One astrophysicist called it “the most honorable thing I’ve ever seen.”

Andrew Lyne is not alone. Psychologists find that admitting we were wrong doesn’t make us look less competent. It’s a display of honesty and a willingness to learn. Although scientists believe it will damage their reputation to admit that their studies failed to replicate, the reverse is true: they’re judged more favorably if they acknowledge the new data rather than deny them. After all, it doesn’t matter “whose fault it is that something is broken if it’s your responsibility to fix it,” actor Will Smith has said. “Taking responsibility is taking your power back.”

When we find out we might be wrong, a standard defense is “I’m entitled to my opinion.” I’d like to modify that: yes, we’re entitled to hold opinions inside our own heads. If we choose to express them out loud, though, I think it’s our responsibility to ground them in logic and facts, share our reasoning with others, and change our minds when better evidence emerges.

This philosophy takes us back to the Harvard students who had their worldviews attacked in that unethical study by Henry Murray. If I had to guess, I’d say the students who enjoyed the experience had a mindset similar to that of great scientists and superforecasters. They saw challenges to their opinions as an exciting opportunity to develop and evolve their thinking. The students who found it stressful didn’t know how to detach. Their opinions were their identities. An assault on their worldviews was a threat to their very sense of self. Their inner dictator rushed in to protect them.

Take it from the student with the code name Lawful. He felt he had been damaged emotionally by the study. “Our adversary in the debate subjected us to various insults,” Lawful reflected four decades later. “It was a highly unpleasant experience.”

Today, Lawful has a different code name, one that’s familiar to most Americans. He’s known as the Unabomber.

Ted Kaczynski became a math professor turned anarchist and domestic terrorist. He mailed bombs that killed three people and injured twenty-three more. An eighteen-year-long FBI investigation culminated in his arrest after The New York Times and The Washington Post published his manifesto and his brother recognized his writing. He is now serving life in prison without parole.

The excerpt I quoted earlier was from Kaczynski’s manifesto. If you read the entire document, you’re unlikely to be unsettled by the content or the structure. What’s disturbing is the level of conviction. Kaczynski displays little consideration of alternative views, barely a hint that he might be wrong. Consider just the opening:

The Industrial Revolution and its consequences have been a disaster for the human race. . . . They have destabilized society, have made life unfulfilling. . . . The continued development of technology will worsen the situation. It will certainly subject human beings to greater indignities and inflict greater damage on the natural world. . . . If the system survives, the consequences will be inevitable: There is no way of reforming or modifying the system. . . .

Kaczynski’s case leaves many questions about his mental health unanswered. Still, I can’t help but wonder: If he had learned to question his opinions, would he still have been able to justify resorting to violence? If he had developed the capacity to discover that he was wrong, would he still have ended up doing something so wrong?

Every time we encounter new information, we have a choice. We can attach our opinions to our identities and stand our ground in the stubbornness of preaching and prosecuting. Or we can operate more like scientists, defining ourselves as people committed to the pursuit of truth—even if it means proving our own views wrong.