When conflict is cliché, complexity is breaking news.
—Amanda Ripley
Eager to have a jaw-clenching, emotionally fraught argument about abortion? How about immigration, the death penalty, or climate change? If you think you can handle it, head for the second floor of a brick building on the Columbia University campus in New York. It’s the home of the Difficult Conversations Lab.
If you’re brave enough to visit, you’ll be matched up with a stranger who strongly disagrees with your views on a controversial topic. You’ll be given just twenty minutes to discuss the issue, and then you’ll both have to decide whether you’ve aligned enough to write and sign a joint statement on your shared views around abortion laws. If you’re able to do so—no small feat—your statement will be posted on a public forum.
For two decades, the psychologist who runs the lab, Peter T. Coleman, has been bringing people together to talk about polarizing issues. His mission is to reverse-engineer the successful conversations and then experiment with recipes to make more of them.
To put you in the right mindset before you begin your conversation about abortion, Peter gives you and the stranger a news article about another divisive issue: gun control. What you don’t know is that there are different versions of the gun control article, and which one you read is going to have a major impact on whether you land on the same page about abortion.
If the gun control article covers both sides of the issue, making a balanced case for both gun rights and gun legislation, you and your adversary have a decent chance at reaching consensus on abortion. In one of Peter’s experiments, after reading a “both-sides” article, 46 percent of pairs were able to find enough common ground to draft and sign a statement together. That’s a remarkable result.
But Peter went on to do something far more impressive. He randomly assigned some pairs to read another version of the same article, which led 100 percent of them to generate and sign a joint statement about abortion laws.
That version of the article featured the same information but presented it differently. Instead of describing the issue as a black-and-white disagreement between two sides, the article framed the debate as a complex problem with many shades of gray, representing a number of different viewpoints.
At the turn of the last century, the great hope for the internet was that it would expose us to different views. But as the web welcomed a few billion fresh voices and vantage points into the conversation, it also became a weapon of misinformation and disinformation. By the 2016 elections, as the problem of political polarization became more extreme and more visible, the solution seemed obvious to me. We needed to burst filter bubbles in our news feeds and shatter echo chambers in our networks. If we could just show people the other side of an issue, they would open their minds and become more informed. Peter’s research challenges that assumption.
We now know that where complicated issues are concerned, seeing the opinions of the other side is not enough. Social media platforms have exposed us to them, but they haven’t changed our minds. Knowing another side exists isn’t sufficient to leave preachers doubting whether they’re on the right side of morality, prosecutors questioning whether they’re on the right side of the case, or politicians wondering whether they’re on the right side of history. Hearing an opposing opinion doesn’t necessarily motivate you to rethink your own stance; it makes it easier for you to stick to your guns (or your gun bans). Presenting two extremes isn’t the solution; it’s part of the polarization problem.
Psychologists have a name for this: binary bias. It’s a basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories. To paraphrase the humorist Robert Benchley, there are two kinds of people: those who divide the world into two kinds of people, and those who don’t.
An antidote to this proclivity is complexifying: showcasing the range of perspectives on a given topic. We might believe we’re making progress by discussing hot-button issues as two sides of a coin, but people are actually more inclined to think again if we present these topics through the many lenses of a prism. To borrow a phrase from Walt Whitman, it takes a multitude of views to help people realize that they too contain multitudes.
A dose of complexity can disrupt overconfidence cycles and spur rethinking cycles. It gives us more humility about our knowledge and more doubts about our opinions, and it can make us curious enough to discover information we were lacking. In Peter’s experiment, all it took was framing gun control not as an issue with only two extreme positions but rather as one involving many interrelated dilemmas. As journalist Amanda Ripley describes it, the gun control article “read less like a lawyer’s opening statement and more like an anthropologist’s field notes.” Those field notes were enough to help pro-life and pro-choice advocates find some areas of agreement on abortion in only twenty minutes.
The article didn’t just leave people open to rethinking their views on abortion; they also reconsidered their positions on other divisive issues like affirmative action and the death penalty.* If people read the binary version of the article, they defended their own perspective more often than they showed an interest in their opponent’s. If they read the complexified version, they made about twice as many comments about common ground as about their own views. They asserted fewer opinions and asked more questions. At the end of the conversation, they generated more sophisticated, higher-quality position statements—and both parties came away more satisfied.
For a long time, I struggled with how to handle politics in this book. I don’t have any silver bullets or simple bridges across a widening gulf. I don’t really even believe in political parties. As an organizational psychologist, I want to vet candidates’ leadership skills before I worry about their policy positions. As a citizen, I believe it’s my responsibility to form an independent opinion on each issue. Eventually, I decided that the best way to stay above the fray was to explore the moments that affect us all as individuals: the charged conversations we have in person and online.
Resisting the impulse to simplify is a step toward becoming more argument literate. Doing so has profound implications for how we communicate about polarizing issues. In the traditional media, it can help journalists open people’s minds to uncomfortable facts. On social media, it can help all of us have more productive Twitter tiffs and Facebook fights. At family gatherings, it might not land you on the same page as your least favorite uncle, but it could very well prevent a seemingly innocent conversation from exploding into an emotional inferno. And in discussions of policies that affect all of our lives, it might bring us better, more practical solutions sooner. That’s what this section of the book is about: applying rethinking to different parts of our lives, so that we can keep learning at every stage of our lives.
Non Sequitur © 2016 Wiley Ink, Inc. Dist. by ANDREWS MCMEEL SYNDICATION. Reprinted with permission. All rights reserved.
In 2006, Al Gore starred in a blockbuster film on climate change, An Inconvenient Truth. It won the Academy Award for Best Documentary and spawned a wave of activism, motivating businesses to go green and governments to pass legislation and sign landmark agreements to protect the planet. History teaches us that it sometimes takes a combination of preaching, prosecuting, and politicking to fuel that kind of dramatic swing.
Yet by 2018, only 59 percent of Americans saw climate change as a major threat—and 16 percent believed it wasn’t a threat at all. Across many countries in Western Europe and Southeast Asia, higher percentages of the populations had opened their minds to the evidence that climate change is a dire problem. In the past decade in the United States, beliefs about climate change have hardly budged.
This thorny issue is a natural place to explore how we can bring more complexity into our conversations. Fundamentally, that involves drawing attention to the nuances that often get overlooked. It starts with seeking and spotlighting shades of gray.
A fundamental lesson of desirability bias is that our beliefs are shaped by our motivations. What we believe depends on what we want to believe. Emotionally, it can be unsettling for anyone to admit that all life as we know it might be in danger, but Americans have some additional reasons to be dubious about climate change. Politically, climate change has been branded in the United States as a liberal issue; in some conservative circles, merely acknowledging the fact that it might exist puts people on a fast track to exile. There’s evidence that higher levels of education predict heightened concern about climate change among Democrats but dampened concern among Republicans. Economically, we remain confident that America will be more resilient in response to a changing climate than most of the world, and we’re reluctant to sacrifice our current ways of achieving prosperity. These deep-seated beliefs are hard to change.
As a psychologist, I want to zoom in on another factor. It’s one we can all control: the way we communicate about climate change. Many people believe that preaching with passion and conviction is necessary for persuasion. A clear example is Al Gore. When he narrowly lost the U.S. presidential election in 2000, one of the knocks against him was his energy—or lack thereof. People called him dry. Boring. Robotic. Fast-forward a few years: his film was riveting and his own platform skills had evolved dramatically. In 2016, when I watched Gore speak in the red circle at TED, his language was vivid, his voice pulsated with emotion, and his passion literally dripped off him in the form of sweat. If a robot was ever controlling his brain, it short-circuited and left the human in charge. “Some still doubt that we have the will to act,” he boomed, “but I say the will to act is itself a renewable resource.” The audience erupted in a standing ovation, and afterward he was called the Elvis of TED. If it’s not his communication style that’s failing to reach people, what is?
At TED, Gore was preaching to the choir: his audience was heavily progressive. For audiences with more varied beliefs, his language hasn’t always resonated. In An Inconvenient Truth, Gore contrasted the “truth” with claims made by “so-called skeptics.” In a 2010 op-ed, he contrasted scientists with “climate deniers.”
This is binary bias in action. It presumes that the world is divided into two sides: believers and nonbelievers. Only one side can be right, because there is only one truth. I don’t blame Al Gore for taking that position; he was presenting rigorous data and representing the consensus of the scientific community. Because he was a recovering politician, seeing two sides to an issue must have been second nature. But when the only available options are black and white, it’s natural to slip into a mentality of us versus them and to focus on the sides over the science. For those on the fence, when forced to choose a side, the emotional, political, and economic pressures tilt in favor of disengaging or dismissing the problem.
To overcome binary bias, a good starting point is to become aware of the range of perspectives across a given spectrum. Polls suggest that on climate change, there are at least six camps of thought. Believers represent more than half of Americans, but some are concerned while others are alarmed. The so-called nonbelievers actually range from cautious to disengaged to doubtful to dismissive.
It’s especially important to distinguish skeptics from deniers. Skeptics have a healthy scientific stance: They don’t believe everything they see, hear, or read. They ask critical questions and update their thinking as they gain access to new information. Deniers are in the dismissive camp, locked in preacher, prosecutor, or politician mode: They don’t believe anything that comes from the other side. They ignore or twist facts to support their predetermined conclusions. As the Committee for Skeptical Inquiry put it in a plea to the media, skepticism is “foundational to the scientific method,” whereas denial is “the a priori rejection of ideas without objective consideration.”*
The complexity of this spectrum of beliefs is often missing from coverage of climate change. Although no more than 10 percent of Americans are dismissive of climate change, it’s these rare deniers who get the most press. In an analysis of some hundred thousand media articles on climate change between 2000 and 2016, prominent climate contrarians received disproportionate coverage: they were featured 49 percent more often than expert scientists. As a result, people end up overestimating how common denial is—which in turn makes them more hesitant to advocate for policies that protect the environment. When the middle of the spectrum is invisible, the majority’s will to act vanishes with it. If other people aren’t going to do anything about it, why should I bother? When they become aware of just how many people are concerned about climate change, they’re more prepared to do something about it.
As consumers of information, we have a role to play in embracing a more nuanced point of view. When we’re reading, listening, or watching, we can learn to recognize complexity as a signal of credibility. We can favor content and sources that present many sides of an issue rather than just one or two. When we come across simplifying headlines, we can fight our tendency to accept binaries by asking what additional perspectives are missing between the extremes.
This applies when we’re the ones producing and communicating information, too. New research suggests that when journalists acknowledge the uncertainties around facts on complex issues like climate change and immigration, it doesn’t undermine their readers’ trust. And multiple experiments have shown that when experts express doubt, they become more persuasive. When someone knowledgeable admits uncertainty, it surprises people, and they end up paying more attention to the substance of the argument.
Of course, a potential challenge of nuance is that it doesn’t seem to go viral. Attention spans are short: we have only a few seconds to capture eyeballs with a catchy headline. It’s true that complexity doesn’t always make for good sound bites, but it does seed great conversations. And some journalists have found clever ways to capture it in few words.
A few years ago, the media reported on a study of the cognitive consequences of coffee consumption. Although their headlines were drawn from the same data, some newspapers praised the benefits of coffee, while other outlets warned about the costs:
The actual study showed that older adults who drank a daily cup or two of coffee had a lower risk of mild cognitive impairment, relative to abstainers, occasional consumers, and heavier consumers. If they increased their consumption by another cup or more per day, they had a higher risk than those who stayed at or below a single cup a day. Each of the one-sided headlines took seven to twelve words to mislead the reader about the effects of drinking coffee. A more accurate headline needed just twelve words to serve up a jolt of instant complexity:
Imagine if even this kind of minimal nod to complexity appeared in articles on climate change. Scientists overwhelmingly agree about its human causes, but even they have a range of views on the actual effects—and the potential remedies. It’s possible to be alarmed about the situation while recognizing the variety of ways to improve it.*
Psychologists find that people will ignore or even deny the existence of a problem if they’re not fond of the solution. Liberals were more dismissive of the issue of intruder violence when they read an argument that strict gun control laws could make it difficult for homeowners to protect themselves. Conservatives were more receptive to climate science when they read about a green technology policy proposal than about an emissions restriction proposal.
Featuring shades of gray in discussions of solutions can help to shift attention from why climate change is a problem to how we can do something about it. As we’ve seen from the evidence on the illusion of explanatory depth, asking “how” tends to reduce polarization, setting the stage for more constructive conversations about action. Here are examples of headlines in which writers have hinted at the complexity of the solutions:
I WORK IN THE ENVIRONMENTAL MOVEMENT. I DON’T CARE IF YOU RECYCLE
CAN PLANTING A TRILLION TREES STOP CLIMATE CHANGE? SCIENTISTS SAY IT’S A LOT MORE COMPLICATED
If you want to get better at conveying complexity, it’s worth taking a close look at how scientists communicate. One key step is to include caveats. It’s rare that a single study or even a series of studies is conclusive. Researchers typically feature multiple paragraphs about the limitations of each study in their articles. We see them less as holes in our work and more as portholes to future discoveries. When we share the findings with nonscientists, though, we sometimes gloss over these caveats.
That’s a mistake, according to recent research. In a series of experiments, psychologists demonstrated that when news reports about science included caveats, they succeeded in capturing readers’ interest and keeping their minds open. Take a study suggesting that a poor diet accelerates aging. Readers were just as engaged in the story—but more flexible in their beliefs—when it mentioned that scientists remained hesitant to draw strong causal conclusions given the number of factors that can affect aging. It even helped just to note that scientists believed more work needed to be done in this area.
We can also convey complexity by highlighting contingencies. Every empirical finding raises unanswered questions about when and where results will be replicated, nullified, or reversed. Contingencies are all the places and populations where an effect may change.
Consider diversity: although headlines often say “Diversity is good,” the evidence is full of contingencies. Although diversity of background and thought has the potential to help groups think more broadly and process information more deeply, that potential is realized in some situations but not others. New research reveals that people are more likely to promote diversity and inclusion when the message is more nuanced (and more accurate): “Diversity is good, but it isn’t easy.”* Acknowledging complexity doesn’t make speakers and writers less convincing; it makes them more credible. It doesn’t lose viewers and readers; it maintains their engagement while stoking their curiosity.
In social science, rather than cherry-picking information to fit our existing narratives, we’re trained to ask whether we should rethink and revise those narratives. When we find evidence that doesn’t fit neatly into our belief systems, we’re expected to share it anyway.* In some of my past writing for the public, though, I regret not having done enough to emphasize areas where evidence was incomplete or conflicting. I sometimes shied away from discussing mixed results because I didn’t want to leave readers confused. Research suggests that many writers fall into the same trap, caught up in trying to “maintain a consistent narrative rather than an accurate record.”
A fascinating example is the divide around emotional intelligence. On one extreme is Daniel Goleman, who popularized the concept. He preaches that emotional intelligence matters more for performance than cognitive ability (IQ) and accounts for “nearly 90 percent” of success in leadership jobs. At the other extreme is Jordan Peterson, writing that “There is NO SUCH THING AS EQ” and prosecuting emotional intelligence as “a fraudulent concept, a fad, a convenient band-wagon, a corporate marketing scheme.”
Both men hold doctorates in psychology, but neither seems particularly interested in creating an accurate record. If Peterson had bothered to read the comprehensive meta-analyses of studies spanning nearly two hundred jobs, he’d have discovered that—contrary to his claims—emotional intelligence is real and it does matter. Emotional intelligence tests predict performance even after controlling for IQ and personality. If Goleman hadn’t ignored those same data, he’d have learned that if you want to predict performance across jobs, IQ is more than twice as important as emotional intelligence (which accounts for only 3 to 8 percent of performance).
I think they’re both missing the point. Instead of arguing about whether emotional intelligence is meaningful, we should be focusing on the contingencies that explain when it’s more and less consequential. It turns out that emotional intelligence is beneficial in jobs that involve dealing with emotions, but less relevant—and maybe even detrimental—in work where emotions are less central. If you’re a real estate agent, a customer service representative, or a counselor, being skilled at perceiving, understanding, and managing emotions can help you support your clients and address their problems. If you’re a mechanic or an accountant, being an emotional genius is less useful and could even become a distraction. If you’re fixing my car or doing my taxes, I’d rather you didn’t pay too much attention to my emotions.
In an effort to set the record straight, I wrote a short LinkedIn post arguing that emotional intelligence is overrated. I did my best to follow my own guidelines for complexity:
Nuance: This isn’t to say that emotional intelligence is useless.
Caveats: As better tests of emotional intelligence are designed, our knowledge may change.
Contingencies: For now, the best available evidence suggests that emotional intelligence is not a panacea. Let’s recognize it for what it is: a set of skills that can be beneficial in situations where emotional information is rich or vital.
Over a thousand comments poured in, and I was pleasantly surprised that many reacted enthusiastically to the complexified message. Some mentioned that nothing is either/or and that data can help us reexamine even our closely held beliefs. Others were downright hostile. They turned a blind eye to the evidence and insisted that emotional intelligence was the sine qua non of success. It was as if they belonged to an emotional intelligence cult.
From time to time I’ve run into idea cults—groups that stir up a batch of oversimplified intellectual Kool-Aid and recruit followers to serve it widely. They preach the merits of their pet concept and prosecute anyone who calls for nuance or complexity. In the area of health, idea cults defend detox diets and cleanses long after they’ve been exposed as snake oil. In education, there are idea cults around learning styles—the notion that instruction should be tailored to each student’s preference for learning through auditory, visual, or kinesthetic modes. Some teachers are determined to tailor their instruction accordingly despite decades of evidence that although students might enjoy listening, reading, or doing, they don’t actually learn better that way. In psychology, I’ve inadvertently offended members of idea cults when I’ve shared evidence that meditation isn’t the only way to prevent stress or promote mindfulness; that when it comes to reliability and validity, the Myers-Briggs personality tool falls somewhere between a horoscope and a heart monitor; and that being more authentic can sometimes make us less successful. If you find yourself saying ____ is always good or ____ is never bad, you may be a member of an idea cult. Appreciating complexity reminds us that no behavior is always effective and that all cures have unintended consequences.
xkcd.com
In the moral philosophy of John Rawls, the veil of ignorance asks us to judge the justice of a society by whether we’d join it without knowing our place in it. I think the scientist’s veil of ignorance is to ask whether we’d accept the results of a study based on the methods involved, without knowing what the conclusion will be.
In polarized discussions, a common piece of advice is to take the other side’s perspective. In theory, putting ourselves in another person’s shoes enables us to walk in lockstep with them. In practice, though, it’s not that simple.
In a pair of experiments, randomly assigning people to reflect on the intentions and interests of their political opposites made them less receptive to rethinking their own attitudes on health care and universal basic income. Across twenty-five experiments, imagining other people’s perspectives failed to elicit more accurate insights—and occasionally made participants more confident in their own inaccurate judgments. Perspective-taking consistently fails because we’re terrible mind readers. We’re just guessing.
If we don’t understand someone, we can’t have a eureka moment by imagining his perspective. Polls show that Democrats underestimate the number of Republicans who recognize the prevalence of racism and sexism—and Republicans underestimate the number of Democrats who are proud to be Americans and oppose open borders. The greater the distance between us and an adversary, the more likely we are to oversimplify their actual motives and invent explanations that stray far from their reality. What works is not perspective-taking but perspective-seeking: actually talking to people to gain insight into the nuances of their views. That’s what good scientists do: instead of drawing conclusions about people based on minimal clues, they test their hypotheses by striking up conversations.
For a long time, I believed that the best way to make those conversations less polarizing was to leave emotions out of them. If only we could keep our feelings off the table, we’d all be more open to rethinking. Then I read evidence that complicated my thinking.
It turns out that even if we disagree strongly with someone on a social issue, when we discover that she cares deeply about the issue, we trust her more. We might still dislike her, but we see her passion for a principle as a sign of integrity. We reject the belief but grow to respect the person behind it.
It can help to make that respect explicit at the start of a conversation. In one experiment, if an ideological opponent merely began by acknowledging that “I have a lot of respect for people like you who stand by their principles,” people were less likely to see her as an adversary—and showed her more generosity.
When Peter Coleman brings people together in his Difficult Conversations Lab, he plays them the recording of their discussions afterward. What he wants to learn is how they were feeling, moment by moment, as they listen to themselves. After studying over five hundred of these conversations, he found that the unproductive ones feature a more limited set of both positive and negative emotions, as illustrated below in the image on the left. People get trapped in emotional simplicity, with one or two dominant feelings.
As you can see with the duo on the right, the productive conversations cover a much more varied spectrum of emotions. They’re not less emotional—they’re more emotionally complex. At one point, people might be angry about the other person’s views, but by the next minute they’re curious to learn more. Soon they could be shifting into anxiety and then excitement about considering a new perspective. Sometimes they even stumble into the joy of being wrong.
In a productive conversation, people treat their feelings as a rough draft. Like art, emotions are works in progress. It rarely serves us well to frame our first sketch. As we gain perspective, we revise what we feel. Sometimes we even start over from scratch.
What stands in the way of rethinking isn’t the expression of emotion; it’s a restricted range of emotion. So how do we infuse our charged conversations with greater emotional variety—and thereby greater potential for mutual understanding and rethinking?
It helps to remember that we can fall victim to binary bias with emotions, not only with issues. Just as the spectrum of beliefs on charged topics is much more complex than two extremes, our emotions are often more mixed than we realize.* If you come across evidence that you might be wrong about the best path to gun safety, you can simultaneously feel upset by and intrigued with what you’ve learned. If you feel wronged by someone with a different set of beliefs, you can be simultaneously angry about your past interactions and hopeful about a future relationship. If someone says your actions haven’t lived up to your antiracist rhetoric, you can experience both defensiveness (I’m a good person!) and remorse (I could’ve done a lot more).
In the spring of 2020, a Black man named Christian Cooper was bird-watching in Central Park when a white woman walked by with her dog. He respectfully asked her to put the dog on a leash, as the nearby signs required. When she refused, he stayed calm and started filming her on his phone. She responded by informing him that she was going to call the police and “tell them there’s an African American man threatening my life.” She went on to do exactly that with a 911 operator.
When the video of the encounter went viral, the continuum of emotional reactions on social media rightfully spanned from moral outrage to sheer rage. The incident called to mind a painful history of false criminal accusations made against Black men by white women, which often ended with devastating consequences. It was appalling that the woman didn’t leash her dog—and her prejudice.
“I’m not a racist. I did not mean to harm that man in any way,” the woman declared in her public apology. “I think I was just scared.” Her simple explanation overlooks the complex emotions that fueled her actions. She could have stopped to ask why she had been afraid—what views about Black men had led her to feel threatened in a polite conversation? She could have paused to consider why she had felt entitled to lie to the police—what power dynamics had made her feel this was acceptable?
Her simple denial overlooks the complex reality that racism is a function of our actions, not merely our intentions. As historian Ibram X. Kendi writes, “Racist and antiracist are not fixed identities. We can be a racist one minute and an antiracist the next.” Humans, like polarizing issues, rarely come in binaries.
When asked whether he accepted her apology, Christian Cooper refused to make a simple judgment, offering a nuanced assessment:
I think her apology is sincere. I’m not sure if in that apology she recognizes that while she may not be or consider herself a racist, that particular act was definitely racist. . . .
Granted, it was a stressful situation, a sudden situation, maybe a moment of spectacularly poor judgment, but she went there. . . .
Is she a racist? I can’t answer that—only she can answer that . . . going forward with how she conducts herself, and how she chooses to reflect on the situation and examine it.
By expressing his mixed emotions and his uncertainty about how to judge the woman, Christian signaled his willingness to rethink the situation and encouraged others to rethink their own reactions. You might even be experiencing some complex emotions as you read this.
It shouldn’t be up to the victim to inject complexity into a difficult conversation. Rethinking should start with the offender. If the woman had taken responsibility for reevaluating her beliefs and behaviors, she might have become an example to others who recognized a bit of themselves in her reaction. Although she couldn’t change what she’d already done, by recognizing the complex power dynamics that breed and perpetuate systemic racism, she might have spurred deeper discussions of the range of possible steps toward justice.
Charged conversations cry out for nuance. When we’re preaching, prosecuting, or politicking, the complexity of reality can seem like an inconvenient truth. In scientist mode, it can be an invigorating truth—it means there are new opportunities for understanding and for progress.