Chapter 8

Rational Communication and the Backfire Effect

Imagine you’ve emailed someone to propose a phone meeting about the Pro-Truth Pledge and the Pro-Truth Movement as a whole. The person has expressed enthusiasm, and you’ve set up the call. After exchanging pleasantries and a bit of small talk, you get into the subject of the meeting. The person you’re trying to convince to join the movement starts the conversation by talking about his perspective of “truth in politics.” He discusses the 2016 US presidential election and says: “Trump was the one candidate that was speaking the truth that a lot of people in this country did not want to hear.”

How would you feel? Would you be aghast? Would you start arguing with that person, trying to convince him that, in reality, Trump had lied way more than Clinton? After all, the facts would be on your side about Trump’s lies. You could cite so many articles and analyses, such as the well-regarded fact-checking column of the Washington Post, which gave Trump a rating of Four Pinocchios (meaning “totally false”) for 57 of 91 statements by Trump that the column checked (that is, 63 percent).

It’s tempting to throw numbers at the person you’re talking to in an effort to convince them head on that they are misinformed. However, behavioral science research suggests that it doesn’t work. In fact, it’s exactly the wrong thing to do. Too often, this approach triggers the backfire effect. When people are presented with information that challenges their beliefs, they tend to defend their positions, and by arguing for them they get more convinced their original beliefs are true. This has huge implications for anyone seeking to engage others in political discussions, especially across the partisan divide. This finding may seem counterintuitive to you, as it did to me when I first learned about it. However, the more I recalled discussions I’d had with people before learning about this research, the more I saw the point. Leading with the facts failed to convince most of my conversation partners. So how do you deal with people who express beliefs that are clearly at odds with the facts?

Let’s go back to our example, which is close to a real situation in which I found myself. That real situation was in a much more high-stakes environment than a simple phone call. It was the start of my interview on The Douglas Coleman Show. Douglas Coleman has a sizable social media following of over 33,000 on Twitter and 17,000 plus on Facebook (as of July 2017). While I’ve done several radio and podcast interviews with conservatives, I always knew they were conservatives, and I prepared for that in advance. However, Coleman’s social media and bio descriptions offered no clear signals in advance of his Trump-friendly perspective. So you can bet I was surprised by Coleman’s claims of Trump’s truthfulness.

What happened in the interview? Did it deteriorate into a shouting match? Well, no, though perhaps it could have. I didn’t start throwing the facts about Trump’s lies at Coleman. I quickly switched gears into a different mode of engagement, and sought to achieve emotionally intelligent communication instead. Validating Coleman’s emotions, and mirroring his wording while subtly redirecting our conversation, I looked for and defined common ground and suggested potentially shared goals with which I thought he might agree. I laid out the Pro-Truth Pledge, and described the Pro-Truth Movement as an effective way forward to fight lies, and promote truth in politics. By the end of our conversation, Coleman committed to the Pro-Truth Pledge. He did so even though he must have realized that the Pro-Truth Pledge has the potential to harm the political prospects of Trump and other politicians he supports who rely heavily on post-truth methods. Soon after our interview, he tweeted about it to his Twitter followers (see Figure 8.1).

Fig. 8.1 Douglas Coleman’s tweet about taking the Pro-Truth Pledge (courtesy of Twitter)

Fig. 8.1 Douglas Coleman’s tweet about taking the Pro-Truth Pledge (courtesy of Twitter)

This chapter lays out the science and the tactics that can help you communicate rationally and avoid the backfire effect with people who express beliefs at odds with the facts, as long as you are able to find some overarching goals or values that are shared with the other person. By communicating rationally, I do not mean communication that relies solely on reason or logic. Instead, I use the behavioral science definition of rational communication. That is, communication that uses accurate information to make wise decisions about how to communicate, and thereby to reach your goals by getting your message across.

The Science Behind the Backfire Effect

The backfire effect stems from confirmation bias—the tendency to look for and interpret information in ways that accord with one’s existing beliefs. When confronted with information that conflicts with existing beliefs, rather than experience discomfort, people search for a way to prove themselves right. The more intensely they are forced to defend their original position, the more strongly committed they feel to it after the argument.

The research that brought this phenomenon to light was a study by Brendan Nyhan and Jason Reifler (reported in their 2010 Political Behavior article). They were seeking to learn whether providing corrective information to people who hold incorrect beliefs would reduce their misperceptions. Specifically, Nyhan and Reifler wanted to study the most typical way that people receive corrective information—in news reports that present both sides of an issue. Unfortunately, as Chapter 4 explained, news reports usually follow the journalistic practice of false equivalence: treating the two sides of a controversy as equal, even if one side has all the facts correct, and the other side merely presents misinformation. To mimic this real-life situation, the experimenters provided the study participants with mock newspaper articles that contained statements from political figures reinforcing misperceptions on three hot policy issues: 1) weapons of mass destruction in Iraq prior to the US invasion of Iraq in 2003; 2) federal policy on stem cell research; 3) the effects of tax cuts on tax revenue. Some of the articles included a corrective statement immediately following the misleading statement from a politician; other articles did not.

For instance, one of the news articles had a misleading quote from George W. Bush that might lead readers to believe that Iraq had a huge weapons-of-mass-destruction (WMD) program before the invasion, but hid or destroyed the weapons right before the arrival of US forces. In some of the articles, the quote was followed by corrective information in the form of the Duelfer Report (the highly authoritative report from the Iraq Survey Group organized by the Pentagon and Central Intelligence Agency to hunt for WMD in Iraq, which found only a very small amount of time-expired chemical weapons that offered a negligible military threat). The study asked participants who had read the article whether they agreed with the following statement: “Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.”

Study participants were not asked for a yes‒no answer, but for an answer on a scale ranging from “strongly disagree” (1) to “strongly agree” (5). On average the corrective statement did not cause a statistically significant change in incorrect beliefs.

However, the researchers also divided their subject pool by ideological beliefs, using a 7-point scale ranging from strongly conservative (3) to strongly liberal (-3). They found that providing corrective information to subjects who held strong ideological beliefs did have a statistically significant impact. Providing very liberal subjects (-3) with corrective statements that accorded with their ideological perspective (in this case, the information from the Pentagon and CIA that Iraq did not have large stockpiles of WMD) resulted in a statistically significant reduction of incorrect beliefs. Those who described themselves as liberal (-2), slightly liberal (-1) or centrist (0) had no statistically significant belief change. What happened with those who identified themselves as slightly conservative (1), conservative (2), and strongly conservative (3) was the most surprising finding, and perhaps the most depressing for those who believe explaining the facts helps people correct their opinions. When provided with clear and authoritative corrective information from highly credible sources that went against their ideological perspective, conservatives in the study actually showed a stronger belief that Iraq had a big WMD program and stockpile compared to subjects in the control condition, who only read the misleading quote from Bush without the data from the Duelfer Report. The correction backfired, resulting in a worse situation from the perspective of people having correct beliefs than giving no correction at all!

The backfire effect thus shows the danger of just providing facts in the hope that doing so will cause people who hold ideologically motivated irrational beliefs to change their minds. Another experiment described in the same study found that liberals are also not very likely to change their minds when provided with accurate information that does not accord with their worldview, so we should not suppose that this is a phenomenon that only happens with conservatives. However, other authors, such as Thomas Wood and Ethan Porter in a 2016 SSRN paper found that conservatives tended to respond less readily than liberals to corrective information.

So, what do you do if you have corrective information and you don’t want to trigger the backfire effect in others? In the course of their research, Nyhan and Reifler also found some helpful insights on the backfire effect, and even some good news. In a manuscript they submitted for publication in 2015 they hypothesized that people reject corrections of their beliefs when they perceive the correction as a threat to their sense of personal identity, self-worth, and worldview. They also posited that compelling presentations of accurate information should reduce misinformation. To test the latter hypothesis, they showed that information presented in a visual format reduces misperceptions more than equivalent information presented as text, thus proving the efficacy of using compelling visual imagery to encourage accurate beliefs. A moment of reflection might explain why this is so: when we are taking in new information via text, if we disagree and it makes us uncomfortable, our eyes can easily flit over the words without really taking them in. But, when we see visual information, especially photographs or data that show clear trends, we can’t help but take it in (though this is not the case for complex graphs and data sets).

Nyhan and Reifler tested this idea by evaluating whether self-affirmation would increase people’s ability to correct their mistaken beliefs. They hypothesized that starting with a self-affirmation exercise that supported people’s self-worth would reduce the sense of threat to their worldview and personal identity when corrective information was given to them. They hoped this approach would decrease the psychological costs of internalizing uncomfortable facts and updating beliefs. The experiment focused on evaluating people’s beliefs about whether the Iraq surge (the large increase in US troops by the Bush administration after the 2006 election) proved effective in decreasing civilian deaths and attacks by insurgents against the US-led coalition forces. Evidence shows that it did. The study participants were given information about the surge and its impact, and then asked to evaluate on a 5-point scale whether attacks “decreased substantially” (1) or “increased substantially” (5).

The specific intervention used by the researchers for self-affirmation asked participants to choose a value important to them from a list of values, and then write about some past experience when the value was “especially important to you and made you feel good about yourself.” This exercise might sound silly to some readers, but it worked. Since the surge resulted in better outcomes, subjects who had advocated for withdrawal had the most to lose by looking at the evidence. But if they also engaged in the affirmation intervention, they ended the experiment holding more accurate beliefs.

While this aspect of the study addressed liberal bias, since liberals tended to be most skeptical about the surge, the manuscript also described a similar intervention for self-identified Republicans who believed that “global warming is just a theory.” When asked to engage in self-affirmation, these conservatives proved more willing to update their beliefs and accept the reality of climate change.

Self-affirmations work best for situations where people already know of some facts that go against their incorrectly held beliefs. They work less well when people are presented with new information that counters their existing incorrect beliefs. This outcome is exciting, since on most issues of political relevance in the real world, people do tend to know of some facts that contradict their beliefs. However, they usually just ignore those facts rather than try to square them with their worldview, which can feel uncomfortable. Thus, tactics to reduce the psychological costs of accepting new and uncomfortable factual information (even something as simple as making people feel good) have evidence-based backing as to their effectiveness. These links between identity, emotions, and the backfire effect have been confirmed by researchers other than Brendan Nyhan and Jason Reifler, as described in a 2016 article by Gregory Trevors et al. in Discourse Processes.

How You Can Engage in More Productive Political Conversations

First, resist the all-too-human temptation to convince the other person head on that he or she is wrong. Remind yourself that when presented with the facts that don’t fit their worldview, most people would rather fight than switch. You will mostly end up strengthening their commitment to their wrong thinking.

Second, engage them on the level of values instead of information. Allow them to express not just what they believe, but something positive they did that affirmed their values. Increasing their sense of self-worth makes it easier for them to hear new information as less threatening. Keep in mind that if you demonstrate genuine respect for what the other person values, they are less likely to perceive you as a threat, and your new information as a hostile attack. If we’re honest with ourselves, isn’t that often what we do in political arguments: attempt to beat each other over the head with facts that support our case, in order to make the other person look wrong or uncaring? That’s a sure way to trigger the backfire effect.

Third, rather than put forth your new information aggressively, as a fact you know about the world, reframe your information as something you have heard from a reliable source—a source the other person would find reliable, too. Invite them to consider what this fact would mean if it were true: “So, how would this information fit with your current thinking, if it were true?” Reframing difficult-to-believe information as a hypothetical situation gives the other person a chance to consider the information and how it fits with their worldview without being threatened by it. If you can allow them the psychological space to reject it, without forcing them to argue the point, you are much less likely to trigger the backfire effect. And, as a valuable byproduct, you are more likely to have a civil conversation and a better relationship when you part company.

Three Rs to Avoid Triggering the Backfire Effect

Resist the urge to start with fact.

Respect the other person’s values, and ask them to talk about that instead.

Reframe your information in hypothetical terms, so the other can consider it safely: “What difference might it make if this new information were true?”

Emotional and Social Intelligence

The research findings we just reviewed have a wide relevance to the vital question of how best to handle conversations on controversial subjects. To sum it up: the findings suggest that people are more likely to feel good about your interactions with them, and pay heed to what you have to tell them, if you communicate in a way that they don’t experience as threatening to their worldview, sense of identity, and self-worth. This should not come as a surprise. There is a lot of research on effective communication that points the same way, using the concepts of emotional intelligence and social intelligence. In a classical 1937 article in Psychological Bulletin, Robert Thorndike and Saul Stein defined social intelligence as “the ability to understand and manage people.” Emotional intelligence, to quote Peter Salovey and John Mayer, who coined the term in a 1990 piece in Imagination, Cognition and Personality, refers to a particular kind of social intelligence that involves “the ability to monitor one’s own and others’ feelings and emotions, to discriminate among them and to use this information to guide one’s thinking and actions.” Daniel Goleman pulled together the extensive research on these topics into a comprehensive overview in his 1995 book, Emotional Intelligence: Why It Can Matter More Than IQ, and again in 2006 in another book, Social Intelligence: The New Science of Human Relationships.

Awareness and regulation of one’s own emotions represents one of the most fundamental skills included in the concept of emotional intelligence. When hearing statements that we do not like, we tend to perceive them as a threat: to our worldview, our identity, or maybe to a ‘tribal group’ with which we identify. Our brain goes into overdrive, releasing cortisol (the stress hormone) and preparing us for a ‘fight or flight’ situation. As a result, regardless of the accuracy of the statement itself, we tend to have either an aggressive response, such as arguing against it; or a defensive response, such as closing down the discussion and ignoring the unlikeable statement. When speaking to people who hold beliefs that are irrational, it’s highly likely that we’ll encounter these kinds of aggressive or defensive responses. Consequently, we need to prepare ourselves well for such conversations, keeping these likely patterns of behavior in mind. It’s also helpful to be in a well-rested mental state. We’ll need mental alertness and nimbleness, and lots of willpower.

To succeed, we’ll need self-knowledge and self-discipline too. In practice this means we need to develop skills at being fully aware of our own experience of aggressive or defensive emotions when we hear statements that we do not like. The easiest way to accomplish that involves paying attention to the bodily sensations that usually accompany our perceptions of threat, and the consequent emotions: anxiety, frustration, anger, and resentment. Each person exhibits these emotions in different ways. For me, they involve a wave of heat rising up in my shoulders, a prickling sensation in my head, agitated movements and speech, impulsiveness, rapid breathing, increased sensitivity to sound, and—if intense enough—a strong sense of fatigue. Others report increased sweating, sensitivity to light and other forms of sensitivity, tapping and fidgeting, clenching hands, grinding teeth, growing headaches, and many other symptoms. Awareness of uncomfortable emotional body sensations is the first necessary step in addressing them.

When we notice these physical sensations, we need to deploy self-regulation techniques to calm down. There’s abundant research evidence to show that making decisions in an emotionally excited state results in bad decision-making. People take more risks, not always wise ones; and they don’t behave according to their own ideal vision of themselves. We’ve probably all experienced this at one time or another. (The research on this can be found in a 2006 article by Dan Ariely and George Loewenstein in Journal of Behavioral Decision Making, and also a 2006 piece by Loewenstein and Jennifer Lerner in the Handbook of Affective Science.)

Most techniques for self-regulation to address this problem try to help us develop better conscious control of what we say and do. The techniques typically work by delaying responses so we have time to think, or by distracting us from the perceived threat. Some good techniques include deep breathing: take ten deep breaths, with each one involving three seconds to draw in a deep breath and five seconds to breathe it out.

Meditating is an excellent strategy (though it’s not something you can do in the heat of the moment): take 5 minutes to just close your eyes, sit in a relaxed position, and focus on your breath as it goes into and out of your body. For those who want to read up on the science behind meditation, I recommend Jon Kabat-Zinn’s 1990 Full Catastrophe Living: Using the Wisdom of Your Body and Mind to Face Stress, Pain, and Illness. Deep breathing and meditation work particularly well for those who experience defensive responses. Another helpful technique is focusing your attention directly on the uncomfortable sensation. Our natural tendency in dealing with physical and emotional discomfort is to divert our attention away from it, that is, take our mind off it. Although focusing directly on the sensation initially intensifies the discomfort, it can also “dissolve” the discomfort shortly thereafter, though this usually takes some training.

For those who feel aggression, a good technique is clenching your fists or toes and relaxing them. Clench for three seconds and relax for another three seconds, ten times. Another way to work out your aggression is by doing some exercises: the punching bag in my basement gets a lot of use during the winter, and the weeds in my garden suffer in the summer. In face-to-face communication, or on the phone such as during my conversation with Douglas Coleman, deep breathing and clenching are accessible, while meditation and exercises are not; the latter two can be easily done in other forms of communication, such as emails or social media discussions that do not have to take place in “real time.” In fact, one of the gifts of the Internet is you can write an email or text, and then sleep on it. As a general rule, never hit the send button while you are angry.

While you may already be aware of your emotions, and capable of managing them to some degree, these are challenging skills to develop. To give ourselves the best chance of communicating effectively, we should assume that the other person—the one holding irrational beliefs—does not have those skills. They are likely to have strong and immediate defensive or aggressive responses when you make statements they don’t like.

So how do you deal with this situation? Be sensitive to the emotional charge some statements will have for certain people. Instead, find a way to communicate accurate information while avoiding the phrases that will provoke an immediate aggressive or defensive response. Yet in political conversations, people do this again and again, right up to the highest levels of public office. When President Trump shut down the government in late 2018, insisting he would only reopen it when Congress authorized $5 billion to build a border wall, the Democratic leadership called the move a “temper tantrum,” and the wall “immoral.” Trump—never one to shy away from name-calling—tweeted back that Speaker Nancy Pelosi was behaving “irrationally.” Adding: “And by the way, clean up the streets in San Francisco, they are disgusting!” (@realDonaldTrump, January 20, 2019). Clearly, this approach ratchets up the emotions, drives people apart, and makes clearheaded communication next to impossible.

Avoiding emotionally charged language requires developing your empathy: your ability to understand how other people feel. Empathy is often confused with sympathy, meaning sharing the person’s feelings, but it’s not the same thing. Also, empathy is often confused with compassion, which is caring about the misfortunes of other people. Having empathy is indispensable for being able to communicate effectively about emotionally charged topics in such a way as to help people update their beliefs with truthful information, nudging their perceptions closer to reality.

A first step to developing empathy involves imagining placing yourself in the other person’s shoes. Given the other person’s background and beliefs, how is that person likely to feel if they heard what you’re about to say? Might they feel any sense of threat to their worldview, identity, “tribe,” or perceptions of self-worth? In doing so, please be on your guard against the false consensus effect. That’s our tendency to overestimate how far other people share our characteristics, beliefs, and emotions. Kathleen Bauman and Glenn Geher showed how significant this is, in a 2002 article in Current Psychology, and earlier Gary Marks and Norman Miller in a 1987 Psychological Bulletin piece. Try hard to avoid making assumptions; instead, use curiosity and gentle questions about the perspectives that the people you’re speaking with have on the topic you have in mind. But don’t ask so many questions that it will feel to them like probing or pressure.

To help you imagine how the other person might feel, consider reading some books describing the worldview and self-identity of people who hold differing ideologies. Two good research-based books to start with: Jonathan Haidt’s 2012 The Righteous Mind: Why Good People Are Divided by Politics and Religion and George Lakoff’s 1997 Moral Politics: What Conservatives Know That Liberals Don’t. When engaging in face-to-face conversations with people who hold differing ideologies, pay particularly close attention to signs of negative emotions in their facial expressions. We all have some idea of what these are, but we can all learn quite a lot more from those who’ve studied the subject systematically. Paul Ekman’s 2003 Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life provides a wealth of valuable insights. With those insights in mind, here are some pointers: on the phone, strive to note changes in tone; in online discussions, pay attention to cues such as changing pattern of engagement, such as use of emojis and length and timing of comments.

Our ability to understand other people’s feelings is often hindered by the interpersonal empathy gap, a cognitive bias where we make mistakes about the intensity of other people’s emotions, usually underestimating it. Leaf Van Boven and colleagues demonstrated in a 2013 article in Advances in Experimental Social Psychology that people typically underestimate the impact of emotions in influencing other people’s judgments, as well as in influencing their own. We typically don’t sufficiently appreciate the pain of social suffering, or its effect on outcomes. The interpersonal empathy gap is especially problematic from the perspective of the backfire effect: if we don’t appreciate how distressed or provoked somebody feels when force-fed information that’s contrary to their prior beliefs, we’re unlikely to succeed in modifying those beliefs.

An article in Social Cognitive and Affective Neuroscience by Jennifer Gutsell and Michael Inzlicht published in 2011 showed that people tend to show less of a spontaneous understanding for other people’s sadness if they perceive those people as not being “one of us,” but as members of an out-group. Gutsell and Inzlicht measured the brainwaves (electroencephalographic alpha oscillations) of study participants when these participants were observing those they perceived as part of their in-group or out-group—which mostly meant those people they liked, and those they did not. When observing in-group members who expressed sadness, the measurements of participants’ brainwaves showed strong activation patterns, suggesting that the participant’s brain was paying attention to the signs that the in-group members were sad. But when observing out-group members, subjects showed much weaker activation. That’s not a surprise: most of care more about the feeling of close family, close friends or close colleagues. If, however, we want to have meaningful communication with people who neither share our views, nor belong to our in-group, the research suggests we need to focus especially hard on empathizing. This will often be the case when conversing with people who hold irrational beliefs (that is, the beliefs of others that we consider irrational).

Once you can understand the other person’s feelings, the next vital step is to convey your message in an emotionally and socially intelligent manner. One of the best works about this that’s grounded in serious research, Marshall Rosenberg’s 2003 Nonviolent Communication: A Language of Life, offers a wealth of valuable guidelines and suggestions. One suggestion is to echo what the other person says, meaning paraphrasing what the other person has said in a short summary using your own words. That shows the other person you’re respectfully paying attention to what they have to say. As part of echoing, aim also to echo the other person’s emotions, whether the person states them or does not state them: for example, you might say “I can see you’re very concerned that such-and-such an outcome might occur.”

Doing so helps the other person feel emotionally validated and heard. Remember, while the person’s beliefs may be irrational, their emotions about these beliefs are real: we have to understand them and act according to that understanding. By showing that you heard and understood that person’s ideas and beliefs, while also acknowledging the reality of their emotions, you help that person feel respected, good, and reduce the risk that what you have to say will be perceived as a threat. Doing so is helpful, and often indispensable, for connecting with others, and helping them update their understanding by means of truthful information.

To be clear, by “echoing” Rosenberg doesn’t mean endorsing, or appearing to endorse, false beliefs the other person may hold. If they say “people have not contributed significantly to global warming,” it would be going against reality to agree with that, and thereby possibly reinforce that person’s state of misinformation. However, you might say, “I understand: you’re skeptical about the scientific evidence that industrial production is causing climate change,” thus showing them that you heard their message, without stating in any way that their message accurately reflects reality. Then, you can, for example, inquire into why they would think it problematic if industrial production did lead to global climate change. At that stage, perhaps you would uncover that their underlying fears come from economic security concerns: they might not want carbon emission limits to undercut job growth. Now, with that information about their emotional concerns, you can have a much healthier conversation, for example about the benefits of green energy.

Echoing has many benefits. It helps address the illusion of transparency, our tendency to believe that other people understand us better than they really do (as Thomas Gilovich and colleagues showed in 1998, in an article in the Journal of Personality and Social Psychology). A related kind of error, called the curse of knowledge, refers to the fact that most of us often forget that others do not have the knowledge and understanding that we do (described by Susan Birch in a 2005 piece in Current Directions in Psychological Science). This curse affects both the content of the interactions, and the skills needed for effective communication. Overestimating what others already know and understand can make us forget to include in what we say enough appropriate information about the context so that others can understand our perspective and the content we are trying to convey. Because of all this, it’s better to over-communicate rather than under-communicate about your message and the context behind it.

It’s best to assume that your conversational partner does not possess the rational communication skills described and advocated here. You’ll need to assume responsibility for the unspoken interaction between you and your conversational partner, such as what influences whether your conversational partner can comfortably pay attention to what you have to say. Psychologists call such “unspoken interaction” (“unspoken” because it’s implicit, not explicit, in what’s said in the conversation) the meta-conversation.

In addition to echoing, you’ll need to convey your own emotions to the other person so as to help them be emotionally attuned to you. Your success will lead to them perceiving you as authentic, and opening up to you, being vulnerable and willing to listen—really listen. Conveying your own emotions to other people involves a broad skill set usually called charisma or personal magnetism. Charisma is often seen as a talent that you either have or don’t have. However, there’s a lot of research that shows such magnetism mostly involves a range of strategies and techniques that you can master with study and practice. The research was well summarized by Olivia Fox Cabane in her book The Charisma Myth: How Anyone Can Master the Art and Science of Personal Magnetism, published in 2013. She advocates listening actively to the other person, and learning to read their emotional state, echoing their points and validating their emotions—methods we’ve already discussed. She offers other useful tactics as well, which I summarize here:

All of these ideas concern how the conversation is managed. Other contributions have been more focused on what is said, more than how. Research by Chip and Dan Heath, described in their 2007 book Made to Stick: Why Some Ideas Survive and Others Die, points to benefits from using stories that have emotional narratives and characters with which the other person—not you—can relate. For the same reason, it’s desirable to use language, and supporting points or examples, which will be comfortable for the person to whom you’re speaking. While the research by Chip and Dan Heath focused on everyday life, and on the business environment, more recent research shows that similar considerations apply to more politically sensitive conversations. An article in the journal Science by David Broockman and Joshua Kalla in 2016 showed that effective conversations can have long-term impacts on people’s views about politically sensitive issues such as transphobia. The study showed that ten-minute conversations with door-to-door canvassers who were trained in effective communication techniques reduced transphobia for at least three months.

Communicating the way that I’m recommending sounds like a lot of work, right? Well, yes, it is. I remember learning all of these skills one by one over time, and I still would certainly not consider myself a master at any of them. However, I’ve had a lot of such conversations, because I know they’re necessary and important. The techniques I’ve described in this chapter helped a lot: both the psychological research and my own experience of activism have convinced me of their effectiveness. To sum it up:

Put Empathy Before Information: Five Es to Help You

Communicate Rationally

E1. Emotional states: Become aware of your emotional states and learn to manage them (especially aggression). Own them; don’t let them own you.

E2. Empathy: Tune in, and turn it on. Develop your capacity to tune in to what others feel, and learn to turn it on especially when speaking with those who are not of your “tribe.”

E3. Eschew emotionally charged utterances. Use your empathy to avoid statements that will trigger difficult emotions in your conversation partners

E4. Echo what you hear the other person saying, so that they realize you are listening to them and getting their messages.

E5. Express your own emotional reality using the techniques of charisma, so that you build an authentic connection with the other person.

A Case Study in Rational Communication

I’ve had numerous successful conversations using these techniques to change people’s minds when they held irrational beliefs that are at odds with their goals. After getting practice in day-to-day conversations, I began to go on radio shows and do podcasts. These are high-stakes environments with little room for error. Let’s return to my radio interview with host Douglas Coleman, mentioned at the start of this chapter. For readers who want to listen to the interview as they read along, here’s the link: https://www.youtube.com/watch?v=Vkyu538T4ts

Around 16 minutes and 45 seconds into the clip, Coleman said that Trump “spoke the truth that many people did not want to hear.” I had to shift gears very quickly, reassessing the whole shape and course of the interview. I updated my evaluation of the situation, and my intended approach, in order to effectively target Coleman’s conservative audience. My response—in this and all other similar interviews—had to walk a fine line. I had to avoid inspiring defensive or aggressive responses by making the other person feel wrong or threatened, while still conveying my points effectively. Around 16:55, you will find my response. I first echoed Coleman’s point, saying that Trump had indeed expressed many ideas that people in this country did not want to hear. That created an immediate atmosphere of agreement between Coleman and me on something and helped him feel good about our interaction, validating his emotions. At the same time, I avoided saying anything untruthful or inaccurate: the plain statement that Trump said many things that plenty of people in the US did not want to hear is very accurate. Going onward, as part of that same response, I talked about Trump speaking to people’s guts, their emotions, and discussed how some people thought he was authentic. The point about Trump’s authenticity is something that conservatives often bring up as a point of pride, and I thought that Coleman might be gesturing at this in his original comment. Of course, I didn’t have the opportunity to use gentle questioning and curiosity here on the radio show, but my experience in these sorts of interviews enabled me to have a pretty good assessment of what Coleman’s perspective would probably turn out to be. While I wanted to make sure to acknowledge the perception of Trump as authentic, I also aimed to highlight how perceptions of authenticity came from speaking to people’s guts, not reason. I was willing to let audience members judge for themselves whether they think that’s a good idea.

Naturally, I could not leave unquestioned the idea that Trump spoke the truth: if I had done so, I’d have done more harm than good. So, after my two initial comments designed to help Douglas Coleman be comfortable with the conversation, at around 17:05 I stated that we need to be careful about what we mean by “truth.” Gently nudging the conversation toward the definition of the truth that I wanted to use, I posed the rhetorical question of whether Trump actually described reality on the ground, whether he conveyed the facts. I then provided the answer myself: sometimes he did, and sometimes he didn’t. To lessen the sting of that comment, I made sure to then quickly say that Clinton sometimes conveyed the facts, and sometimes did not. Then, I went on to say that we can discuss—if Coleman wishes—how to compare who is more truthful. I made sure to finish with a statement that again echoed Coleman, stating “there are certain truths, like you say yourself, that Trump was expressing that other politicians were not expressing.” Coleman responded at 17:37, and the first thing he said was “right.” Probably he was pleased by my near-quoting of his words. He then began to discuss the difference between objective, scientific truths—the facts—and personal, subjective truths, specifically bringing up belief in what is written in the Bible as an example of the latter.

Starting my response at 18:13, I agreed, and emphasized that we need to differentiate the truth about physical reality from the truth about personal beliefs. That set up a really good basis for the rest of our conversation. Coleman at around 19:00 steered the conversation to the Pro-Truth Pledge, and asked whether the goal is to get politicians and other public figures to describe their beliefs accurately, or to speak the truth about the issues. My response was that we can’t read people’s minds and thus are unable to verify whether they accurately report their beliefs. However, we can verify the facts about the issues, what Coleman referred to as “scientific facts.” Again, I used Coleman’s own language. As the discussion moved on, I used a number of conservative-friendly talking points, for example disparaging the myth spread mainly by liberals that the September 11 terrorist attack was an inside job by George W. Bush. Whenever I brought up mainstream news sources, I focused on conservative ones such as the Wall Street Journal. My comments emphasized how the Pro-Truth Pledge offers an opportunity to fight myths from the liberal side that lack a factual basis, while omitting to mention that most myths come from conservatives, and that many of the attacks on mainstream news sources stem from the right. This made sense, since my goal was to establish the pledge as a tool for fighting myth, not blaming one side or the other for creating them in the first place. The conversation flowed very smoothly after that, and by the end of the conversation, Douglas Coleman decided to take the Pro-Truth Pledge.

I used these rational communication techniques in many other interviews with conservative radio show hosts on topics ranging from Trump’s firing of FBI Director James Comey, and the baseless allegation that Obama wiretapped Trump Tower, to Muslim terrorist attacks. In these cases, and others, the hosts were able to shift their views at least somewhat. I believe it’s likely that the conversations swayed some of the audience to change their perspectives as well. Do conservative radio show hosts like Coleman feel hoodwinked and recant their agreements with me later? My experience so far indicates that’s not the case. (For a fuller analysis of three additional interviews, you can find a free PDF download at www.protruthpledge.org/book-interviews.) Given the scientific basis for the effectiveness of these rational communication methods, my hypothesis is that they will prove as effective for you as they have for me—and will be much more productive than just plain arguing.

How Can You Practice Rational Communication?

Remember, rational communication does not mean arguing facts first. Rational communication means to think rationally about the psychological reality of the other person, especially if they believe irrational ideas. If you argue facts first, you will most likely create a backfire effect, further entrenching the other person in their irrational ideas. To be rational means to use empathy before information. Take the time to understand their values, in a curious and respectful way. Create a sense of common goals for the conversation, so that the discussion of the facts works toward a common purpose. Make sure you do not threaten the other person’s sense of self and worldview. Then introduce information in such a way that they can consider it in a “what if” hypothetical manner, so they can see whether or not new information fits together with their worldview. Here’s a quick summary of these techniques.

Seek first: to understand the other’s worldview

Seek second: to understand the other’s emotions

Seek third: to find common goals and values

Seek fourth: to find a common ground of facts and logic

Seek fifth: a course of action you mutually agree on

Please experiment with these five steps in your own political conversations, including with close friends and family members. Indeed, one of the worst features of a politically polarized society is that people stop talking to each other. Without conversations across the political spectrum, it’s all too easy to assume the worst motives of the “other side.” Because we fail to understand each other, we impute motives to them, that: they are “stupid,” they “hate America,” and so on. Even if you don’t succeed in changing another person’s mind, you will better understand their thinking, and you will leave an impression of openness and willingness to communicate, which is desperately needed.

Important Terms Referenced in This Chapter

Rational communication: communication that uses accurate information to make wise decisions about how to communicate, and thereby to reach your goals by getting your message across.

Backfire effect: happens when people are presented with information that challenges their beliefs. People tend to defend their positions, and by arguing for them they get more convinced their beliefs are true.

Confirmation bias: a tendency to look for and interpret information in ways that accord with one’s existing beliefs.

Social intelligence: the ability to understand and manage people.

Emotional intelligence: the ability to monitor one’s own and others’ feelings and emotions, to discriminate among them and to use this information to guide one’s thinking and actions.

False equivalence: treating the two sides of a controversy as equal, even if one side has all the facts correct, and the other side merely presents misinformation.

False consensus effect: our tendency to overestimate how far other people share our characteristics, beliefs, and emotions.

Interpersonal empathy gap: a cognitive bias where we make mistakes about the intensity of other people’s emotions, usually underestimating it.

Meta-conversation: the implicit, unspoken interaction in a conversation.

Illusion of transparency: our tendency to believe that other people understand us better than they really do.

Curse of knowledge: the fact that most of us often forget that others do not have the knowledge and understanding that we do.

Echoing: in a conversation, paraphrasing what the other person has said in a short summary using your own words.

Charisma: “personal magnetism”; the ability to convey your own emotions to other people in a way that attracts or “charms” them, and thus more easily influence them.