The rush to judgement stops us listening and learning. Instead of trying to win the argument, try to be interested – and interesting.
Daniel Kahan, a professor of law at Yale University, studies the way that our political opinions dumb us down. More specifically, he investigates how people unconsciously distort new information to make it fit what they already believe, on controversial topics from vaccination to climate change. A commonly heard complaint about political culture is that voters aren’t presented with enough facts. Kahan’s work suggests that giving people facts won’t necessarily help.
In one of his studies, people were given a maths problem to solve. Using data from a (fictional) clinical trial, the respondents had to make a series of calculations to work out whether a new skin cream had been shown to increase or decrease a rash. Most respondents got the answer right. Next, they were invited to interpret exactly the same set of statistics, this time in the guise of a question about gun laws, a highly polarising topic in the United States. Some respondents were given data suggesting that gun crime was on the rise after a change in the law, others that it was falling. This time, how accurately people answered the statistical question depended on their political persuasion. Faced with a result they didn’t like, pro-gun respondents suddenly got worse at maths; the same went for anti-gun respondents.
Kahan points out that this shouldn’t be so surprising. If a person reads about a potentially dangerous skin cream, or a change in how much tax they pay, it makes sense to absorb the new information rather than sticking to what they already believe. To do otherwise would clearly be self-defeating. But most people get little tangible benefit from being correct about, say climate change. They do, however, get an immediate benefit from expressing beliefs that others like them share: the feeling of belonging. We care more about people than being right, and the risk of changing a shared belief is that you no longer have people with whom to share it.
Say you’re discussing the night sky with a friend and he mentions that Venus is Earth’s nearest planetary neighbour. If you correct him (it’s actually Mars), he’ll probably accept he was wrong. Maybe he’ll be mildly embarrassed, but the conversation will move on. Now imagine a similar conversation taking place in the seventeenth century. Your friend says something about the Sun going round the Earth, and you correct him, noting that according to the findings of this guy Galileo, it’s the other way around. Your friend is likely to get furious, to deny all the evidence you present, and to denounce you as a wicked heretic. That’s because, at the time, astronomy was not just about astronomy. It was bound up with people’s deeply held beliefs about the social and spiritual order. By telling your friend that the Earth goes round the Sun, you were not just correcting an error in his conception of the physical universe, you were threatening his place in the social universe and thus his very sense of self. That’s why, faced with information on a topic in which we have some personal investment, we perceive what supports our identity and ignore what does not.
Kahan’s name for this phenomenon is ‘Identity-Protective Cognition’. You might assume it only applies to people of low intelligence or education, but Kahan has found that highly intelligent and educated people are, if anything, more likely to distort and mould facts to fit their worldview. Clever people are better at finding reasons to support their beliefs, even when those beliefs are false. They make more convincing arguments, to others and to themselves, and they’re better at reasoning away contradictory information. In the online forums on the flatness of the Earth or the lies of climate science, you can observe people using considerable scientific erudition to reach wholly erroneous conclusions.
For those hoping for more productive political disagreements, this suggests a bleak prognosis. More facts won’t help, neither will better reasoning. So what might? Kahan discovered an answer to that question by accident, when he was approached by a group of documentary makers who wanted some guidelines on how to interest viewers in science-based topics. The group asked the professor to help them identify members of the public with high levels of scientific curiosity. Kahan and his team of researchers invented a survey tool, called the Science Curiosity Scale (SCS): a series of questions designed to predict how likely a person is to have their attention held by a science documentary. It includes questions about how likely the respondent is to read science books, and invites them to choose between a few articles with different levels of scientific content.
Kahan’s team surveyed thousands of people and found that individuals with high levels of scientific curiosity were equally distributed across the population: men and women, lower class and upper class, right-leaning and left-leaning. They discovered something else, too – something totally unexpected. Out of his own curiosity, Kahan had inserted some questions on politically polarised issues into the survey. When the answers came back, he noticed that the higher a person’s level of scientific curiosity, the less partisan bias she displayed.
For Kahan, this was counter-intuitive. Previously, he had established that more knowledgeable people were also more likely to be partisan thinkers. But what the survey had done was distinguish people high in knowledge from those high in curiosity. The curious people didn’t necessarily know a lot about science, but they took a lot of pleasure in finding stuff out. It turned out that Republicans and Democrats who were highly curious were much closer in their views on, say, climate change, than Republicans and Democrats with significant levels of knowledge about it.
Kahan and his research colleagues designed another test. They gave participants a selection of articles about climate change and asked them to pick the one that they found most interesting. Some of the articles were supportive of climate change science, and some undermined it; some articles had headlines that framed the story as a surprise, others as confirmation of what was known.
Normally, partisan respondents would pick the article that supported their worldview. But science-curious Republicans picked articles that went against their prevailing political viewpoint when the headline framed the story as a surprise (‘Scientists Report Surprising Evidence: Arctic Ice Melting Even Faster Than Expected’). The same was true of science-curious Democrats, in reverse. For the scientifically curious, Kahan concluded, the intrinsic pleasures of surprise and wonder trump their desire to have what they already know confirmed. Curiosity beats bias.
* * *
Powering up your desire to learn is often the only way to make the most out of a difficult encounter. If you’re a climate change activist who meets someone who believes the whole thing is a hoax, the best you can do is be intrigued by how they arrived at that view. What experiences have they had, what have they read or heard, that got them there? Knowing that won’t reconcile you to their view, but it gives you something to talk about.
You can get to a disagreement too soon. It’s usually wise to defer the point at which you say, ‘Well, actually . . .’; the longer you let the other person talk, unimpeded by interruption or the need to defend themselves, the more data you gather about their perspective. That inevitably puts you in a stronger position: either you will learn something that modifies your view or you’ll have gained a better understanding of the other person’s view, and of how to argue with them. And sometimes, the more a person talks, the more they talk themselves out of the position they started in.
Questions are good way of showing curiosity, but they can also be a way of avoiding it. If I ask, ‘Are you serious?’ I’m really saying that I don’t take you seriously. Asking, ‘Why do you believe that?’ is better but not by much. It sounds like a demand to the other person to justify themselves. It positions you as the judge and puts them in the dock. Much better to ask, ‘Can you tell me more?’ or some variant. That kind of question shows that you’re willing to listen and that you see this as a conversation of equals. ‘Can you tell me more about why you believe that?’ is different to, ‘Why do you believe that?’ in a subtle but significant way.
Some of this book was written during a stay in Paris. While there I was contacted by a businessman called Neil Janin who knew that I’d previously published a book on curiosity, the topic he wished to discuss. He didn’t know I was writing a book on disagreement, but that turned out to be his speciality. Janin spent thirty years at the management consultancy McKinsey, many of them running its Paris office. Now semi-retired, he coaches senior executives in how to deal with difficult, conflict-ridden conversations. When we met, he was recovering from illness and had lost his voice. From across a café table, he fixed me with his penetrating gaze and delivered aphoristic wisdom in an intense, rasping whisper. ‘The key to it all’, he said, ‘is connection. If you don’t connect, you can’t create. What stops me connecting to a colleague? Judgement. “He’s stupid, she doesn’t get it. They don’t have the facts; once I give them the facts they will change their mind; if not, they’re idiots.”’ When we are in a disagreement, he continued, we face a choice, with an easy option and a hard option. ‘We love judgement. It helps us be “right”, which is good for our ego, and requires no energy. Curiosity is energy-consuming, because you’re trying to figure things out. But it’s the only way through.’
Laurence Alison told me that in order to be effective, interrogators have to suspend moral judgement, no matter what terrible crime the suspect may have committed. ‘There’s a reason this person has ended up opposite you, and it’s not just because they’re evil. If you’re not interested in why they’re here, you’re not going to be a good interrogator.’ Janin echoes this sentiment in what he says is the most important piece of advice he gives his clients: ‘SUSPEND JUDGEMENT. GET CURIOUS!’
Disagreement is hard work even for management consultants and their clients, who at least share a culture: analytical and logical, responsive to incentives and interests. What should you do if the person you’re talking to seems emotionally led, irrational, possessed by bizarre beliefs? That’s a question Jayne Docherty addresses in her book on the Waco negotiations. The key to it, she suggests, is to assume that they are being rational, and make it your job to figure out what kind of rationality they are using.
Max Weber, the great sociologist, argued that we use the term ‘rational’ too narrowly. It usually describes people acting in a logical way designed to achieve a material goal. Weber called that instrumental rationality and proposed three other types of rational behaviour. There is affective rationality, when I make my relationships central to whatever I say and do; that’s the rationality used by the respondents in Daniel Kahan’s study. There’s traditional rationality, when we are happy to accept the steer that previous generations have given us, which is why we might put a tree in our house in December. Finally, there is values rationality, when everything we do is in the service of some higher value, almost regardless of outcome. That’s what the Davidians were employing, to the befuddlement of the more instrumentally rational FBI.
Few people rely on just one mode of rationality; most of us switch between them or use more than one at once. Docherty points out that the Davidians could actually be quite practical and analytical and willing to problem-solve, as long as it didn’t conflict with their ultimate values. This is one way that curiosity can help us. In a disagreement with someone who isn’t using instrumental rationality – whether family member, colleague or political opponent – then, rather than assuming they’re crazy, you can try and get curious about what mode of rationality they’ve moved into. When your daughter is being irrationally stubborn about going to bed later, she might be operating in affective rationality; she is looking for a way to spend more time with you. What’s the deeper logic of the other person’s behaviour? Come to think of it, what’s the deeper logic of your own?
* * *
You’re not only trying to get curious yourself, you’re trying to stimulate the other person’s curiosity. So how do you do that?
Gregory Trevors, a psychologist from the University of South Carolina, has studied ‘the backfire effect’: the paradoxical tendency for people to strengthen a belief in false information after the falsity of that information is pointed out to them (the term was coined by political scientists who found that, in 2009, people who believed that Iraq was behind 9/11 were more likely to believe that proposition after being shown information which refuted it). It’s a similar reaction to the one that addicts have on being told their habit is bad for them. The risk of correcting someone, as we’ve seen, is that you trigger an identity threat. That brings what Trevors calls ‘moral emotions’, like anger and anxiety, into play, which can quickly derail the conversation. Anger and anxiety lead people to focus narrowly on defending their position and attacking the source of any conflicting messages. An alternative strategy is to try and activate the other’s ‘epistemic emotions’, like surprise and curiosity, which, according to Trevors, act as an antidote to anxiety and anger. Carli Leon, the former anti-vaccinator who spoke about how insults had made her dig her heels in, also said, ‘What helped me was people asking me questions that got me to think.’
Earlier on we discussed how to avoid triggering a threat reaction: convey your regard for the other person before getting into the disagreement (being curious about what they have to say is one way to do that). Other than affirmation, you can also frame new information or new arguments in a way that intrigues them rather than putting them on the back foot. As Daniel Kahan found, surprise – I didn’t know that, or, I hadn’t thought of it like that before – loosens up rigid beliefs. Displaying your own curiosity about the topic indicates that you don’t think you have all the answers and encourages them to feel curious too. Gregory Trevors suggested using stories, humour and metaphors to neutralise the other’s defence system. In short, rather than trying to sound convincing, try to be interesting and interested.
It’s always easier to be incurious than curious. As Neil Janin suggested, curiosity is hard because it requires the allocation of scarce resources: energy, time and attention. If you have a different opinion to me on, say, immigration, that might be because you have a different experience of it to me. But contemplating that gulf of difference demands an expenditure of brainpower on my part to which I’m often unwilling to commit. It’s simply quicker and more efficient to dismiss you as bigoted than it is to be interested in what you’re saying. In a world where we’re bombarded with opinions, that can seem like a necessary reaction, but it’s one we should resist. By shutting down our curiosity about different views we make ourselves less intelligent, less humane – and less interesting.