LUKE SKYWALKER: ‘No . . . that’s not true. That’s impossible!’
DARTH VADER: ‘Search your feelings, you know it to be true!’
—The Empire Strikes Back (1980)1
Abraham Bredius was nobody’s fool. An art critic and collector, he was the world’s leading scholar on Dutch painters, and particularly the seventeenth-century master Johannes Vermeer. As a young man in the 1880s, Bredius had made his name by spotting works wrongly credited to Vermeer. At the age of eighty-two, in 1937, he was enjoying something of a retirement swansong. He had just published a highly respected book in which he had identified two hundred fakes or imitations of Rembrandt.2
It was at this moment in Bredius’s life that a charming lawyer named Gerard Boon paid a visit to his Monaco villa. Boon wanted to ask Bredius’s opinion of a newly rediscovered work, Christ at Emmaus, thought to have been painted by Vermeer himself. The exacting old man was spellbound. He sent Boon away with his verdict: Emmaus was not only a Vermeer, it was the Dutch master’s finest work.
‘We have here – I am inclined to say – the masterpiece of Johannes Vermeer of Delft,’ wrote Bredius in a magazine article shortly after. ‘Quite different from all his other paintings and yet every inch a Vermeer.
‘When this masterpiece was shown to me I had difficulty controlling my emotion,’ he added, noting reverently that the work was ongerept – Dutch for virginally pure and untouched. It was an ironic choice of words: Emmaus could hardly have been more corrupt. It was a rotten fraud of a painting, stiffly applied to an old canvas just a few months before Bredius caught sight of it, and hardened with Bakelite.
Yet this crude trickery not only caught out Bredius, but the entire Dutch art world. Christ at Emmaus soon sold for 520,000 guilders to the Boijmans Museum in Rotterdam. Compared to the wages of the time that is the equivalent of about £10 million today. Bredius himself contributed to help the museum buy the picture.
Emmaus became the centrepiece of the Boijmans Museum, drawing admiring crowds and rave reviews. Several other paintings in a similar style soon emerged. Once the first forgery had been accepted as a Vermeer, it was easier to pass off these other fakes. They didn’t fool everyone, but like Emmaus they fooled the people who mattered. Critics certified them; museums exhibited them; collectors paid vast sums for them – a total of more than £100 million in today’s money. In financial terms alone, this was a monumental fraud.
But there was more. The Dutch art world revered Vermeer as one of the greatest painters who ever lived. Painting mostly in the 1660s, he had been rediscovered only in the late 1800s. Fewer than forty of his works survive. The apparent emergence of half a dozen Vermeers in just a few years was a major cultural event.
It was also an event that should have strained credulity. But it did not. Why?
Don’t look to the paintings themselves for an answer. If you compare a genuine Vermeer to the first forgery, Emmaus, it is hard to understand how anyone was fooled – let alone anyone as discerning as Abraham Bredius.
Vermeer was a true master. His most famous work is Girl With a Pearl Earring, a luminous portrait of a young woman: seductive, innocent, adoring and anxious all at once. The painting inspired a novel, and a movie starring Scarlett Johansson as the unnamed girl. In The Milkmaid, a simple scene of domesticity is lifted by details such as the rendering of a copper pot, and a display of fresh-baked bread that looks good enough to grab out of the painting. Then there’s Woman Reading a Letter. She stands in the soft light of an unseen window. Is she, perhaps, pregnant? We see her in profile as she holds the letter close to her chest, eyes cast down as she reads. There’s a dramatic stillness about the image – we feel that she’s holding her breath as she scans the letter for news; we hold our breath too. A masterpiece.
And Christ at Emmaus? It’s a static, awkward image by comparison. Rather than seeming to be an inferior imitation of Vermeer, it doesn’t look like a Vermeer at all. It’s not a terrible painting, but it’s not a brilliant one either. Set alongside Vermeer’s works it seems dour and clumsy. And yet it, and several others, fooled the world – and might continue to fool the world to this day, had not the forger been caught out by a combination of recklessness and bad luck.
In May 1945, with the war in Europe at an end, two officers from the Allied Art Commission knocked on the door of 321 Keizersgracht, one of Amsterdam’s most exclusive addresses. They were met by a charismatic little man called Han van Meegeren. The young van Meegeren had enjoyed some brief success as an artist. In middle age, as his jowls had loosened and his hair had silvered, he had grown rich as an art dealer.
But perhaps he had been dealing art with the wrong people, because the officers came with a serious charge: that van Meegeren had sold Johannes Vermeer’s newly discovered masterpiece, The Woman Taken in Adultery, to a German Nazi. And not just any Nazi, but Hitler’s right-hand man, Hermann Göring.
Van Meegeren was arrested and charged with treason. He responded with furious denials, trying to bluster his way to freedom. His forceful, fast-talking manner was usually enough to get him out of a sticky situation. Not this time. A few days into his incarceration, he cracked. He confessed not to treason but to a crime that caused astonishment across the Netherlands and the art world as a whole.
‘Fools!’ he sneered. ‘You think I sold a priceless Vermeer to Göring? There was no Vermeer! I painted it myself.’3
Van Meegeren admitted painting not only the work that had been found in Nazi hands, but Christ at Emmaus and several other supposed Vermeers. The fraud had unravelled not because anyone spotted these flawed forgeries, but because the forger himself confessed. And why wouldn’t he? Selling an irreplaceable Vermeer masterpiece to the Nazis would have been a hanging offence, whereas selling a forgery to Hermann Göring wasn’t just forgivable, it was admirable.
But the question remains: how could a man as expert as Abraham Bredius have been fooled by so crass a forgery? And why begin a book about statistics with a tale that has nothing at all to do with numbers?
Working out how van Meegeren fooled Bredius teaches us much more than a footnote in the history of art; it explains why we buy things we don’t need, fall for the wrong kind of romantic partner, and vote for politicians who betray our trust. In particular, it explains why so often we buy into statistical claims that even a moment’s thought would tell us cannot be true.
Van Meegeren wasn’t an artistic genius, but he intuitively understood something about human nature. Sometimes, we want to be fooled.
We’ll return to the cause of Abraham Bredius’s error in a short while. For now, it’s enough to understand that his deep knowledge of Vermeer’s paintings proved to be a liability rather than an asset. When he saw Christ at Emmaus, Bredius was undone by his emotional response. The same trap lies in wait for any of us.
The aim of this book is to help you be wiser about statistics. That means I also need to help you be wiser about yourself. All the statistical expertise in the world will not prevent you believing claims you shouldn’t believe and dismissing facts you shouldn’t dismiss. That expertise needs to be complemented by control of your own emotional reactions to the statistical claims you see.
In some cases there’s no emotional reaction to worry about. Let’s say I tell you that Mars is more than 50 million kilometres, or 30 million miles, away from the Earth. Very few people have a passionately held belief about that claim, so you can start asking sensible questions immediately.
For example: is 30 million miles a long way? (Sort of. It’s more than a hundred times further than the distance between Earth and the moon. Other planets are a lot further away, though.) Hang on, isn’t Mars in a totally different orbit? Doesn’t that mean the distance between the Earth and Mars varies all the time? (Indeed it does. The minimum distance between the two planets is a bit more than 30 million miles, but sometimes Mars is more than 200 million miles away.) Because there is no emotional response to the claim to trip you up, you can jump straight to trying to understand and evaluate it.
It’s much more challenging when emotional reactions are involved, as we’ve seen with smokers and cancer statistics. Psychologist Ziva Kunda found the same effect in the lab, when she showed experimental subjects an article laying out the evidence that coffee or other sources of caffeine could increase the risk to women of developing breast cysts. Most people found the article pretty convincing. Women who drank a lot of coffee did not.4
We often find ways to dismiss evidence that we don’t like. And the opposite is true, too: when evidence seems to support our preconceptions, we are less likely to look too closely for flaws.
The more extreme the emotional reaction, the harder it is to think straight. What if your doctor told you that you had a rare form of cancer, and advised you not to look it up? What if you ignored that advice, consulted the scientific literature, and discovered that the average survival time was just eight months?
Exactly that situation confronted Stephen Jay Gould, a palaeontologist and wonderful science writer, at the age of forty. ‘I sat stunned for about fifteen minutes . . .’ he wrote in an essay that has become famous. You can well imagine his emotions. Eight months to live. Eight months to live. Eight months to live. ‘Then my mind started to work again, thank goodness.’5
Once his mind did start to work, Gould realised that his situation might not be so desperate. The eight months wasn’t an upper limit; it was the median average, which means that half of sufferers live longer than that. Some, possibly, live a great deal longer. Gould had a good chance: he was fairly young; his cancer had been spotted early; he’d get good treatment.
Gould’s doctor was being kind in trying to steer him away from the literature, and many of us will go to some lengths to avoid hearing information we suspect we might not like. In another experiment, students had a blood sample taken and were then shown a frightening presentation about the dangers of herpes; they were then told that their blood sample would be tested for the herpes virus. Herpes can’t be cured, but it can be managed, and there are precautions a person can take to prevent transmitting the virus to sexual partners – so it would be useful to know whether or not you have herpes. Nevertheless, a significant minority, one in five, not only preferred not to know whether they were infected but were willing to pay good money to have their blood sample discarded instead. They told researchers they simply didn’t want to face the anxiety.6
Behavioural economists call this ‘the ostrich effect’. For example, when stock markets are falling, people are less likely to log in to check their investment accounts online.7 That makes no sense. If you use information about share prices to inform your investment strategy, you should be just as keen to get it in bad times as good. If you don’t, there’s little reason to log in at all – so why check your account so frequently when the market is rising?
It is not easy to master our emotions while assessing information that matters to us, not least because our emotions can lead us astray in different directions. Gould realised he hadn’t been thinking straight because of the initial shock – but how could he be sure, when he spotted those signs of hope in the statistics, that he wasn’t now in a state of denial? He couldn’t. With hindsight, he wasn’t: he lived for another twenty years, and died of an unrelated condition.
We don’t need to become emotionless processors of numerical information – just noticing our emotions and taking them into account may often be enough to improve our judgement. Rather than requiring superhuman control over our emotions, we need simply to develop good habits. Ask yourself: how does this information make me feel? Do I feel vindicated or smug? Anxious, angry or afraid? Am I in denial, scrambling to find a reason to dismiss the claim?
I’ve tried to get better at this myself. A few years ago, I shared a graph on social media which showed a rapid increase in support for same-sex marriage. As it happens, I have strong feelings about the matter and I wanted to share the good news. Pausing just long enough to note that the graph seemed to come from a reputable newspaper, I retweeted it.
The first reply was ‘Tim – have you looked at the axes on that graph?’ My heart sank. Five seconds looking at the graph would have told me that it was inaccurate, with the time scale a mess that distorted the rate of progress. Approval for marriage equality was increasing, as the graph showed, but I should have clipped it for my ‘bad data visualisation’ file rather than eagerly sharing it with the world. My emotions had got the better of me.
I still make that sort of mistake – but less often, I hope.
I’ve certainly become more cautious – and more aware of the behaviour when I see it in others. It was very much in evidence in the early days of the coronavirus epidemic, as helpful-seeming misinformation spread even faster than the virus itself. One viral post – circulating on Facebook and email newsgroups – all-too-confidently explained how to distinguish between Covid-19 and a cold, reassured people that the virus was destroyed by warm weather, and incorrectly advised that iced water was to be avoided, while warm water kills any virus. The post, sometimes attributed to ‘my friend’s uncle’, sometimes to ‘Stanford hospital board’ or some blameless and uninvolved paediatrician, was occasionally accurate but generally speculative and misleading. Yet people – normally sensible people – shared it again and again and again. Why? Because they wanted to help others. They felt confused, they saw apparently useful advice, and they felt impelled to share. That impulse was only human, and it was well-meaning – but it was not wise.8
Before I repeat any statistical claim, I first try to take note of how it makes me feel. It’s not a foolproof method against tricking myself, but it’s a habit that does little harm and is sometimes a great deal of help. Our emotions are powerful. We can’t make them vanish, and nor should we want to. But we can, and should, try to notice when they are clouding our judgement.
In 2011, Guy Mayraz, then a behavioural economist at the University of Oxford, conducted a test of wishful thinking.9
Mayraz showed his experimental subjects a graph of a price rising and falling over time. These graphs were actually historical snippets from the stock market, but Mayraz told people that the graphs showed recent fluctuations in the price of wheat. He asked each person to make a forecast of where the price would move next – and offered them a reward if their forecasts came true.
But Mayraz had also divided his experimental participants into two categories. Half of them were told that they were ‘farmers’, who would be paid extra if wheat prices were high. The rest were ‘bakers’, who would earn a bonus if wheat was cheap. So the subjects might earn two separate payments: one for making an accurate forecast, and the second a windfall if the price of wheat happened to move in their direction. Yet Mayraz found that the prospect of the windfall influenced the forecast itself. The farmers hoped that the price of wheat would rise, and they also predicted that the price of wheat would rise. The bakers hoped for – and predicted – the opposite. This is wishful thinking in its purest form: letting our reasoning be swayed by our hopes.
Another example was produced by economists Linda Babcock and George Loewenstein, who ran an experiment in which participants were given evidence from a real court case about a motorbike accident. They were then randomly assigned to play the role of plaintiff’s attorney (arguing that the injured motorcyclist should receive $100,000 in damages) or defence attorney (arguing that the case should be dismissed or the damages should be low).
The experimental subjects were given a financial incentive to argue their side of the case persuasively and to reach an advantageous settlement with the other side. They were also given a separate financial incentive to accurately guess what damages the judge in the real case had actually awarded. Their predictions should have been unrelated to their role-playing, but again, their judgement was strongly influenced by what they hoped would be true.*10
Psychologists call this ‘motivated reasoning’. Motivated reasoning is thinking through a topic with the aim, conscious or unconscious, of reaching a particular kind of conclusion. In a football game, we see the fouls committed by the other team but overlook the sins of our own side. We are more likely to notice what we want to notice.11
Perhaps the most striking example of this is among people who deny that the human immunodeficiency virus, HIV, causes AIDS. Some deny that HIV exists at all, but in any case HIV denialism implies rejecting the standard, and now highly effective, treatments. Some prominent believers in this idea have, tragically, doomed themselves and their children to death – but it must have been a comforting belief, particularly in the years when treatments for the condition were less effective and carried more severe side effects than they do today. One might assume that such a tragic belief would be vanishingly rare, but perhaps not. One survey of gay and bisexual men in the United States found that almost half believed HIV did not cause AIDS and more than half believed the standard treatments did more harm than good. Other surveys of people living with AIDS found the prevalence of denialist views at 15 to 20 per cent. These surveys weren’t rigorous randomised samples, so I would not take the precise numbers too seriously. However, it’s clear evidence that large numbers of people reject the scientific consensus in a way that could put them in real danger.12
I could see wishful thinking in operation in March 2020, too, when researchers at the University of Oxford published a ‘tip of the iceberg’ model of the pandemic. That model suggested that the coronavirus might be much more widespread but less dangerous than we thought, which had the joyful implication that the worst would soon be over. It was a minority view among epidemiologists, because the data detective work being done at that point saw little evidence that the vast majority of people had negligible symptoms. Indeed, one of the central points of the Oxford group was that we desperately needed better data to figure out the truth. That, however, was not the message that caught on. Instead, people widely shared the ‘good news’, because it was the kind of thing we all wanted to be true.13
Wishful thinking isn’t the only form of motivated reasoning, but it is a common one. We believe in part because we want to. A person who is HIV-positive would find it comforting to believe that the virus does not lead to AIDS and cannot be passed to breastfeeding children. A ‘farmer’ wants to be accurate in his forecast of wheat prices, but he also wants to make money, so his forecasts are swayed by his avarice. A political activist wants the politicians she supports to be smart and witty and incorruptible. She’ll go to some effort to ignore or dismiss evidence to the contrary.
And an art critic who loves Vermeer is motivated to conclude that the painting in front of him is not a forgery, but a masterpiece.
It was wishful thinking that undid Abraham Bredius. The art historian had a weak spot: his fascination with Vermeer’s religious paintings. Only two existed. He had discovered one of them himself: The Allegory of Faith. He still owned it. The other, Christ in the House of Martha and Mary, was the only Vermeer known to portray a scene from the Bible. Bredius had assessed it in 1901 and concluded quite firmly that it was not a Vermeer. Other critics disagreed, and eventually everyone reached the conclusion that Bredius had been wrong, including Bredius himself.
Stung by that experience, Bredius was determined not to repeat his mistake. He knew and loved Vermeer better than any man alive, and was always on the lookout for a chance to redeem himself by correctly identifying the next discovery of a Vermeer masterpiece.
And Bredius had become fascinated by the gap between the early, biblical Martha and Mary and Vermeer’s more characteristic works, which were painted some years later. What lurked undiscovered in that gap? Wouldn’t it be wonderful if another biblical work were found after all these years?
Bredius had another pet theory about Vermeer. The idea was that the Dutch master had, as a young man, travelled to Italy and been inspired by the religious works of the great Italian master Caravaggio. This was conjecture; not much was known about Vermeer’s life. Nobody knew if he had ever seen a Caravaggio.
Van Meegeren knew all about Bredius’s speculations. He painted Emmaus as a trap. It was a big, beautiful canvas, on a biblical theme, and – just as Bredius had argued all along – the composition was a homage to Caravaggio. Van Meegeren had planted some Vermeer-like touches in the painting, using seventeenth-century props. The bread that Christ is breaking is highlighted, just like that famous pearl earring, with thick dots of white paint called pointillés. And the paint was hard and cracked with age.
Bredius had no doubts. Why would he? Van Meegeren’s stooge, Gerard Boon, wasn’t just showing Bredius a painting: Boon was showing him evidence that he had been right all along. In the final years of his life, the old man had found the missing link at last. Bredius wanted to believe, and because he was an expert, he had no trouble in summoning up reasons to support his conclusion.
Those tell-tale pointillés on the bread, for instance: the white dots seem a bit clumsy to the untrained eye but they reminded Bredius of Vermeer’s highlights on that tempting loaf of bread in The Milkmaid. The fact that the composition echoed Caravaggio would have been lost on a casual viewer, but leaped off the canvas under Bredius’s gaze. He would have picked up other clues that Emmaus was the real thing. He would have noted the genuine seventeenth-century vase that van Meegeren had used as a prop. There were seventeenth-century pigments, too, or as close as possible. Van Meegeren had expertly duplicated Vermeer’s colour palette. There was the canvas itself: an expert such as Bredius could spot a nineteenth- or twentieth-century forgery simply by looking at the back of the painting and noting that the canvas was too new. Van Meegeren knew this. He had painted his work on a seventeenth-century canvas, carefully scrubbed of its surface pigments but retaining the undercoat and its distinctive pattern of cracking.
And then there was the simplest test of all: was the paint soft? The challenge for anyone who wants to forge an old master is that oil paints take half a century to dry completely. If you dip a cotton bud into some pure alcohol and gently rub the surface of an oil painting, then the cotton may come away stained with pigments. If it does, the painting is a modern fake. Only after several decades will the paint harden enough to pass this test.
Bredius had identified fakes using this method before – but the paint on Emmaus stubbornly refused to yield its pigment. This gave Bredius an excellent reason to believe that Emmaus was old, and therefore genuine. Van Meegeren had fooled him with a brilliant piece of amateur chemistry, the result of many months of experimentation. The forger had figured out a way to mix seventeenth-century oil paints with a brand-new material: phenol formaldehyde, a resin that when heated at 105ºC for two hours turned into one of the first plastics, Bakelite. No wonder the paint was hard and unyielding: it was infused with industrial plastic.
Bredius had half a dozen subtle reasons to believe that Emmaus was a Vermeer. They were enough to dismiss one glaring reason to believe otherwise: that the picture doesn’t look like anything else Vermeer ever painted.
Take another look at that extraordinary statement from Abraham Bredius: ‘We have here – I am inclined to say – the masterpiece of Johannes Vermeer of Delft . . . quite different from all his other paintings and yet every inch a Vermeer.’
‘Quite different from all his other paintings’ – shouldn’t that be a warning? But the old man desperately wanted to believe that this painting was the Vermeer he’d been looking for all his life, the one that would provide the link back to Caravaggio himself. Van Meegeren set a trap into which only a true expert could stumble. Wishful thinking did the rest.
Abraham Bredius bears witness to the fact that experts are not immune to motivated reasoning. Under some circumstances their expertise can even become a disadvantage. The French satirist Molière once wrote, ‘A learned fool is more foolish than an ignorant one.’ Benjamin Franklin commented, ‘So convenient a thing is it to be a reasonable creature, since it enables us to find or make a reason for everything one has a mind to.’
Modern social science agrees with Molière and Franklin: people with deeper expertise are better equipped to spot deception, but if they fall into the trap of motivated reasoning, they are able to muster more reasons to believe whatever they really wish to believe.
One recent review of the evidence concluded that this tendency to evaluate evidence and test arguments in a way that’s biased towards our own preconceptions is not only common, but just as common among intelligent people. Being smart or educated is no defence.14 In some circumstances it may even be a weakness.
One illustration of this is a study published in 2006 by two political scientists, Charles Taber and Milton Lodge. Taber and Lodge were following in the footsteps of Kari Edwards and Edward Smith, whose work on politics and doubt we encountered in the introduction. As with Edwards and Smith, they wanted to examine the way Americans reasoned about controversial political issues. The two they chose were gun control and affirmative action.
Taber and Lodge asked their experimental participants to read a number of arguments on either side and to evaluate the strength and weakness of each argument. One might hope that being asked to review these pros and cons might give people more of a shared appreciation of opposing viewpoints; instead, the new information pulled people further apart. This was because people mined the information they were given for ways to support their existing beliefs. When invited to search for more information, people would seek out data that backed their preconceived ideas. When invited to assess the strength of an opposing argument, they would spend considerable time thinking up ways to shoot it down.
This isn’t the only study to reach this sort of conclusion, but what’s particularly intriguing about Taber and Lodge’s experiment is that expertise made matters worse.* More sophisticated participants in the experiment found more material to back up their preconceptions. More surprisingly, they found less material that contradicted them – as though they were using their expertise actively to avoid uncomfortable information. They produced more arguments in favour of their own views, and picked up more flaws in the other side’s arguments. They were vastly better equipped to reach the conclusion they had wanted to reach all along.15
Of all the emotional responses we might have, the most politically relevant are motivated by partisanship. People with a strong political affiliation want to be on the right side of things. We see a claim, and our response is immediately shaped by whether we believe ‘that’s what people like me think’.
Consider this claim about climate change: ‘human activity is causing the Earth’s climate to warm up, posing serious risks to our way of life’. Many of us have an emotional reaction to a claim like that; it’s not like a claim about the distance to Mars. Believing it or denying it is part of our identity; it says something about who we are, who our friends are, and the sort of world we want to live in. If I put a claim about climate change in a news headline, or in a graph designed to be shared on social media, it will attract attention and engagement not because it is true or false but because of the way people feel about it.
If you doubt this, ponder the findings of a Gallup poll conducted in 2015. It found a huge gap between how much Democrats and Republicans in the United States worried about climate change. What rational reason could there be for that? Scientific evidence is scientific evidence. Our beliefs around climate change shouldn’t skew left and right. But they do.16
This gap became wider the more education people had. Among those with no college education, 45 per cent of Democrats and 22 per cent of Republicans worried ‘a great deal’ about climate change. Yet among those with a college education, the figures were 50 per cent of Democrats and 8 per cent of Republicans. A similar pattern holds if you measure scientific literacy: more scientifically literate Republicans and Democrats are further apart than those who know very little about science.17
If emotion didn’t come into it, surely more education and more information would help people to come to an agreement about what the truth is – or at least, the current best theory? But giving people more information seems actively to polarise them on the question of climate change. This fact alone tells us how important our emotions are. People are straining to reach the conclusion that fits with their other beliefs and values – and, like Abraham Bredius, the more they know, the more ammunition they have to reach the conclusion they hope to reach.
Psychologists call one of the processes driving this polarisation ‘biased assimilation’. Imagine that you happen to encounter a magazine article that is discussing what we know about the effects of the death penalty. You’re interested in the topic and so you read on, encountering the following brief account of a research study:
Researchers Palmer and Crandall compared murder rates in 10 pairs of neighboring states with different capital punishment laws. In 8 of the 10 pairs, murder rates were higher in the state with capital punishment. This research opposes the deterrent effect of the death penalty.
What do you think? Does that seem plausible?
If you’re opposed to the death penalty, then it probably does. But if you’re in favour of the death penalty, doubts might start to creep in – those kind of doubts that we’ve already seen were so powerful in the case of tobacco. Was this research professionally conducted? Did they consider alternative explanations? How did they handle their data? In short, do Palmer and Crandall really know what they’re doing, or are they a pair of hacks?
Palmer and Crandall won’t be offended by your doubts. The duo do not exist. They were dreamed up by three psychologists, Charles Lord, Lee Ross and Mark Lepper. In 1979, Lord, Ross and Lepper conducted an experiment that was designed to explore how people thought through arguments they felt passionately about. The researchers rounded up experimental subjects with strong views in favour of, or against, the death penalty. They showed the experimental subjects summaries of two imaginary studies. One of these made-up studies demonstrated that the death penalty deterred serious crime; the other, by the fictitious researchers Palmer and Crandall, showed the opposite.18
As one might expect, the experimental subjects were inclined to dismiss studies that contradicted their cherished beliefs. But Lord and his colleagues discovered something more surprising: the more detail people were presented with – graphs, research methods, commentary by other fictional academics – the easier they found it to disbelieve unwelcome evidence. If doubt is the weapon, detail is the ammunition.
When we encounter evidence that we dislike, we ask ourselves, ‘Must I believe this?’ More detail will often give us more opportunity to find holes in the argument. And when we encounter evidence that we approve of, we ask a different question: ‘Can I believe this?’ More detail means more toeholds on to which that belief can cling.19
The counterintuitive result is that presenting people with a detailed and balanced account of both sides of the argument may actually push people away from the centre rather than pull them in. If we already have strong opinions, then we’ll seize upon welcome evidence, but we’ll find opposing data or arguments irritating. This ‘biased assimilation’ of new evidence means that the more we know, the more partisan we’re able to be on a fraught issue.
Maybe this sounds absurd. Don’t we all want to figure out the truth? We certainly should when it will affect us personally – and the tragic case of HIV/AIDS denialism indicates that some people will go to extraordinary lengths to reject ideas that are uncomfortable and unwelcome, even if those ideas could save their lives. Wishful thinking can be astonishingly powerful.
But often being right doesn’t have such profound consequences. On many questions, reaching a factually incorrect conclusion causes us no harm at all. It can even help us.
To see why, ponder an issue where most people would agree that there is no objective ‘truth’ at all: the moral difference between eating beef, eating pork and eating dog. Which of these practices you think is right and which is wrong depends mostly on your culture. Few people will care to discuss the underlying logic of the matter. It’s better to fit in.
Less obviously, the same is often true of arguments where there is a correct answer. In the case of climate change, there is an objective truth even if we are unable to discern it with perfect certainty. But as you are one individual among nearly 8 billion on the planet, the environmental consequences of what you happen to think are irrelevant. With a handful of exceptions – say, if you’re the president of China – climate change is going to take its course regardless of what you say or do. From a self-centred point of view, the practical cost of being wrong is close to zero.
The social consequences of your beliefs, however, are real and immediate.
Imagine that you’re a barley farmer in Montana, and hot, dry summers are ruining your crop with increasing frequency. Climate change matters to you. And yet rural Montana is a conservative place, and the words ‘climate change’ are politically charged. Anyway, what can you personally do about it? Here’s how one farmer, Eric Somerfeld, threads that needle:
In the field, looking at his withering crop, Somerfeld was unequivocal about the cause of his damaged crop – ‘climate change.’ But back at the bar, with his friends, his language changed. He dropped those taboo words in favor of ‘erratic weather’ and ‘drier, hotter summers’ – a not-uncommon conversational tactic in farm country these days.20
If Somerfeld lived in Portland, Oregon, or Brighton, England, he wouldn’t need to be so circumspect at his local tavern – he’d be likely to have friends who took climate change very seriously indeed. But then those friends would quickly ostracise someone else in the social group who went around loudly claiming that climate change is a Chinese hoax.
So perhaps it is not so surprising after all to find educated Americans poles apart on the topic of climate change. Hundreds of thousands of years of human evolution have wired us to care deeply about fitting in with those around us. This helps to explain the findings of Taber and Lodge that better-informed people are actually more at risk of motivated reasoning on politically partisan topics: the more persuasively we can make the case for what our friends already believe, the more our friends will respect us.
HIV denialism shows we’re capable of being tragically wrong even in matters of life and death. But it’s far easier to lead ourselves astray when the practical consequences of being wrong are small or non-existent, while the social consequences of being ‘wrong’ are severe. It’s no coincidence that this describes many controversies that divide along partisan lines.
It’s tempting to assume that motivated reasoning is just something that happens to other people. I have political principles; you’re politically biased; he’s a fringe conspiracy theorist. But we’d be wiser to acknowledge that we all think with our hearts rather than our heads sometimes.
Kris De Meyer, a neuroscientist at King’s College, London, shows his students a message describing an environmental activist’s problem with climate change denialism:
To summarize the climate deniers’ activities I think we can say that:
(1) Their efforts have been aggressive while ours have been defensive.
(2) The deniers’ activities are rather orderly – almost as if they had a plan working for them.
I think the denialist forces can be characterized as dedicated opportunists. They are quick to act and seem to be totally unprincipled in the type of information they use to attack the scientific community. There is no question, though, that we have been inept in getting our side of the story, good though it may be, across to the news media and the public.21
(Here’s an example of this tendency that, for personal reasons, I can’t help but be sensitive about. My left-leaning, environmentally conscious friends are justifiably critical of ad hominem attacks on climate scientists. You know the kind of thing: claims that scientists are inventing data because of their political biases or because they’re scrambling for funding from big government. In short, smearing the person rather than engaging with the evidence. Yet the same friends are happy to embrace and amplify the same kind of tactics when they’re used to attack my fellow economists: that we’re inventing data because of our political biases, or scrambling for funding from big business. I tried to point out the parallel to one thoughtful person, and got nowhere. She was completely unable to comprehend what I was talking about. I’d call this a ‘double standard’, but that would be unfair – it would suggest that it was deliberate. It’s not. It’s an unconscious bias that’s easy to see in others and very hard to see in ourselves.)*
Our emotional reaction to a statistical or scientific claim isn’t a side issue. Our emotions can, and often do, shape our beliefs more than any logic. We are capable of persuading ourselves to believe strange things, and to doubt solid evidence, in service of our political partisanship, our desire to keep drinking coffee, our unwillingness to face up to the reality of our HIV diagnosis, or any other cause that invokes an emotional response.
But we shouldn’t despair. We can learn to control our emotions – that is part of the process of growing up. The first simple step is to notice those emotions. When you see a statistical claim, pay attention to your own reaction. If you feel outrage, triumph, denial, pause for a moment. Then reflect. You don’t need to be an emotionless robot, but you could and should think as well as feel.
Most of us do not actively wish to delude ourselves, even when that might be socially advantageous. We have motives to reach certain conclusions, but facts matter too. Lots of people would like to be movie stars, billionaires, or immune to hangovers, but very few people believe that they actually are. Wishful thinking has limits. The more we get into the habit of counting to three and noticing our knee-jerk reactions, the closer to the truth we are likely to get.
For example, one survey, conducted by a team of academics, found that most people were perfectly able to distinguish serious journalism from fake news, and also agreed that it was important to amplify the truth, not lies. Yet the same people would happily share headlines such as ‘Over 500 “Migrant Caravaners” Arrested With Suicide Vests’, because at the moment at which they clicked ‘share’, they weren’t stopping to think. They weren’t thinking, ‘is this true?’ and they weren’t thinking, ‘do I think the truth is important?’. Instead, as they skimmed the internet in that state of constant distraction that we all recognise, they were carried away with their emotions and their partisanship. The good news is that simply pausing for a moment to reflect was all it took to filter out a lot of the misinformation. It doesn’t take much; we can all do it. All we need to do is acquire the habit of stopping to think.22
Another study found that people who were best able to distinguish real from fake news were also the people who scored highly on what is called a ‘cognitive reflection test’.23 These tests – created by Shane Frederick, a behavioural economist, and made famous by Daniel Kahneman’s book Thinking, Fast and Slow – ask questions such as:
A bat and ball cost $1.10, and the bat costs a dollar more than the ball. How much does the ball cost?
and
A lake contains a patch of lily pads which doubles in size each day. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?*
Many people get the answers to these questions wrong the first time they hear them, but what’s required to reach the correct solution isn’t intelligence or mathematical training, but pausing for a moment to double-check your gut reaction. Shane Frederick points out that noticing your initial error is usually all that’s necessary to solve the problem.24
The cognitive reflection questions invite us to leap to the wrong conclusion without thinking. But so, too, do inflammatory memes or tub-thumping speeches. That’s why we need to be calm. And that is also why so much persuasion is designed to arouse us – our lust, our desire, our sympathy or our anger. When was the last time Donald Trump, or for that matter Greenpeace, tweeted something designed to make you pause in calm reflection? Today’s persuaders don’t want you to stop and think. They want you to hurry up and feel.
Don’t be rushed.
Han van Meegeren had been arrested almost immediately after German occupation ended. He should have been prosecuted and punished for collaboration with the Nazis.
The wily forger had prospered mightily under Nazi occupation. He owned several mansions. While Amsterdam starved during the war, he hosted regular orgies at which prostitutes helped themselves to fistfuls of jewels. If he wasn’t actually a Nazi himself, he went to extraordinary lengths to behave like one. He was friends with Nazis, and he bent over backwards to celebrate Nazi ideology.
Van Meegeren illustrated and published a lavishly evil book called Teekeningen 1, full of grotesque anti-Semitic poetry and illustrations, using the Nazi iconography and colours. He spared no expense in the printing of the book, and no wonder, given whom he imagined might read it. A copy was hand-delivered to Adolf Hitler, with a handwritten dedication in artist’s charcoal: ‘To My beloved Führer in grateful tribute – Han van Meegeren’.
It was found in Hitler’s library.
To understand what happened next, we need to understand emotion rather than logic. The Dutch were disillusioned with themselves after five years of German occupation. Anne Frank was just the most famous of the huge number of Jews to have been deported from the Netherlands and murdered, but it is less well known that a far higher proportion of Dutch Jews were deported than those living in France or Belgium.25 Van Meegeren, of course, was yet another collaborator. But in the wake of the war, the Dutch had become tired of parading such men through their courts, month after month. They desperately wanted a more inspiring story – just as Abraham Bredius desperately wanted to find a Caravaggioesque Vermeer. Yet again, van Meegeren produced what was wanted: this time, a light-hearted tale of boldness and trickery in which a Dutchman had struck back against the Nazis.
The men responsible for prosecuting van Meegeren soon became his unwitting accomplices. They arranged an absurd publicity stunt where he ‘proved’ that he was a forger rather than a traitor by painting a picture in the style of Emmaus. One breathless headline reported, ‘He Paints for His Life’. Newspapers in the Netherlands and around the world couldn’t tear their gaze away from the great showman.
Then came the trial, a media circus in which the charismatic van Meegeren was the ringmaster. He spun his story: that he had only forged the art to prove his worth as an artist, and to unmask the art experts as fools. When the judge reminded him that he had sold the fakes for high prices, he replied, ‘Had I sold them for low prices, it would have been obvious they were fake.’ The courtroom laughed; van Meegeren had them all spellbound. A man who should have been viewed as a traitor reshaped his reputation into that of a patriot, even a hero. He manipulated the emotions of the Dutch people, as he had manipulated the emotions of Abraham Bredius before the war.
It wasn’t just the Dutch who swallowed the story of the man who played Göring for a fool. Van Meegeren found plenty of people who were delighted to play up the deliciousness of the story. Early biographers of van Meegeren made him out to be a misunderstood trickster, hurt by the unjust rejections of his own art, but happy to outsmart his country’s occupiers. One oft-reported story is that Göring, awaiting trial in Nuremberg, when told that he had been duped by van Meegeren, ‘looked as if for the first time he had discovered there was evil in the world’. When you hear that anecdote it’s almost impossible to resist repeating it. But like the pointillés on the bread, it’s a telling detail that is just as false.
If only Hitler’s personally inscribed copy of Teekeningen 1 had been discovered before van Meegeren’s trial, the story of the daring little forger would have dissolved. The truth about van Meegeren would have been obvious. Or would it?
The discomfiting truth about Teekeningen 1 is that the dedicated copy in Hitler’s library had been found almost immediately. De Waarheid, a Dutch resistance newspaper, had announced the discovery on 11 July 1945. It just didn’t matter; nobody wanted to know. Van Meegeren waved the truth away, claiming that he had signed hundreds of copies of the book and the dedication must have been added by someone else. In a modern setting he might have dismissed the newspaper report as ‘fake news’.
It was a ludicrous excuse, but van Meegeren had managed to hypnotise his prosecutors just as he had hypnotised Bredius, by distracting them with interesting details and selling them a story they wanted to believe.
In his closing statement to the court he claimed again that he hadn’t done it for the money, which had brought him nothing but trouble. It was a bold claim: we should remember that while wartime Amsterdam went hungry, van Meegeren liked to accessorise his mansions with prostitutes, jewels, and prostitutes draped with jewels. No matter: the newspapers and the public lapped up his story.
After being found guilty of forgery, van Meegeren was cheered as he left the courtroom. He had pulled off an even more audacious con – a fascist and a fraud successfully presented himself as a cheeky hero of the Dutch people. Abraham Bredius desperately wanted a Vermeer. The Dutch public desperately wanted symbols of resistance to the Nazis. Han van Meegeren knew how to give people what they wanted.
Before serving a day of his sentence, van Meegeren died, on 30 December 1947, of a heart attack. An opinion poll conducted a few weeks earlier had found him to be (except for the Prime Minister) the most popular man in the country.
If wishful thinking can turn a rotten fake into a Vermeer, or a sleazy Nazi into a national hero, then it can turn a dubious statistic into solid evidence, and solid evidence into fake news. But it doesn’t have to. There is hope. We’re about to go on a journey of discovery, finding out how numbers can make the world add up. The first step, then, is to stop and think when we are being presented with a new piece of information, to examine our emotions and to notice if we’re straining to reach a particular conclusion.
When we encounter a statistical claim about the world, and are thinking of sharing it on social media or typing a furious rebuttal, we should instead ask ourselves: ‘How does this make me feel?’*
We should do this not just for our own sake, but as a social duty. We’ve seen how powerful social pressure can be in influencing what we believe and how we think. When we slow down, control our emotions and our desire to signal partisan affiliation, and commit ourselves to calmly weighing the facts, we’re not just thinking more clearly – we are also modelling clear thinking for others. It is possible to take a stand not as a member of a political tribe but as someone who is willing to reflect and reason in a fair-minded manner. I want to set that sort of example. I hope that you do, too.
Van Meegeren understood all too well that how we feel shapes what we think. Yes, expertise and technical knowledge matter, but the technical side of dealing with numbers will come in the chapters that follow. If we don’t master our emotions, whether they are telling us to doubt or telling us to believe, we’re in danger of fooling ourselves.
___________
* In both cases it’s conceivable that people were swayed less by the modest financial incentive and more by the emotional power of the role they were being asked to adopt. Either way, taking a particular perspective on the situation proved to be a strong influence on the decisions they made.
* Political expertise in this experiment was measured by asking people questions about the workings of US government – for example, how many congressional votes are needed to override a presidential veto?
* I’m quite sure that I’m guilty, too. I just can’t see exactly how.
* The answers: five cents, and forty-seven days.
Perhaps the second question is less of a stumbling block than once it was. The lily patch is growing exponentially, and we have all received a hard lesson from the coronavirus in what exponential growth looks like.
* A follow-up question might also be worth asking: why does it make me feel that way?