Evidence-Based Smear Campaigns

Guardian, 1 May 2010

Elections are a time for smearing. But do smears work, and if so, what’s the best way to combat them? A new experiment published this month in the journal Political Behavior tries to examine the impact of corrections. The findings are disturbing: far from changing people’s minds, if you are deeply entrenched in your views, a correction will only reinforce them.

The first experiment used articles claiming that Iraq had weapons of mass destruction immediately before the US invasion. One hundred and thirty participants were asked to read a mock news article, attributed to Associated Press, reporting on a Bush campaign stop in Pennsylvania during October 2004. The article describes Bush’s appearance as ‘a rousing, no-retreat defense of the Iraq war’, and gives genuine Bush quotes about WMD: ‘There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks, and in the world after September the 11th … that was a risk we could not afford to take.’ And so on.

The 130 participants were then randomly assigned to one of two conditions. For half of them, the article stopped there. For the other half, the article continues, and includes a correction: it discusses the release of the Duelfer Report, which documented the lack of Iraqi WMD stockpiles – and the lack of an active production programme – immediately prior to the US invasion.

After reading the article, subjects were asked to state whether they agreed with the following statement: ‘Immediately before the US invasion, Iraq had an active weapons-of-mass-destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before US forces arrived.’ Their responses were measured on a five-point scale ranging from ‘strongly disagree’ to ‘strongly agree’.

As you would expect, those who self-identified as conservatives were more likely to agree with the statement. Separately, meanwhile, more knowledgeable participants (independently of political persuasion) were less likely to agree. But then the researchers looked at the effect of whether you were also given the correct information at the end of the article, and this is where things get interesting. They had expected that the correction would be less effective for more conservative participants, and this was true, up to a point. For very liberal participants the correction worked as expected, making them more likely to disagree with the statement that Iraq had WMD, when compared with those who were also very liberal but who received no correction. For those who described themselves as left of centre or centrist, the correction had no effect either way.

But for people who placed themselves ideologically to the right of centre, the correction wasn’t just ineffective, it actively backfired: conservatives who received a correction telling them that Iraq did not have WMD were more likely to believe that Iraq had WMD, when compared with those who were given no correction at all. You might have expected people simply to dismiss a correction that was incongruous with their pre-existing view, or to regard it as having no credibility: in fact, it seems such information actively reinforced their false beliefs.

Maybe the cognitive effort of mounting a defence against the incongruous new facts entrenches you even further. Maybe you feel marginalised and motivated to dig in your heels. Who knows? But these experiments were then repeated, in various permutations, on the issue of tax cuts (or rather, the idea that tax cuts had increased national productivity so much that tax revenue increased overall) and stem-cell research. All the studies found exactly the same thing: if the original dodgy fact fits with your prejudices, a correction only reinforces these even more. If your goal is to move opinion, this depressing finding suggests that smears work; and what’s more, corrections don’t challenge them much, because for people who already disagree with you, it only make them disagree even more.