In Oscar Wilde’s 1890 novel The Picture of Dorian Gray, Lord Henry Wotton utters the famous line: ‘There is only one thing in the world worse than being talked about, and that is not being talked about.’
I suspect Donald Trump would agree; there can be few politicians who have ever taken such a single-minded approach to the cultivation of fame.
When he announced his intention to run for president in 2015 it was on a stage in the basement of his Trump Tower building in Manhattan, in front of eight American flags.
‘Sadly the American dream is dead, but if I get elected president I will bring it back,’ he said, before turning his wrath on Mexico, which he accused of ‘bringing their worst people’, including criminals and ‘rapists’ to America, accusing China of taking ‘our jobs’, threatening to be ‘tough’ on ISIS and insulting fellow Republican candidate Jeb Bush – ‘How the hell can you vote for this guy?’ – guaranteeing that he led the news on all media outlets.
Widely dismissed – including by notable Republicans and conservatives – in favour of the more ‘serious’ or conventional candidates, his attention-grabbing comments about building a wall along the US–Mexico border, banning Muslims from the USA and so on, were widely pilloried in the media, but did nothing to halt his ascendancy.
Each new tweet, no matter how absurd, ill conceived or false (a PolitiFact study found that only 5 per cent of the claims Trump made during the 2016 campaign were wholly true and that 69 per cent were either mostly false, false or ‘pants on fire’), was widely reported on, even if only to be mocked.
As his campaign gathered momentum, and his leadership opponents fell by the wayside, most of those in Trump’s own party maintained distance from him, continuing to comment negatively on his outbursts. And, right up to the day of the election, most of the US (and world) media were united in their condemnation of this most unusual of candidates.
However, despite all this negative coverage, Trump was elected president.
Through it all Trump has consistently accused the media of bias. And he’s right, of course.
All media are biased in some way or another – implicitly or explicitly. Whatever your opinion, you can probably find a news source that will confirm your own beliefs (as we saw in Chapter 4), and ‘media bias’ is a fairly well-established phenomenon. And, it goes without saying that politicians and their spin doctors who try to shape the news cycle are inherently biased. In tandem with (or in opposition to) the media, they work hard to present their own versions of the world, and make their preferred facts and figures stick in the public’s mind.
Some news sources at least strive for impartiality. The BBC’s Charter and Agreement requires the corporation to be impartial and it trains its reporters, via its academy, to recognise their own bias: ‘Impartiality can mean challenging your own assumptions or those of your team or contributors.’ Similarly, Reuters has a policy of taking a ‘value-neutral approach’, and its editorial policy states: ‘We are committed to reporting the facts and in all situations avoid the use of emotive terms.’
However, most of the media is openly partial, explicitly endorsing certain parties or candidates or points of view. Trump’s objection to partisan news almost certainly wasn’t based on principle. It was just disappointing to Trump that, apart from outliers such as TV pundit Sean Hannity and right-wing website Breitbart News, most of the media, including most conservative media, didn’t endorse him – he objected to that particular ‘bias’.
But rather than focusing on Trump and how the media might be partial or ‘biased’, I’d like to focus on what goes on in our own heads that makes us resistant or receptive to what the media has to say. What cognitive biases come into play and do how they affect what we make of the media?
Media coverage certainly affects how we vote, but not necessarily in the way you might expect. Exploring the nature of this influence can help us to understand some of the ways in which we form our political priorities and decide what we think is true. And it can reveal how subtle changes in the way an issue is framed in the media can shape our interpretation of it.
Some parts of the media make quite bold claims about the effect they have on our political decisions. In the UK the Sun newspaper, for example, always endorses a candidate in a general election and this used to be considered a very powerful message to the electorate. Following the 1992 election they ran with the headline ‘It’s The Sun Wot Won It’. Hillary Clinton racked up at least 186 endorsements running against Trump – including publications with a solid record of endorsing Republicans such as the Dallas Morning News and Columbus Dispatch, as well as the San Diego Union-Tribune and Arizona Republic, neither of which had endorsed a Democrat for over a century. And in Australia, concern about media influence led to the passing of a law in 1992 banning all election TV advertising from midnight on the Wednesday before polling day to the close of polls on polling day – always a Saturday – to allow voters a ‘cooling off’ period to consider the issues.
So, is the media good at telling us what to think?
In the USA, there had been so much propaganda during the Second World War to promote the war effort and national security that, after the conflict had ended, researchers started studying the effects of the mass media. They were keen to know if the propaganda was changing people’s perceptions, and creating a passive, easily manipulated public.
Unexpectedly, the answer was no – people weren’t easily being ‘led astray’; in fact, the propaganda had done little to change people’s minds; it only managed to reinforce their views if they already agreed.1
Humans can be a stubborn lot – for better or for worse we have a number of biases that can make us resistant to persuasion. As we saw in Chapter 4, our minds have some strategies for making us think we’re right – it can be hard for us to see the other side’s perspective, and we’ll tend to focus on evidence that reinforces our own point of view. Our psychology can make us stand firm when we’ve already made up our minds, especially in areas where we have a lot of knowledge or already have strong opinions. Crucially, though, that doesn’t just apply to propaganda. Even when the media is making us aware of facts backed up by science, which we’ll look at in Chapter 6, we can be less susceptible to the message, especially if it contradicts our deeply held beliefs and values.
In a way, those mid-century Americans were protected from propaganda by their biases. Although perhaps they also didn’t quite trust the sources of propaganda. Curiously, a study published in 2011 suggested that political endorsements that diverge from the publication’s normal stance – say, a conservative newspaper supporting a Democrat – have a greater impact on readers’ choices.2 Voters’ trust increases when they perceive a lower degree of bias, and they perceive less bias when an outlet defies its own conventions.
The media isn’t an all-powerful propaganda machine – as we’ve seen in the case of Trump, who was elected despite the media being widely against him. It isn’t necessarily successful in telling us exactly what to think – but we can be open to persuasion in other ways.
When you think about the issues that are most pressing in the world today, what comes to mind? Depending on who and where you are, they might include climate change, big banking, defence and security, immigration, women’s rights or poverty.
Why might you focus on particular topics like these? Why do they seem more important to you than other issues at this moment in time?
When you see a story about one of these subjects, you’re likely to be drawn to it. But perhaps you’re interested partly because you’ve seen so many stories in the media already.
In 1922 the American journalist Walter Lippmann wrote the influential book Public Opinion. In those early days of the mass media – newspapers, radio and filmed newsreels – he took a dim view of their ability to inform the public, famously writing that the press was ‘like the beam of a searchlight that moves restlessly about, bringing one episode and then another out of the darkness into vision’.
By selecting one issue rather than another, the media focuses our attention. It is setting the agenda, defining people’s priorities – the things they think are most important in the world right now. As the American political scientist Bernard Cohen claimed in 1963: ‘While the press may not be successful much of the time in telling people what to think, it is stunningly successful in telling its readers what to think about.’3
In 1982 a famous study by scientists at Yale University and the University of Michigan set out to discover just how much direct influence the media can have on our political priorities.
Shanto Iyengar, Mark D. Peters and Donald R. Kinder recruited a group of people and asked them which of the prominent topics of the day they most cared about, ranked in order of importance.4 They then split them into two groups: the ‘experimental’ group and the ‘control’ group. Over some days, the experimental group watched the news every evening, but the broadcast had been manipulated to focus more on defence issues. The control group watched undoctored news reports.
After a few evenings of watching this doctored news, participants were questioned on their views again, and the effects were clear. When first asked, the experimental group had ranked defence sixth out of eight problems (following inflation, pollution, unemployment, energy and civil rights). After watching the news, defence had leapt up to second place, with inflation remaining in first place. For those who saw the ordinary reports, the ranking of defence stayed the same.
In a second experiment, the researchers set out to explore whether people could be influenced by three different topics. One group watched news with extra defence stories, another group watched news with more on inflation, and another saw more on pollution. The results were largely as expected: participants with more defence-related news increased their rating of the importance of defence, and those exposed to more pollution-related stories increased their rating of the importance of pollution. The ‘extra inflation news’ participants also rated inflation as more important than the other groups, but the increase wasn’t significant (though that might have been because they already thought inflation was very important at the start of the experiment).
Crucially, though, the way in which the news was doctored also influenced how participants viewed the US president in 1981, Jimmy Carter. The researchers asked people in both experiments how they rated President Carter on the subjects they’d been primed on. If the respondents had seen the doctored news stories they were more likely to be concerned about his performance in those areas. And that led them to be more concerned when evaluating Carter’s overall performance as well.
The evidence was clear: if a problem was prominently featured in the TV news, it became a much more important factor in how people judged the president’s performance.
Stories in the news affect our priorities and how we see the political world around us, and often we don’t even realise we’re being influenced. But it’s not simply because we’re just passively absorbing whatever information is presented to us.
It’s partly a result of the mind’s active attempt to make the best use of whatever information is most available to it.
The human brain is more powerful than any computer. Sometimes, however, when faced with extremely complex decisions, rather than fully weigh up all the options, it employs some mental shortcuts. Those decision-making shortcuts – sometimes known as ‘heuristics’ – can influence our judgements of what is important. In this case, we are biased by something called the ‘availability heuristic’.
This mental shortcut is one of the many heuristics and psychological tendencies made famous by the Nobel Prize-winning psychologist and behavioural economist Daniel Kahneman and his colleague Amos Tversky (their remarkable relationship was the subject of a book called The Undoing Project by bestselling author Michael Lewis). Wondering whether people really were the rational decision-makers they were believed to be (at least by economists), in the 1960s and 1970s they started investigating some of the mental shortcuts we use when we’re operating in uncertain conditions. In 1973 they published a seminal paper in which they identified this availability heuristic.5
The idea is that if something can be remembered or brought to mind easily, it becomes more important or relevant to us, and that affects our judgements.
Here’s a simple example.
Try to bring to mind one or two positive things that happened to you last week. They could be anything – a special family moment, a success at work, a fun night out with friends.
Now answer the question: ‘How happy were you last week?’
What will you base your answer on? It’s likely that it will be influenced by whatever memories are most easily available to you. In this case, if you’ve been thinking of positive things that happened to you, you’re more likely to say you were happier.
Obviously if you’ve had an exceptionally good (or bad) week, then thinking of a couple of things might not have a huge effect; but for an average week about which you don’t yet have a strong opinion, your mind will bias its answer based on the positive (or, conversely, negative) events you’ve just been thinking about, simply because they come more easily to mind.
There are many ways this bias can affect our judgement. If you have a lot of friends who’ve had heart attacks, you’re likely to think that heart attacks are more common than they actually are. Or, in business, people think an investment is better or worse depending on what they’ve recently seen about that company, rather than looking at all the facts.
If something comes to mind more easily, it might be because you’ve seen it a few times, or because it was particularly dramatic or vivid (or emotionally wrenching – as many news items are), or perhaps you came across it particularly recently.
This means that, when thinking about big issues (or evaluating a president, as in the experiment with Carter mentioned above), we’re more likely to bring to mind recent examples from the news, rather than examining all the alternative sources of information – and those recent examples are likely to have a disproportionate influence on our judgements.
What really makes the difference here, according to other research, is the ease of recall.
In 2002 a British team set out to test this with a real-world example: they decided to find out how people perceived the then British prime minister, Tony Blair.6 They asked a number of people who weren’t very politically engaged to think back and list a specific number of positive things (either two or five) about Blair, or the same number of negative things. And then they asked the respondents their attitude towards him. Surprisingly, what affected the answers to this question wasn’t whether the respondents had been asked to think of positive things or negative things – it was whether they had found it easy to recall those positive or negative points in the first place.
If people had struggled to come up with a lot of negative things to say about Blair, that made them like him more than the people who’d listed positive attributes – presumably because, if they couldn’t quickly think of something bad to say, then he couldn’t be that bad. They were basing their judgements of him on how easily these things came to mind, not on any further research, analysis or reflection.
Like most cognitive biases, the availability heuristic certainly has its uses. If you’re asked to name the capital of a foreign country and you’re not sure of the answer, going with whatever city first comes to mind is probably a decent strategy. But when making a complex decision that relies on an understanding of how the world works this bias can lead us astray.
The influential psychologist and linguist Steven Pinker made this point in his book The Better Angels of Our Nature. Pinker points out that the media frequently show us examples of conflicts, war zones and famine: those are the stories that they consider important, that they think we want or need to know about, and that is not necessarily a bad thing.
But the result is that people often think the level of war and violent death in the world is as high as ever (or even worse than it was in the past), when actually the number of violent deaths per head of population is at an all-time low. We often assume that global poverty is at an all-time high too – but, again, as a proportion of the world’s population the number of people living in poverty has dramatically reduced during the last fifty years.
Most of us don’t have a broader understanding of the statistical trends, and so, when we think about the state of the world, the examples depicted in the media are the ones that come most easily to mind, and we use them to make our judgements.
And don’t forget – the media and politicians can make use of this by pushing the political agenda onto issues that are naturally the territory of one party or another. The right is usually stronger on issues of defence, crime and law and order, for example, so in an election cycle news stories that emphasise those events naturally play to the right. The left is usually stronger on issues such as inequality and minority rights, so stories that emphasise exploitation or discrimination will naturally benefit the left.7
Trump’s 2016 campaign functioned like no other to control the news agenda. With a single provocative tweet, he could put his issues, his name and his brand at the top of the news cycle. The coverage was as widespread as it was negative. And it was certainly negative. One estimate suggested that of all the times Trump was mentioned in the media, 96 per cent of the articles expressed a negative opinion.
However, Trump focused on making some simple, memorable messages available. Build a wall. Bring jobs back to America. ‘Crooked Hillary’. ‘Make America great again’. And a large portion of the US electorate was not influenced by the media’s negative coverage – they weren’t told what to think – but they certainly felt they knew what was important, what everyone was talking about . . . and that was Trump.
Trump knew that media exposure, even negative exposure, could be a gift to his campaign. Perhaps he is familiar with what psychologists call the ‘mere exposure effect’.
The idea here is that when something – or someone – is more familiar to us, we like it more (hence some psychologists regard it as part of a broader ‘familiarity bias’).
When we’re repeatedly exposed to something, its familiarity means we are more likely to be welcoming of it.
The Polish-American social psychologist Robert Zajonc first coined the term ‘mere exposure effect’ in 1968. He opened his paper with this intriguing example:8
‘On February 27, 1967, the Associated Press carried the following story from Corvallis, Oregon: “A mysterious student has been attending a class at Oregon State University for the past two months enveloped in a big black bag. Only his bare feet show. Each Monday, Wednesday and Friday at 11.00 a.m. the Black Bag sits on a small table near the back of the classroom. The class is Speech 113 – basic persuasion . . . Charles Goetzinger, professor of the class, knows the identity of the person inside. None of the 20 students in the class do. Goetzinger said the students’ attitude changed from hostility toward the Black Bag to curiosity and finally to friendship [italics added].”’
Zajonc went on to test the effect in a range of ways, showing that, on average, if we see a random arrangement of letters a number of times (‘ryiane’, for example), we will start to prefer it to other orders that we are not familiar with. So just by exposing a person to something, you can improve their attitude towards it (hence ‘mere’ exposure).
There have since been a huge number of studies showing how this effect can influence us in a wide range of situations. For example, you are more likely to be well disposed to someone you occasionally see in the street than someone you’ve never seen before. It might be because you haven’t had a reason to be frightened of that person, so every time you see them, that lack of threat is reinforced, even though you don’t actually know anything about them. It is a strange but reliable effect, and often we don’t even realise that it is happening. We don’t think we prefer things because we’ve seen them before – we like them more because, well, we just do.
A lot of advertising is effective for this reason – the overt message of the ad might not filter through, but just being exposed to the brand name makes the product more familiar, and more appealing, when we next see it.
The same effect also comes into play in politics. An American study showed that seeing a candidate more often – their media exposure – can affect their success in the polls by 5–10 per cent.9
So, the more we see something in the media, the more familiar it can seem to us and the more positive we feel about it – which means that just the repeated coverage of an idea or candidate in the media can be helpful over time. But this well-documented bias has another side, too.
Surprisingly, it also applies to whether we think things are true.10
In a simple experiment in the 1970s, three American psychologists called Lynn Hasher, David Goldstein and Thomas Toppino wanted to find out if this ‘familiarity effect’ extended not just to whether we liked something but also to whether we thought it was true or false. If you hear something often enough, do you start to believe it?
Take the statement: ‘The total population of Greenland is about 50,000.’ It sounds plausible enough, though most of us won’t know if it’s really true or not without checking. In the absence of certainty, how do we decide?
Hasher and her colleagues had a suspicion: the more often you hear that Greenland has 50,000 inhabitants, the more likely you are to think it’s true. And they were right.
In their experiment they asked people to rate whether they thought sixty statements (on politics, sports and the arts) were true or false. They ran three sessions, two weeks apart, and a small number of the statements were repeated across the sessions. What they found is that, with repetition, people were more and more likely to rate the repeated statements as true, whether or not the statements were in fact true (e.g. ‘Total U.S. defence spending has risen steadily since 1965’) or false (e.g. ‘The People’s Republic of China was founded in 1947’).
You might get a feel for this effect if you remember the research about monkeys from Chapter 1: the study showing that monkeys are normally happy to perform a task for a piece of celery, but will protest when they see another monkey being rewarded with grapes for the same task.
Does that research feel more familiar this time? Does that familiarity somehow feel reassuring? Perhaps you even feel more confident that it’s true?
I haven’t provided you with any more evidence that it’s true; I’ve simply repeated it.
In politics, this effect is only likely to have an influence in areas where voters know little about the issues or candidates, but in terms of the actual facts, that probably covers a lot of modern-day politics for most of us (something we will explore further in Chapter 8).
That’s why successful political campaigns focus on key ‘sound-bite’ figures. They can make sure these figures reach audiences via media outlets that are looking for easy material, and their repeated exposure makes them somehow more believable over time. US president Ronald Reagan’s ability to connect with voters earned him the nickname the ‘Great Communicator’, and his ability was greatly enhanced by his – or at least his team’s – understanding of how the news cycle operated, which they then kept fed with short, memorable phrases such as: ‘Government is not the solution to our problem; government is the problem’ and ‘Mr Gorbachev, tear down this wall!’
In the run-up to the EU referendum, the Leave campaign stated as regularly and often as it could the claim that the UK sent £350 million a week to the EU – a figure which they even emblazoned across their battle bus. The UK Statistics Authority said this figure was not just potentially misleading, but misleading plain and simple, and the respected Institute for Fiscal Studies called it ‘absurd’. But each time the Remain campaign challenged the figure of £350 million, it probably ended up reinforcing the memory of the figure in people’s minds. A simple number was easier to remember than any of the critical arguments (a lot of that money returns directly in the form of subsidies and investment, and the broader benefits of membership of the single market are hard to quantify), so the debate became anchored to that number.
In the US election of 2016, with the repetition of ‘lock her up’ at Trump rallies, and the constant coverage of his claims about Clinton’s supposed malfeasance, the idea of ‘crooked Hillary’ was more likely to seem true with every repetition. Of course we now know that Trump had no intention of locking her up, and said in the days after the election it was ‘no longer something he felt so strongly about’ and that he did not want to ‘hurt the family’.
Knowing about your own inbuilt tendency to believe and trust things that you’ve seen before, and to build arguments based on things you can most easily remember, might cause you to pause the next time you get into a heated political debate.
Most psychologists are fairly pessimistic about our ability to counteract these biases. But it can make for an interesting exercise to question some of the facts and figures or arguments that are so familiar we usually accept them without challenge. Are they as valid as you’ve always assumed? As we’ll see in the next chapter, sometimes when an incorrect notion gets lodged in our minds (Did Iraq have weapons of mass destruction? Can vaccines cause Autism?) it can prove very hard to change.
When Trump said he wanted to ban all Muslims from the USA and build a wall along the US–Mexico border, he presented these policies as national security issues. His opponents presented them in terms of racism and human rights.
Similarly, when Angela Merkel opened up Germany’s borders, she talked about the measure as a compassionate decision on her part, calling it ‘our humanitarian duty’. For her opponents, it was a dilution of German culture and a risk to public safety; and she was particularly criticised for the policy in the wake of the Berlin attack, when a truck was driven into crowds at the Christmas markets. The attack, carried out by an asylum seeker, left twelve people dead.
When it comes to decisions on issues such as human rights or public safety, the facts might be simple, but their interpretation rarely is. In 1981, in a hugely influential article called ‘Framing Decisions and the Psychology of Choice’, Kahneman and Tversky outlined yet another way in which the decisions we make are not always as rational as we think they are.11 How we perceive or make choices on a particular issue depends on how that choice is ‘framed’.
They tested their idea by presenting people with a choice that potentially had a life or death outcome. In the case of an epidemic, they had to choose between two options for a vaccine trial – both had a chance of resulting in some deaths, as well as a chance of saving lives.
The two available options remained the same throughout the experiment but the researchers varied the way they were framed, sometimes emphasising the positives or negatives (lives saved or lost) of one option and sometimes of the other. What they found was that these variations of framing could dramatically affect the number of people who would choose either option (most would go for the positive framing).
Since their pioneering work, this phenomenon has been investigated widely around the world. It’s a subject of particular relevance to politics. After all, the art of political spin is, to a large extent, a question of how news stories are framed, and there’s one very famous study that demonstrates how it can affect our political judgements.12
In 1997 researchers in the USA created two different news reports about a Ku Klux Klan rally, and showed them to two separate groups of people.
The first version framed the KKK rally as a free speech issue. The article included protestors’ signs saying ‘No free speech for racists’, quotes from supporters saying they should be able to hear if they want to, and some photographs of protestors and leaders speaking at the microphone. We might imagine a TV voiceover to read something along the lines of: ‘In a key test of the limits of free speech today the Ku Klux Klan made a controversial speech . . .’
The second version framed the rally as a public order issue. This article quoted observers and reporters seeing ‘real sparks in the crowd’ and the ‘tension between Klan protestors and supporters came within seconds of violence’, alongside photos of police officers protecting Klan members from the protestors. For this version, we might imagine a TV voiceover to be: ‘There was tension in the centre of the city today as members of the Ku Klux Klan made a controversial speech . . .’
Seeing one article or the other could make a big difference to the way people judged the event, but it also went on to affect their general attitudes to the KKK. When the rally was presented as a free speech issue, people were much more likely to express tolerance of such rallies and speeches.
Our tendency to interpret events and information according to how they’re ‘framed’ is very powerful. This is partly because in framing issues it’s possible to stress specific values or beliefs that we might hold very deeply.
As we mentioned earlier, sometimes when the things we read conflict with our deeply held values that makes us more likely to resist them.
But the opposite is also true – if something is presented as being in line with those beliefs, we can be more open to interpreting the facts in a particular way.
The media inevitably have to make choices about how to frame a story. To take a less emotive contemporary example, imagine a news report running one headline: ‘Traffic was massively disrupted today as environmental protestors blockaded a road outside an airport to oppose expansion’, or another: ‘To make a stand on the environmental impact of a new runway environmental protestors blockaded a road outside an airport today’. Both versions could be completely true, but whether the story is framed in terms of the disruption to traffic or as an environmental stand is likely to influence our opinion of it.
For any media wishing to offer a truly impartial version of events, this presents a challenging dilemma. How a journalist chooses to tell a story may reflect their own interpretation – and that could be a conscious decision, or not.
For those of us who consume the media, our own values, and how they complement or conflict with the framing of a story, will shape what we think of the ‘facts’. Think about that the next time you read a news story; if it had been framed in a different way, would you have interpreted those facts differently?
How we perceive and interpret political issues and events is influenced by the media in certain ways that might not be obvious at first. The media’s priorities shape our priorities; whether or not we notice changes in immigration in our local area might depend upon how salient the media has made that issue. When faced with complex, uncertain decisions, our minds will naturally look to whatever facts or stories are most available; and how those facts have been framed can change how we interpret them. The ease with which we can then recall those facts, and how familiar they feel to us, can also shape our perceptions and arguments.
So even though for most of us the media coverage of an election can’t directly change our vote, it can still certainly have an influence on our views and perceptions, upon which our vote ultimately depends. That may not sound overly alarming; perhaps you trust your news sources to provide you with what you judge as relevant and important information. But with the rise of fake news spreading false stories around the internet, perhaps we do need to be more aware of the subtle ways in which we can be swayed by the media.