Chapter Eight
The Science of Fox News
In June 2011, Jon Stewart went on air with Fox News’s Chris Wallace and started a major media controversy over the channel’s misinforming of its viewers. Sadly, the outcome only served to demonstrate how poorly our political culture handles the problem of systemic right-wing misinformation, and how much it ignores the root dynamics behind its existence.
Stewart has long been sparring with and mocking Fox. But one statement that he made that day, both because it was both so definitive and also so damning, struck the most devastating blow. “Who are the most consistently misinformed media viewers?” Stewart asked Wallace. “The most consistently misinformed? Fox, Fox viewers, consistently, every poll.”
Stewart’s statement was accurate. The next day, however, the Pulitzer Prize-winning fact-checking site PolitiFact weighed in and rated it “false.” In claiming to check Stewart’s “facts,” PolitiFact ironically committed a serious error—and later, doubly ironically, failed to correct it.
PolitiFact’s erroneous rebuttal set off a tizzy at Fox News where—on The O’Reilly Factor, Fox Nation, and Fox News Sunday—Stewart was bashed and PolitiFact lauded for its good (e.g., bad) fact-checking work. Stewart then went on air and apologized, albeit half-seriously, for he proceeded to list a mountain of cases in which PolitiFact had caught Fox spewing misinformation.
Stewart called the exercise his ascent of “Mount Fib.”
Yuks aside, PolitiFact was wrong, and Stewart was initially right—but wrong to accept the site’s correction. Thus, once PolitiFact weighed in, we moved from a situation in which at least one person was getting it right (Stewart) to a situation in which three individuals or organizations were in error—Stewart, PolitiFact, and Fox News—even as all of them now considered the matter closed. How’s that for the power of fact checking?
There probably is a group of media consumers out there who are more misinformed, overall, than Fox News viewers. But if you only consider mainstream U.S. television news outlets with major audiences (e.g., numbering in the millions), it really is true that Fox viewers are the most misled based on the evidence before us—especially in areas of political controversy. This should come as little surprise by now, but is precisely why it is the case remains under-explained at present.
I’ll get to the underlying causes shortly, drawing on what we already know about left-right differences, but also introducing a new concept—“selective exposure.” For Fox News, as we’ll see, represents the epitome of an environmental (“oven”) factor that has interacted powerfully with conservative psychology. The result has been a hurricane-like intensification of factual error, misinformation, and unsupportable but ideologically charged beliefs on the conservative side of the aisle.
First, though, let’s survey the evidence about how misinformed Fox viewers are.
Based upon my research, I have located seven separate studies that support Stewart’s claim about Fox, and none that undermine it. Six of these studies were available at the time that PolitFact took on Stewart; one of them is newer. There may well be other studies out there than these; I can’t claim that my research is utterly exhaustive and there are no black swans. However, given the large amount of attention paid to the Stewart-Fox-PolitiFact flap—and my calls at that time for citations to any other studies of relevance—it seems likely that most or all of the pertinent research came to light.
The studies all take a similar form: These are public opinion surveys that ask citizens about their beliefs on factual but contested issues, and also about their media habits. Inevitably, some significant percentage of citizens are found to be misinformed about the facts, and in a politicized way—but not only that. The surveys also find that those who watch Fox are more likely to be misinformed, their views of reality skewed in a right-wing direction. In some cases, the studies even show that watching more Fox makes the misinformation problem worse.
It’s important to note that not all of these studies are able to (or even attempt to) establish causation. In other words, they don’t necessarily prove that watching Fox makes people believe incorrect things. It could be that those who are already more likely to hold incorrect beliefs (in this case, Republicans and conservatives) are also more likely to watch Fox to begin with, or to seek it out. The causal arrow could go in the opposite direction or in both directions at once.
Let me also add one more caveat. I can imagine (and could probably even design) a study that might find Fox News viewers are better informed than viewers of other cable news channels about some contested topic where biases and misinformation are driven by left-wing impulses (e.g., the kinds of issues discussed in chapter 12). Why this study doesn’t appear to exist I don’t know; but I certainly didn’t come across it in my research. Conservatives ought to perform such a study, if they want to prove Stewart even a little bit wrong.
So with that, here are the studies.
In 2003, a survey by the Program on International Policy Attitudes (PIPA) at the University of Maryland found widespread public misperceptions about the Iraq war. For instance, many Americans believed the U.S. had evidence that Saddam Hussein’s Iraq had been collaborating in some way with Al Qaeda, or was involved in the 9/11 attacks; many also believed that the much touted “weapons of mass destruction” had been found in the country after the U.S. invasion, when they hadn’t. But not everyone was equally misinformed: “The extent of Americans’ misperceptions vary significantly depending on their source of news,” PIPA reported. “Those who receive most of their news from Fox News are more likely than average to have misperceptions.” For instance, 80 percent of Fox viewers held at least one of three Iraq-related misperceptions, more than a variety of other types of news consumers, and especially NPR and PBS users. Most strikingly, Fox watchers who paid more attention to the channel were more likely to be misled.
At least two studies have documented that Fox News viewers are more misinformed about this subject.
In a late 2010 survey, Stanford University political scientist Jon Krosnick and visiting scholar Bo MacInnis found that “more exposure to Fox News was associated with more rejection of many mainstream scientists’ claims about global warming, with less trust in scientists, and with more belief that ameliorating global warming would hurt the U.S. economy.” Frequent Fox viewers were less likely to say the Earth’s temperature has been rising and less likely to attribute this temperature increase to human activities. In fact, there was a 25 percentage point gap between the most frequent Fox News watchers (60%) and those who watch no Fox News (85%) in whether they think global warming is “caused mostly by things people do or about equally by things people do and natural causes.” The correct answer is that global warming is caused mostly by things people do—but clearly, agreeing with this statement is much more accurate than disagreeing with it.
In a much more comprehensive study released in late 2011 (too late for Stewart or for PolitiFact), American University communications scholar Lauren Feldman and her colleagues reported on their analysis of a 2008 national survey, which found that “Fox News viewing manifests a significant, negative association with global warming acceptance.” Viewers of the station were less likely to agree that “most scientists think global warming is happening” and less likely to think global warming is mostly caused by human activities, among other measures. And no wonder: Through a content analysis of Fox coverage in 2007 and 2008, Feldman and her colleagues also demonstrated that Fox coverage is more dismissive about climate science, and features more global warming “skeptics.”
The Feldman study also contained an additional fascinating finding: Those Republicans who did watch CNN or MSNBC were more persuaded than Democratic viewers were to accept global warming. In other words, Republicans in the study seemed much more easily swayed by media framing than Democrats, in either direction. (This is something this book will return to.)
Once again, at least two studies have found that Fox News viewers are more misinformed about this topic.
In 2009, an NBC survey found “rampant misinformation” about the health care reform bill before Congress—derided on the right as “Obamacare.” It also found that Fox News viewers were much more likely to believe this misinformation than average members of the general public. “72% of self-identified FOX News viewers believe the healthcare plan will give coverage to illegal immigrants, 79% of them say it will lead to a government takeover, 69% think that it will use taxpayer dollars to pay for abortions, and 75% believe that it will allow the government to make decisions about when to stop providing care for the elderly,” the survey found. By contrast, among CNN and MSNBC viewers, only 41 percent believed the illegal immigrant falsehood, 39 percent believed in the threat of a “government takeover” of healthcare (40 percentage points less), 40 percent believed the falsehood about abortion, and 30 percent believed the falsehood about “death panels” (a 45 percent difference!).
In early 2011, the Kaiser Family Foundation released another survey on public misperceptions about health care reform. The poll asked 10 questions about the newly passed healthcare law and compared the “high scorers”—those who answered 7 or more correct—based on their media habits. The result was that “higher shares of those who report CNN (35 percent) or MSNBC (39 percent) as their primary news source [got] 7 or more right, compared to those that report mainly watching Fox News (25 percent).” The questions posed had some overlaps with the 2009 NBC poll—for instance, about providing care to undocumented immigrants and cutting some benefits for those on Medicare—but also covered a variety of other factual matters that arose in the healthcare debate.
In late 2010, two scholars at Ohio State University studied public misperceptions about the so-called “Ground Zero Mosque”—and in particular, the prevalence of a series of rumors depicting those seeking to build this Islamic community center and mosque as terrorist sympathizers, anti-American, and so on. All of these rumors had, of course, been dutifully debunked by fact-checking organizations. The result? “People who use Fox News believe more of the rumors we asked about and they believe them more strongly than those who do not.” Respondents reporting a “low reliance” on Fox News believed .9 rumors on average (out of 4), but for those reporting a “high reliance” on Fox News, the number increased to 1.5 out of 4 (on average).
In late 2010, the Program on International Policy Attitudes (PIPA) once again singled out Fox in a survey about misinformation during the 2010 election. Out of 11 false claims studied in the survey, PIPA found that “almost daily” Fox News viewers were “significantly more likely than those who never watched it” to believe 9 of them, including the misperceptions that “most scientists do not agree that climate change is occurring” (they do), that “it is not clear that President Obama was born in the United States” (he was), that “most economists estimate the stimulus caused job losses” (it either saved or created several million jobs), that “most economists have estimated the health care law will worsen the deficit” (they have not), and so on.
It is important to note that in this study—by far the most critiqued of the bunch—the examples of misinformation surveyed were all closely related to prominent issues in the 2010 midterm election, and indeed, were selected precisely because they involved issues that voters said were of greatest importance to them, like health care and the economy. That was the main criterion for inclusion, explains PIPA senior research scholar Clay Ramsay. “People said, here’s how I would rank that as an influence on my vote,” says Ramsay, “so everything tested is at least a 5 on a zero to 10 scale.”
So the argument that the poll’s topics were chosen so as to favor Democrats, and to punk Fox viewers, doesn’t hold water. Indeed, the poll question that was of least import to voters, and thus whose inclusion was most questionable, was one that provided a clear opportunity to trap Democrats in an incorrect belief—and succeeded in doing so. It was a rather tricky question: Whether President Obama’s allegation that the U.S. Chamber of Commerce had raised foreign money to run attack ads on Democratic candidates, and to support Republican candidates, had been “proven to be true.” Actually, PolitiFact had rated the claim as only “half-true,” so “proven to be true” was judged to be incorrect—but 57 percent of Democratic voters gave that wrong answer.
So much for attempts to challenge the topics chosen in this study. And even if you were to throw out the study entirely, the other six remain, and the weight of the evidence barely shifts.
On the subject of Fox News and misinformation, PolitiFact simply appeared out of its depth. The author of the article in question, Louis Jacobson, only cited two of the studies above—“Iraq War” and “2010 Election”—though six out of seven were available at the time he was writing. And then he suggested that the “2010 Election” study should “carry less weight” due to various methodological objections.
Meanwhile, Jacobson dug up three separate studies that, understanding the political mind, we can dismiss as irrelevant. That’s because these studies did not concern misinformation, but rather, how informed news viewers are about basic political facts like the following: “who the vice president is, who the president of Russia is, whether the Chief Justice is conservative, which party controls the U.S. House of Representatives and whether the U.S. has a trade deficit.”
A long list of public opinion studies have shown that too few Americans know the answers to such basic questions. They are insufficiently informed about politics, just as they are about science, economics, and American history. That’s lamentable, but also off point at the moment. These are not politically contested issues, nor are they skewed by an active misinformation campaign. As a result, on such issues many Americans may be ill-informed but liberals and conservatives are nevertheless able to agree.
Jon Stewart was clearly talking about political misinformation. He used the word “misinformed.” And for good reason: Misinformation is by far the bigger torpedo to our national conversation, and to any hope of a functional politics. “It’s one thing to be not informed,” explains David Barker, a political scientist at the University of Pittsburgh who has studied conservative talk-radio listeners and Fox viewers. “It’s another thing to be misinformed, where you’re confident in your incorrectness. That’s the thing that’s really more problematic, democratically speaking—because if you’re confidently wrong, you’re influencing people.”
From the point of view of the political brain, the distinction between lacking information and believing misinformation is equally fundamental. Whether you know who the president of Russia is—that’s one type of question. It’s a question where there’s no political stake and someone who doesn’t know the answer can accept it when it is provided, because it doesn’t require any emotional sacrifice to do so. However, whether global warming is human caused is fundamentally different. It’s a question that is politicized, and thus engages emotions, identity, and classic pathways of biased reasoning. So to group together the lack of information with misinformation is, from this book’s perspective, the most flagrant of fouls.
And it gets even worse for PolitiFact: There are reasons to think that Fox News viewers are both more informed than the average bear, and yet, simultaneously more misinformed on key politicized issues. In other words, many of them are classic “smart idiots” engaging in motivated reasoning to support their beliefs. “They’re an active group, that actually knows a fair amount of political facts,” explains Barker. “They can tell you who the members of the Supreme Court are, and things like that. But when it comes to political information that has any kind of a partisan element to it, where a correct answer helps one side politically, or hurts one side politically, [being misinformed is] very typical of them.”
Thus PolitiFact’s approach was itself deeply uninformed, and underscores the ignorance about psychology that pervades mainstream politics. Indeed, after I refuted its analysis in a much read blog post, PolitiFact failed to correct its error, or even to mention that it had missed four relevant studies in its analysis.
Almost entirely missing in the PolitiFact-Stewart flap was any weighing of the truly interesting and important question: Why are Fox News viewers so misinformed?
To answer it—thereby showing the interaction between media change on the one hand, and conservative psychology on the other—we’ll first need to travel once again back to the 1950s, and the pioneering work of the psychologist and Seekers infiltrator, Leon Festinger.
In his 1957 book A Theory of Cognitive Dissonance, Festinger built on his study of Mrs. Keech and the Seekers, and other research, to lay out many ramifications of his core idea about why we contort the evidence to fit to our beliefs, rather than conforming our beliefs to the evidence. That included a prediction about how those who are highly committed to a belief or view should go about seeking information that touches on that powerful conviction.
Festinger suggested that once we’ve settled on a core belief, this ought to shape how we gather information. More specifically, we are likely to try to avoid encountering claims and information that challenge that belief, because these will create cognitive dissonance. Instead, we should go looking for information that affirms the belief. The technical (and less than ideal) term for this phenomenon is “selective exposure”: what it means is that we selectively choose to be exposed to information that is congenial to our beliefs, and to avoid “inconvenient truths” that are uncongenial to them. Or as one group of early researchers put it, in language notable for its tone of wrecked idealism:
In recent years there has been a good deal of talk by men of good will about the desirability and necessity of guaranteeing the free exchange of ideas in the market place of public opinion. Such talk has centered upon the problem of keeping free the channels of expression and communication. Now we find that the consumers of ideas, if they have made a decision on the issue, themselves erect high tariff walls against alien notions.
Selective exposure is generally thought to occur on the individual level—e.g., one person chooses to watch Fox News. But when we think about conservative Christian homeschooling or the constant battles over the teaching of controversial issues in public schools—where authoritarian parents seek to skew curricula to prevent their children from hearing threatening things—a kind of selective exposure is also on full display. The only difference is that it’s selective exposure of information for someone else. It’s parents trying to control what their children are exposed to, actively seeking to blind the next generation rather than themselves.
Selective exposure theory grows out of the cognitive dissonance tradition, but the concept of erecting “tariff walls” against inconvenient truths gels with the theory of motivated reasoning as well. As Charles Taber of Stony Brook University explains, motivated reasoning makes us susceptible to all manner of confirmation biases—seeking or greatly emphasizing evidence that supports our views and predispositions—and disconfirmation biases—attacking information that threatens us. In this context, “selective exposure” might be considered a certain breed of “confirmation bias,” one involving our media choices in particular. (As we’ll see, the theory of motivated reasoning also implies that “selective exposure” may operate, at least in part, on a subconscious and emotional level that we’re not even aware of.)
If Festinger’s ideas about “selective exposure” are correct, then I was wise to be cautious, earlier, about whether the chief problem with Fox News is that it is actively causing its viewers to be misinformed. It’s very possible that Fox could be imparting misinformation even as politically conservative viewers are also seeking the station out—highly open to it and already convinced about many falsehoods that dovetail with their beliefs. Thus, they would come into the encounter with Fox not only misinformed and predisposed to become more so, but inclined to be very confident about their incorrect beliefs and to impart them to others. In this account, political misinformation on the right would be driven by a kind of feedback loop, with both Fox and its viewers making the problem worse.
Psychologists and political scientists have extensively studied selective exposure, and within the research literature, the findings are often described as mixed. But that’s not quite right. In truth, some early studies seeking to confirm Festinger’s speculation had problems with their designs and often failed—and as a result, explains University of Alabama psychologist William Hart, the field of selective exposure research “stagnated” for several decades. But it has since undergone a dramatic revival—driven, not surprisingly, by the modern explosion of media choices and growing political polarization in the U.S. And thanks to a new wave of better-designed and more rigorous studies, the concept has become well established.
“Selective exposure is the clearest way to look at how people create their own realities, based upon their views of the world,” says Hart. “Everybody knows this happens.”
The first wave of selective exposure research, much of it conducted during the 1960s, resulted in the drawing of one key distinction that we must keep in mind. Even in cases where the sorting of people into friendly information channels had been demonstrated, critics questioned whether the study subjects were actively and deliberately building “tariff walls” to protect their beliefs. Rather, they suggested that selective exposure might be de facto: People might encounter more information that supports their personal views not because they actively seek it, but because they live in communities or have lifestyle patterns that strongly tilt the odds in favor of such encounters happening in the first place.
Thus, if you live in a “red state,” Fox News is more likely to be on the TV in public places—bars, waiting rooms—than if you live in a “blue state.” And your peers and neighbors are much more likely to be watching it and talking about it. Anyone who travels around America will notice this, rendering the distinction between de facto and what we might call motivated selective exposure an important one.
However, more modern studies of selective exposure are explicitly designed to rule out the possibility of de facto explanations. As a result, by 2009, Hart and a team of researchers were able to perform a meta-analysis—a statistically rigorous overview of published studies on selective exposure—that deliberately omitted these problematic research papers. That still left behind 67 relevant studies, encompassing almost 8,000 individuals, and by pooling them together Hart found that people overall were nearly twice as likely to consume ideologically congenial information as to consume ideologically inconvenient information—and in certain circumstances, they were even more likely than that. That’s not to say nobody ever goes seeking what political science wonks sometimes call “counterattitudinal” information—often they do. But it’s rarer, overall, than seeking friendly information.
When are people most likely to seek out self-affirming information? Hart found that they’re most vulnerable to selective exposure if they have defensive goals—for instance, being highly committed to a preexisting view, and especially a view that is tied to a person’s core values. Just as Festinger predicted, then, defensive motivations increase the “risk,” so to speak, of engaging in selective exposure.
One defensive motivation identified in Hart’s study was closed-mindedness, which makes a great deal of sense. It is probably part of the definition of being closed-minded, or dogmatic, that you prefer to consume information that agrees with what you already believe.
Knowing that political conservatives tend to have a higher need for closure—especially right-wing authoritarians, who are increasingly prevalent in the Republican Party—this suggests they should also be more likely to select themselves into belief-affirming information streams, like Fox News or right-wing talk radio or the Drudge Report. Indeed, a number of research results support this idea.
In a study of selective exposure during the 2000 election, for instance, Stanford University’s Shanto Iyengar and his colleagues mailed a multimedia informational CD about the two candidates—Bush and Gore—to 600 registered voters and then tracked its use by a sample of 220 of them. As a result, they found that Bush partisans chose to consume more information about Bush than about Gore—but Democrats and liberals didn’t show the same bias toward their own candidate.
Selective exposure has also been directly tested several times in authoritarians. In one case, researchers at Stony Brook University primed more and less authoritarian subjects with thoughts of their own mortality. Afterwards, the authoritarians showed a much stronger preference than non-authoritarians for reading an article that supported their existing view on the death penalty, rather than an article presenting the opposing view or a “balanced” take on the issue. As the authors concluded: “highly authoritarian individuals, when threatened, attempt to reduce anxiety by selectively exposing themselves to attitude-validating information, which leads to ‘stronger’ opinions that are more resistant to attitude change.”
The aforementioned Robert Altemeyer of the University of Manitoba has also documented an above average amount of selective exposure in right wing authoritarians. In one case, he gave students a fake self-esteem test, in which they randomly received either above average or below average scores. Then, everyone—the receivers of both low and high scores—was given the opportunity to say whether he or she would like to read a summary of why the test was valid. The result was striking: Students who scored low on authoritarianism wanted to learn about the validity of the test regardless of how they did on it. There was virtually no difference between high and low test scorers. But among the authoritarian students, there was a big gap: 73 percent of those who got high self-esteem scores wanted to read about the test’s validity, while only 47 percent of those who got low self-esteem scores did.
Altemeyer did it again, too, in another study. This time, as part of a series of larger studies on prejudice and ethnocentrism, he asked 493 students the following question:
Suppose, for the sake of argument, that you are less accepting, less tolerant and more prejudiced against minority groups than are most of the other students serving in this experiment. Would you want to find this out, say by having the Experimenter bring individual sheets to your class, showing each student privately his/her prejudice score compared with the rest of the class?
Right-wing authoritarians tend to be highly prejudiced and intolerant. But their response to this question also showed that compared with those who were less authoritarian, they didn’t want to learn this about themselves. Only 55 percent of Altemeyer’s authoritarians wanted to find out about their degree of prejudice, compared with 76 percent of his students rating low on authoritarianism. And the difference held up when the test was performed in a slightly different way, with an even larger group of students. When Altemeyer gave half of the students the opportunity to learn if they were more prejudiced, and the other half the opportunity to learn if they were less prejudiced, the authoritarians were much more likely to want to hear the good news about themselves, but not to hear the bad—selective exposure in action. The non-authoritarians, again, wanted the information either way.
Authoritarians, Altemeyer concludes, “maintain their beliefs against challenges by limiting their experiences, and surrounding themselves with sources of information that will tell them they are right.”
The evidence on selective exposure, as well as the clear links between closed-mindedness and authoritarianism, gives good grounds for believing that this phenomenon should be more common and more powerful on the political right. Lest we leap to the conclusion that Fox News is actively misinforming its viewers most of the time—rather than enabling them through its very existence—that’s something to bear in mind.
And if selective exposure will be worse among authoritarians, it will probably be worse still among authoritarians who are also political sophisticates—because, just like motivated reasoning, selective exposure appears to be worse among sophisticates in general.
In a powerful motivated reasoning study that also examined selective exposure, Charles Taber and Milton Lodge of Stony Brook gave their test subjects—whose political views and basic political literacy had already been measured—the opportunity to read information that either supported or challenged their views on affirmative action and gun control. In the experiment, the participants had to actively choose which positions to read on the issues, and those positions were identified with a well known and clearly political source—the Republican Party and the National Rifle Association on gun control, the Democratic Party and the National Association for the Advancement of Colored People on affirmative action, and so on.
Overall, the test subjects showed a clear tendency to choose friendly information over unfriendly information. But the effect was much stronger among the “sophisticates,” those who had scored higher on a test of basic civic and political literacy. These Einsteins chose to read congenial and self-affirming arguments 70 to 75 percent of the time. In the gun control case, for instance, sophisticated opponents of gun control chose to read arguments from the NRA or Republican Party 6 times for every 2 times they sought to read arguments from the other side (the Democratic Party or Citizens Against Handguns).
In other words, if you know a lot about a topic—or, if you think you do—this suggests you’re more likely to only consume information that is friendly to your views. Insofar as Fox viewers are highly self-confident conservative sophisticates, we have another reason for thinking that selective exposure is a key factor in driving the Fox misinformation effect.
One intriguing question is whether the inclination to engage in selective exposure occurs automatically and subconsciously, the result of emotional responses that occur outside of our awareness.
There’s little doubt that a constantly repeated behavior—like watching a particular television station—can become more or less unconscious over time. “It can be just pure habit, like driving to work in the morning, you don’t even realize how you got there, you’re just there,” says Hart. “You grab your coffee and you turn on Fox, that’s what your thought process is.”
On a first or early encounter, Fox News could also be emotionally appealing at a level prior to consciousness awareness. The attraction wouldn’t necessarily come from argumentative substance—although it might—but perhaps from imagery and tone. Thus, Fox’s constant displays of American flags, and the firm and confident bombast of its hosts, might strike a psychologically pleasing note for conservatives who are flipping through the channels. Slowly that may then bleed over into consciousness, as a person becomes aware of becoming a regular Fox viewer.
The end result, according to Stony Brook’s Charles Taber, is that selective exposure probably emerges from a blend of subconscious and conscious mental operations—and of course, even our seemingly “conscious” media choices are inseparable from our emotions. “To say that these processes are triggered automatically does not mean that we are not aware of the feelings, motivations, and beliefs that are so triggered,” Taber explains. “It is when we become aware of these things that we have the subjective sense of choosing to watch some media and avoid others, but in most cases I would claim these conscious decisions are rationalizations of inclinations that were set in motion outside of awareness.”
And then, when the phone rings for a survey, people can not only identify themselves as Fox viewers, but they may deliver some pretty colorful answers to the questions asked.
But there’s another crucial ingredient involved in selective exposure, a plainly environmental one. In order to actively decide to consume information from a congenial source, such a source must be readily available to you. You must have the choice. An extreme hypothetical can serve to illustrate the point: By definition, a political conservative living in an environment that only offers liberal media sources cannot engage in selective exposure.
“The more information that people are given, research suggests that the chance of engaging in selective exposure becomes greater,” says the University of Alabama’s William Hart. What this means is that over time, the American political environment (the “oven”) has become far more conducive to selective exposure—because media choices have simply exploded since the 1980s. That’s especially so for choices offering political fare.
It’s not just cable, whose onset obliterated the old alphabet soup monopoly enjoyed by ABC, CBS, PBS, and NBC, and gave us CNN, Fox News, and MSNBC (and the Food Network!). And it’s not just the Internet—although that’s a particularly ripe environment for selective exposure, since it offers the most choices, as well as plenty of ideological ones.
But consider another case: Conservative talk radio, the emergence of which was probably sped along by the Reagan Federal Communications Commission’s 1987 decision to do away with the “Fairness Doctrine,” which had previously required broadcasters both to cover controversial issues, and also to air different perspectives on them. Whatever the logic behind killing this rule, it surely was not based on a modern understanding of the political brain and its biases.
During the 1990s, conservative talk radio flourished, offering a powerful mix of entertainment and explicitly ideological commentary. And as scholars began to study this medium, they unveiled results that will sound familiar.
First, conservative talk-radio listeners were found to be political sophisticates—more heavily focused on political issues, wealthier, more likely to read the newspaper. And yet at the same time, they were found to be highly misinformed. Indeed, in a study by David Barker, C. Richard Hofstetter of San Diego State University, and several colleagues, it was found that “exposure to conservative political talk shows was related to increased misinformation, while exposure to moderate political talk shows was related to decreased levels of political misinformation, after controlling for other variables.” For anyone who understands the “smart idiots” effect, that makes perfect sense.
What were conservative talk-radio listeners misinformed about in the 1990s? It’s a bit of a trip down memory lane, but one that illuminates a key transitional stage leading to our current misinformation environment. They wrongly believed that “Growth in the budget deficit has increased during the Clinton presidency” and that “Teaching about religious observations is illegal in public schools,” as well as that teen pregnancy was on the rise and that student test scores were declining—all part and parcel of a right-wing narrative about America, but not actually true. They also believed several myths about welfare reform, a top issue in the Clinton era: e.g., “Most people are on welfare because they do not want to work”; and “America spends more on welfare than on defense.”
Welfare in fact presents a very well documented case study of conservative misinformation during the 1990s, one that seems closely parallel to the health care and global warming debates today.
In an early study (published in the year 2000) on the prevalence of falsehoods in American politics—one that stressed the then-novel distinction between being uninformed and believing strongly in misinformation—political scientist James Kuklinski and his colleagues at the University of Illinois at Urbana Champaign examined contrasting public views about the facts on this issue. Sure enough, they found that conservatives (or at any rate, those who held strong anti-welfare views) tended to be both more misinformed about welfare, and also more confident they were right in their (wrong) beliefs. In particular, welfare opponents tended to greatly exaggerate the cost of the program, the number of families on welfare, how many of them were African-American, and so on. For instance, only 7 percent of the public was on welfare at the time of the study; but those who exaggerated by answering up to 18 or 25 percent in Kuklinski’s survey were highly confident they were right. Just 1 percent of the federal budget went to welfare, but those who dramatically exaggerated the number—answering up to 11 or 15 percent—were highly confident they were right. And so on.
By the time Fox News came on the air in 1996, then, the trend of providing ideological fare to conservative sophisticates—both highly engaged and confident, and also more misinformed—was already well established. Indeed, Fox’s founder, the former Nixon adviser and television producer Roger Ailes, is a close friend of Rush Limbaugh’s. In the 1990s, Ailes produced a television show for political radio’s most popular personality. Some Fox hosts, like Sean Hannity and Bill O’Reilly, are also talk-radio stars, or were at one time—and the audiences for the two media overlap heavily. “I think that by now especially, they’ve become the same people,” says the University of Pittsburgh’s David Barker.
None of which is to suggest that Fox isn’t also guilty of actively misinforming viewers. It certainly is.
The litany of misleading Fox segments and snippets is quite extensive—especially on global warming, where it seems that every winter snowstorm is an excuse for more doubt-mongering. No less than Fox’s Washington managing editor Bill Sammon was found to have written, in a 2009 internal staff email exposed by MediaMatters, that the network’s journalists should:
. . . refrain from asserting that the planet has warmed (or cooled) in any given period without IMMEDIATELY pointing out that such theories are based upon data that critics have called into question. It is not our place as journalists to assert such notions as facts, especially as this debate intensifies.
And global warming is hardly the only issue where Fox actively misinforms its viewers. The polling data here, from the Project on International Policy Attitudes (PIPA) are very telling.
PIPA’s study of misinformation in the 2010 election didn’t just show that Fox News viewers were more misinformed than viewers of other channels. It also showed that watching more Fox made believing in nine separate political misperceptions more likely. And that was a unique effect, unlike any observed with the other news channels that were studied. “With all of the other media outlets, the more exposed you were, the less likely you were to have misinformation,” explains PIPA’s director, political psychologist Steven Kull. “While with Fox, the more exposure you had, in most cases, the more misinformation you had. And that is really, in a way, the most powerful factor, because it strongly suggests they were actually getting the information from Fox.”
Indeed, this effect was even present in non-Republicans—another indicator that Fox is probably its cause. As Kull explains, “even if you’re a liberal Democrat, you are affected by the station.” If you watched Fox, you were more likely to believe the nine falsehoods, regardless of your political party affiliation.
In summary, then, the “science” of Fox News clearly shows that its viewers are more misinformed than the viewers of other stations, and are this way for ideological reasons. But these are not necessarily the reasons that liberals may assume. Instead, the Fox “effect” probably occurs both because the station churns out falsehoods that conservatives readily accept—falsehoods that may even seem convincing to some liberals on occasion—but also because conservatives are overwhelmingly inclined to choose to watch Fox to begin with.
At the same time, it’s important to note that they’re also disinclined to watch anything else. Fox keeps constantly in their minds the idea that the rest of the media are “biased” against them, and conservatives duly respond by saying other media aren’t worth watching—it’s just a pack of lies. According to Public Policy Polling’s annual TV News Trust Poll (the 2011 run), 72 percent of conservatives say they trust Fox News, but they also say they strongly distrust NBC, ABC, CBS, and CNN. Liberals and moderates, in contrast, trust all of these outlets more than they distrust them (though they distrust Fox). This, too, suggests conservative selective exposure.
And there is an even more telling study of “Fox-only” behavior among conservatives, from Stanford’s Shanto Iyengar and Kyu Hahn of Yonsei University in Seoul, South Korea. They conducted a classic left-right selective exposure study, giving members of different ideological groups the chance to choose stories from a news stream that provided them with a headline and a news source logo—Fox, CNN, NPR, and the BBC—but nothing else. The experiment was manipulated so that the same headline and story was randomly attributed to different news sources. The result was that Democrats and liberals were definitely less inclined to choose Fox than other sources, but spread their interest across the other outlets when it came to news. But Republicans and conservatives overwhelmingly chose Fox for hard news and even for soft news, and ignored other sources. “The probability that a Republican would select a CNN or NPR report was around 10%,” wrote the authors.
In other words—to reiterate a point made earlier—Fox News is both deceiver and enabler simultaneously. Its existence creates the opportunity for conservatives to exercise their biases, by selecting into the Fox information stream, and also by imbibing Fox-style arguments and claims that can then fuel biased reasoning about politics, science, and whatever else comes up.
It’s also likely that conservatives, tending to be more closed-minded and more authoritarian, have a stronger emotional need for an outlet like Fox, where they can find affirmation and escape from the belief challenges constantly presented by the “liberal media.” Their psychological need for something affirmative is probably stronger than what’s encountered on the opposite side of the aisle—as is their revulsion toward allegedly liberal (but really centrist) media outlets.
And thus we once again find, at the root of our political dysfunction, a classic nurture-nature mélange. The penchant for selective exposure is rooted in our psychology and our brains. Closed-mindedness and authoritarianism—running stronger in some of us than in others—likely are as well.
But nevertheless, and just as with consevative think tanks and counterexpertise, it took the development of a broad array of media choices before these tendencies could be fully activated. The seed needed fertile soil in which to grow. Cast it on stony ground—say, the more homogeneous media environment of the 1960s and 1970s, when The New York Times and Washington Post were the “papers of record” and everybody watched the three network channels and PBS—and its growth will be stunted.
Perhaps the fact that early studies of selective exposure sometimes failed, leading psychologists to largely discard the theory—even as now, it has been revived and is coming on strong—itself suggests the potency of this environmental change.
At this point in the book’s narrative, I have laid out three different bodies of evidence that help to build a case about American conservatives’ unique misalignment with reality—and how this misalignment has come to exist.
First, I’ve explored motivated reasoning, and how this emotional and automatic process leads many of us to do just about anything to defend our identities and beliefs—including clinging to wrong ideas and arguing fiercely on their behalf. And I’ve shown some evidence suggesting that this tendency may be more prevalent on the political right (although liberals are certainly not immune to it)—not just motivated reasoning in general, but selective exposure in particular.
Second, I’ve surveyed a large body of research on conservative psychology—finding that conservatives (especially authoritarians) appear to be less Open, less tolerant of uncertainty and ambiguity, less integratively complex, and to have a stronger need for closure.
Finally, I’ve shown how political, social, and technological change in the U.S.—factors like the mobilization of a conservative movement, the proliferation of supporting think tanks and “experts,” a leftward shift of academia in response, and the growth of sympathetic conservative media outlets—have added fuel to the fire. All of these new factors interact with conservative psychology, in such a way as to make the misinformation problem worse.
Now, then, it’s time for a very different kind of evidence. It’s time to look at how factually wrong conservatives actually are. I’ve shown many hints of this throughout the book, but now comes the time to look systematically.
This is a critically important part of the story. It would be one thing to theorize that conservatives are likely to be more dogmatic about incorrect beliefs in a context where there aren’t many real world cases of conservatives being incorrect. People would very understandably wonder why anyone came up with such a theory in the first place.
But that’s not where we find ourselves. The evidence, in this case, is the best support for the theory one could imagine.
Notes
147 “the most consistently misinformed” Fox News Sunday, June 19, 2011. Transcript available online at http://www.foxnews.com/on-air/fox-news-sunday/transcript/defense-secretary-robert-gates-exit-interview-jon-stewart-talks-politics-media-bias?page=6.
147 rated it “false” PolitiFact, “Jon Stewart says those who watch Fox News are the ‘most consistently misinformed media viewers,’” June 20, 2011. Available online at http://www.politifact.com/truth-o-meter/statements/2011/jun/20/jon-stewart/jon-stewart-says-those-who-watch-fox-news-are-most/.
147 tizzy at Fox News For an overview of some of the fallout, see Media Matters, “Jon Stewart Gets It Right About Fox News,” June 22, 2011. Available online at http://mediamatters.org/research/201106220022.
148 my calls at that time Chris Mooney, “When Facts Don’t Matter: Proving the Problem With Fox News,” DeSmogBlog, June 29, 2011. Available online at http://www.desmogblog.com/when-facts-don-t-matter-proving-problem-fox-news.
149 widespread public misperceptions about the Iraq war Project on International Policy Attitudes, “Misperceptions, the Media, and the Iraq War,” October 2003. Available online at http://www.pipa.org/OnlineReports/Iraq/IraqMedia_Oct03/IraqMedia_Oct03_rpt.pdf.
150 late 2010 survey Jon A. Krosnick and Bo MacInnis, “Frequent Viewers of Fox News Are Less Likely to Accept Scientists’ Views of Global Warming,” December 2010. Available online at http://woods.stanford.edu/docs/surveys/Global-Warming-Fox-News.pdf.
150 much more comprehensive study Lauren Feldman et al, “Climate On Cable: The Nature and Impact of Global Warming Coverage on Fox News, CNN, and MSNBC,” International Journal of Press/Politics, in press.
151 an NBC survey NBC News Health Care Survey, August 2009. Questions available online at http://msnbcmedia.msn.com/i/MSNBC/Sections/NEWS/NBC-WSJ_Poll.pdf. However, this does not break down the responses by media viewership. Instead, that interpretation can be found here from NBC: http://firstread.msnbc.msn.com/_news/2009/08/19/4431138-first-thoughts-obamas-good-bad-news.
151 another survey on public misperceptions about health care Kaiser Family Foundation, “Pop Quiz: Assessing Americans’ Familiarity With the New Health Care Law,” February 2011. Available online at http://www.kff.org/healthreform/upload/8148.pdf.
152 “Ground Zero Mosque” Erik Nisbet and Kelley Garrett, “Fox News Contributes to Spread of Rumors About Proposed NYC Mosque,” October 14, 2010. Available online at http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf.
152 misinformation during the 2010 election Program on International Policy Attitudes, “Misinformation and the 2010 Election,” December 2010. Available online at http://www.worldpublicopinion.org/pipa/pdf/dec10/Misinformation_Dec10_rpt.pdf.
153 “People said, here’s how I would rank that as an influence on my vote” Interview with Steven Kull and Clay Ramsay of the Program on International Policy Attitudes, July 7, 2011.
153 “half-true” PolitiFact, “President Obama says foreign money coming in to the U.S. Chamber of Commerce may be helping to fund attack ads,” October 7, 2011. Available online at http://www.politifact.com/truth-o-meter/statements/2010/oct/11/barack-obama/president-barack-obama-says-foreign-money-coming-u/.
154 “It’s one thing not to be informed” Interview with David Barker, July 7, 2011.
154 “They can tell you who the members of the Supreme Court are” Interview with David Barker, July 7, 2011.
155 after I refuted its analysis Chris Mooney, “Jon Stewart 1, PolitiFact 0: Fox News Viewers Are the Most Misinformed,” DeSmogBlog, June 22, 2010. Available online at http://www.desmogblog.com/jon-stewart-1-politifact-0-fox-news-viewers-are-most-misinformed.
155 PolitiFact failed to correct its error In fairness, PolitiFact received overwhelming criticism on this matter—not surprisingly, since PolitiFact was clearly wrong—and ran a follow up item acknowledging the criticism. But not changing its rating. See Louis Jacobson, “Readers say we were uninformed about Jon Stewart’s claim,” June 21, 2011. Available online http://www.politifact.com/truth-o-meter/article/2011/jun/21/readers-sound-about-our-false-jon-stewart/.
155 his 1957 book Leon Festinger, A Theory of Cognitive Dissonance, Stanford: Stanford University Press, 1957. See chapters 6 and 7, “Voluntary and Involuntary Exposure to Information: Theory,” and “Voluntary and Involuntary Exposure to Information: Data.”
156 “high tariff walls against alien notions” Lazarsfeld, Paul F., Bernard, R. Berelson, and Hazel Gaudet. 1948. The People’s Choice. New York: Columbia University Press.
156 confirmation biases . . . and disconfirmation biases Charles Taber, email communication, July 7, 2011.
157 findings are often described as mixed Shanto Iyengar and Kyu S. Hahn, “Red Media, Blue Media: Evidence of Ideological Selectivity in Media Use,” Journal of Communication, Vol. 59, 2009, 19–39.
157 “Everybody knows this happens” Interview with William Hart, July 11, 2011.
157 selective exposure might be de facto Shanto Iyengar et al, “Selective Exposure to Campaign Communication: The Role of Anticipated Agreement and Issue Public Membership,” The Journal of Politics, Vol. 70, No. 1, 2008, pp. 186–200.
157 statistically rigorous overview of published studies on selective exposure William Hart et al, “Feeling Validated Versus Being Correct: A Meta-Analysis of Selective Exposure to Information,” Psychological Bulletin, 2009, Vol. 135, No. 4, 555–588.
158 often they do [seek out counter-attitudinal information] See R. Kelly Garrett and Paul Resnick, “Resisting Political Fragmentation on the Internet,” Daedalus, Fall 2011. Available online at http://www.comm.ohio-state.edu/kgarrett/Assets/GarrettResnick-ResistingPoliticalFragmentation-prepress.pdf.
158 Democrats and liberals didn’t show the same bias Shanto Iyengar et al, “Selective Exposure to Campaign Communication: The Role of Anticipated Agreement and Issue Public Membership,” The Journal of Politics, Vol. 70, No. 1, 2008, pp. 186–200.
159 “highly authoritarian individuals, when threatened . . .” Howard Lavine et al, “Threat, Authoritarianism, and Selective Exposure to Information,” Political Psychology, Vol. 26, No. 2, 2005.
159 an above average amount of selective exposure in right-wing authoritarians For these two studies, see Robert Altemeyer, The Authoritarian Specter, Cambridge, MA: Harvard University Press, 1996, p. 139–142.
160 “maintain their beliefs against challenges by limiting their experiences” Ibid, p. 111.
160 powerful motivated reasoning study Taber & Lodge, “Motivated Skepticism in the Evaluation of Political Beliefs,” American Journal of Political Science, Vol. 50, Number 3, July 2006, pp. 755–769.
161 “You grab your coffee and you turn on Fox” Interview with William Hart, July 11, 2011.
162 “the subjective sense of choosing to watch some media and avoid others” Charles Taber, email communication, July 6, 2011.
162 “The more information people are given” Interview with William Hart, July 11, 2011.
162 a particularly ripe environment for selective exposure For a more optimistic take on selective exposure on the Internet, see Matthew Gentzkow and Jesse M. Shapiro, “Ideological Segregation Online and Offline,” available online at http://faculty.chicagobooth.edu/matthew.gentzkow/research/echo_chambers.pdf.
163 political sophisticates David C. Barker, Rushed to Judgment: Talk Radio, Persuasion, and American Political Behavior, New York: Columbia University Press, 2002. For this study see Chapter 7.
163 more misinformed about welfare, and also more confident they were right James Kuklinski et al, “Misinformation and the Currency of American Citizenship,” The Journal of Politics, Vol. 62, No. 3, August 2000, pp. 790–816.
164 “they’ve become the same people” Interview with David Barker, July 7, 2011.
164 “refrain from asserting that the planet has warmed” Media Matters, “FOXLEAKS: Fox boss ordered staff to cast doubt on climate science,” December 15, 2010. Available online at http://mediamatters.org/blog/201012150004.
165 “strongly suggests they were actually getting the information from Fox” Interview with Steven Kull and Clay Ramsay of the Program on International Policy Attitudes, July 7, 2011.
165 annual TV News Trust Poll Public Policy Polling, “PBS the most trusted name in news,” January 19, 2011. Available online at http://www.publicpolicypolling.com/pdf/PPP_Release_National_0119930.pdf
166 ‘Fox-only” behavior among conservatives Shanto Iyengar and Kyu S. Hahn, “Red Media, Blue Media: Evidence of Ideological Selectivity in Media Use,” Journal of Communication, Vol. 59, 2009, 19–39. See also Shanto Iyengar and Richard Morin, “Red Media, Blue Media,” The Washington Post, May 3, 2006. Available online at http://www.washingtonpost.com/wp-dyn/content/article/2006/05/03/AR2006050300865_pf.html.