In the summer of 2019, President Donald J. Trump held a social media summit at the White House. The guests, however, were not representatives from the firms that own social media platforms. Instead, they were extremist, ultraconservative conspiracy theorists. The president nevertheless praised them for their work, calling them real journalists, vital opinion leaders, and assets to democracy. Let’s be clear, though: the White House had brought together some of the producers, distributors, and marketers of big political lies. The summit attendees had a long record of disseminating racism, sexism, purposeful disinformation, and subtle misinformation over social media. The president validated and celebrated their work, and by inviting them to the White House, he elevated the purveyors of big political lies to the status of a central feature of modern democracy.
But this event was not just about celebrating the role of misinformation in public life. It was actually a form of campaign coordination. In preparing for the next big presidential campaign, the sitting president of the United States was rewarding his extended team with praise and validation, bringing them into the center of political power. He was acknowledging their role in his own political career, setting expectations for the campaign ahead, and sending clear signals on how he thought political news and information should be created.
I wish this book weren’t necessary. In New Media Campaigns and the Managed Citizen, I analyzed how political actors used the internet to manipulate voters in the United States. In The Digital Origins of Dictatorship and Democracy, I described how new information technologies were helping democracy advocates dismantle the tough structures of authoritarianism. In Pax Technica, I wrote about how behavioral data from networked devices was increasingly being used to make political inferences—sometimes for the collective good and sometimes for social control. This current book is about the teams of people, such as those invited to the White House and others encircling some of the world’s most prominent public figures, who do the work of embedding new information technologies into our political lives but use those technical systems to misinform us, distract us, and discourage sensible deliberation.
What connects these arguments is that, first and foremost, politics is a sociotechnical system constituted by both ideology and technology. Any framework for understanding politics that is simply about elections, political parties, and government, or that assigns technology a minor role in explaining current affairs, won’t produce compelling explanations for political outcomes. Second, any sensible definition of democracy—or authoritarianism—must include elements of its information infrastructure. In an important way, this is because the features of a democracy may be abstract or found in broad generalizations and vague organizational arrangements that are rarely exercised or surprisingly fragile. But information policy, infrastructure standards, and cultures of technology use are the clear, specific, regular examples of democracy in action and practice. Social media platforms in particular provide the structure for our political lives, they regularize our civic engagement, and they offer the most comprehensive evidence of whether we are living in a democracy or a dictatorship.
If ruling elites, lobbyists, and shady politicians can use new information technology for political redlining or creating astroturf movements, they will. As you’ll see in the chapters ahead, political redlining is the process of restricting our supply of political information based on assumptions about our demographics and our present or past opinions. It occurs when political consultants delimit which population is less likely to vote and then ignore that population, spending time serving only likely voters. Sometimes campaign managers dedicate resources to ensuring that a certain group won’t vote.
I’m hopeful, positivist, and unbending in this book in that I have an expansive and inclusive definition of what I mean by “big political lies” and I have faith that exposing and correcting them with evidence can improve policy making, governance, and public life.
There are so many variations of political lies that I will not philosophize about their nuances. Yet I do believe that evidence and truth can disabuse us of bad ideas based on misinformation, disinformation, untruth, half-truth, distortion, or omission of facts. If pressed to offer a formal definition of a lie machine, I would say that this book is about the social and technical mechanisms for putting an untrue claim into service of political ideology. It’s not PR, and it’s more than political campaign messaging. And computational propaganda is the use of algorithms to produce, distribute, and market untruths that serve ideology. Some writers prefer disinformation, misinformation, propaganda, or other terms for these fibs. For me, however, understanding the nuanced distinctions among types of political lies is not as important as understanding the political economy and technological basis of their creation and success. So in the pages that follow, I am not rigid in using just one term, and I use the term most sensibly descriptive for the example at hand. In my mind, the examples in this book are all in the family of big political lies.
Authoritarian governments will look, almost by definition, for ways to use new information technologies for social control. Their strategy involves seeding multiple conflicting stories about events and using technologies to prevent the public from having sensible debates about those events. Democracy advocates will always look for creative ways to catch dictators off guard using information technology. What I didn’t expect, before I began research in this field, is that authoritarian regimes would use such techniques on voters in democracies. And then I didn’t expect politicians, lobbyists, and political parties in democracies to adapt those techniques for use on their own citizens. I assumed that electoral administrators, public officials, and courts would prevent the political communication techniques of dictatorships from diffusing into democracies.
In 2014, I wrote an opinion piece suggesting that bot use was on the rise and that politicians should pledge not to use them on voters. “Would political campaign managers in a democracy like the United States use bots?” I asked. I concluded, “The question is not whether political parties in democracies will start using bots on one another—and us—but when.”
I don’t believe that politicians or political campaign managers in democracies read that piece. In 2015, I wrote Pax Technica, a book about the shifting location of political power from the organizations of government to the technology firms of Silicon Valley. Much of the evidence in the book was from authoritarian regimes and failed states where technology platforms were providing governance goods that public agencies couldn’t provide. These were also countries where political elites were using information technology as a tool for social control.
This book continues my argument that politics is best understood as a sociotechnical system. There is a global economy of political lies, and in these pages I explain how that economy has evolved because of how we use our technology to organize data and how technology innovation uses our data. The biggest political lies are the result of complex interactions between people and our technologies, and any sensible causal explanation for modern politics must dig into that complexity. To tell this story you must have a global perspective, and you must feature people and technology as key actors. The causal story is a global one because the production, distribution, and marketing of political lies involve people and places all over the world, even if the targets for misinformation are the citizens of one country. The causes of democratic malaise, rising populism, poor political decision making, and declining voter sophistication are complex and conjoined: people produce and consume the lies, but the algorithms, data sets, and information infrastructure determine the impact of those lies.
I have an expansive definition of what makes a political lie. Several key aspects of political lies are covered in chapter 1. But there are so many variations of misleading political content that I don’t believe the nuances are more important than the consequences of their production, distribution, and marketing to citizens. Looking across the examples used in this book, there are incorrect opinions and clear untruths. There is misinformation and disinformation, deliberately crafted to be so extremist, sensationalist, and conspiratorial that they effectively become lies. The common ingredients include incendiary and polarizing commentary and points of fact that are so wrong that the main outcome of accepting or tolerating the inaccuracies is an incendiary, polarizing conclusion. Content that promotes undue skepticism, negative emotions, and contrarian views for the sake of “teaching the controversy” or text and video messages that bring anxiety or aversion to dialogue and new evidence also fall into the broad category of political lies.
Writing about all the torrid details of political lies, manipulations, and deception may seem a soul-crushing task. But I think that these lie machines have caused several of our democratic organizations and institutions to grind to a halt. Some elections and referenda generate incredible outcomes which are almost mistakes in that such large numbers of voters had such poor-quality information that their deliberations resulted in solvable errors.
Many people believe things that are demonstrably false. Telling organized lies helps some politicians win and stay in office, where they use bad information to make poor decisions. They generate new conspiracies and deepen public distrust, and then voters go back to the polls on election day equipped with even more grievances and less information. We can’t save our democracies unless we understand these mechanisms.
People in several countries are now governed by political leaders who have rejected, not just specific scientific advice, but the overall notion that carefully collected evidence is the best way to understand the world. Such deliberate use of lie machines is deeply distressing. But we are threatened with more than emotional distress. Purposefully organized ignorance may end up destroying our ability to deliberate sensibly.
Many people wonder what’s new about the political lies and manipulations we’ve experienced in recent years. Certainly, the history of propaganda and political rumor mongering is long. The first broadsheets, newspapers, and, later, newswires were regularly accused of spreading made-up stories serving various political agendas. But compared to today, such print journalism was harder to produce, distribute, and market to vast numbers of people.
The complex political misinformation efforts of yesteryear took significant physical resources to orchestrate. For example, after World War II, the East German Stasi faked Nazi era records and deposited them at the bottom of Černé Jezero Lake in what was then Czechoslovakia. In 1964, the records were then “found” and shared at a dramatic press conference, thereby ending the careers of some West German politicians and impeding the prosecution of war criminals. Operation Neptune, as it was dubbed, was later revealed to be one of the Stasi’s most complex disinformation campaigns. In another example, in 1974 Greek Cypriots were accused of burning a Turkish mosque on the island nation. This news was broadcast all over Turkey, giving the Turkish president a ratings boost and national support for invading Cyprus. A Turkish general later admitted that Turkish troops burned the mosque to foment dispute. There certainly is a long history to fake news production, but in the past, it was mostly employed in times of war and crisis and was produced by major governments.
Things changed significantly with the creation of the internet, which quickly became a tool for automating political lies. The first known use of a political bot is UseNet’s Serdar Argic account in the late 1980s. This early bot searched for the term “Turkey” and then posted denials of the Armenian Genocide. Like some of the latest political bots, its reach was somewhat accidental because this simple keyword trigger meant that this political misinformation was posted in lists dedicated to cooking and Christmas and Thanksgiving celebrations. But most such early examples share similar features: there were some elements of truth to the storylines, and significant financial and personnel resources were dedicated to executing the plan for disseminating the lies. They were large, one-off projects that couldn’t be easily repeated and refined over time.
What makes today’s lie machines different is their low cost of production, the great speed of dissemination over social media, and the expanding industry of marketing agencies to help place and amplify computational propaganda. In 2014, the Columbian Chemicals Plant explosion hoax scared US voters with the story of a factory in Louisiana that had been attacked by ISIS (the Islamic State in Iraq and Syria). All the content was faked: images were doctored to appear to have come from CNN, falsified pages were placed on Wikipedia, and fake user accounts on multiple platforms spread the junk news. Many more examples of such lie machines are covered in the pages ahead—examples of campaigns that ran for months, taking advantage of the affordances of social media advertising technologies. For now, let’s just state that what makes modern lie machines special is that they are much less expensive to run, are based on quantitative models about information circulation, and allow constant experimentation and testing to perfect, refine, and reproduce messaging. They are produced at an industrial scale. This has made them a pernicious threat to public life.
Along with some big data analysis, this book makes use of interviews of people who are involved in the creation of lie machines. As is the norm in contemporary ethnography, interviewees were offered pseudonyms and some observations and experiences have been compiled and combined to create archetypes of individuals working in the production, distribution, and marketing of political lies. Company names have been changed. While I worked with many of the interview subjects, as a security measure, the files linking real people and real companies with their pseudonyms were not shared across country research teams or centralized. For example, fieldwork with firms in Poland and Brazil involved interviewing staff and experts, visiting several workplaces, and tracking their activities. But in this book I have aggregated observations and created corporate pseudonyms using the word equivalent to imitation in Polish and Brazilian Portuguese. Because these firms create fake social media profiles that emulate real citizens, I call the Polish company Imitacja Consulting and the Brazilian company Imitação Services.
Lie machines are a global problem, with a treadmill of production, distribution, and marketing that crosses international borders. Political actors are getting very good at producing big lies, social media algorithms provide an effective way of distributing those lies, and the science of marketing lies to the right audience is improving. I believe that such successful marketing actually cultivates the audience—even makes the audience grow—resulting in sustained support for the dictators, lobbyists, and dishonest politicians who are good at building and maintaining lie machines.
I hope that understanding how lie machines work can help us take them apart. The big lie machines are a mix of human and technical practices. Understanding this gives us a wider surface for intervention, with both social and technical solutions. Treating the problem this way makes it easier to craft legislation and regulation for particular industries, hold social media firms to account, identify practices that should be criminalized, and seek technical fixes to the challenges besieging democracy.