Index

1980 US presidential campaign, 103

2008 US presidential election, 5

2016 US presidential election: bot use during, 69–73

debates, 50, 70–71

Democratic Party hacked, 8–9, 50

junk news circulation, 12, 55, 68, 72–73, 100, 109–10

Pizzagate, 25, 84

primaries, 50, 74

Russian efforts to influence, 11–12, 24, 49, 50, 65–66

voter fraud allegations, 84, 98. See also Clinton, Hillary; Trump, Donald J.

2017 UK elections, 54, 57

2018 US elections, 24, 94. See also political polarization

2020 US presidential elections, 77

A/B testing, 124, 171 (defined). See also message testing

advertising: AI and, 145, 151

algorithms and ad distribution, 67–68

during Brexit campaign, 118, 119–28, 128t, 130–32, 133–34

conversion rate, 120, 123–24, 127, 133, 134–35, 172

data mining and, 113

Facebook’s pricing, 132

false advertising vs., 15

and marketing of political lies, 17

proposed guidelines, 155

Russian ads on social media, 45–49, 65–66

tithing, 165

TV/radio/print ads, 6

and voter activation, 110, 123–24

affinity campaigns, 128–29, 131, 171 (defined). See also BeLeave campaign; Vote Leave campaign

affordances, xvi, 17–18, 19, 20, 67, 95, 171 (defined)

Afghanistan, 49

African Americans, 49, 51, 140, 147–48

Aggregate IQ (AIQ), 119–20, 128–30. See also Brexit campaign

Albania, 122

Algeria, 34

algorithms: AI and, 144, 147, 173 (defined) (see also artificial intelligence)

audits, 165, 166

computational propaganda and, 34, 67–68, 69, 91, 108, 141

and the distribution of misinformation, 2, 3, 17, 52, 67–68, 79, 104, 108, 142–43

Hong Kong protest posts removed, 55

and message testing, 122 (see also message testing)

power of, 56, 73, 141, 157

and speed of junk news distribution, 100. See also artificial intelligence; bots and botnets; computational propaganda; social media platforms

All Out War: The Full Story of How Brexit Sank Britain’s Political Class (Shipman), 118, 120–22, 124

American Association of Public Opinion Researchers, 58

Andrejevic, Mark, 65, 154

Antisocial Media: How Facebook Disconnects Us and Undermines Democracy (Vaidhyanathan), 155

Arab Spring, 5–6, 10, 34–38, 55

Argentina, 76

Armenian Genocide, xvi

artificial intelligence (AI; machine learning), 23, 26–27, 144–54, 155, 171 (defined)

Assad, Bashar al-, 61. See also Syria

Assignments for Savushkin 55 (document), 39–40

astroturfing, 90, 171 (defined)

Australia, 138

authoritarian governments: censorship and surveillance, 1–2, 32, 36–37, 63–64

citizen data controlled, 159

defined, 171

lessons from Arab Spring, 34–38

media strategies (historically), 37

Morozov’s warning on internet use, 10–11

social media used for social control, xii, 1–2, 6, 11, 33, 36–38, 75–78

strategic use of information technologies, lie machines, xii, xiii, 1–2, 6, 36–38, 44, 52, 60, 77, 139, 140, 146. See also specific countries

automated social media accounts/activity: during 2016 presidential campaign, 69–73

and fake news, 23

and Hong Kong protests, 55

identifying, 58–59, 146

power of, 14, 17, 56–57, 168

within Russia, 55

in Turkey, Philippines, 2. See also bots and botnets; fake social media accounts

automation. See algorithms; artificial intelligence; automated social media accounts/activity; bots and botnets

Azerbaijan, 77

Bahrain, 36, 63, 77

Bakshy, Eytan, 104

behavioral data: AI and, 27, 151, 153

as political information, x, 3, 113–16, 156–57, 159–60

sources, 3, 4 (see also specific sources). See also data

BeLeave campaign, 119, 124–25, 128–31, 134, 182(n30). See also Brexit campaign

belief in lies/misinformation, xiv, 98, 135

Ben Ali, Zine El Abidine, 34, 35

bias. See cognitive bias

big data: AI and, 23, 27, 144–47

big-data analytics, 23, 65, 143

and computational propaganda, 21 (see also computational propaganda)

defined, 171–72

democracy and, 18, 158–66

sharing access, 158–60, 167. See also data

Bolsonaro, Jair, 91

bots and botnets: and 2016 presidential elections, 11–12, 25, 65–66

advantages, 60, 77, 79, 142–43

in Brazilian politics, 91–95

chatbots, 26–27, 145, 146, 152

computational propaganda and, 22, 67–68, 142–43

conversation choked off, 56, 62–63, 68, 76

defined and explained, 56–57, 62, 70, 143, 172

first use of political bot, xv–xvi

harmful content amplified, 34

identifying, 55, 57, 58–59, 62, 77, 143, 145, 146

Imitacja Consulting’s use of, 90–91

legends vs., 89, 90

market for, 78–81

as part of lie machine infrastructure, 17, 25, 54, 56–57, 85

political bots around the world, 74–78

political campaigns’ and parties’ use of, xii, 73–74, 76–77 (see also 2016 US presidential election)

political campaign solutions firms and, 64–65

positive uses/impacts, 79, 80

power and influence, 56–61, 79, 85, 168

purpose (function), 22, 76

and the Syrian civil war, 61–64

tracing bot networks, 68–74

on Twitter, 17, 42, 55, 57, 60, 61–63, 77, 78–79, 91, 92–94. See also artificial intelligence; computational propaganda

Brasil Liker, 92

Brazil, 12, 55, 91–95, 98. See also Imitação Services

Brexit campaign: canvassing, 133

digital strategy, 11, 67–68, 118–24, 129–32

impact and influence, 125–36

lies and misinformation, 11, 85, 118, 122–23, 129, 135

spending, 118, 121–22, 124–28, 130–32, 133, 181–82(n30), 182(n31)

Brunton, Finn, 52

Cambridge Analytica, 9, 64, 65, 114, 160

Canada, 12, 77, 138

cars, 4

Catalan independence, 47

causal factors, conjoined, 19–20

cell phones, 4, 63–64, 67, 121

censorship, 32, 36, 37, 143, 149, 161, 171 (defined), 172. See also authoritarian governments

Chaffee, Steve, 103

chatbots, 26–27, 145, 146, 152. See also artificial intelligence; bots and botnets

China: and Assad, 61

citizen data controlled, 159

and Hong Kong protests, 13, 55 (see also Hong Kong protests)

large-scale media experimentation, 26

media strategy (historically), 32

as misinformation superpower, 139

social media strategy, 1, 13, 33, 38, 75, 77

Citizen Lab, 63

clickbait content, 34

Clinton, Hillary, 11–12, 25, 50, 69–73, 84

CNN, xvi

cognitive bias, 11, 139. See also elective affinity; selective exposure

cognitive dissonance, 102–3

Columbian Chemicals Plant hoax, xvi

computational propaganda: algorithms and, 34, 67–68, 69, 91, 108, 141

in Brazil, 91–95

combatting or mitigating impact, 141, 168

defined and explained, xi, 17, 67–68, 142–43, 172

disseminated by troll armies, 33–34

future of AI-generated propaganda, 144–54

generated by lie machines, 16

hallmarks and advantages, 21–22, 142–43

message testing, 112 (see also message testing)

preservation of values/order through, 37

Russian polarization campaigns using, 43–51 (see also Internet Research Agency; Russia)

social media data and personalization, 104 (see also personalization)

social media platforms exploited, 17–18, 34, 67–68, 91

as threat to democracy, 97–98, 142–43, 146

tracing bot networks, 68–74. See also algorithms; bots and botnets; lie machines; political lies

Computational Propaganda Project, analysis by, 45–51

ComScore, 92

conservatives, 13, 24, 49, 51, 74, 114, 115, 117. See also Brexit campaign; far-right groups; Republican Party; Trump supporters

conversion rate, 120, 123–24, 127, 133, 134–35, 172 (defined)

core data, 9, 172 (defined). See also data

credit card companies: behavioral data, 3, 159

data and ad targeting, 152–53

data breaches, 116

data shared/sold, 9, 114, 115, 159–60

political use of credit card data, 6, 8, 9–10, 14, 156

proprietary data, 117

Crivella, Marcello, 94–95

Cruz, Ted, 65, 74

Cummings, Dominic, 119–20, 121–22, 125–26, 127, 132

customization. See microtargeting; personalization

cyberattack (defined), 172

cyber troops, 2, 75–78. See also Internet Research Agency; trolls and troll armies

cyberwar, 63–64, 76, 172 (defined). See also Syria

Cyprus, xv

data: AI and, 23, 27, 144–47, 151, 153

buying/selling, 2, 114–15, 116, 153, 159–60, 163–64, 165

and canvassing, 133

citizens’ knowledge of/control over, 113–14, 115–16, 117, 156, 160–64, 166–67, 169

core data, 9–10, 172 (defined)

data breaches, 116

as delivery mechanism for misinformation, 7

democracy and, 18, 158–66

device-level data, 4, 10, 11, 27, 67, 150–51

donation of, 160, 163–64

fake archives, 150

historical recordkeeping, 153–54, 159–60

illegal harvesting, 34

management and protection, 8–9, 115–17, 155, 159–60, 166–67

and personalization of ads/content, 16, 104, 111, 115, 151, 153

political campaigns’ use/management, 6–10, 111–18

as political information, x, 3, 14, 113–18, 156, 159–60, 162, 163

privately held, 117, 144–45, 153, 155, 158–64, 167, 168, 169

proposed reforms, 155, 158–66

public data, 159–60, 165, 169

quantity, 4

raw data, 8, 114–15, 116, 117, 174

and redlining, 6–7 (see also political redlining)

shared access, 158–60, 165, 168 social media data (see social media platforms)

sources, 3, 4, 9–10, 23, 114, 121 (see also specific sources). See also behavioral data; big data

debates, political, 50, 70–71, 93

democracies: balance of power, 158, 158

bots’ negative impacts, 59

challenges facing, 27

data flow within, 159–60, 161–65 (see also data)

election regulators, 8

foreign political intervention against (see foreign intervention in politics)

indictments, complaints against Russia’s troll armies, 48

propaganda and censorship in, 32

similarities between, 35

social media as political tool, xii, 11–12, 55–56, 75–78, 143

social media’s negative impact, 2, 18, 51, 66–67

technology firms’ response to junk news, fake accounts, 66, 73, 79, 138

vulnerability to junk news, 138–39. See also democracy; elections (generally); and specific countries, parties, and elections

democracy: data and, 158–66

democratic government defined, 172

democratic movements (see Arab Spring; Hong Kong protests)

lie machines as threat, xiv, 53, 97–98, 110, 135, 140, 142–54 (see also artificial intelligence)

redesigning internet to support norms, 22, 158–69

social media’s impact/threat, 2, 11–12, 18, 51, 66–67, 143, 146–50. See also democracies; elections (generally)

Democratic Party (US), 8–9, 50, 103. See also liberals

and specific politicians

Democratic Unionist Party (Northern Ireland), 131

deniability, 96–97

device tithe, 164–65, 172 (defined)

dictators. See authoritarian governments; and specific individuals

Dilnot, Sir Andrew, 122

disinformation: defined, 15, 172–73

future of disinformation campaigns, 137–40, 143

historical disinformation campaigns, xv–xvi

about Nemstov’s death, 40–42

use of term, xi. See also junk news; misinformation; political lies

Disinformation and “Fake News”: Interim Report (UK House of Commons Digital Culture, Media and Sport Committee), 119, 130, 181–82(n30)

distribution systems for lies, 54–81

about, 3, 14, 16–17, 25

bots as, 54, 85 (see also bots and botnets)

social media (platforms) as, 3, 14–15, 16–17, 22, 51–53, 55, 66, 67–68, 108, 142–43, 149. See also algorithms; social media

social media platforms (companies); and specific platforms

drones, 80, 150

Duterte, Rodrigo, 19

Economist, 84

Ecuador, 76

Eghna Development and Support, 63

Egypt, 5–6, 11, 34–35, 36, 100. See also Arab Spring

elections (generally): better guidelines, 157

checks on fairness of, 8, 158

electoral laws, 93, 97, 125

impact of political lies, xiv, 12, 18, 98, 109–10, 142

selective exposure and voter loyalty, 100–101 (see also selective exposure)

social media as threat to, 11, 18

social media campaigns extended past, 97–98

surprising results, 27

voter decision making, in final days, 118, 122, 132

voter fraud allegations, 84, 98. See also democracy; and specific countries and elections

elective affinity, 100, 119, 154, 173 (defined). See also selective exposure

email, 4, 84, 88, 92, 116, 121

encrypted chat platforms, 150. See also WhatsApp

Equifax, 116

European Research Council, 69, 164

European Union: Brexit campaign lies about, 11, 85, 118, 122–23, 129

data protection law, 166–67

Russian criticism of, 42

UK referendum, 118, 133

vulnerability to junk news, 138. See also Brexit campaign

Facebook: and 2016 presidential election, 11–12, 49–50, 65–66, 72–73, 152

ad prices and revenue, 132, 134, 135

authoritarian governments’ strategic use of, 38

Brazilian bots, 92, 93–94

Brexit campaign ads, 67–68, 119–22, 123–24, 126–27, 128t, 129–32, 133–34

Cambridge Analytica scandal, 9

as distribution system for political lies, 3, 22, 66, 108

enforcement efforts by, 66, 107

fake accounts, 17, 107

impressions, 126, 127, 128t, 130–32, 134

journalism transformed, 138

models and data not shared, 60, 108

movement away from, 95

in Myanmar, 108

newssharing, selective exposure, and algorithmic ranking, 104

and North African protests, 10

in Poland, 87 (see also Imitacja Consulting)

political campaigns’ communication with voters through, 3–4

popularity and dominance, 92

quantity of content, 4

and Russian influence in US politics, 11–12, 45–51, 65–66, 152

as source of news, 100

user statistics, 46, 55

voter turnout study, 133

fact-checking, 99, 109, 140

fake news (defined), 86. See also junk news

fake social media accounts: AIdriven, 146–47

and Brazilian politics, 92, 94–95 (see also Imitação Services)

computational propaganda and, 22, 45

efforts to eliminate, 66, 73

on Facebook, 17, 107

identifying, 58–59, 146–47

impact, 34

legends, 87–90, 92, 173

managed by US firms, 107

multiple platforms and credibility of, 48

and Russian interference in foreign politics, 45–51 (see also Russia). See also automated social media accounts/activity; bots and botnets; Imitacja Consulting; Internet Research Agency; trolls and troll armies

family and friends. See social networks

far-left groups, 11. See also liberals

far-right groups, 11, 13, 15, 24, 49, 51, 86, 140. See also conservatives

Federal Communications Commission (FCC), 74

foreign intervention in politics: AI and, 147–48

future of, 140, 141, 152

generally, xii, 2, 13, 16, 74, 148–49, 168. See also China; Russia; and specific elections and organizations

Freixo, Marcelo, 94–95

Gab, 152

Gaddafi, Muammar, 34, 35

German Democratic Republic, xv

Germany, xv, 12, 100

Gladwell, Malcolm, 10

global economy of political lies. See under political lies

Google, 4, 45, 60, 88, 108, 111–12, 138. See also YouTube

Groundgame app, 133

hacking, 8–9, 50, 116

hashtags, 68, 69, 70, 78

health misinformation, 82–83

Hispanic voters, 49

Hong Kong protests (2019), 13, 25, 55, 100, 100

ideology: defined, 173

identifying voter affiliation through data, 114, 115

junk news and, 96, 99

lies and lie machines in service of, 2, 15–16, 25, 30, 52, 141

and politics as sociotechnical system, x–xi

selective exposure and, 101–3, 104–5

social media as tool to advance, 11, 15, 33, 90

social movements and, 37–38. See also conservatives; far-right groups; liberals; political lies; political polarization; propaganda; and specific parties and political groups

Imitação Services (pseud.; Brazilian corporation), xvii, 91–95, 96, 98, 106–7, 108–9

Imitacja Consulting (pseud.; Polish corporation), xvii, 87–91, 96–97, 98, 106–7, 108–9

immigrants and immigration, 43–44, 51, 98, 122, 140, 148

impressions, 126, 127, 128t, 130–32, 134, 173 (defined)

India, 13, 33, 82, 100

Infoglut (Andrejevic), 65, 154

information bubbles, 10. See also selective exposure

information infrastructure: authoritarian governments’ control over, 36–37

causal role, xiii, 19–20

and definitions of democracy, x

interference with, 116

political conversation flow directed by, 73

and political participation, 156

power of control over, 79

profitability of data gathered through, 139

tithing, 164

as tool for political actors, 18. See also information

technologies; internet; internet of things; social media platforms

information technologies: future of, 150–51

and politics as sociotechnical system, x–xi

power resulting from control of, 79

public trust in, 51, 55, 57, 58

strategic use by authoritarian governments, xii, xiii, 1–2, 6, 36–38, 44, 52, 139, 140 (see also social media; trolls and troll armies; and specific countries and technologies)

in warfare, 63–64. See also algorithms; artificial intelligence; bots and botnets; computational propaganda; information infrastructure; internet; internet of things; social media; social media platforms; technology firms

information warfare, 61–65

Instagram: as distribution system for political lies, 3, 22

popularity, 92

Russia’s use of, 12, 38, 45, 47, 48–50, 51, 66

trust levels on, 95

institutions (defined), 173

Intelligence Committees. See US House Permanent Select Committee on Intelligence; US Senate Select Committee on Intelligence

internet: changes in communication content/tone, 4–7

debate over role in public life, 10–11, 18

and the history of propaganda and political lies, xv–xvi

political actors’ use of, 97–98

and social movements, 10, 36–37 (see also specific movements and countries)

social science research and the societal role of, 19–21

user statistics, 55. See also bots and botnets; information infrastructure; information technologies; internet of things; social media; social media platforms; and specific platforms

internet of things, 4, 23, 26, 139, 151, 162, 173 (defined)

Internet Research Agency (IRA; Russia), 24–25, 31–33, 38–51, 66, 83–84, 96, 147–48. See also junk news; Russia

Iran, 13, 33, 36, 77, 77, 139

ISIS, xvi, 49

Israel. See Psy-Group

Jack, Caroline, 14

Joint Threat Intelligence Group, 77–78

Jordan, 34

journalism. See news media

junk news (fake news): during 2016 Brazilian elections, 94–95

during 2016 US elections, 12–13, 55, 72–73, 100, 109–10

during 2018 US elections, 94

algorithms and distribution, 67–68

appeal, 23

belief in, xiv, 98, 135

during Brexit campaign, 122–23

computational propaganda and, 22 (see also computational propaganda)

creation, dissemination, and organization, 13, 33–34, 49, 83–87, 99–100 (see also trolls and troll armies; and specific producers)

defined, 15, 86, 96, 173

faked videos, 77, 150

growth, 138

identifying, 139

impact, 2, 23, 98, 109–10, 154

junk news organizations, 96–98, 105–7, 144 (see also Imitação Services; Imitacja Consulting)

multiple, contradictory explanations offered, 32, 40–42

public’s susceptibility, 100–105, 138–39

role/function, 22

speed of distribution, 100. See also disinformation; junk science; misinformation; specific incidents and campaigns

junk science, 27, 86, 152, 154

Kremlin. See Putin, Vladimir; Russia

Labour (UK), 54. See also Remain campaign

laws and regulations: data reforms, 162–66

electoral laws, 93, 97, 125

Lebanon, 34

legends, 87–90, 92, 173 (defined). See also fake social media accounts

liberals, 11, 43–44, 51, 114, 115, 117. See also Democratic Party; Labour (UK); Remain campaign; and specific politicians

Libya, 5–6, 34–35, 36

lie machines: appearance, 18–19

bots’ role in, 80, 142–43 (see also bots and botnets)

breaking (disassembling), 28, 154–69

characteristics (strengths), xvi

cultivation of audience by, xvii

defined and explained, xi, 1–4, 13–18, 137, 141, 173

distrust sown by, 22, 110, 142, 149–50

future of, 137–41, 144, 144–54

global political economy of, 24, 26, 111, 140–44, 153, 160, 167

main components, 2–3, 16, 108–9, 142 (see also distribution systems for lies; marketing of lies; production of lies)

message testing, 112 (see also message testing)

money flow, 25, 26

motives for building, 37–38, 141

profitability, 97, 144

quality of lies, 85

researching, analyzing and understanding, 22–28

rise and growth, 32–34, 139, 140–41

Russian origins, 32–33, 52 (see also Internet Research Agency; trolls and troll armies)

as sociotechnical systems, 19–22, 155

as threat to democracy, xiv, 53, 97–98, 110, 135, 140, 142–54, 169 (see also democracies; democracy). See also junk news; political lies; trolls and troll armies; and specific firms and campaigns

lies (defined), 135. See also junk news; lie machines; political lies

LiveJournal, 32

lobbyists: chatbots used, 26, 152

consulting firms used, 88, 91, 117

data use and purchasing, 2, 3, 7–8, 117, 155, 162–63 (see also data)

direct communication with voters via social media, 3–4, 5, 109–12

future use of lie machines, 141, 145, 149, 152

and marketing of lies, xvii, 3, 17

as producers of lies and lie machines, 14, 137, 139–40, 141, 155, 168, 169

strategic use of information technologies, xi, xii, 6, 18, 76, 152

trolling techniques used, 19, 52

location targeting, 67–68

Macedonia, 122

machine learning. See artificial intelligence (AI)

Malaysian Airlines Flight 17, 41, 83–84

marketing of lies, 82–107

about, 3, 17, 25, 108

by Brazilian firm, 91–95, 106–7 (see also Imitação Services)

corporate media and, 106

legends, 87–90, 92, 173 (see also fake social media accounts)

market for robots, 78–81

organizing of junk news, 17, 83–87

personalization and, 104

by Polish firm, 87–91, 106–7 (see also Imitacja Consulting)

quality of lies, 85

repetition, 104–5

selective exposure and, 100–105

sowing doubt and uncertainty, 83–84, 107. See also advertising; algorithms; junk news: junk news organizations; lobbyists; personalization; political campaign support industry; political consultants

mass media, 106. See also news media

Melnikov, Ivan, 41

message testing, xvii, 112, 120–22, 124, 134, 171 (defined)

Mexico, 1, 77, 100

microtargeting, 6, 104, 142, 173–74 (defined). See also personalization; and specific campaigns and targeted groups

misinformation: algorithms and distribution of, 3, 16, 52, 67–68, 79, 104, 142–43 (see also algorithms)

counteracting, 164, 164–65

data donation and, 164

defined, and term use, xi, 14–15, 174

false advertising vs., 15

formal structures, 22 (see also specific structures)

future impact, 155

health misinformation, 82–83

identifying, 139 (see also fact-checking)

operations around the world, 74–78

repetition of, 104–5

Russian campaigns, 40–51, 83–84 (see also Internet Research Agency; Russia)

social organization of, 51–53

structured campaigns, 85–86 (see also Brexit campaign). See also disinformation; foreign intervention in politics; junk news; lie machines; political lies; and specific misinformation campaigns

Miyo, Yuki, 103

Montenegro, 122

Moro, Sérgio, 98

Morocco, 34

Morozov, Evgeny, 10–11

Mubarak, Hosni, 34, 35

Muslim Americans, 140, 147

Myanmar, 108

nanosats, 150

National Science Foundation, 69, 164

Nazis, xv

negative campaigning, 57–58, 64, 67, 74, 107, 110–11, 156, 174

Nemstov, Boris, 29–32, 38–39, 40–42, 47

Neves, Aécio, 92

New Knowledge (consulting firm), 74

news media: access to data, 158, 164

history of propaganda in, xv

and junk news firms/strategies, 98, 106, 138

role of editors, journalists, 109, 138–39, 158

Russia and foreign news media, 39–40

and social unrest, 36. See also junk news

North African protests. See Arab Spring

Obama, Barack, 5

Olshansky, Dmitry, 41

onboarding, 120, 174 (defined)

Operation Neptune, xv

opinion leaders, 90

Organization for Economic Cooperation and Development (OECD), 151

Oxford Internet Institute, 45. See also Computational Propaganda Project

Pakistan, 11, 13, 33

Pariser, Eli, 10

partisanship, 80, 99, 101–3. See also ideology; political polarization

pax technica, 79, 174 (defined)

Pax Technica (Howard), xii–xiii, 57

personal data. See data

personalization: of ads, messaging, and content, 6, 16, 104, 111, 152, 156, 173

and automation and computational propaganda, 4, 21–22, 153

data and, 16, 104, 111, 115, 151, 152 (see also data)

of our media consumption, 55

Peskov, Dmitry, 40

Philippines, 2, 12, 19

Phillips, Whitney, 106

Pizzagate, 25, 84

Poland. See Imitacja Consulting

polarization. See political polarization

political campaigns: ads and voter activation, 110–11, 123–24

and AI-generated propaganda, 144–47, 152–53

authoritarian tactics adopted, 80

bot use, xii, 64–65, 73–74, 76–77 (see also 2016 US presidential election)

broad vs. microtargeting by, 6 (see also personalization)

canvassing, 133

direct communication with voters via social media, 3–4, 5–6, 109–12, 118–23

future use of lie machines, 141, 145, 148–49

information infrastructure an asset to, 18

misinformation campaigns increasing, 139–40

personal data use/management, 6–10, 111–12, 113–18 (see also data)

redlining, xi, 6–7, 111–12, 131, 132, 144–45, 174

social media strategies now used, 97–98

voter suppression, 49, 148–49, 164. See also advertising; negative campaigning; political consultants; political parties; redlining; and specific campaigns and elections

political consultants, 63, 64–65

and AI, 145–46

and data, 2, 7, 8, 9, 113, 114, 115–17, 162, 163

new tactics quickly adopted, 73, 93

political affinities, 117

as producers and marketers of lies, 3, 14, 17, 25, 27, 86, 105–6, 112, 141, 142, 155

and redlining, xi, 111

social media exploited, 56, 94. See also Imitação Services; Imitacja Consulting

political lies: belief in, xiv, 98, 135

during Brexit campaign, 118, 122–23, 129, 135 (see also Brexit campaign)

creating effective lies, 80–81

defined, xiii–xiv

exposing and correcting, xi

faked videos, 77, 150

future of disinformation campaigns, 137–40, 144

global political economy of, xiii, 24, 26, 111, 140–44, 153, 160, 167 (see also specific countries and organizations)

impact on voters, elections, xiv, 12, 18, 98, 109–10, 142 (see also specific campaigns and elections)

marketing, 3, 17, 25 (see also marketing of lies)

producers and production of, 2–3, 14 (see also production of lies)

quality, 85

Russian campaigns, domestic/foreign (see Russia)

types, xi–xii, xiv, 14–15. See also disinformation; lie machines; misinformation; propaganda; and specific producers, distributors, marketers, and events

political parties: consultants used, 100, 105–6

data use, management, security, 6–7, 8–9, 157, 158, 159–60

donor records, 9

factchecking, 109

as producers of lies, lie machines, 2, 6, 16, 105–6, 117, 137, 139–40

selective exposure and voter loyalty, 101–3

social media exploited, 11, 75, 144, 149, 168, 169

technology tools used, xii, 16, 19, 75–78, 145, 149–50. See also political campaigns; political polarization; politicians; and specific parties and countries

political polarization: AI and, 148

blame for, 24

in Brazil, 94–95, 98

as result of junk news (generally), 98, 110

Russian troll armies and, 24, 43–51, 66, 147–48 (see also Russia)

selective exposure and, 101

swing states targeted, 56, 68, 72–73, 100

political redlining, xi, 6–7, 111–12, 131, 132, 144–45, 174 (defined). See also Vote Leave campaign

politicians: communication with voters via social media, 3–4, 5, 109–12

consulting firms used, 91, 117

data use and purchasing, 2, 3, 7–8, 117, 154 (see also datay)

foreign-backed promotion/discreditation of, 148–49

as producers of lies, lie machines, 2, 117, 141, 155, 168, 169

strategic use of information technologies, xi, xii, 52, 152, 153. See also political campaigns; political consultants; political parties; and specific individuals

politics: AI and the future of, 144–54

assigning blame, 24

rebuilt social media’s promise for, 168

social media’s impact, 53, 109–13

as sociotechnical system, x–xiii. See also democracies; foreign intervention in politics; political campaigns; political lies; political parties; politicians; and specific countries

polls and polling data, 3, 8, 120, 158

push polling, 57–58, 59, 74, 174 (defined)

primary elections (US), 50, 74

privacy. See data

production of lies, 29–53

about, 2–3, 30, 108

creating effective material, 80–81

dictators’ lessons from Arab Spring, 34–38

junk/fake news production, 85–86 (see also junk news)

by politicians, parties, and lobbyists, 2, 6, 14, 16, 105–6, 117, 137, 139–41, 155, 168, 169

social organization of misinformation and, 51–53. See also authoritarian governments; political campaign support industry; political consultants; trolls and troll armies; and specific countries and organizations

propaganda, xi, xv–xvi, 21, 32, 156. See also computational propaganda; disinformation; lie machines; misinformation; political lies; and specific countries

Psy-Group, 64–65

public trust: abused by bots, 57–59

AI and, 146, 149

in authoritarian countries, 37

distrust sown by lie machines, 22, 110, 142, 149

in technology and social media, 51, 55, 57, 58

push polling, 57–58, 59, 74, 174 (defined)

Putin, Vladimir, 24, 29, 30, 41, 83. See also Russia

al-Qaeda, 64

Qtiesh, Anas, 78

raw data, 8, 114–15, 116, 117, 174 (defined). See also data

Reddit, 17

redlining. See political redlining

Remain campaign, 131–32

repetition, 104–5

Republican Party, 50, 65, 74, 103. See also conservatives; Trump supporters; and specific politicians

robocalls, 9, 74

Rousseff, Dilma, 92, 93–94, 98

RT (formerly Russia Today), 31, 86

Ruffini, Patrick, 80

Russia: citizen data controlled, 159

Democratic Party hacked, 8–9, 50

efforts to influence US politics, 11–12, 13, 24, 43–51, 56, 65–66, 100, 147–48, 152

health misinformation spread, 82–83

lie machines’ origins in, 32–33, 52

as misinformation superpower, 139

and Myanmar, 108

political murders, 29–32 (see also Nemstov, Boris)

and Syria, 61, 64

troll armies and social media strategy, 18–19, 24, 30–33, 38–51, 65–66, 77, 82–84

Twitter conversations in, 44, 55

Ukraine invaded, 29, 30

US botnet masquerading as Russian, 74. See also Internet Research Agency

Saleh, Ali Abdullah, 34

Saudi Arabia, 13, 33

schemata, 102

science, 149, 164

junk science, 27, 82–83, 86, 152, 154

search engines, 14, 82, 105. See also Google

selective exposure, 100–105, 111–12, 154, 161, 174 (defined). See also cognitive bias; information bubbles

Serbia, 122

Serdar Arjic bot, xv–xvi

Shipman, Tim, 118, 120–22, 124

Shirky, Clay, 10

smart devices, 4, 27, 150, 152–53. See also cell phones; internet of things

social control, 6, 11, 38, 75–78. See also authoritarian governments; censorship; surveillance; and specific countries

social media: authoritarian governments’ strategic use, xii, 1–2, 6, 11, 33, 36–38, 75–78 (see also specific countries)

automated tools, 4

Brexit campaign and, 11, 67–68, 118–24, 129–32 (see also Brexit campaign)

democracy impacted/threatened, 2, 11–12, 18, 51, 66–67, 143, 146–50, 169

direct communication with voters via, 3–4, 5–6, 109–12, 118–23

as distribution system for political lies, 3, 14–15, 16–17, 22, 51–53, 66, 67–68, 108, 142–43, 149 (see also trolls and troll armies)

donation of social network accounts, 94

far-right groups’ use, 24, 86

foreign intervention in politics via (see foreign intervention in politics)

needed changes, 156, 163–69

networked structure, 66

as news source, 1, 72, 100

organized manipulation, around the world, 74–78

political role (generally), x–xi, 10–11

and public opinion, 109–13

and selective exposure, 100–105, 154, 161

social control through, 6, 11, 38, 75–78, 143

social movements and, 5–6, 10, 34–35, 36, 55 (see also Arab Spring)

speed of dissemination, xvi, 100

ubiquity and popularity, 51, 54–55. See also algorithms; automated social media accounts/activity; bots and botnets; fake social media accounts; internet; social media platforms; social networks

social media platforms (companies): accountability and regulation, 138, 141, 143, 162, 165–66, 166–67

affordances, xvi, 17–18, 19, 20, 67, 95, 171 (defined)

algorithms and distribution of misinformation, 67–68, 79, 91, 104, 108 (see also algorithms)

authoritarian governments’ manipulation enabled by, 38

data collection and personalization, 4, 104

data management and sharing, 9, 115–16, 154–56, 160, 167, 168, 169 (see also data; data: shared access)

as distribution system for misinformation, 3, 14–15, 16–17, 22, 51–53, 55–57, 67–68, 79–80, 142–43, 149

encrypted chat platforms, 150

enforcement efforts by, 66, 73, 79, 143

exploited by computational propaganda, 17–18, 34, 67–68, 91 (see also trolls and troll armies)

as fundamental infrastructure, 166–67

identifying legend networks, 89

internal research, 112–13, 133, 181(n14)

manipulation of public opinion, dangers of, 154–60

movement from Facebook to other platforms, 95

proposed reforms, 163–69

public trust in, 51, 55, 57

research using data from, 60, 103–4, 112–13

Russian account data shared with Congress, 45

social networks used to connect, 97

surveillance by, 2

terms of service agreements, 9, 17–18, 67, 73, 89, 115–16

on their own influence, 109. See also social media; technology firms; and specific platforms

social movements, 5–6, 10, 34–35, 36–37. See also Arab Spring

social networks (family, friends): and the Arab Spring, 34

bots and, 59 (see also bots and botnets)

distribution of misinformation across, 13–14, 48–49, 53, 57, 84, 99–100, 156

and political communication on social media, 5, 14–15, 22

and selective exposure, 100–101, 104–5

trust in, 1

social unrest, phases of, 36–37

sociotechnical systems, x–xiii, 19–22, 155, 174 (defined). See also lie machines

Spain, 47

Spam (Brunton), 52

spamming, 91

Sputnick (Russian news agency), 31

Stasi (East German secret police), xv

Stimson, Rebecca, 130

surveillance, 2, 32, 61, 63–64, 172

swing states, 56, 68, 72–73, 100

Syria, 36, 49, 61–64, 77, 78

Syria Civil Defence (White Helmets), 64

Syrian Electronic Army, 78

Taiwan, 77

technology. See information technologies

technology firms: accountability and regulation, 138, 141, 143, 162, 166, 167

data ownership, use, and management, 117, 154, 158–68 (see also data)

response to junk news and fake accounts, 66, 73, 79, 138

selective exposure and elective affinity exploited, 104–5, 154, 161. See also social media platforms; and specific firms

Telecom Project, 63

Telegram, 95, 150

terms of service agreements, 9, 17–18, 67, 73, 89, 115–16, 117

Terms of Service; Didn’t Read (website), 115–16

Thailand, 2, 78

Tibet, 77

TikTok, 25

Tinder, 17, 54, 57, 95

tithes, 164–65, 172

transparency, 163, 166–67

trolls and troll armies, 29–53

AI as replacement for, 147–48 (see also artificial intelligence)

combined with bots, 80

criminal indictments and diplomatic complaints against, 48

defined and explained, 30, 174

Imitacja Consulting (Poland), xvii, 87–91 (see also Imitacja Consulting)

importance of understanding, 51–52

in Myanmar, 108

professionalization, 33, 43

rise and growth, 18–19, 24, 33, 42–44, 52

role/function, 22, 23, 33–34

Russian troll armies, 18–19, 24–25, 30–31, 38–51, 147–48 (see also Internet Research Agency; Russia). See also fake social media accounts; Internet Research Agency

Trump, Donald J., ix–x, 13, 49, 50, 84, 98. See also 2016 US presidential election; Trump campaign; Trump supporters

Trump campaign (2016), 64–66, 69–73, 74, 133. See also 2016 US presidential election

Trump supporters, 12, 84. See also conservatives; far-right groups; Republican Party

trust, public. See public trust

trust levels, on social media platforms, 95

truth machines, blueprints for, 163–66

Tunisia, 5–6, 34–35, 36. See also Arab Spring

Turkey, xv–xvi, 2, 11, 78, 122, 139

Twitter: and 2016 presidential election, 69–73

bots (automated accounts) on, 17, 25, 38, 42, 55, 57, 60, 61–63, 69–73, 77, 78–79, 91, 92–94 (see also bots and botnets)

direct communication with voters via, 3–4

as distribution system for political lies, 3, 22, 66, 108

enforcement efforts by, 66, 79

foreign misinformation campaigns on (generally), 152

junk/professional news ratio on, 12

models, data not shared, 111–12

models and data not shared, 60

as news source, 100

Russian activity on, 11–12, 33, 42, 45, 47, 48–49, 51, 66

Syrian “eggs,” 61–63

tweet statistics, 4

USAID’s “Cuban Twitter” program, 78. See also social media; social media platforms

Ukraine, 29–30, 40, 42, 83–84. See also Nemstov, Boris

United Kingdom, 75, 77–78, 118, 122, 133. See also Brexit campaign

United States: AI-generated propaganda in, 146 (see also artificial intelligence)

authoritarian social media tactics imported, 80

and Brazilian protests, 98

Columbian Chemicals Plant hoax, xvi

internet and Facebook use in, 55

junk news distribution and quantity, 99–100 (see also junk news)

minorities, 49, 51, 140, 147–48

Russian efforts to influence politics, 11–12, 13, 24, 43–51, 56, 65–66, 100, 147–48 (see also Internet Research Agency; Russia; and specific elections by year)

swing states, 56, 68, 72–73, 100

vulnerability to junk news, 138. See also specific elections, companies, minority groups, and individuals

US Agency for International Development (USAID), 78

UseNet, xv–xvi

US House Permanent Select Committee on Intelligence, 46

US Senate Select Committee on Intelligence, 45, 48, 50. See also political polarization: Russian troll armies and

Vaidhyanathan, Siva, 155

Venezuela, 13, 33, 75, 77, 77, 139

veterans, 13, 43–44, 100, 148

Veterans for Britain, 131

videos, faked, 77, 150

Vietnam, 1–2, 75

virtual private networks (VPNs), 89

Vote Leave campaign, 85, 118–28, 130, 132, 134, 181–82(n30), 182(n31). See also Brexit campaign

voters: AI and voter participation, 145

data generation, 156, 157 (see also behavioral data; data)

decision making, in final days of campaigns, 118, 122, 132

Facebook and turnout, 133

redlining, xi, 6–7, 111–12, 131, 132, 144–45, 174

selective exposure and voter loyalty, 101–3 (see also selective exposure)

voter fraud allegations, 84, 98

voter suppression, 49, 148–49, 164. See also conservatives; elections; liberals; political polarization; swing states; and specific countries and elections

WeChat, 150

WhatsApp, 3, 4, 90, 92–95, 97, 150

White Helmets (Syria Civil Defence), 64

Wikipedia, xvi

Yemen, 5–6, 34–35. See also Arab Spring

young people, 103

YouTube, 12, 17, 45, 48, 49, 66, 92