What’s on the menu? When it comes to everyone’s social media diet, there’s one constant you can count on: misinformation.
It’s built into the system. Sociologist Zeynep Tufekci explained it succinctly in her TED talk:1
So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I study social movements, so I was studying it, too. And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube. YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there.
Well, you might be thinking, this is politics, but it’s not. This isn’t about politics. This is just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan. It’s like you’re never hardcore enough for YouTube.
So what’s going on? Now, YouTube’s algorithm is proprietary, but here’s what I think is going on. The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads. Now, with nobody minding the ethics of the store, these sites can profile people who are Jew haters, who think that Jews are parasites and who have such explicit anti-Semitic content, and let you target them with ads.
They can also mobilize algorithms to find for you look-alike audiences, people who do not have such explicit anti-Semitic content on their profile but who the algorithm detects may be susceptible to such messages, and lets you target them with ads, too. Now, this may sound like an implausible example, but this is real. ProPublica investigated this and found that you can indeed do this on Facebook, and Facebook helpfully offered up suggestions on how to broaden that audience. BuzzFeed tried it for Google, and very quickly they found, yep, you can do it on Google, too. And it wasn’t even expensive. The ProPublica reporter spent about $30 to target this category.2
As Internet companies like YouTube and Facebook have struggled with the deluge of far-right extremism, racial bigotry, and conspiracy theories that have filled their platforms, it’s becoming increasingly clear that there’s one very simple and yet insurmountable reason they haven’t been able to get it under control: their revenue streams are built around attracting such content.3
YouTube executives, as a Bloomberg News piece exposed in early 2019, have remained lackadaisical about the problem over the years that it has accumulated. “Scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world’s largest video site surfaced and spread,” Peter Bergen reported. “Each time they got the same basic response: Don’t rock the boat.”4
Getting these platforms to clamp down on speech that helps fuel racial violence—notably including conspiracy-theorist content that scapegoats targeted minorities—is made more difficult both by the traffic-boosting incentives in place to permit it to continue, as well as by the ease with which targeted offenders can escape their wrath and continue to post content.
The conspiratorial mindset is threaded throughout the social fabric of YouTube. It’s part of the warp and weft of its production economy. “YouTube offers infinite opportunities to create a closed ecosystem, an opaque algorithm, and the chance for a very small number of people to make a very large amount of money,” observes Alexis Madrigal, deputy editor of the Atlantic. “While these conditions of production—which incentivize content creation at a very low cost to YouTube—exist on other modern social platforms, YouTube’s particular constellation of them is special. It’s why conspiracy videos get purchase on the site, and why they will be very hard to uproot.”5
No one is more emblematic of that problem than conspiracy-meister Alex Jones of Infowars, who was officially banned from YouTube and Facebook in August 2018. Even though Jones had a long and horrific track record with his videos, the lawsuit filed by the parents of Sandy Hook victims plagued by Infowars followers made clear the potential liability that every platform that hosted his work faced.6
Jones has not disappeared easily, however. His Infowars content has been reposted by a number of mirror sites that eventually have been removed—one as recently as just after the attacks in Christchurch. In spite of this, Media Matters notes: “Channels that violate YouTube’s rules by exclusively sharing Infowars content are easily found on YouTube, but the video platform doesn’t appear to be devoting many resources to enforcing its own rules.”7
Indeed, YouTube very nearly installed a remuneration system for its video creators in 2017 that would have made Jones the site’s highest-paid contributor and only changed course after its platform was linked to various acts of violence.
The top priority at YouTube is “engagement”: getting people to come to the site and remain there, accumulated in data as views, time spent viewing, and interactions. Moderating extremist content is often devalued if it interferes with the company’s main goals. The key for gauging engagement is a metric the algorithm designers call “watch time”—that is, the amount of time you spend consuming media on the site.
Becca Lewis, a researcher with the technology research nonprofit Data & Society, warns that this fixation on watch time can be either banal or dangerous. “In terms of YouTube’s business model and attempts to keep users engaged on their content, it makes sense what we’re seeing the algorithms do,” Lewis said. “That algorithmic behavior is great if you’re looking for makeup artists and you watch one person’s content and want a bunch of other people’s advice on how to do your eye shadow. But it becomes a lot more problematic when you’re talking about political and extremist content.”8
The company announced early in 2019 that it intended to crack down on the conspiracism. However, part of its problem is that YouTube in fact created a huge market for these crackpot and often harmful theories by unleashing an unprecedented boom in conspiracism. That same market is where it now makes its living.
The formula for success that emerged over time at YouTube is simple: “Outrage equals attention.” Brittan Heller, a fellow at Harvard University’s Carr Center, observed that it’s also ripe for exploitation by political extremists and hucksters. “They don’t know how the algorithm works,” she said. “But they do know that the more outrageous the content is, the more views.”9
And the more views, the more money these platforms—not just YouTube, but Google and Facebook and Twitter and Instagram—roll in. Hate and division become the fuel for profit in this system.
Information scientist Safiya Umoja Noble observes that “the neoliberal political and economic environment has profited tremendously from misinformation and mischaracterization of communities, with a range of consequences for the most disenfranchised and marginalized among us.” She has particularly zeroed in on the way that search engines have bolstered white nationalist and similarly extremist ideas through their “algorithms of oppression”: “Search results, in the context of commercial advertising companies, lay the groundwork . . . for implicit bias: bias that is buttressed by advertising profits.”10
Algorithms not only are designed to reinforce racial stereotypes and narratives, they actually help encourage people to be radicalized by extremist political ideologies—particularly white nationalism.
________
It may be the most notorious Google search in history: “black on white crime.” That was the search that sent Dylann Roof down the path that led him to murder nine black parishioners inside a Charleston, South Carolina, church one evening in June 2015.
He described this path in a post on the white supremacist website he had created.
The event that truly awakened me was the Trayvon Martin case. I kept hearing and seeing his name, and eventually I decided to look him up. I read the Wikipedia article and right away I was unable to understand what the big deal was. It was obvious that Zimmerman was in the right. But more importantly this prompted me to type in the words “black on White crime” into Google, and I have never been the same since that day. The first website I came to was the Council of Conservative Citizens. There were pages upon pages of these brutal black on White murders. I was in disbelief. At this moment I realized that something was very wrong. How could the news be blowing up the Trayvon Martin case while hundreds of these black on White murders got ignored?
From this point I researched deeper and found out what was happening in Europe. I saw that the same things were happening in England and France, and in all the other Western European countries. Again I found myself in disbelief. As an American we are taught to accept living in the melting pot, and black and other minorities have just as much right to be here as we do, since we are all immigrants. But Europe is the homeland of White people, and in many ways the situation is even worse there. From here I found out about the Jewish problem and other issues facing our race, and I can say today that I am completely racially aware.11
Roof actually has provided a kind of map of radicalization online. As information scientist Michael Caulfield explains, this whole process is actually produced by a kind of self-contained data spiral that’s based to a large extent on “curation”—that is, the way we collect web materials into our own spaces and annotate them.12 That curation creates a data feedback for the algorithm that then directly affects what you see. Curations, he warns, can warp reality because of the resulting feedback loop: they “don’t protect us from opposing views, but often bring us to more radical views.”
Caulfield observes that “black on white crime” is a data void—that is, it’s not a term used by social scientists or reputable news organizations, “which is why the white nationalist site Council of Conservative Citizens came up in those results. That site has since gone away, but what it was was a running catalog of cases where black men had murdered (usually) white women. In other words, it’s yet another curation, even more radical and toxic than the one that got you there. And then the process begins again.”
Noble explains that the framing a person brings to his or her Internet experience shapes what kinds of results they see on a search engine, or a video recommendation, or a social media news feed. “In the case of Dylann Roof’s alleged Google searches,” she writes, “his very framing of the problems of race relations in the U.S. through an inquiry such as ‘black on white crime’ reveals how search results belie any ability to intercede in the framing of a question itself. In this case, answers from conservative organizations and cloaked websites that present news from a right-wing, anti-Black, and anti-Jewish perspective are nothing more than propaganda to foment racial hatred.”13
The key to this process of radicalization is its incremental nature: people undergoing it don’t recognize what is happening to them, since each step feels normal initially. This is in fact precisely by design by the organizations and ideologues who are trying to recruit people into their conspiracy theories, which are ultimately about belief systems and political movements.
A onetime “red-pilled” conspiracy theorist named Matt described how he became trapped in a curated spiral like this for Kelly Weill of the Daily Beast. It began when he innocently watched a video of Bill Maher and Ben Affleck discussing Islam, and at its completion, the algorithm recommended several much more extreme videos attacking Islam, including some produced by Infowars conspiracy theorist Paul Joseph Watson. One video led to the next and the next.14
“Delve into [Watson’s] channel and start finding his anti-immigration stuff which often in turn leads people to become more sympathetic to ethno-nationalist politics,” Matt said. “This sort of indirectly sent me down a path to moving way more to the right politically as it led me to discover other people with similar far-right views.”
Now twenty, Matt has since exited the ideology and built an anonymous Internet presence where he argues with his ex-brethren on the right.
“I think YouTube certainly played a role in my shift to the right because through the recommendations I got,” he said, “it led me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”
“The thing to remember about this algorithmic–human grooming hybrid is that the gradualness of it—the step-by-step nature of it—is a feature for the groomers, not a bug,” says Caulfield. “I imagine if the first page Roof had encountered on the CCC page had sported a Nazi flag and a big banner saying ‘Kill All Jews,’ he’d have hit the back button, and maybe the world might be different. (Maybe.) But the curation/search spiral brings you to that point step by step. In the center of the spiral you probably still have enough good sense to not read stuff by Nazis, at least knowingly. By the time you get to the edges, not so much.”15
Peter Neumann, of the United Kingdom’s Centre for the Study of Radicalisation, identifies six steps on the ladder of extremist belief. “The first two of these processes deal with the consequences of being exposed to extremist content,” he writes. “No single item of extremist propaganda is guaranteed to transform people into terrorists. Rather, in most cases, online radicalization results from individuals being immersed in extremist content for extended periods of time, the amplified effects of graphic images and video, and the resulting emotional desensitization.”16
Beheading videos, photos of corpses, suicides, and mass murders, all these things are part of these first two steps in the immersion process. Neumann calls this mortality salience—material intended to create an overpowering sense of one’s own vulnerability to death, as well as to heighten the viewer’s moral outrage.
The next two steps are also key to the process—namely, immersion in extremist forums, where deviant and extremist views are normalized, and online disinhibition, wherein people lose their normal inhibitions about violence because of their relative anonymity online. “Some of the participants get so worked up that they declare themselves ready to be terrorists,” notes psychologist Marc Sageman. “Since this process takes place at home, often in the parental home, it facilitates the emergence of homegrown radicalization, worldwide.”17
The final stages occur when online role-playing occurs—the kind in which new recruits more or less practice their ideology in gaming situations, often in the context of modern video games. The participants project themselves into their gaming avatars, giving themselves traits that they usually do not possess in real life. After a while, this divide becomes noticeable and drives further radicalization: “[A]fter recognizing the gap between their avatar’s mobilization and their own physical mobilization, many online participants begin taking steps to reconcile the gap,” observe researchers Jaret Brachmann and Alex Levine.18 This is when they take the last step: using the Internet to connect directly to terrorist infrastructures that then begin to mobilize them.
Caulfield believes one of the keys to preventing this kind of radicalization lies in establishing “digital literacy” programs wherein young people new to the Internet can learn how to confront, cope with, and overcome the challenges they will be forced to navigate there. And it all begins with the curation process, how we accumulate the materials for our personal spaces.
“So, the idea here is that you might start in a relatively benign space with some kind of ideological meaning, and then someone uses this term ‘black on white crime,’” Caulfield says. “It’s probably a stretch to call the Google search results a curation, but you can think of it along the same lines. You put in a term, and Google is going to show you the most relevant, not necessarily the best, but the most relevant results for that term. And now you have a set of things that are in front of you. Now, on each of those pages, because you picked ‘black on white crime,’ if you click into that page that has ‘black on white crime,’ there are going to be other phrases on there.”19
Even people with normal levels of skepticism can find themselves drawn inside. “So you go, and you do the Google search, and you’re like, ‘You know what? I can’t trust this page. I’m going to be a good info-literacy person, and what I’m going to do is, I’m going to just check that these crimes really happened.’ OK, so what do you do? You pull these crimes and what you find is that these crimes did happen, and the pages they’re going to are more white supremacists talking about how these are actually black on white hate crimes. And now they’re mentioning more things, and they’re mentioning more terms, and they’re mentioning changes in law that now make it easier for black people to kill white people.
“So you’re like ‘Oh, well, I’ve got to Google this change in the law.’ But who’s talking about this thing that’s broadly made up, or it’s a heavy misinterpretation of something? Well, again it’s white. . . . So you keep going deeper and deeper, and every time you’re pulling out something to investigate on that page, it’s pulling you into another site, and that other site of course is covering a bunch of other events and terms and so forth. And you end up going deeper and deeper into it.”20
Caulfield argues that educators need to help their students develop better informational literacy, including learning how to recognize when they are being recruited into a radical belief system or cult or are being manipulated for either financial or political motivations.
“My contention is that the students are practicing info-literacy as they’ve learned it,” he says. “And as a matter of fact, this approach to researching online is what they have learned from a fairly early age in terms of how to approach sources and information on the web.
“I think a lot of academics and teachers would say, ‘but that’s not what we’re teaching,’” he adds. “Let’s just put that whole argument aside because it doesn’t matter. Whatever we’re teaching, these are the lessons they take away from it. So they are practicing info-literacy as learned, and through either chance or through engineering or through fate, whatever it is, these techniques plug really well into radicalization.”21
________
Sometimes the people who fall down the rabbit holes and are recruited into communities organized around conspiracy theories would have ended up in a similar situation regardless. But people are also being actively recruited for a combination of political, ideological, and financial/economic motivations. And they are being actively deceived.
“We are all targets of disinformation, meant to erode our trust in democracy and divide us,” warns University of Washington information scientist Kate Starbird.22
She came to this stark conclusion while conducting a study at the University of Washington involving the evolution of the discussion about the Black Lives Matter movement on social media—and found herself walking into the unexpected realization, supported both by data and a raft of real-world evidence, that the whole discussion was being manipulated, and not for the better. The more the team examined the evidence, the clearer it became that this manipulation was intended to fuel internal social strife among the American public.23
The study quickly morphed into a scientific examination of disinformation—that is, information that’s intended to confuse and distort, whether accurate or not—which exists on all sides of the political spectrum.24 One of their key studies focused on Twitter to see how bad information immediately follows major crisis-type events such as mass shootings and how those rumors “muddy the waters” around the event, even for people who were physically present, and in particular how such rumors can permanently alter the public’s perception of the event itself and its causes.
Consider exhibit A: the nearly instantaneous claims by Alex Jones and other conspiracy theorists that the Las Vegas mass shooting of October 1, 2017, was a false flag event and the ensuing swirl of confusion around it, which eventually permanently obscured the public’s understanding that the man who perpetrated it was unhinged and at least partially motivated by far-right conspiracy theories about guns. Police investigators avoided the evidence that this had been the case as well.
The chief reason we perceive stories, whether real or not, as “true” depends in large part on our unconscious cognitive biases, Starbird says—that is, when our preexisting beliefs are confirmed along the way. We’ve seen how these biases can be targeted by technology companies. Well-equipped political organizations can manipulate disinformation in much the same way.
“If it makes you feel outraged against the other side, probably someone is manipulating you,” she warns.25
The main wellspring of the disinformation Starbird dealt with in her study was Russia and its “troll farms” that introduced industrial-strength data pollution into the American discourse via social media during the 2016 election campaign and afterward. However, she says that the disinformation can be, and often is, run by anyone sophisticated enough to understand its essential principles. These include white nationalists, a number of conspiracy-oriented campaigns involving vaccines and other health-related conspiracies, and in recent years, QAnon.
The strategy, she says, is not just consistent, but frighteningly sophisticated and nuanced. “One of these goals is to ‘sow division,’ to put pressure on the fault lines in our society,” she explained in her findings. “A divided society that turns against itself, that cannot come together and find common ground, is one that is easily manipulated. . . . Russian agents did not create political division in the United States, but they were working to encourage it.”26
These outside organizational entities make full use of a preexisting media ecosystem featuring “news” outlets that claim to be “fair” and “independent,” but which are in fact only propaganda organizations, nearly all of them right-wing. As Starbird explained in one of her studies:
This alternative media ecosystem has challenged the traditional authority of journalists, both directly and indirectly. . . . Its development has been accompanied by a decreased reliance on and an increased distrust of mainstream media, with the latter partially motivated by a perception of widespread ethical violations and corruption within mainstream media. . . . Indeed, many view these alternative news sites as more authentic and truthful than mainstream media, and these effects are compounding—as research has found that exposure to online media correlates with distrust of mainstream media.27
False information renders democratic discourse, which relies on factual accuracy, impossible, and as Starbird notes, “with the loss of commonly-held standards regarding information mediation and the absence of easily decipherable credibility cues, this ecosystem has become vulnerable to the spread of misinformation and propaganda.”28
Because it’s actually a fairly closed, self-contained, and narrow ecosystem, it becomes a real echo chamber, with stories being repeated among the various “independent” news sites, even if they seem not to show up on the major networks (Fox being the most common exception). After a while, the repetition acts as a kind of confirmation for the stories—if people keep seeing different versions of the same headlines, they’ll start thinking the information has been confirmed by a “variety” of sources.
“The tactics of disinformation can be used by anyone,” Starbird says. “The Internet seems to facilitate access towards targets of different kinds, and we are definitely seeing disinformation from multiple sets of actors, including from the U.S., including foreign actors and domestic actors as well. There’s a certain flavor of Russian disinformation that is perhaps different from some others, but the tactics are known and they are easily learned and portable.”29
Unwinding the relationship between authoritarian governments like Russia—which has been promoting far-right political movements around the world, particularly in Europe—and white nationalists trying to “red-pill” vulnerable young people is complicated. “There are some movements, particularly these far-right movements, whose disinformation aligns neatly with Russian disinformation as well,” Starbird observes.
“It’s a chicken-and-egg problem. The current manifestation of the far right or alt-right or whatever we want to call it, the information systems and some of the mechanisms for information flow all seem to have Russian disinformation integrated into them. It’s hard to know what’s cause and what’s effect, but they seem to be intertwined. In a similar way, we can see far left ecosystems around things like Syria and Venezuela are integrated with Russian disinformation as well. . . . We don’t know how causal that is versus opportunistic.”30
________
Among the most radical—and simultaneously, the most active—movements organizing and recruiting online, particularly on social media, are the white nationalists and misogynists who comprise the alt-right. Their homes are web-sites like the overtly neo-Nazi Daily Stormer site and a handful of other similar places, whence they spread out to a multitude of other forums to find potentially vulnerable recruits, most of them young.
The radical right itself has little compunction about identifying its target demographic for red-pilling. Andrew Anglin, publisher and founder of the Stormer, wrote: “My site is mainly designed to target children.”31 At the annual white nationalist American Renaissance conference in Tennessee in April 2018, longtime supremacists bragged about their demographic support: “American Renaissance attendees are now younger and more evenly divided among the sexes than in the past,” one speaker noted before gushing over the white nationalist college campus group Identity Evropa.32
When authorities, both in the United States and abroad, have talked about online radicalization in the recent past, most of us have tended to think of it in terms of radical Islamists from groups such as Islamic State, who have been known to leverage the technology to their advantage, particularly social media. A study by terrorism expert J. M. Berger published in 2016 found that white nationalists were far outstripping their Islamist counterparts, however: “On Twitter, ISIS’s preferred social platform, American white nationalist movements have seen their followers grow by more than 600 percent since 2012. Today, they outperform ISIS in nearly every social metric, from follower counts to tweets per day.”33
“Online radicalization seems to be speeding up, with young men, particularly white men, diving into extremist ideologies quicker and quicker,” Berger said, adding that “the result seems to be more violence, as these examples indicate. It is a serious problem and we don’t seem to have any real solutions for it. These cases also show that an era of violence brought on by the internet is indeed upon us, with no end in sight.”
The radicalization process itself often begins with seemingly benign activity, such as spending hours in chat rooms or playing computer games, and these activities provide a kind of cover for the process as it accelerates. Eventually, two things happen: The fresh recruits become eager to take their ideology to the streets, to make it manifest in real life and not merely online. At the same time, the ideology itself often becomes much more radical, often swiftly, ending in an embrace of explicit fascism and white supremacy.
That is how Patriot Front came into being: best known for its members plastering overtly racist and xenophobic fliers around campuses and various locales around the nation, it nonetheless recruits and organizes almost entirely online.34
The origins of Patriot Front lie in neo-Nazi organizing that began in 2015 at the message board IronMarch.org, itself an outgrowth of the community of dedicated fascists who commented at online forums such as 4chan and Storm-front and allegedly founded by Russian nationalist Alexander Slavros. IronMarch in turn spun off the activist group Atomwaffen (German for “atomic bomb”) Division, whose members engaged in various far-right actions in both 2018 and 2019.35 Atomwaffen activists favored plastering flyers advertising their organization, and their reach included the University of Washington campus in Seattle.36
Although Atomwaffen Division was explicit in its embrace of German-style Nazism, other fascists at IronMarch began discussing ways to broaden their reach in order to compete with alt-right and identitarian groups such as Identity Evropa for young recruits. Out of these discussions they created a new group in 2015, first named Reaction America, then renamed in 2016 as American Vanguard. When one of that group’s leaders was exposed for offering up information to an antifascist group and IronMarch users and administrators began “doxxing” American Vanguard members, the group broke away from IronMarch. In early 2017, the organization once again rebranded as Vanguard America. After an Atomwaffen Division member in Florida shot and killed two other members in May 2017, telling authorities the group was planning to blow up a nuclear plant, a number of Atomwaffen participants joined ranks with Vanguard America.37
The leader of Vanguard America (VA), a Marine Corps veteran from New Mexico named Dillon Irizarry (but better known by his nom de plume Dillon Hopper), began organizing rallies at which members openly carried firearms. On its website, VA claimed that America was built on the foundation of white Europeans and demanded the nation recapture the glory of the Aryan nation, free of the influence of the international Jews.38
Vanguard America had a significant presence in Charlottesville, Virginia, for the August 11–12, 2017, “Unite the Right” rally, as several of its members joined in the Friday-night torch-bearing march onto the University of Virginia campus. The next day, a phalanx of VA marchers chanting “Blood and soil!” marched toward the protest at a city park and then were recorded acting as a shield wall meant to protect the park and its monument to Robert E. Lee, whose feared imminent removal by city fathers was, symbolically at least, the nominal focus of the protest.
Among those VA marchers was James Alex Fields, the twenty-year-old Ohio man who, later that afternoon, drove his Dodge Challenger into a crowd of counterprotesters and maimed twenty people, killing one, thirty-two-year-old Heather Heyer. VA later issued a statement claiming that Fields was not actually a member of the organization.39
Another marcher that Saturday in Charlottesville—indeed, photographed only two marchers away from Fields—was Thomas Rousseau, who not only was a VA member but had taken a prominent leadership role in the group online. Based in Texas, Rousseau noted in chats that VA’s statement “never said that [Fields] did anything wrong.” Soon he and other participants were recommending yet another name change.40
On August 30, Rousseau announced, in a major split with Irizarry/Hopper, that “we are rebranding and reorganizing as a new entity,” henceforth to be known as “Patriots Front” (the “s” was dropped in short order). “The new name was carefully chosen, as it serves several purposes. It can help inspire sympathy among those more inclined to fence-sitting, and can easily be used to justify our worldview.”
The mention of “fence sitting” was a reference to the ongoing discussion within the online neo-Nazi community about engaging and recruiting young men sympathetic to their underlying cause but not yet fully radicalized. There had been similar discussions about drawing in “patriots” from the far-right militia movement, who have traditionally insisted on drawing a line on participating in outright white supremacist activity.
Rousseau also made it clear that the plan was to translate online discussion into real-world actions, concrete activism: “You will be expected to work, and work hard to meet the bar rising,” he wrote. “Inactivity will get you expelled, unwillingness to work and contribute in any capacity will as well.”41
As Patriot Front’s organizing has played out in real life, that “work” has primarily comprised making their presence felt at rallies and protests, spreading the word with freeway banners, and plastering flyers in public locations, where they are often summarily removed.
That ethos is how Patriot Front organizing generally has played out on the ground. The group first made its presence felt in Houston in September 2017, when about a dozen members appeared outside a book fair and demanded a fight with antifascist organizers who reportedly were inside giving a talk. (Rousseau later led a similar protest outside an Austin bookstore.)
Other members around the country began taking their activism public by erecting banners promoting the Patriot Front website on freeway overpasses, frequently for just short periods and removed before authorities arrived to remove them. In Seattle’s Fremont neighborhood, a group of masked neo-Nazis briefly unfurled a swastika-laden banner advertising IronMarch.org; two months later, in suburban Bellevue, a similar group put up a banner advertising bloodandsoil.org on an Interstate 90 overpass, where it was shortly removed by Department of Transportation workers. In October, someone erected a Patriot Front “Resurrection through Insurrection” banner on a freeway near Los Angeles, California. And in November, Patriot Front activists put up a banner in San Antonio, Texas, on the Texas–San Antonio campus.
The most widespread manifestation of Patriot Front’s organizing efforts, however, has been the appearance of its flyers in public spaces around the country. Their stark black-and-white posters—featuring a variety of slogans, including “We Have a Right to Exist,” “Fascism: The Next Step for America,” “Will Your Speech Be Hate Speech?” as well as screeds urging “patriots” to “reconquer your birthright,” while others exhort “all white Americans” to “report any and all illegal aliens”—have been either taped or glued to lampposts, telephone poles, windows, doors, bulletin boards, and anywhere else they can be seen by the public. They have especially targeted college campuses.
However, Patriot Front is notable for its utterly undisguised and unrepentant fascism. It’s also utterly lacking in the often juvenile transgressive humor and use of pop culture and irony that are core to much of the appeal of the alt-right online. Instead, its dead-serious advocacy of white supremacist ideology is intended to appeal to a more militant mindset, an important byproduct of its origins in the IronMarch.org community. As the manifesto on its website explains in depth:
An African may have lived, worked, and even been classed as a citizen in America for centuries, yet he is not American. He is, as he likely prefers to be labelled, an African in America. The same rule applies to others who are not of the founding stock of our people, or do not share the common unconscious that permeates throughout our greater civilization, and the European diaspora. The American identity was something uniquely forged in the struggle that our ancestors waged to survive in this new continent. America is truly unique in this pan-European identity which forms the roots of our nationhood. To be an American is to realize this identity and take up the national struggle upon one’s shoulders. Not simply by birth is one granted this title, but by the degree to which he works and fulfills the potential of his birth. No man is complete simply to live, but to do more than that, to strive to create a path onward for his people, and to connect with the heritage he is undeniably a part of. That is what completes a man. Only then is he truly deserving of the title and a place among his people.
To date, Patriot Front appears mainly to be comprised of small clusters of dedicated neo-Nazis intent on spreading their fascist gospel to other right-wing extremists, especially “fence-sitting” alt-righters potentially attracted to violent street action. It is noteworthy mainly because of the ease and rapidity with which it has spread to nearly every corner of the country, and the open appeals to young white males, who are the focus of their recruitment.42
Atomwaffen Division, however, became another matter altogether. It came to national prominence after the arrests in summer of 2017, when one of its members murdered his two roommates, who also belonged to the overtly neo-Nazis training organization, and a third, who wasn’t present when the killings occurred, was shortly caught with a carful of homemade explosives, arrested, and convicted. A few months later, another Atomwaffen member murdered a gay Jewish student from California.
A ProPublica exposé of Atomwaffen Division published in February 2018 revealed that its members had been undergoing arms training in the woods, while plotting to target key members of the public for assassination and ultimately a “race war.” They also discussed domestic terrorism, including poisoning city water supplies, bombing natural gas lines, and destroying electric infrastructure.
“We’re only going to inspire more ‘copycat crimes’ in the name of AWD. All we have to do is spread our image and our propaganda,” one of its leaders exhorted his cohorts.43
________
Getting red-pilled actually means a lot of different things to the people who claim it, though it generally refers to embracing any of a number of conspiracy theories and absorbing the conspiracist worldview—which can often morph into something even more radical very quickly.
A large collection of group chats among explicitly fascist ideologues and organizers was analyzed by the open-source journalism site Bellingcat, which examined the process by which recruits became increasingly radicalized and absorbed into the belief system.44
It found that most agree that the key is acknowledgment of the Jewish question, or JQ; that is, whether or not Jewish people are at the center of a vast global conspiracy, the end goal of which is usually “white genocide.” The participants in the chats described red-pilling as a gradual process, but the end point seemed to be almost uniformly alt-right white nationalism.
“Individual people can be red-pilled on certain issues and not others,” the report noted. “Stefan Molyneux, a popular author and far-right YouTube personality, is seen as being red-pilled on race and ‘the future of the west’ even though he is not considered as a fascist. Prominent YouTuber PewDiePie is also often considered red-pilled. It is accepted that media personalities need to hide their outright fascist beliefs, or ‘power level,’ in order to have a chance at redpilling the general population.”45
Recruitment techniques, in fact, tend to dominate the discussions, and disagreements often erupt over which are the most effective, though everyone concurred that people who harbor an animus toward “social justice warriors” (known more often by their acronym, SJWs) and “political correctness” are prime targets. They also agree that Donald Trump is seen as the source of redpilling for many Americans.
Males comprised the vast majority of these fascist activists—some, in fact, doubted that women can be red-pilled at all. When women did appear on the scene, they made their marks by being even more extreme than the typical conspiracy theorist.
Most of them, though not all, were being radicalized online: the report found thirty-nine of the seventy-five fascists whose chats it studied credit the Internet as their red-pilling source, with YouTube the website most frequently referenced. However, the report notes that “when indoctrination begins offline new converts inevitably go online to deepen their beliefs.”46
A user named barD described his red-pilling process:
Get redpilled on Feminism after reading some crazy SJW posts about MLP [My Little Pony] being racist and sexist and anti-lesbian, get redpilled on islam after getting intruiged by some islamisists taking in a youtube comments section. Get redpilled on GG [Gamergate] from sargon.47
The spiral that barD described continued ratcheting up, ranging from comment-section disputes to consuming videos from far-right YouTube personalities to participating in the comments at “the_donald” subreddit and 4chan’s infamous white nationalist–dominated /pol/ board, eventually concluding at fascist Discord servers.
Radicalized recruits are fond of claiming that they actually never used to be racist at all but that an argument with an “SJW” online made them so angry they turned to white nationalist ideology. They insist that racist remarks they make are meant only “ironically,” rather like the OK sign. Bellingcat found a user named FucknOathMate who, when asked if he was “only doing it ironically at first,” replied, “Well sort of”—then added that, before he was red-pilled, he knew Jewish people were “weird” and “ran everything,” but he hadn’t yet become a Holocaust denier or a fascist as he was now.48
Conspiracy theories, particularly those peddled by Alex Jones, Paul Joseph Watson, and their multimedia Infowars operation, also played a key role in the “red-pilling” process for many of the people Bellingcat identified as dedicated fascists.
“Conspiracy theories appear to be one of the more well-trodden roads into fascist nationalism,” it reported. A key example was provided by a Discord user using the nom de plume Harleen Kekzel, who claimed to have identified at the age of sixteen as a “polyamorous genderqueer masculine leaning pansexual” and that Alex Jones started her on the journey to becoming “red-pilled”—or rather that she “was conspiracy pilled” along with her husband.49
However, for all his usefulness, Jones and Infowars actually are viewed with considerable skepticism by many serious fascists who dismiss them as “controlled opposition.” Jones, who denounced David Duke after having him on his program, is generally viewed as too compromised and too milquetoast for serious National Socialists—as are other right-wing pundits with a conspiracist bent, such as Michael Savage and David Horowitz, both of whom are Jewish.
The most striking and powerful pathway to radicalization for these young fascists, however, was YouTube.
“Fascists who become red-pilled through YouTube often start with comparatively less extreme right-wing personalities, like Ben Shapiro or Milo Yiannopolous,” Bellingcat reported. “One user explained that he was a ‘moderate republican’ before ‘Steven Crowder, Paul Joseph Watson, Milo Yiannopolous, Black Pidgeon Speaks,’ and other far-right YouTubers slowly red-pilled him. Over time he ‘moved further and further right until [he] could no longer stand them. That’s why [he likes] those groups even still, because if we just had the Fascists, we’d never convert anyone.’”50
The serious fascists, however, view the alt-right as having something of an image problem, particularly in how it appropriates mainstream cartoon and humor imagery, like Pepe the Frog, who is widely recognized as the alt-right’s chief mascot. “Fascist activists view the alt-right as silly, but also as a crucial recruiting ground,” noted Bellingcat.51
And it doesn’t get much sillier—or stranger and ultimately disturbingly toxic—than the Church of Kek.
________
You may have seen the name bandied about on social media, especially in political circles where alt-right activists and avid Donald Trump supporters lurk. Usually it is brandished as a kind of epithet, seemingly to ward off the effects of liberal arguments, and it often is conveyed in memes that use the image of the alt-right mascot, Pepe the Frog: “Kek!”52
Kek, in the alt-right’s telling, is the “deity” of the semi-ironic “religion” the white nationalist movement has created for itself online—partly for amusement, as a way to troll liberals and self-righteous conservatives both, and partly to make a kind of political point. He is a god of chaos and darkness, with the head of a frog, and the source of the alt-right’s memetic “magic,” to whom white nationalists and Donald Trump alike owe their success, according to their own explanations.
In many ways, Kek is the apotheosis of the bizarre alternative reality of the alt-right: at once absurdly juvenile, transgressive, and racist, Kek also reflects a deeper, pseudo-intellectual purpose that appeals to young ideologues who fancy themselves deep thinkers. It dwells in that murky area they often occupy, between satire, irony, mockery, and serious ideology; Kek can be both a big joke on liberals and a reflection of the alt-right’s own self-image as serious agents of chaos in modern society.
Most of all, Kek has become a kind of tribal marker of the alt-right: its meaning obscure and unavailable to “normies,” referencing Kek is most often a way of signaling to fellow conversants online that the writer embraces the principles of chaos and destruction that are central to alt-right thinking. Many of them like to think of it as a harmless 4chan meme—though in the end, there really is nothing harmless about it.
The name, usage, and ultimately the ideas around it originated in gaming culture, particularly on chat boards devoted to the World of Warcraft online computer games, according to Know Your Meme.53 In those games, participants can chat only with members of their own faction in the “war” (either Alliance or Horde fighters), while opposing players’ chats are rendered in a cryptic form based on Korean; thus, the common chat phrase “LOL” (laugh out loud) was read by opposing players as “KEK.” The phrase caught on as a variation on “LOL” in game chat rooms, as well as at open forums dedicated to gaming, animation, and popular culture, places such as 4chan and Reddit—also dens of the alt-right, where the Pepe the Frog meme (originally an apolitical cartoon frog created by a liberal named Matt Furie) also has its origins and similarly was hijacked as a symbol of white nationalism.
At some point, someone at 4chan happened to seize on a coincidence: there was, in fact, an Egyptian god named Kek. An androgynous god who could take either male or female form, Kek originally was depicted in female form as possessing the head of a frog or a cat and when male, a serpent, though during the Greco-Roman period, the male form was depicted as a frog-headed man.
More importantly, Kek was portrayed as a bringer of chaos and darkness, which happened to fit perfectly with the alt-right’s self-image of being primarily devoted to destroying the existing world order.
In the fertile imaginations at play on 4chan’s image boards and other alt-right gathering spaces, this coincidence took on a life of its own, leading to wide-ranging speculation that Pepe—who, by then, had not only become closely associated with the alt-right, but also with the candidacy of Donald Trump—was actually the living embodiment of Kek. And so the Cult of Kek was born.
Constructed to reflect alt-right politics, the online acolytes of the “religion” in short order constructed a whole panoply of artifacts of the satirical church, including a detailed theology, discussions about creating “meme magick,” books and audiotapes, and even a common prayer:
Our Kek who art in memetics
Hallowed by thy memes
Thy Trumpdom come
Thy will be done
In real life as it is on /pol/
Give us this day our daily dubs
And forgive us of our baiting
As we forgive those who bait against us
And lead us not into cuckoldry
But deliver us from shills
For thine is the memetic kingdom, and the shitposting, and the winning, for ever and ever.
Praise KEK54
Kek “adherents” created a cultural mythology around the idea, describing an ancient kingdom called “Kekistan” that was eventually overwhelmed by “Normistan” and “Cuckistan.” They created not only a logo representing Kek—four Ks surrounding an E—but promptly deployed it in a green and black banner, which they call the “national flag of Kekistan.”
The banner’s design, in fact, perfectly mimics a German Nazi war flag, with the Kek logo replacing the swastika and the green replacing the infamous German red. Alt-righters are particularly fond of the way that the banner trolls liberals who recognize its origins. It was all a goof. But it also wasn’t.
Alt-right marchers at public events planned to create violent scenes with leftist, antifacist counterprotesters and have appeared carrying Kekistan banners. Others have worn patches adorned with the Kek logo.
Besides its entertainment value, the “religion” is mainly useful to the alt-right as a trolling device for making fun of liberals and “political correctness.” A 2017 alt-right rally in support of adviser Stephen Bannon in front of the White House, posted on YouTube by alt-right maven Cassandra Fairbanks, featured a Kekistan banner and a man announcing to the crowd a “free Kekistan” campaign.
One of the leaders of the group offered a satirical speech: “The Kekistani people are here; they stand with the oppressed minorities, the oppressed people of Kekistan. They will be heard; they will be set free. Reparations for Kekistan now! Reparations for Kekistan right now!”55
“We have lived under normie oppression for too long!” chimed in a cohort.
“The oppression will end!” declared the speaker.
The main point of the whole exercise was to mock “political correctness,” an alt-right shibboleth, and it was deeply reflective of the ironic, often deadpan style of online trolling in general and alt-right “troll storms” especially. Certainly, if any “normies” were to make the mistake of taking the “religion” seriously and suggesting that its “deity” was something they actually worshipped, they would receive the usual mocking treatment reserved for anyone foolish enough to take their words at face value.
Yet at the same time, lurking behind all the clownery is an idea that altrighters actually seem to take seriously: namely, that by spreading their often cryptic memes far and wide on social media and every other corner of the Internet, they are infecting the popular discourse with their ideas. For the alt-right, those core ideas all revolve around white males, the patriarchy, nationalism, and race, especially the underlying belief that white males and masculinity are under siege—from feminists, from liberals, from racial, ethnic, and sexual/gender minorities.
In such alt-right haunts as Andrew Anglin’s neo-Nazi website the Daily Stormer, references to the Kek “religion” have become commonplace, and, besides electing Trump, Kek as the “god of chaos” has been credited at the site with killing more than thirty people in a fire at an Oakland artists’ collective. A very early Stormer disquisition on Kek by “Atlantic Centurion,” published in August 2015, explores the many dimensions of the Kek phenomenon in extensive theological detail, connecting their belief system to Buddhism and other religions.
It is the Kek the Bodhisattva who can teach our people these truths, if we are willing to listen and to commit ourselves to the generation of meme magick through karmic morality and through the mantra of memes. By refusing to cuck and by rejecting the foul mindsets of our invaders and terrorizers, we will move the nation away from its suffering under the pains of hostile occupation, and closer and closer to its final rebirth. If instead, our people cuck and adopt the foul mindsets, they will generate not Aryan karma but further mosaic samsara.
The trve power of skillful memes is to meme the karmic nation into reality, the process of meme magick. By spreading and repeating the meme mantra, it is possible to generate the karma needed for the rebirth of the nation.56
Anglin himself frequently references Kek, making clear that he too subscribes to the underlying meme-spreading strategy that the “religion” represents. Describing a black artist’s piece showing a crucified frog—which appeared to Anglin to be a kind of blasphemy of the Kek deity—he declared that “there’s some cosmic-tier stuff going on out there.” Another post, published in March, was headlined: “Meme Magic: White House Boy Summoned Spirit of Kek to Protect His Prophet Donald Trump.”
Anglin devoted the post to explaining a teenager’s use of an alt-right hand signal while meeting Trump, concluding that “the only possibility here is that this is an example of Carl Jung’s synchronicity—seemingly acausal factors culminating to create an event based on its meaning. But it is not really acausal—it merely appears that way to the nonbeliever. It is our spiritual energies, channeled through the internet, that caused this event to manifest,” he wrote. “It is meme magic.”57
Kekistan, according to Bellingcat, is a frequent topic of discussion on the fascist Discord servers whose chats they examined. “Opinions vary from calling it a ‘forced meme’ to expressing serious devotion to the idea. Kekistan flags and other regalia are often seen at Patriot Prayer rallies and other far-right protests. Some fascists lament that many people who fly the flag don’t understand the Nazi origins of its design. But many know exactly what they are signaling when they put one on a flagpole, or their helmet.”58
Whether they really believe any of this or not, the thrust of the entire enterprise is to mock everything “politically correct” so loudly and obtusely—and divertingly—that legitimate issues about the vicious core of white male nationalism they embrace never need to be confronted directly. The alt-right’s “meme war” is ultimately another name for far-right propaganda, polished and rewired for twenty-first-century consumers. The ironic pose that Kek represents and accompanying claims that the racism they promote is innocently meant to provoke in the end are a façade fronting a very old and very ugly enterprise: hatemongering of the xenophobic and misogynistic kind.
And the current running through all of this is also very ancient, very human, and very worrisome: authoritarianism.