Chapter 9

A PILL FOR HEALING

When we consider further the social and psychological roots of the collective urge to kill the world, we are likely to see more of ourselves in it and to begin to think of such groups as something of a dark underside or “cultural underground” of our own society. We are also likely to discover that whatever renders our society more decent and more inclusive in its benefits is likely to undermine the totalistic impulse to destroy everything. But since that impulse will not disappear, we had better continue to bring our imaginations to bear on confronting and exploring it, on finding the means to resist its lure and oppose its destructive projects.

Robert Jay Lifton, Destroying the World to Save It1

BuzzFeed reporter Joseph Bernstein befriended Lane Davis well before he stabbed his father to death on Samish Island—“Seattle4Truth” was one of his key sources for an exposé of Milo Yiannopoulos that Bernstein published in 2016, along with several other “researchers” who specialized in Gamergate and Pizzagate conspiracy theorists.

He was shocked by the murder and traveled to the remote, generally soggy little western Washington town to compose an insightful piece about its aftermath. Like me—I attended several of Lane’s hearings, which culminated in him copping a guilty plea that put him behind bars for the next seventeen years—Bernstein found Davis’s family unwilling to talk to the press. Considering how much they lost, no one can really blame them.2

He also kept exploring the important subtext of Lane Davis’s story: namely, how conspiracism and extremist ideas can unhinge people to such a remarkable extent that they will kill other people, even their own parents—and, more important, whether there is anything we can do about it, both collectively as a society and individually as ordinary people.

The very difficult truth, as Bernstein found, is that many hundreds of thousands of people absorb and participate in conspiracy theories without ever becoming so unhinged that they act out violently and commit mass murder. Indeed, although an extraordinarily high percentage of mass killers hold conspiracist worldviews, which play a significant role in the disconnectedness and isolation that fuels these actors, in reality they comprise only a small percentage of the total number of people who participate in this alternative universe.3

That tells us there is no singular, one-size-fits-all, easy-to-swallow answer, no “blue pill” that could prevent Lane Davis from stabbing his father or Stephen Paddock from opening fire on a country music festival. And there’s no easy way of predicting or mapping exactly what beliefs or ideas eventually will result in tragedy.

“The truth is, as researchers of violent extremism like James and Horgan will tell you, the vast majority of people who use Gab and Stormfront will never commit a violent crime,” he wrote. “That’s not to absolve online communities of the beliefs of their members. It’s not to say digital spaces can’t play a major role in ushering people toward violence. As the lasting influence of Anwar al-Awlaki and Dylann Roof show, they can. It’s not even to say such spaces have a right to exist on private hosting services. They don’t—at least not as far as the First Amendment is concerned. But there isn’t an easy answer when it comes to finding the small number of people who will commit extremist violence.”4

Bernstein’s retrospective explained:

Researchers have long known that the common risk factors for violent extremism have no real predictive power. A February 2018 literature review prepared for the Department of Homeland Security by the research giant RTI International found that “it is unrealistic to expect that the presence of any risk factor—or even the combination of several risk factors—can or will predict ‘with any accuracy’ whether an individual will engage in violent extremism.” Indeed, the risk factors are all backwards-looking. Applying them to the general population makes that obvious.

Take a few of the big ones: a feeling of moral outrage, identification with an in-group perceived to be under threat, and underemployment. That describes a lot of violent extremists. It also describes a lot of journalists. Even if you add more specific risk factors, like mental health issues, having a criminal history, and a recent triggering event like a death or a divorce, well, that could still describe a lot of people who will never even consider violence. It’s impossible to come up with a checklist of risks and pop out a violent extremist.5

It would be a mistake for any program intended to prevent the downward spiral into violence to try basing its prescription on a psychological map, mainly because the human variables involved are so large. “The permutation of risk factors raises an uncomfortable possibility: that the route to violence is so complex and unique that there is no meaningful way to map it,” says Bernstein. “As much as we want to talk about structural forces like socioeconomics and racism in determining who becomes violent and who doesn’t, an impossibly huge number of individual variations appear to play a significant role as well.”6

One of the insidious aspects of conspiracism is that it has a built-in immunity to what might be a normative means of drawing people out of that belief system—namely, a rational and reasoned application of factual evidence and logic. Anyone who tries to persuade someone who’s been “red-pilled” with such methods quickly finds themselves dismissed as either a willing dupe or an active coconspirator. Like most right-wing authoritarians, their compartmentalized thinking enables them to believe two completely contradictory things at once and to ignore factual evidence when presented as “tainted.”

The narratives that bind people to conspiracy theories are often large-scope mythologies that make them feel heroic as well as part of something bigger. These are normal drives in human beings but have been twisted by the disinformation to which they’ve been exposed. These mythologies are also highly personalized and often gut-level in nature. This is why there are no immediate, large-scale solutions to the problem, fixable through mass media or other appeals.

So there really is no “blue pill,” as it were, no universally applicable remedy that would cure people of their conspiracy-bred delusions. The only way to actually draw people out of the conspiracist universe and back into the sometimes murky light of the real world is on a one-to-one basis, one at a time, very slowly, and with an emphasis on empathy—as well as an understanding that even the best efforts in these areas often fail.

It’s a slow, often painful, and only sometimes rewarding process. But for people determined to not simply cope with this phenomenon in their lives, but to confront and overcome it, there’s not really any other option.

________

What is nearly universal about conspiracy theorists and the “red-pilled” ideologues who are sucked into the darkest corners of authoritarian ideologies by them is that their personal lives are nearly always in shambles. They alienate family members, old friends, new friends, neighbors, business and work associates, customers, you name it—they churn through relationships like raccoons through a trash pile, picking up and discarding with startling ease.7

For family and old friends, this ongoing churn and otherworldly alienation is extremely disorienting and frequently heartbreaking. But as it happens, these are the people with the greatest likelihood of changing the trajectory of a redpilled person and perhaps drawing them back into the normative world and the warmth of healthy human relationships.

It’s a very difficult proposition, however, and carries many risks. The first is that many such attempts actually drive the red-pilled true believers even deeper into their rabbit holes. You may think you’re doing the right thing by plopping facts in front of them, but it’s more likely to worsen things. Multiple studies have demonstrated that the most common response to this approach is for the conspiracy theorists to see your facts as further proof of the conspiracy—as well as clear evidence of your gullibility, at least, if not your active complicity in the plot.

Radicalization expert Peter Neumann acknowledges that there are innate limitations to this approach. “There is a big debate around to what extent counter-messaging is possible and to what extent that can ever work,” he says. “I think in most cases, when you’re talking about extremists rather than conspiracy theories, in most cases it was a face-to-face interaction that pulled people out, and I don’t think it’s easy to convince people on the Internet that they are wrong about these things.

“And if you do it badly, it may actually have the opposite effect. This is what psychologists call reactance, which is the idea that if you confront someone who is very entrenched in their views and you’re trying to convince them of the opposite, you’re actually provoking resistance, and with the result that that person becomes even more entrenched in their views.”

Secondarily, your relationship with the person will further fray and destabilize along the way. If they are prone to anger—and most conspiracy theorists seem to be—you well could bear the brunt of it and all that entails.

Neumann warns: “Of course, one of the first things that is important to understand is that not everyone can be deradicalized. So, if you have someone that is very deep in those conspiracies, is very convinced about them, and feels very happy and comfortable with that identity, it’s very hard to pull them out. It’s like if you have someone who’s a hardened neo-Nazi and who is moving within that milieu who is perhaps also having, let’s say, a family relationship or relationships within that group of people, it’s very hard to pull him or her out.

“Deradicalization can work, in two situations. The first arises when people are still exploring and before their extremist identities are being settled, so if you get them at an early enough stage while they’re still having a degree of openness about the different questions that they are exploring, you can still engage them. And the other situation where deradicalization can work is at the very end of the process when people start having doubts, start becoming disillusioned, or for other reasons really are looking for a change.

“I used to be a smoker myself. It’s like quitting smoking. . . . [P]eople will tell you that it’s not possible to quit smoking unless you really want it, unless you are convinced that you want to change your lifestyle. And I think the same is true with extremists or people who are very deep in these weird theories. Unless they themselves have questions that you can leverage and unless they themselves are having doubts that you can work with, it’s basically not possible.”8

So the first question any person trying to tackle the problem should ask themselves is, will it be worth it? Be honest with yourself and about the other person, because that will be crucial.

However, if you decide to proceed and try to help someone who has become enmeshed in conspiracy theories and the extremist ideologies that accompany them, I’ve assembled a number of steps for doing so, compiled from the best advice available from experts—psychologists, sociologists, researchers, and activists with real-life experience drawing people out of the rabbit holes and back into the bright light of normative reality.

The key, again, in all this is the human variables: every person is different, every path down into the rabbit holes is personal, every force keeping the red-pilled trapped inside their alternative universe varies from case to case. And always keep in mind that success is difficult and rare. Every choice you make depends on the circumstances, personality, and beliefs of the person involved.

Whether this one-at-a-time, interpersonal, and typically slow approach ever gains enough momentum to solve the problem on a large scale is anyone’s guess. “Everyone is looking for the silver bullet, and it doesn’t exist,” says Neumann. “It’s something that requires education. It’s something that will not happen within the next couple of months, but maybe the next couple of decades if people really get behind it, but it’s a really hard, long-term project.”

At the same time, most of us see no other options—we just know we must try.

So I engaged some of the best minds on this subject in extended conversation and asked them: if you could devise a toolkit that ordinary people could use to bring their friends and loved ones out of the dark world of conspiracy theories and back into the sunlight of reality, what would it look like?

After listening to their advice, I compiled an incremental, step-by-step set of conceptual approaches to the problem—tools, as it were—that ordinary people can use. There are fifteen of them.

Fifteen Steps

1. First, an ounce of prevention is worth a pound of cure: the most effective way of overcoming the effects of “red-pilling” is immunizing people beforehand.

Psychologist Stephan Lewandowsky recommends working incrementally with people who have had little exposure to conspiracy theories: “Now we do know that what tends to work is to inoculate people against conspiracy theories by exposing people to the theory ahead of time and saying, well you know, if you ever come across this, be aware of the fact that it’s complete nonsense. And that can be done; a colleague of mine, Karen Douglas, has done that with anti-vaccination conspiracy theories.

“She finds that if you expose people to a small dose, just like a vaccine, of the conspiracy theory up front, then it finds less traction when people are actually exposed to it. So you can educate, sort of protect people against conspiracy theories, but of course the crucial is you got to get to them first, because if you try to do it after they’ve already been exposed to it then it’s far less effective. So you have to get people educated about the existence of theories ahead of time.

“And it has to be proactive. Now of course, there’s a risk in that as well. Some people might never encounter conspiracy theory and if you had never mentioned it, they would have never gone on the Internet to look it up. So you may have in fact protected some people, but inadvertently tip others into the rabbit hole.

“Nothing is perfect and that is an obvious risk: that if you’re exposing people to something to protect them against it, then that may backfire conceivably. Now the other thing is that what colleagues and I have done repeatedly is to look at the reasoning flaws that are part of conspiracy theories—one of them being the pure chance of it, and the other one being the unrealized implications of some of those claims.”9

Information scientist Michael Caulfield believes that the most effective form of immunization comes from teaching young people better methods of consuming digital information. That’s especially the case, he says, in the new age of conspiracism, when everything connects like an elaborate Tinkertoy set. It adds a level of complexity that most people are unprepared to deal with.

“The newer thing is the absolute connection between all of these conspiracies,” Caulfield says. “The truth is, almost everybody believes in one conspiracy. Almost everybody. This is true going back a long time. You can go back to the ’70s, ’80s, or whatever. You’ll find that people believe in one conspiracy or another.

“I think what is happening increasingly now is that people get into the one conspiracy that everybody is going to get into anyway. They’re going to choose a conspiracy and be into it. But the conspiracy communities are very connected—the viral mechanics across these communities are very connected—and then the bad actors in this space are very aware of how to pull people in from relatively harmless conspiracies, deeper into other conspiracies.

“I saw this horrifying YouTube video that was basically an anti-Semitic, Jews-control-the-world video, but of course none of it said that explicitly. It was an animated thing with a voiceover, and you’ve seen these sorts of things. But a big piece of it was, ‘Oh, society is trying to medicate you, and there’s a big conspiracy to try to keep you medicated, or from seeing the truth,’ and so forth.

“I just thought about how compelling this would be for someone who had been on Ritalin all their life, or something like that. Looking at this saying, ‘Yeah, what’s that all about?’ and having some actual fear and concern that, what if there is a pharmaceutical conspiracy to keep us all medicated?

“Of course the piece of it that was really smart, in this horrific way, about what this person was doing, was as you looked at the imagery, if you were literate in this, you saw, oh what they’re looking up to is actually this Jewish-controlled, banker-funded conspiracy to keep everybody down. But of course that doesn’t pop out to the person first watching the video. What pops out is, ‘Oh, this is validation for what I feel.’ So they’re very smart in how they go into these things.”

Caulfield believes that an understanding of context is essential for effective digital literacy. “I do really believe that if on day one you’re looking at that video, and you realize, ‘Hey, wow, this is a video put out by a group of literal neo-Nazis,’ I think you would back off of it.

“But through this process of grooming, whether it’s algorithmic grooming, which just sort of happens via YouTube’s algorithms or whether it’s grooming by bad actors explicitly, this process relies on you not fully understanding your destination until you’re in it.

“My belief is that the process works because you don’t understand what the destination is of this stuff, before you’re already so deep into it that you’re living in a different epistemology.”10

Most people outside the epistemic bubble, however, don’t see that there’s a destination. All most people see is that you’re having a conversation with someone in a chat room or a forum for conversation, which becomes a mechanism for recruitment that can be used across all kinds of digital platforms. Video gamers, for instance, will go into a video chat room, hanging out with other players; pretty soon someone starts bitching about how “social justice warriors” want to ruin their first-person shooter games. That it’s a feminist conspiracy and so on. Then someone pipes up: “Well, you know why they’re doing that, right? It’s all about cultural Marxism.” And soon the whole chat room is getting filled in on the nefarious work of those Jews in the Frankfurt School and down the rabbit hole they go.

Caulfield explains that this is heightened by how we teach kids to research information on the Web, exemplified by the bizarre radicalization spiral of biased curation that helped Dylann Roof dive off the deep end. “You’ve got this loaded search-term problem, right? So when you see this cycle, . . . someone says, ‘Oh, it’s all about cultural Marxism,’ and then you go and read the stuff: you plug ‘cultural Marxism’ into the search engine. But because of the curation loop, you’re not going to get a relatively academic treatment of how the Frankfurt School changed the application of Marxist thought to contemporary cultural issues.

“You’re going to get all this stuff, which is going to have more terms, and each thing you read is going to have more terms and more examples. They’ll pull out one example of this that’s particularly advantageous to what they’re arguing, and you google that example. And of course who is talking about that example?

“Is it NBC News? No, because the example usually is this one-off event where some unknown feminist said something to someone. It’s a nothing thing. The only place it’s a something thing is in this misogynistic world where you’re actually dealing with a white supremacist community. So you get pulled deeper and deeper into it, step by step by this.”

This, he says, is the model we teach young people for researching information on the Internet, built on older models in which publication of information was more carefully vetted, which are inadequate for an age in which deliberate disinformation is not merely common but often pervasive, depending on the platform.

“It’s a model of reacting and going deeper,” Caulfield says. “Interestingly, it’s a model that involves a lot of reading. It involves a lot of deep reading of all these texts, and my qualms with the media literacy and info-literacy we give students is that what we teach students to do is to deeply read texts and look at the internal logic of them and to think about the arguments.”

This is a bad thing when it comes to white nationalist propaganda and dis-information. “This is not what you actually want them to do with this stuff,” he explains. “What you want them to do with this stuff is: ‘Please dear God, stop reading until you know where this came from, what its agenda is, and how it fits into broader expert, or reporting, or any broader consensus, right?

“So we teach students to dive in and read deeply and think deeply about these things—but this is just really bad advice.”

Caulfield has constructed a model with the acronym SIFT: stop, investigate, find, trace. It emphasizes teaching young would-be researchers to contextualize their information before they absorb it. Once they’re at a website, stop and look at what else that site does, how reliable its information appears to be, who runs it, and what, if any, agenda it has. And then keep expanding the context.

“The first move in SIFT is stop. Stop reading, just don’t do anything. Stop. Before you go deeper into this stuff, before you read it, before you react, before you get angry, before you start going into search term after search term, and these sorts of things: stop.

I is investigate the source. F is find better coverage. T is trace claims, quotes, and images to the original context and see if they’re being misportrayed.

“The idea is, if we can get people to, in a sense, read less and react less, we can maybe break that cycle, which gains such a momentum. And you just see it. I saw it with a family member of my own. Not deep into the white supremacist territory, thankfully, but into some other things that were extremist, the Seth Rich sort of stuff.

“It happens too quickly, but it happens because these people are actually being voracious readers and voracious ‘researchers,’ but they’re not actually orienting themselves to the discourse that they’ve landed themselves in.”

2. Immunization, of course, is not always possible. Most of the time conspiracism shows up at our dinner tables unannounced. Prepare to be blindsided, so arm yourselves with knowledge.

Stephen Lewandowsky was drawn into the world of conspiracism and disinformation when he composed a scientific paper examining theories denying that global climate change is a reality. “What we did in the paper that we wrote about was to explore the implications of that claim,” he says. “What that actually means and what would have had to be true for that claim to be true. When you look at that, then you find that the claim just falls over in a heap because the moment you start thinking about what the implications are, it just becomes ridiculous.”

He found that an early application of scientific rigor was helpful, but it only helped when people promoting denialist climate theories had to face consequences within their scientific peer communities for their own frequent lack of rigor. “One thing that might be helpful is to expose people to the consequences of their own faults and it becomes obvious very often to anyone who is capable of standard cognition that the claim just makes no sense because the implications don’t materialize,” he says.

“Then the other thing we know about conspiracy theories is that they tend to come together in clusters so people tend to believe more than one conspiracy. If you believe one, then chances are you’ll believe more than one. It’s not just an isolated thing.

“People tend to believe in multiple conspiracies and that also identifies them as being a particular type of person, because it’s not just that they have information about Pizzagate that no one else has, but that they actually have a personality or cognitive style that makes them susceptible to believing a lot of weird things. So, first of all, that’s an important thing to tell others who you might protect against conspiracy theories by just noticing that, hang on, people who believe one conspiracy also believe in another. Do you really want to believe in all that stuff?”

That, however, assumes they haven’t already become ensnared.

“Reaching the people who have gone down the rabbit hole is extremely difficult because one of the problems now is that on the Internet they will find support in their views that seems strong,” Lewandowsky says. “This is one of the really pernicious aspects of the Internet as completely underappreciated in my view: the social signals you get on the Internet are completely miscalibrated. I can believe anything I want no matter how absurd. I’ll find a thousand people on Facebook who share my views, whatever they may be. I can say the Earth is flat or Obama is a lizard person, and somebody will share that view.

“At the moment people have a community, they become entrenched in those views. And that is what makes it more difficult to break than it used to be, because at least thirty years ago, people knew they were cranks because no one else believed anything that they said, but that’s no longer true. Now people have the Internet and they get back up from a thousand others. Which is a trivially small number given how many people are on Facebook, but it’s sufficient to create an illusion of support.

“At the moment you find a community, that’s it. You’re stuck in the rabbit hole and you’re no longer feeling isolated because there are these other people egging you on.”

Lewandowsky cautions that “you’re talking about a certain type of person who falls into that rabbit hole”—namely, authoritarian personalities. “That is triggered by their personal circumstances and their feeling of alienation from society, and there’s some evidence to suggest that,” he says. For example, the people who believe everything Donald Trump says are of a certain type, right? Not everybody falls for Trump, quite the contrary.

“So the difficulty is that a lot of the fertile ground for this nonsense—for Trump and conspiracy theories and everything else—is political in nature. And it’s the politics of the last twenty or thirty years that have just disempowered a lot of people, or they feel that they have lost out, regardless whether that feeling is real or not.

“All of this is unfolding against a political structure that is extremely difficult to change, and finding the antidote is almost impossible without changing the politics or changing society at large.”

3. Your relationship with the person is what will guide them out of the rabbit hole.

The most powerfully toxic effect of conspiracism is that it isolates people—cuts them off from the rest of society, encourages them to walk away from it, from democratic institutions, from their own political franchise. Friends and coworkers peel away gradually as they grow uncomfortable with the conspiracist’s growing obsession. Family members become alienated as the red-pilled begin insisting that they convert to their belief system, usually claiming they’re doing so for everyone’s protection.

Eventually their only social contact is with other true believers. That becomes their community—and after a while, they can’t even trust those, right? In a very powerful way, when people dive down the conspiracist rabbit holes, they lose touch with their own humanity.

So any kind of sincere attempt at empathetic or—more importantly—non-judgmental exchange of personal contact with a conspiracy theorist is likely to have some reasonable chance of success, because they often are starved for a sympathetic ear. However, during these initial stages, they’ll remain highly suspicious, typically paranoid, and on the lookout for any sign of disrespect or judgment that they are crazy or stupid—all things they have heard many times before.

Dishonesty in these situations is always disastrous, but so is too much transparency. The ideal approach is a clinical one: think of yourself as someone conducting an interview in a professional manner, one in which the questions have already been laid out and the nature of your query is constrained by them.

Sociologist Samantha Kutner, who has researched both the organizing principles of far-right Proud Boys as well as methods of deradicalization, says she often thinks of Kenneth Burke’s rhetorical concept of identification when approaching these situations—that is, in order for any kind of persuasion to take place, one or both parties involved must “identify” with the other. So when starting out—whether with an established relationship or with a new one—it’s essential to find common ground over which you can bond, at least a little.

“I like seeing how something resonates with an individual,” Kutner says, most particularly, “whether when you see someone else’s plight, whether you empathize with them and want to support them.” She says that, as with any human exchange, it’s essential to play it by ear, roll with the punches, and maintain your own integrity in the relationship. It’s even possible to gently mock them, she says.

“There has to be a gentle kind of pro-social way to respond to these people that’s like, ‘Hey, you’re wrong. But we still care about you,’” Kutner says. “The tone is hard to replicate, but the times that I’ve been able to do it, it’s been successful. I’m sure there are other people that are into the pro-social trolling where, yes, you’re mocking them, but you’re not doing it in a way that’s showing that you dehumanize them. It’s just nudging them in a certain direction or hoping they see something.”

C. V. Vitolo, a professor of debate at University of Wisconsin–Madison and one of the activists involved in direct contact with deradicalization work, explains that “recognizing that radicalization is a process and not a state of being and that it is worthwhile to intervene at any point in that process” are fundamental to any kind of success.

“And so just because you are not good at dealing with full-blown 14/88 Nazis doesn’t mean that you don’t have a role to play in stopping radicalization. Don’t sell yourself short or think your interventions are small if you’re intervening at the beginning or the middle of the radicalization process.”

The next fundamental, in Vitolo’s view, is the human connection. “Relationships are the key,” she says. “There are people who, it took me two, sometimes even three years to get through to, who started very strong in one direction and had to work piece by piece from outright to civic nationalists to libertarian, and eventually becoming a little bit more properly left leaning, in my opinion. And that’s something that’s not possible if someone doesn’t have some kind of trust-based relationship with you.

“That process of repetition is so important. Relationships don’t just afford you trust, they also afford you the ability to say things over and over and over to people. And repetition has a big impact. If you only hear someone make an argument once, no matter how good it is, it’s not necessarily going to sit with you. That’s not how our brains work. We don’t take a piece of information and process it through every single thing we know, you know?”

4. Therefore, the key to all of your actions is empathy—not blind, unquestioning empathy, but the mindful kind, attuned to the faults of the other person and maintaining normal moral boundaries, while also being willing to simply listen.

If your goal is to help another person emerge from the conspiracist cocoon through the gravitational pull of your relationship, then large doses of empathy and forbearance will be required on your part. This is always much easier said than done, especially because red-pilled people are so often angry, contentious, suspicious, and generally cantankerous.

Samantha Kutner says that avoiding a really confrontational approach is key. She describes her sessions as “more of a respectful listening. Being willing to listen to them.” She says she studied psychologist Carl Rogers’s “nondirective approaches” to interviewing subjects, which helped her tremendously.

Maintaining this empathetic approach, however, should never mean sacrificing the boundaries of fundamental human behavior—violence and threats are always intolerable, whether directed at you or anyone else. And you should never sacrifice those boundaries for the sake of maintaining the relationship.

As journalist Noah Berlatsky has observed, empathy as a simple principle is a two-edged sword that can make the world better, but it can be used to make things worse. “The problems with empathy bias are compounded by the fact that it’s possible to weaponize empathy to create support for ugly political programs, and even for violence,” he writes. “Trump leverages our empathy bias by presenting himself as the spokesperson for good, normal, white people against untrustworthy, dangerous, racial others. When he talks about immigrants, for example, Trump constantly aligns himself with those he claims are victims of immigrants. He focuses on the people who will supposedly suffer when immigrants take jobs or commit crimes.”11

Paul Bloom, a psychology professor at Yale University, argues for a different model of empathetic behavior he calls “rational compassion,” which mitigates and redirects the claims of empathy. Trump’s calls for empathy for victims of crimes committed by undocumented immigrants, he suggests, should elicit a search for data. “You don’t have to be a cold-blooded utilitarian,” Bloom says, “but people should appreciate that facts matter. If Trump says or implies that an extraordinary number of illegal immigrants are murderers and rapists, before you start putting yourself in people’s shoes and feeling their pain, you should ask, is it true?”

C. V. Vitolo says she has grappled with how different people’s grasps of the world can really be. She emphasizes “just the importance of holding another person’s epistemology and of understanding a lot of this as epistemological problems, where it’s not just that they don’t understand the data or they don’t understand what the science says.

“Sometimes it’s not a question of, ‘Could you, in theory, separate people into races biologically?’ Sure, you can separate people into any number of things biologically if you wanted to. You could do it twenty different ways. You can do whatever you want. But it’s essential to understand: why sort people this way in the first place, right? They often invert the scientific process. What is driving the mode of scientific inquiry that you’re participating in?”

5. After you listen, ask the right questions.

All of this underscores, perhaps, the difficulty of attempting to deradicalize someone when it’s being done by an ordinary person without training in these techniques, and it reminds us that making this attempt probably means learning new interpersonal skills that may not always be successful. Maintaining a calm professionalism in the face of a raging conspiracy theorist is not for the meek at heart.

Kutner recommends using Carl Rogers’s listening strategies to defuse tense situations.12 These involve two key steps: first, seeking to understand the other person’s ideas, then “reflecting back to them what it is you think that they’re saying.”

“There’s a lot of, ‘This is what I think you’re saying. Is that accurate?’” she explains. “And then going back and forth and having them identify.”

As the conversation proceeds, she says it’s essential to respond with your questions to what they’re saying, as well as to help them to elucidate rationally what they otherwise simply have a strong feeling about. Having an actual back-and-forth exchange, and not purely a clinical one, works best.

Vitolo says it’s essential that those who undertake this process comprehend that they’re confronting an “evil that is worth dedicating a ton of time and resources to pushing back against,” but also that they “do not believe that the people attached to those ideologies are also just not worth our time or are irredeemable in any way. Persuasion hinges on your ability to be likable.

“And it’s not really likability, because usually it’s not anything that would pass as civility in normie culture. But it is about adopting the kind of social cues of that audience and saying things and making arguments in ways that you know are reaching out to them on really human levels and holding their perspectives and taking them seriously, taking their pain seriously, taking their own causes of liberation, even if they may seem trivial in comparison, seriously. It’s hard to find people who have that balance of empathy and also are resilient enough to not fall for it.”

6. Don’t fall down the rabbit hole yourself while attempting to help someone else out of it.

One of the first things that those who are attempting to draw someone out of the red-pill rabbit hole have to do is immunize themselves from the attractions of conspiracism. So at every step in your process of listening to and discussing a red-pilled person’s beliefs, it’s essential to keep in mind the distinction made in chapter 1 between conspiracies and conspiracy theories—namely, the three primary limitations of real conspiracies (duration, numbers of actors, scope, and breadth) and the known attributes of conspiracy theories (the long period of time over which they occurs, their large numbers of participants, and the global reach of the plots).

Stephan Lewandowsky says he’s frequently criticized by people asking how he can dismiss the theories when in fact there have been real conspiracies. His answer: “If you look at true conspiracies that we now acknowledge have happened—like the Volkswagen diesel scandal [in which the carmaker was caught falsifying emissions data on its cars], which is classic conspiracy, right? That was uncovered by very conventional means. That wasn’t cranks who invented this and then found it confirmed. No, these were professionals that noticed some weird stuff and they investigated that. And the same is true for Iran Contra, true for Watergate, COINTELPRO.”

As he observes, there’s never been a real conspiracy uncovered by a conspiracy theorist.

“The important thing . . . is [that] the cognition of the people who uncovered these true conspiracies is usually totally straightforward,” he adds. “I mean, they’re not cranks—they are investigative journalists or they’re whistleblowers or they’re academics or journalists. They’re completely mainstream people whose job it is to go after the evidence. And in contrast to that, the people who believe that NASA faked the moon landing, if you look at how they considered the evidence, you can just identify all these cognitive flaws. So true conspiracies exist, yes—but if you think like a crank, you’ll never find a true conspiracy.”

A shorthand version of this, he acknowledges, is that much of the conspiracy theories he sees are readily discarded when the people pitching them have extensive records of promulgating misinformation, falsehoods, and nonsense: “Often, I’m very skeptical simply because the people involved make no sense.”

Those are the most essential and simple guides. It’s also wise never to become cavalier about conspiracy theories and treat them as mere mental games, diversions, or entertainments.

“In many ways there is nothing harmless about conspiracy theories,” observes Lewandowsky. “I think it used to be the case that people cavalierly just dismiss them as being a fringe phenomenon, and they were some weird people out there who believe that Elvis was still alive in North Korea. People were amused by it and would go, ‘Aha, yeah, isn’t that funny?’

“And perhaps that was appropriate at the time before the Internet came along and gave them a mushrooming platform and before they were recognized. I think what we’re experiencing now is that conspiracy theory has been increasingly conducive to political extremism, and of course in a sense that is nothing new, because the history of anti-Semitism is basically just one big weaponized conspiracy theory. So in that sense, there is nothing new under the sun; it is simply that we now have broader dissemination of this. It’s much, much easier to spread this stuff, and I don’t think they’re harmless.”

Kate Starbird warns that the newer conspiratorial appeals are increasingly sophisticated and capable of ensnaring even well-educated people. “I can see the rhetoric of critical thinking is used constantly within the conspiracy theory ecosystems that we study,” she says. “They’ve manipulated the tools of critical thinking, and they use it to take apart reality, and in ways that are not healthy for society.

“So how do we give people the tools not to take everything apart, but to learn how to build back some sense of what we can trust, and not to be skeptical of everything, but to figure out what we should assign credibility to? In terms of the next generation of digital literacy, or just information literacy, I think that’s going to be really important for us, to figure out how to trust things.”

Lewandowsky observes that for conspiracists, “the targets are almost arbitrary. It’s just a matter of finding somebody you can hate. Doesn’t matter if they’re gay or Jewish or Muslim or whatever, it’s just fulfilling the same function, which is to give people somebody to hate, because they’re being disenfranchised or disgruntled with their own lives and then somebody comes along and tells them that it’s all the fault of the Jews or Muslims or whatever and then, aha! All of a sudden they no longer have to take responsibility for their own actions or they can blame somebody for whatever misery they are experiencing and then some people will just go boom.”

7. As you ask questions, search mainly for the conspiracist’s deeper motives.

When you’re asking all these questions, you’re doing more than just getting them to explain themselves in a way that perhaps you can understand, even if it seems absurd and unlikely. What you’re really after is their underlying reasons for believing these conspiracy theories to be true. Because those reasons are the key to changing the course of their trajectory.

Although throwing facts at them in rebuttal is not merely pointless, but counterproductive, getting them to explain their thinking coherently and honestly has its own effect.

“It’s virtually impossible to convince them, certainly with facts, but it becomes possible when they start having doubts about their own theory or when they have been disillusioned for other reasons,” says Lewandowsky.

“And that’s the important thing, that it’s not . . . always about the conspiracy theory. No, the conspiracy theory is the symptom, in many cases, of other problems that these people are having in their own life.”

A lot of times the search leads from one buried motive to another, deeper and farther down the path that brought them to this point in the first place. This is especially the case when it comes to complicated hatreds like misogyny, which usually is entangled with the person’s emotional development and health or lack thereof. But even with simpler, more visceral and less explicable hatreds such as racial or ethnic bigotry, a more complex and often deeply personal experience or trauma may form its real core.

“Someone said to me—again, someone who has a lot of practical experience in working with neo-Nazis—she said she’s never met a happy racist,” says Lewandowsky. “All these people who turn toward these ideologies, toward these hateful ideologies, are often people who’ve made a lot of wrong choices and they’re not very happy with their lives, and the embracing of that ideology is the consequence, it’s not the cause of their problems. So, it’s also about trying to identify what are the problems or underlying things that may have led them to embrace that.”

C. V. Vitolo believes it’s essential to “make it intelligible to them that there’s another world out there where you get the things that you want. The ethnostate—it’s not an end in and of itself. It’s a means to your feelings of security, your feelings of safety, your feelings of, honestly, futurity. . . . I can certainly tell you that there’s a process involved that you can be an active participant in which your needs will be taken seriously. And I think making sure they know that the uncertainty you feel is the uncertainty everyone feels. And you don’t have to control it all to know that you’ll be given a fair shake.”

8. Recognize that these underlying motivations and needs have other ways of being met—so find them.

The very personal needs and motives that fuel most conspiracists’ attraction to them are not easily generalized—sometimes they arise from individual traumas, sometimes from idiosyncratic upbringings, or their own personal wiring, and they are often related to the nuances of the widely varying material realities of their daily lives.

However, there are threads that also run through conspiracism and the authoritarian personalities drawn to them that often surface with deeper questioning. A common theme that comes out is the heroic ideal: many if not most red-pilled conspiracists see themselves as part of a heroic effort to save whatever is the focus of that person’s motives: their families, their communities, their region, their nation, their entire race (“my people” is how they usually express the latter), sometimes all of the above.

This heroic self-conception is common among right-wing extremists in part because they often have personal stories that are not so heroic, and announcing their heroism is a means of overcoming that past. It also becomes a useful rationalization for a wide range of behaviors, from believing palpable nonsense to committing hate crimes.

Indeed, it’s something they share with hate crime perpetrators. A number of psychological studies of such criminals has found that nearly every one of them believed they were committing a “message” crime on behalf of their community and in its defense, believing the mere presence of the target minority posed a threat of some kind to it. Hate criminals and domestic terrorists both cast themselves in a heroic light. It’s central to their self-rationalization for their acts.

Lewandowsky recalls the case of Timothy McVeigh’s accomplice in the 1995 Oklahoma City bombing, Terry Nichols. “He used to be a guy in his mid-thirties who had not accomplished anything: he had not been able to maintain a relationship, he had broken off his education, he wasn’t able to maintain any jobs. By any standards, he was a bit of a loser, and he realized it. He was very frustrated and becoming depressed about his situation,” he recounts.

“And then at some point, he gets in touch with members in Michigan of the Aryan Nations, and they basically changed him. He starts hanging out with them, they embrace him, but they also give him a new philosophy, which says to him, ‘None of this is actually your fault, it is the government’s fault.’ And suddenly he feels very empowered and he basically said, ‘Wow, I’ve actually been a victim. None of this has been my fault. It’s been the government’s fault. There’s this vast conspiracy operating against people like me. And now I’m no longer a loser; in fact, I’m becoming a freedom fighter, I’m embracing this cause.’ And adopting this ideology was incredibly powerful for someone like him to make sense of his personal troubles, but also to find something that he could embrace and work for, for the first time in his life.”

“Look at the various reasons that people are engaged in conspiracy ideation and conspiratorial cultures,” Michael Caulfield explains. “One of the big ones, of course, is it’s really easy to earn intellectual respect in these communities, right? I mean, compared to everywhere else, where it’s actually quite expensive to earn intellectual respect.

“If you go into the flat Earth conspiracy, you can be Stephen Hawking in five months, the Stephen Hawking of that community. So getting intellectual respect in those communities is actually quite cheap. You suddenly don’t have to deal with the fact that ‘Oh, I have these opinions, but there are people smarter than me that have other opinions, so maybe my opinions aren’t the center of the discourse universe.’”

Caulfield was struck by a mother’s tale of deradicalization in the Washingtonian that described a woman’s struggle when her thirteen-year-old son joined the alt-right. Angry and resentful over a false accusation of sexual harassment, he had gravitated into online circles (particularly at Reddit) that discussed the plot against young white men and, eventually, “cultural Marxism.” He tried attending a Proud Boys rally.

Eventually, meeting some of his online alt-right heroes turned out to be disillusioning, and he peeled away from the scene. Some weeks later, he told his mother what was going through his head.

“I’d always had my doubts,” he said. “I knew liking them was wrong. But I wanted to like them because everyone else hated them.”

She asked him whether he liked them.

“I liked them because they were adults and they thought I was an adult. I was one of them,” he answered. “I was participating in a conversation. They took me seriously. No one ever took me seriously—not you, not my teachers, no one. If I expressed an opinion, you thought I was just a dumbass kid trying to find my voice. I already had my voice.”13

Caulfield recognized this scenario: “It’s almost traditional grooming in the sense that the big thing they had with the son is that the community, he felt, took him seriously. And he felt that his other communities, including his parents, didn’t. And even though that’s about a thirteen-year-old, I think it’s also very similar to why a lot of adults gravitate to these communities.

“It’s not all just racism and threats and so forth. A lot of it is they have a desire for some kind of stature, and they’re not going to make that stature by saying something that everyone else is saying. And they’re not going to get that stature in a community that consists of millions of people versus a smaller community where they can be a quicker superstar.”

The motivating dynamics of conspiracy theorists all vary widely, but these core motivations tend to arise out of commonly held beliefs and views that have been distorted and misshapen by the false information and sociopathic worldview innate to conspiracism. There are, of course, many other ways to feel heroic and appreciated than to join extremist organizations, and getting people to change course may require being imaginative about finding other ways of making them feel valued.

The narratives that are the grist of these buried mythologies are usually gut level in nature: we buy into the myth of heroism on a visceral level. But those who do find themselves caught in an endless dynamic in which generating, identifying, defeating, and then eliminating the enemy—which is the essence of the heroic narrative—becomes their sole preoccupation. As we have seen, this can have toxic and tragic consequences.

9. Avoid challenging the person’s core underlying beliefs that motivate his or her embrace of conspiracy—at least initially.

A number of studies of the deradicalization process found that any attempt to pull a person out of a rabbit hole immediately turned sideways if the interlocutor challenged any of the core cherished beliefs of the conspiracist’s worldview.

Sometimes these beliefs reside in the range of normative society—such as, say, the belief that abortion is murder—and become malignant only when a range of toxic conspiracy theories is adopted to support them. At other times, however, these beliefs originate wholly within the realm of conspiracists’ alternative universe, such as the belief that Hillary Clinton presides over a global child slavery and pedophilia ring.

And throughout most of the process of persuading people to draw back out of their rabbit holes, they clutch these core belief “jewels” tightly and jealously. Facts that challenge those beliefs, in particular, are not helpful at this point. Because for the true believers, reality becomes the precipice.

“It’s who you are,” says Kutner. “This really fragile sense of self you’ve constructed in the movement, it falls apart when it’s challenged like that. And then you’re back to feeling lost and alone and, ‘My girlfriend left me,’ and ‘Who the fuck am I?’ Most people don’t want to deal with that.”

When she encounters these moments, Kutner says she sees it as an opportunity to explore those core motives: “For me, I think it’s more, ‘What are your thoughts on this?’”

Often she can advance the conversation by tossing out published material from other sources: “Just articles that have a different perspective,” she says. “‘I was reading this and I haven’t had a chance to talk to many people who’ve had your experience. What do you think about this?’ And kind of playing up to their ego, like they’re qualified to speak on the topic. Especially if they’re guys, because ego plays a central role in so much of this.”

The world is dominated—though not solely populated—by men. And so many of the deradicalization efforts have a strong pheromonal scent.

“They’re very aggressively performing masculinity,” Kutner says. “I think some of them have severe depression that comes with the package.

“With that is that lack of agency,” she adds, referencing one of the key traits of people drawn to conspiracism. “I talked to a Proud Boy, and I said, ‘Is it kind of like putting on a new suit and seeing if it fits? And if you wear it long enough, you may not feel as depressed as you were?’

“We agreed that was a big thing—that masking depression through becoming overly violent or aggressively posturing was a really big deal. I think that men have a hard time approaching what some healthy level of masculinity looks like because no one really knows, so they go to everything that’s been hardwired into society. They take it to the most extreme route because it’s like, ‘If I feel like a weakling, then if I’m an alpha male maybe that will compensate for my deficiencies.’”

10. Keep it real and make it genuinely about helping your friend or loved one. If the person you are attempting to rescue suspects ulterior motives, you will be sunk.

Samantha Kutner says she works hard to make her interchanges with radicalized Proud Boys “empowering” but in a healthy direction.

“It would come from a source of empowerment with them,” she says, adding that groups like the Proud Boys recruit with the same idea in mind: “I think they’re initially sold on this pitch: ‘You get a chance to be blah, blah in this organization.’”

She offers them a different, similarly empowering narrative: “I think that selling them on, ‘You’re an expert in yourself,’ and encouraging self-reflexivity is key,” she says.

It’s essential, Michael Caulfield says, to gradually move the conversations into friendlier territory without playing into the process that led them to conspiracist beliefs in the first place. Making your relationship with them distinct from that part of their lives is also helpful—and when drawing them back to the real world, begin asking their advice on matters and questions that lie wholly outside that realm.

“I think in the process of that, when it’s spotted, and when people both understand how to not play into the whole process, I think we can have some effect,” Caulfield says. “And the way that we have that effect is just a simple conceptual question I would have them ask: ‘I’m not sure this is what you think it is, but here’s something that you might be interested in.’”

Caulfield says he avoids asking: “Oh, is this trustworthy or not?” “I actually don’t think that that’s the thing, because when you get into these things, it’s trustworthy because it supports your thing.”

He says it’s good to challenge them on a rational level on non-core issues, and best to challenge them on core issues when they show signs of deepening radicalization: “When someone starts to drift this way, they’ll post something, maybe their initial post about QAnon,” he says, “or just a little further along,” he will attempt to steer them away, “saying, ‘Okay, my source is better than your source, and your source sucks.’

“I think you could say, ‘Hey, I’m not sure if you realize this, but that’s actually a known white supremacist site’—but then really quickly validate their concern, find a piece of their concern that you can validate, and say: ‘However, your post did get me interested in some of these issues, and I found this article from The Atlantic that makes some related points.’

“And that gives that person an out, and they are more likely to withdraw and say, ‘Okay, some of my feelings and tension about this is not completely invalidated, but yeah, it looks like I’ve messed up here. And I didn’t mean to tweet out a white supremacist source.”

Still, it’s important in the process to leave room for their beliefs, even as you’re getting them to think and behave more rationally.

“I do think that the universes there become just so encased,” he says, “the way that it is sort of a bubble—not a bubble like it protects us from things, but like in science fiction: a bubble universe, a little universe that is internally consistent but exists outside of Earth Prime. They live in a bubble universe, everything is internally consistent in there, and you can’t pull out the Seth Rich conspiracy without it unraveling into a million different issues.”

The sheer mass of this universe can be daunting for anyone dealing with it. “That’s just harder because their belief system has become so plugged into these things and so densely interconnected,” Caulfield says. “There’s such a density to conspiratorial belief. Even more than just belief. You would think the world is the most densely linked thing, but conspiratorial belief is just everything links to everything.

“It’s like a game of Jenga at that point. What can you actually pull out that’s not going to threaten this person’s central identity at this point? That’s the much more difficult thing.”

11. Take the time to establish real-world activities with the other person that have nothing to do with politics or conspiracies. Do things, don’t just talk.

The human experience itself of being on the Internet and its limitations as a form of human interaction both have a great deal to do with the spread of conspiracism well beyond merely giving the theories a platform and a means to circulate widely. Much of it—particularly the ugly trolling behavior and ultimately the extremist ideologies and violent radicalism that emerge in the worst corners—reflects the disembodied nature of that experience and the easy dehumanization that comes along with it.

“Disembodiment and dematerialization are generally assumed to be intrinsic consequences of digital media because of the implicit immateriality of digital information and because the user’s interaction with the media is estranged, or alienated, from instinctual corporeal authenticity,” observed one study, which then explained that these were mostly misconceptions.14

However, the otherworldly aspect of being online is part of why it’s easy to dehumanize other people there: in our online exchanges, it’s just bits on a screen—there’s no body nearby, no vocal intonation, no eye contact or expression, no hand gestures, all normal parts of full human communication. It’s easy to troll someone else. There’s even a thrill involved.

This environment also enables conspiracism to run unchecked. The cottage industry that produces most of the world’s conspiracy theories competes constantly to “push the envelope” of outrage and hysteria, which are the meat and potatoes of their existence. This means that even vile racial bigotry comes into play—along with the whole raft of white nationalist and other forms of extremism.

Personal relationships have difficulty swimming in this environment. So it’s a great idea to spend shared time away from it. When undertaking the task of deradicalizing someone, it’s ideal to meet up at least occasionally for in-real-life activities: coffee, lunch, beers, golf, tennis, movies, pizza, whatever. Putting a real face and person to all these bits on the screen alters the dynamic substantially.

If you’re doing this with someone entirely online, this is much more complicated but still possible. It mostly means agreeing to do things away from the computer monitors, out in the real world, and then sharing those things with the other person. You can go see movies, read books, watch TV shows, whatever works.

Samantha Kutner likes to have a “literary component” to her exchanges with far-right extremists. Sometimes this involves reading the same book and discussing it. Sometimes it involves both sides keeping a journal and then discussing them afterward.

“I’ve journaled since I was eleven, and I think that that has allowed me to constantly be willing to evaluate my perspective and really know myself in that way,” Kutner says. “I think that encouraging self-reflexivity and having them read, where you’re just exposed to a multitude of different themes, is essential to this.”

Sometimes the reading would simply entail “articles or summaries of articles that talk about hate residuals,” she says. “It’s how your brain is wired in a certain way after being in these organizations. Your neurons repeatedly fire together, wire together. That’s the statement that they say. You may be wanting to leave the group, but you may still be in that mindset where your program or your pattern of thinking has been shaped in this way that it’s going to take a really long time to unlearn these things.”

She focuses on helping them understand “it’s normal to experience what they’re experiencing, and they’re not alone in it, reducing shame and stigma, so that it’s not a process where you have to go it alone, and you can become an expert in yourself. You can heal from your experience and whatever it is that can take you out of it.”

12. As the bond builds, opportunities will arise in which to undermine some of their conspiracism with factuality—though again, this has to be done gradually and sensitively.

Time is the ally of anyone working to draw conspiracists out of their rabbit holes, because the more contact they have with someone whom they previously thought of as a hapless pawn or a corporate sellout, a potential plotter or some other stereotype used to dismiss doubters, has turned out to be someone who values them and what they think. Eventually, this may lead them to rethink these perceptions, though not always.

Samantha Kutner was an egghead academic when most of her subjects met her. Initially they were hostile and skeptical but, over time, came to see her as a friend. What worked? “I think prolonged contact with them and showing them that I’m not an angry feminist, antifa, SJW individual, and I’m listening to them,” she answers. “I don’t think these men were listened to like that in their lives. I think that was a really powerful thing.

“I wasn’t passive. If they were wrong in something, I’d say, ‘Here’s how I understand it, but here’s how most people see this.’ I was always trying to get them to kind of reconcile the two different things. I think for those two Proud Boys, that was really important because they were confronting all of these discrepancies between what they thought the group was and what the group actually is. They were doing it in a kind of safe environment where they know that I’m not going to dox them, and I’m not out to get them, and they could confide in me. I think that helped a lot with those two members that left.”

What never works, as Michael Caulfield advises, is shaming them—though he believes public shaming of high-profile figures or participants in violent behavior can be very effective in changing people’s behavior. “But for your friends and family, what’s the alternative?” he asks rhetorically. “You have their relationship with them, and it’s the only bargaining chip you have, it’s the only thing that you’ve got. And if you make them make a choice between that and you, I’m just not sure what that does for the relationship.”

A rhetorical strategy that can be effective at this stage is to find narratives that neutralize and potentially can replace the older narratives around which they have constructed their worldview—particularly the visceral, gut-level narratives that are so appealing to them. Drawing someone enmeshed, for instance, in QAnon conspiracy theories away from those beliefs is always helped by nudging reminders, delivered nonthreateningly, that the billionaire president really did not have the best interests of ordinary people in mind.

“We have a lot of data showing that if you want people to give up on misinformation, you have to provide them with an alternative that explains the noise,” says Stephan Lewandowsky. “‘Why is it that you shouldn’t believe climate deniers?’ ‘Well, they’re funded by Exxon,’ is what they’re going to say. That is a thing that is clearly doable and works for some people.

“At most, people do not want to fall victim to a con man so if you can convince them that they have been taken advantage of or victimized, then that might turn them against the con man. However, in practice, that doesn’t always work because people, once they become vested in something, are reluctant to admit that they’ve been fooled. They don’t want to be fooled. So you have to somehow provide that alternative narrative without emphasizing the fact that they’ve been fooled themselves. It should be more like, ‘Look at all these people out there who are exploiting vulnerable people with their nonsense—surely you wouldn’t fall for that type of stuff.’ Or, ‘This guy is victimizing you, I want to help you get out of this hole that he’s pulled you into.’”

13. Over time, your relationship with the other person will become important enough to them that they will become more open to your perspective and more willing to reconsider theirs.

Peter Neumann has seen how difficult it can be to deradicalize someone. “We know this from a lot of European far-right extremists—you have to get them out in a way that, first of all, understands that if you’ve been involved in a particular extremist milieu or in conspiracy—like if you’ve been a 9/11 truther for the past ten years—a lot of your social relationships, both online and offline, will be based on that theory. Maybe you even got married to someone because you met him or her in the context of these ideas. So this is about deconstructing social relationships as much as it is about convincing them that they were wrong.

“And then, of course, . . . if you’ve dedicated ten years of your life towards promoting these ideas, you become very invested, and it’s hard for anyone—nevermind an extremist, it’s hard for anyone—to admit that they basically wasted ten years of their life on something that is completely untrue. So, in providing them with a new framework, you also have to make sure that, as absurd as that sounds, you respect what they’ve done and that they don’t feel like their entire life has been completely wasted.

“So, you have to allow them to embrace something new without necessarily making them feel bad or without basically telling them that they are complete losers who’ve been hooked to a lie, and you have to channel that into something else. That’s why quite a few extremists who deradicalize then become very vigorous advocates of deradicalizing other extremists, because this is a new project that they can channel their energies into and where their own involvement and extremism is actually something that’s valuable because they understand it, so they can say, ‘Well, I’ve not wasted these five years. They’re actually useful for me, because I now understand how these people work.’”

14. Keep multiplying. Once they begin emerging from the rabbit hole, help them to reconnect with their personal world—restoring old relationships that have been ruptured—as well as to expand their world by getting new experiences that may undermine some of their bigoted or paranoid beliefs.

Samantha Kutner has found that helping her interlocutors actually come face to face with some of the people they have demonized in their conspiracist’s imagination has a powerful effect.

“I would really focus on having them come into contact with the groups and the people that they’ve demonized in a controlled, safe setting,” she says. “And having them see that the idea of whoever it is, whether it’s a Jewish person or a black person or a Hispanic person, the idea that they’ve been sold about these groups of people is not true; these stereotypes are lies.

“I think the contact hypothesis would be a really good approach to the reflective, literary component. There would be self-work. There would be outreach for people who were former extremists, and then there would be people who don’t fit that mold that they’ve been conditioned to believe. It would involve speaking to them and just saying, ‘Look, this is my life.’ Spend a few hours with me, and this is what I do.” People, after all, tend to shed their horns and pitchforks after a cup of coffee with them.

15. Help them to heal. That must be the abiding principle behind every one of these steps.

The baseline of this kind of undertaking is fixing something that has been broken. Conspiracy theories break people. They break relationships, families, communities, nations. They can be overcome only by human healing and compassion backed by moral and ethical clarity, the antithesis of the sociopathic world they engender.

The success of the process may hinge on the extent to which you and the person you’re helping can interact in the real world. That’s Peter Neumann’s advice, at least.

“They say of course the Internet offers opportunities to find people that perhaps you wouldn’t interact with normally, but I think the aim should then be to have an off-ramp where you pull it back into face-to-face conversation,” he says. “I mean, a conversation can start on the Internet, and if someone is willing to engage with you, that’s great, but I think ultimately the solution needs to be face-to-face.

“And that’s why a lot of people are currently talking about this combination, on and off, looking for people online, trying to figure out who is potentially open to engaging, who has doubts and questions, but the endgame is always to engage with them face-to-face and to have facility that basically allows them to get real help and solve the problem that they have.”

________

The work of saving the world from a descent into the madness of conspiracism need not, of course, fall entirely on the shoulders of the friends and family members of the people who fall into its snares. They may turn out to be the most effective tool, but government, business, and society at large have critical roles to play as well.

Peter Neumann’s International Centre for the Study of Radicalisation at King’s College, London, studied the problem of American online radicalization in depth as early as 2013 and published a comprehensive overview of how authorities can tackle it.

The first step: reducing the supply of extremist content. But that comes with a major caveat.

“First comes the recognition that—for constitutional, political, and practical reasons—it is impossible to remove all violent extremist material from the Internet and that most efforts aimed at reducing the supply of violent extremist content on the Internet are costly and counterproductive,” it explains at the outset.15

Next: reducing the demand. These measures would work, “for example, by discrediting, countering, and confronting extremist narratives or by educating young people to question the messages they see online.”

Finally: exploiting the Internet. Making practical use of online content and interactions for the purpose of gathering information, gaining intelligence, and pursuing investigations is essential for preventing violence and terrorism.

Neumann and his team explain that the steps involved in reducing the supply of the content are limited, particularly in the United States because of its constitutional free speech protections, in which censorship is extremely circumscribed. In European nations, nationwide filters and legal restrictions may affect the spread of extremist material in nations where laws exist prohibiting it, but that won’t affect American consumers to any appreciable extent. Indeed, “most of the traditional means for reducing the supply of violent extremist content would be entirely ineffective or of very limited use in the U.S. context,” the study explains.

The most viable option in the United States would involve commercial takedowns of extremist material from the platforms where they fester, such as YouTube, Facebook, and Twitter. Those companies have, since early 2019, begun making serious efforts at removing extremist content from their platforms, though with mixed results.

“One practical option could be for government agencies to create and, where appropriate, strengthen informal partnerships with Internet companies whose platforms have been used by violent extremists,” the study suggests. “The objective would be to assist their takedown teams—through training, monthly updates, and briefings—in understanding national security threats as well as trends and patterns in terrorist propaganda and communication. As a result, online platforms such as Facebook and Google would become more conscious of emerging threats, key individuals, and organizations and could align their takedown efforts with national security priorities.”

Reducing demand for extremist and terrorist material online is a much more complicated proposition. Neumann’s study focuses on “activating the marketplace of ideas”—that is, conducting outreach in the very spaces where the radicalism is growing. It notes that chief among the drawbacks to engaging people online is the decided “enthusiasm gap”: “Instead of having extremist views drowned out by opposing views, the Internet has amplified extremists’ voices.” The strategy is also hampered by significant gaps in the pluralism of ideas, as well as major gaps in skill level.

More promising, it suggests, would be measures aimed at “creating awareness” in a way that was effective and “building capacity in order to assure that alternative voices are heard.” Countermessaging—which would expose people to messages that are specifically designed to counter the appeal of extremism—can also work, but it too has limitations when coming from officials or authorities. For such campaigns to really work, they have to be fueled and propagated at the grassroots level by ordinary people.

The study also discusses ways that the Internet can be an effective tool for gathering intelligence on political extremists, because so many of them organize online. It can also be used by investigators in collecting evidence of crimes afterward.

However, the centerpiece of the study was its finding that promoting digital-media literacy was the “most long-term—yet potentially most important—means of reducing the demand for online extremism.”

“In recent years, educators and policymakers have recognized the unique risks and challenges posed by the Internet,” the study notes. “Most efforts have focused on protecting children from predators and pedophiles, with the result that—in practically every school—kids are now being taught to avoid giving out personal details and to be suspicious of people in chat rooms. Little, however, has been done to educate young people about violent extremist and terrorist propaganda.”

Neumann says that families and friends still hold the keys for preventing radicalization, because they see the effects in real life, while others are seeing it online. Often, red-pilled young men are hiding those activities, but their alienation and increasing anger levels also become manifest in their daily workaday and family lives.

“We know that in terrorism cases, for example, when people radicalize, we know that a lot of extremists who may be very deep in their extremist worlds still have so-called bystanders—that is, they have friends, family, people around them, colleagues at school, but most importantly family who still have an influence on them.”

Many of these people express shock at the person’s radicalization afterward, but many also acknowledge that they saw warning signs. And even after they had fully immersed themselves in their extremist world and embraced Nazism or Islamic State or whatever end the radicalization led to, they would maintain their ties to these bystanders.

“So, even people who went to Syria to join ISIS, we’ve observed a lot of them, they would still WhatsApp with their mom, because they were missing them, and because for them, their father or their mother was still a figure out of authority, not in all cases, but in some cases they were,” Neumann said. “So, that’s why a lot of these programs, these countering violent extremism programs are about empowering mothers, empowering parents, etcetera.

“So, I do think that people should take an interest and should be cognizant when suddenly one of their friends or their kids starts saying weird stuff and gets into something that is really problematic, because they all talk about it; we know about this. It’s not like they’re keeping quiet about it, especially in the early phase when you discover a new paradigm that explains a lot of things that you thought the government was keeping secret. You’re very missionary, you’re talking about it all the time, and I do think that so-called bystanders, family, friends, people at school should pay attention to that.

“Because it’s exactly at that point that you can still intervene and that you still have influence and you can still talk someone out of it or you can engage with them, which at a later point may be much, much more difficult. So, it is really important to pay attention to this and to be ready to engage in a discussion about it for people who are connected to them, at an early point.”

________

The question any investigative reporter worth his or her salt looking at the twisted world of conspiracy theories and its globally toxic effects on every level of society would ask is: who benefits?

Who stands to gain, monetarily and in power, from a phenomenon in which large numbers of people form communities built around non-facts and nonsense that inevitably crumble in fits of extremist rancor and fiduciary miscreancy and, in the process, cut themselves off from their communities and from the political process? Who gains when millions of people dismiss democracy as a delusional joke and abandon what political franchise they possess? Who gains when democracy is debilitated, undermined, hollowed out from within?

Authoritarians do. Certainly authoritarians in government who prefer that every act reflects the instincts of the revered leader atop the heap because they believe that will produce a well-ordered society that also makes them wealthy. But also particularly authoritarians who run corporations that prefer operating without the constraints imposed by democratic government, particularly health, labor, and employment regulations, as well as environmental and work-place safety rules. These are the people who stand to benefit when a democratic society defenestrates itself, unlinks its arms from each other, retreating into survivalist bomb shelters to await the apocalypse or the civil war, whichever comes first.

The world, in truth, has always had a kind of unapologetic conspiracy operating in the open in every nation, under every form of government: namely, the conspiracy of established authority, entrenched wealth, and traditional cultural centers to maintain their positions and enhance and expand them if possible. “He who has the gold, rules” as a cynical but realistic version of the Golden Rule is not a new joke.

Conspiracy theories, sociologist Chip Berlet has long argued, are a kind of wedge between ordinary people and reality whose whole purpose is to distract the public’s attention from the very real conspiracy happening before their faces.

“Conspiracism is neither a healthy expression of skepticism nor a valid form of criticism; rather it is a belief system that refuses to obey the rules of logic,” he explains. “These theories operate from a pre-existing premise of a conspiracy based upon careless collection of facts and flawed assumptions. What constitutes ‘proof’ for a conspiracist is often more accurately described as circumstance, rumor, and hearsay; and the allegations often use the tools of fear—dualism, demonization, scapegoating, and aggressively apocalyptic stories—which all too often are commandeered by demagogues.

“Thus conspiracism must be confronted as a flawed analytical model, rather than a legitimate mode of criticism of inequitable systems, structures, and institutions of power. Conspiracism is nearly always a distraction from the work of uprooting hierarchies of unfair power and privilege.”16

“Conspiracy is really a tool of power,” observes Michael Caulfield, who notes that in recent years, academic discourse about conspiracy theories has wandered into discussions of “who gets labeled a conspiracy theorist and who doesn’t.

“From my point of view, it’s been interesting to see that hit the wall of Internet reality, where we’re looking at this massification of conspiracy theory,” he says. “Some of that starts to look a little precious in terms of what we’re looking at, precisely because of the sorts of violent events which we’re talking about, but I think also because more and more people are having this experience of watching people drift, bit by bit, deeper and deeper into these communities.”

Kate Starbird first stumbled upon the machinations of foreign authoritarians tampering with the world’s media ecosystems when she scientifically examined the spread of misinformation about Black Lives Matter and found clear evidence that Russian intelligence operatives were deliberately sowing racial disharmony on social media. Now, she’s finding a variety of strands of authoritarianism woven throughout a number of disinformation campaigns whose data she and her team at University of Washington have gathered.

She believes these campaigns are effective in large part because of the environment in which it’s occurring, where more and more people are adopting strong political identities.

“So if that political identity is strong and we’re very polarized, we’re more susceptible to disinformation,” she says. “Disinformation targets political division, and that’s also a place where we seem to be psychologically susceptible to it as well.”

The work led her to examine Cold War propaganda techniques, where much of the sponsored disinformation originates. “There are some really interesting old books on Russian disinformation, particularly the Vladislav Bitman books,” she says. “In one of them, he talks about how really partisan political identities are ones that are easily manipulated in different ways, and they said at the time, the KGB was really focused on targeting the political left, because that aligned with historically their political ideologies.

“But they’re saying that there’s no reason that these techniques wouldn’t also work on the far right, and they’d already seen some instances of this as well, because they said the vulnerabilities themselves aren’t about ideology. They’re about being so committed to a political identity that you become unable to differentiate truth versus familiarity and what you want to believe.”

More current outbreaks of authoritarianism, such as the far-right regime of Rodrigo Duterte in the Philippines, provide even more vivid insights into how the disinformation functions. A recent network disinformation paper was published out of a study in Manila where researchers from the University of Leeds interviewed people who were participating in a government disinformation campaign based on social media, troll accounts that blamed “drug dealers” and liberal elites for all the nation’s ills. The result, it found, was a “public sphere filled with information pollution and, consequently, with toxic incivility and polarization.17

“We’ve been trying to figure out, what’s the intersection between these populist movements and this disinformation infrastructure,” Starbird says. “They intersect, but it’s not causal. It seems to be something else. And the rise of these populist movements all over the world, why do they seem to intersect?

“They’re authoritarian, because that’s their nature,” she adds, noting that the populism all seems to be of the right-wing variant. “What we’ve been seeing in a lot of these recent movements is that the leaders that are able to take advantage of this are ones that are willing to reflect back at the movements what they want to hear, which I guess is the basis of how populism works. It’s grievance based, anti-elite based, which feeds into conspiracism.

“But they’re able to reflect back some of the nastier parts of mob behavior, which the Internet enables in all sorts of different ways. Then these leaders have been able to take advantage of that in different ways. And you can see the mediascape, the people that have risen in the new media especially, becoming a media player out of doing the same kind of thing, echoing back these kinds of things to the crowd. Finally, political leaders are able to use social media directly to sense what people are talking about and then reflect it right back at them. So we’re seeing that en masse.”

Formulating an effective response in defense of democracy itself is, in fact, challenging, in large part because so many people living in those democracies haven’t recognized that the ground is shifting around them and the landscape with it.

“The hard thing is, there’s a really complex set of multiple things happening in concert that are resonating,” Starbird says. “It’s great to unpack them and understand them, but any one explanation is going to fall short of being complete, because it’s such a complex system.”

The most essential step in tackling the spread of disinformation and conspiracism on social media, she says, is “to build things that are trustworthy. Not just social media, but all sorts of information ecosystems. Journalism has to figure it out; social media has to figure it out. How do we make trustworthy information-participation systems? There’s not an easy answer to that.

“It feels like we’re being overwhelmed by something that we don’t necessarily have the tools of society to grapple with. That seems to be resonating in ways very quickly and unexpectedly with political conditions and information systems and all of these things are resonating in ways that are just putting us in a really bad place.”

The scenario, she says, reminds her of old film footage from 1940 of the Tacoma Narrows Bridge in southern Puget Sound, a suspension bridge that was renowned among drivers for wobbling unsteadily on windy days, earning it the sobriquet “Galloping Gertie.” Opening on July 1 that year, it remained up only until November, when a powerful windstorm created such wild oscillations that the bridge came apart and fell into the Sound. Film footage showing the bridge structure warping wildly before its collapse became iconic in the Pacific Northwest.

“Galloping Gertie—I feel like that’s what’s happening,” she says. “It’s just hitting us one way and something else is hitting us a different way. It’s hitting this resonance, and it’s out of control.

“What is it going to take to cut it? Maybe that keeps going until everything just gets shattered to pieces, or maybe we just figure out how to get out of that loop, and we can resolve some of it.”

________

We know what taking the so-called “red pill” looks like. If we could devise a “blue pill,” what would happen? What would it do to you? To what would you “awaken”?

It’s what we all need right now.