CHAPTER 4

LIFE AFTER Q

“The Storm” blew over. January 6, 2021, came and went without any of the cabal being arrested as Q had prophesized. There were no televised trials or executions of powerful pedophiles. No high-flying cannibals were taken to the Guantanamo prison. On January 20th, Joe Biden was sworn in as the president of the United States, instead of Trump beginning his second term as “the plan” had promised.

Disappointment and disbelief rippled over the QAnon-based Internet forums. Predictably, reactions fell into one of three categories. The Diehards doubled down on their belief system. The Doubters tried to reconcile the reality they couldn’t deny with the ideas they couldn’t abandon. And the Defectors dropped out, disillusioned with QAnon.

The Diehards

Faced with the failure of Q prophecies, the Diehards dug deeper into the folQlore, claiming everything—the certification of the election, the failed insurrection, and the inauguration––was part of “the plan.” On Gab, a social network known for its far-right userbase, a QAnon account with 130,000 followers posted, “Does it make sense that Trump would ‘give up’ like this? What if it had to be this way, what if this actually ends up being the best way? . . . Call me crazy, but I don’t think this movie is done.”1 Another Diehard wrote, “Like many of you, I am in shock by today’s [events] and then I realized why it had to happen and that Q told us it would happen and, why this NEEDED to happen.”2

Joe Biden didn’t actually become president, Diehards said. Rather, the televised inauguration was another hoax, a ruse to distract the uninitiated. Some claimed that Biden’s inauguration was actually taped 11 hours earlier, while on January 20th, in a secret ceremony, Trump was sworn in as president. Others rolled with the idea that the inauguration was indeed held in Washington, DC, and Biden was sworn in, but that it was part of an entrapment plan set up by Trump to incriminate Biden and the deep state. “Things have just started,” explained Tiffany, a Diehard QAnon: “They had to ‘commit’ the crime to fully lock the deal.”3

Among the “clues” Diehards counted as evidence for their freshly baked conspiracy theory was the number of flags on the dais at Trump’s farewell address before he departed Washington, DC, for Florida. There were 17 flags, they observed. Guess what the 17th letter of the alphabet is? It’s Q! Clearly, this was a hidden message from Trump that “the plan” was still working. Another clue was that the Bible on which Biden swore was wrapped in a leather binding, which meant that he didn’t actually swear on the Bible and so wasn’t a legitimate president.

“Be prepared, and stay cool,” Diehard Valerie Gilbert wrote to her Facebook friends. “Slow and steady wins the race. We’re in the home stretch now.”4

The Doubters

For many QAnons, the public failure of Q prophecies stirred cognitive dissonance, an anxiety-ridden state that arises when facts can’t be easily reconciled with one’s self-image.5 Q was not helping, having gone eerily quiet. Since November 2020, Q posted only four times, leaving the followers to deal with their disappointments on their own. Some feared they had been misled or betrayed.

“I am so scared right now, I really feel nothing is going to happen now,” wrote one Doubter on a Telegram channel frequented by QAnons. “I’m just devastated.”6

The Doubters weren’t ready to concoct another conspiracy theory to explain the glaring mismatch between the reality and Q’s predictions. Something went wrong, that much was clear to them. At the same time, however, they weren’t ready to give up on the ideas of saving abused children or finding and punishing the pedophiles and cannibals responsible for their suffering. “I will continue speaking truth. I have not given up. I still have faith. I still know that God Wins,” a Gab account posted.7

After Twitter banned Donald Trump, and Facebook shut down accounts that were posting disinformation, QAnon traffic on these social media declined precipitously.8 Some of the most radical hashtags related to the “stolen election” conspiracy theory (#FightForTrump, #HoldTheLine, and “#MarchFor-Trump”) declined by 95 percent across Twitter, Facebook, and Instagram.9

Tens of thousands of QAnons migrated to unmoderated social media, such as Gab, 4chan, and 8chan. These more permissive Internet forums already hosted hate groups and radical movements that had been kicked off of Twitter, Facebook, Instagram, and Reddit for violations of their terms of conditions. They cheered the influx of QAnon Doubters as a unique recruitment opportunity. On 4chan, an anonymous account posted,

This would be the perfect time to start posting Nat Soc [Nazism] propaganda in Q anon groups. Clearly, this is a very low point for Q believers, and once people have been broken, they will look for ways to cling back to hope again.10

The Proud Boys were similarly abuzz on Telegram,

Parler being shut down has sent tens of thousands (or more) of people to telegram. All of them are seeking refuge and looking for answers since their Q-bullshit lied to them. Now is our opportunity to grab them by the hand and lead them toward ideological truth. Join their normie chats and show them love and unity.11

Strategies for bringing QAnon Doubters into alt-right proper included pushing on them “the most extreme talking points that they already have in their head thanks to Trump.”12

The Defectors

Some QAnons had seen enough—Trump’s failure to lead them in the insurrection that he allegedly incited, Q’s failure to guide them after the November election, and the eventual transfer of power to President Biden. They connected the dots into a bigger picture: QAnon was full of lies.

“Power has changed hands and that is the end. In the time we needed Trump and Q the most . . . [they] both shut up and left,”13 someone posted on a QAnon-related forum.

In the aftermath of the January 6 insurrection, law enforcement began identifying people who stormed the Capitol building—and rolling out arrest warrants. This was another jarring reality check—not only because they didn’t expect any negative consequences of their actions, but also because the community they’ve come to see as their family abandoned them when they needed it most. “Not one patriot is standing up for me,” said Jenna Ryan, a real estate agent who became infamous for flying to the January 6 insurrection by private jet. “I’m a complete villain. I was down there based on what my president said. ‘Stop the steal.’ Now I see that it was all over nothing. He was just having us down there for an ego boost. I was there for him.”14

While QAnons were being indicted for trying to deliver the presidency to Trump, their hero and leader did not seem to care about them. In the final days of his presidency, Trump was issuing dozens of pardons. But not a single presidential pardon named a QAnon follower who stormed the Capitol building for him.

“So just to recap: Trump will pardon Lil Wayne, Kodak Black, high profile Jewish fraudsters . . . No pardons for middle class whites who risked their livelihoods by going to ‘war’ for Trump,”15 summarized one post on a white supremacist channel on Telegram. Gut-wrenching feelings of abandonment and betrayal led many QAnons to leave.

For others, defection came from the head, not the heart. Jitarth Jadeja was a true believer in QAnon, until one Q clue, designed to cement followers’ beliefs in Q legitimacy, did the opposite for him. The Q-drop said President Trump would use the phrase “tip top” in one of his speeches to send a coded signal of support for QAnon. Amazingly, soon thereafter, Trump did use that phrase. But when Jadeja did some online research, he found many other instances in the past when Trump had said “tip top.” It was clear to Jadeja that Q had used the same trick that is often used by horoscopes and fortunetellers: pointing out something that seems unique but is in fact quite common. Q’s “tip top” drop was a version of “Your heart has been broken by someone you deeply trusted” or “There’s someone important in your life whose name begins with a J.”

Jadeja was crushed by his discovery. “It was the worst feeling I had in my life,” he said. He discovered a Subreddit called r/Qult_Headquarters, “dedicated to documenting, critiquing, and debunking the chan poster known as ‘Q’ and his devotees.” On the Subreddit, Jadeja posted a 659-word essay that began, “Q fooled me.”16 To Jadeja’s great surprise, instead of the ridicule he feared, he found a welcoming and supportive community on the forum. This newfound connection with other Defectors helped him to leave QAnon for good.

Melissa Rein Lively’s defection came, not from the heart, not from the head, but from her husband calling the police on her. The Scottsdale, Arizona, businesswoman’s life became consumed by her addiction to QAnon. Her husband staged an intervention and gave her an ultimatum: the family or QAnon.

Lively, as we discussed in Chapter 2, is the QAnon follower who became infamous for live streaming herself as she destroyed a facemask display at Target and yelled at store employees. As a result of her public outburst, the PR business she had built from the ground up was destroyed: Her clients didn’t want to have anything to do with her. Lively, her husband, and even his business partner were getting death threats. She didn’t feel safe anymore. “I was all consumed with doom-scrolling on the Internet. I was living in these conspiracy theories. All of this fear porn that I was consuming online was just feeding my depression and anxiety.”17 Lively’s husband tried to get through to her, to convince her to leave QAnon, but failed. That’s when he called the police.

Lively live streamed the intervention, too. She told the police that she was a “QAnon spokesperson” and that she was on the phone with President Trump “all the time.” After the police detained her to take her to a nearby mental health facility, she screamed, “You’re doing this to me because I’m Jewish!”

In retrospect, Lively sees her husband’s intervention as an act of kindness to someone whose psychological disorder took over her life. At the mental health facility, she was diagnosed with bipolar disorder and with posttraumatic stress disorder stemming from losing both her parents at a young age. Intensive therapy she has been receiving helped her to leave the world of QAnon behind. As a PR professional, she has made it her business to publicly apologize to Target and the employees she had mistreated. She has signed new clients and has reconciled with her husband. Like Jadeja, Lively was moved to find that people were willing to give her a second chance.

Diehards, Doubters, and Defectors present unique challenges to policymakers and to security services. In this chapter, we will consider what could aid QAnon Defectors in reintegrating into “normal” life, what could draw QAnon Doubters out of the conspiracy theory world, and what could minimize the danger from QAnon Diehards.

But QAnon’s damage is not limited to its followers. A recent poll found that 6 out of 10 Americans couldn’t correctly identify four or more conspiracy theories as false.18 That number represents about 125 million Americans whose beliefs were upended by QAnon disinformation. This group includes QAnon Doubters, but it also includes a vastly larger group of people who have not yet fallen into the rabbit hole of conspiracy theory. They seem to teeter on the edge of it. These millions of Americans are uncertain about their reality. They likely experience anxiety and fear because of this uncertainty. The interventions proposed in the following discussion would benefit them as well as former QAnons.

The same NPR/Ipsos poll found that one in five Americans believed the following statement is true: “A group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media.”19 One in five Americans projects to roughly 36 million adults who are Diehards. This staggering number suggests that most of us know someone––a family member, a friend, a colleague, or a neighbor––who is deep into the QAnon rabbit hole. In the weeks following the January 6 insurrection, news stories featured devastating firsthand accounts of adult children losing their parents to QAnon20 and of parents whose adult children become QAnonized and severed contact with them.21 Siblings stopped speaking, and long-term romantic partnerships broke up when one person in the relationship followed the white rabbit into QAnon.22

QAnon leaves behind a trail of broken families and friendships, with millions of people left to pick up the pieces. They, too, need help. The interventions proposed here are designed to address the needs of those who themselves have never believed any of Q’s lies, but whose lives have nonetheless been shaken by them.

Four-Pronged Approach

QAnon arose from a perfect storm of social norms crumbling amidst the fear and loneliness of a global pandemic. Like a rising tide lifts all boats, this perfect storm pushed different people into QAnon for different reasons. To get them out, similarly diverse methods will be needed. Here, we will present four approaches to de-QAnonization.

The first set of counter-QAnon measures would increase psychological resilience to a conspiracy-theory phenomenon like QAnon. We imagine these measures as building immunity to disinformation, fear mongering, and false prophecies. It would include teaching critical thinking and social media literacy and offering “psychological inoculation” against conspiratorial thinking. This intervention prong would especially benefit Doubters.

The second set of measures aims to decrease the danger of QAnon-type disinformation to the general public, reducing the virulence of conspiracy theories. Limiting exposure to dangerous rhetoric online would curb radicalization by enforcing social media terms of conduct and de-platforming or otherwise limiting accounts that spew disinformation. Here, we suggest improvement of social media filter algorithms, airing out troll-infested forums, and flooding the unregulated online space with narratives that would undermine conspiracy theories. Limiting exposure would benefit Diehards, Doubters, and Defectors, as well as other non-QAnon Internet users.

A third set of measures proposes professional treatments for mental health issues that brought people into QAnon, or that arose from following it. Included in this intervention prong would be individual cognitive psychotherapy, support groups, and volunteering. Offering treatment would be of greatest use to those Defectors who are ready to admit they have a psychological problem and are willing to seek help for it. It would also benefit non-QAnons whose psychological well-being has suffered because of QAnonized loved ones.

Finally, the fourth approach recommends adjusting expectations. Those of us outside of QAnon may feel scared, confused, or angry with people falling for blatant lies. We may want the craziness to stop, immediately. Adjusting our expectations would moderate our stress levels. For the Defectors who had left QAnon, it’s equally important to set realistic goals about their re-entering “normal” life and rebuilding broken relationships. Adjusting expectations would help former QAnons and non-QAnons alike. Here, we propose comprehensive mindfulness training. Figure 4.1 illustrates the measures that can be helpful in saving people from the impact of QAnon.

Building Immunity

Critical Thinking and Social Media Literacy

Q may be one person or a few different people, but they had a lot of help getting the message out. Less than two weeks after the first Q post appeared on 4chan, Russian Internet trolls began amplifying the post. In 2019, accounts suspected of being controlled by Russia’s government-based Internet Research Agency sent out a large number of tweets tagged with #QAnon and #WWGQWGA.23 After Twitter removed the offending accounts, the Kremlin-controlled news agency Russia Today (RT) devoted more and more coverage to QAnon, condemning Twitter’s “censorship” and posting Q-conspiracy theories. RT issued a similar slew of QAnon-supporting content after Facebook removed accounts that spread disinformation. Heartened by RT’s support for their cause, QAnon followers began sharing more content from Russian outlets, including RT and Graphika.24

Figure 4.1. Counter-Q measures. Source: Sophia Moskalenko and Kristian Warpinski.

Russian propaganda has targeted not just QAnon supporters but all Americans. In the months before the 2016 presidential election, an estimated 126 million U.S. Facebook users had seen posts, stories, or other content created by accounts backed by the Russian government,25 and an additional 20 million have been exposed to this Russian content on Facebook-owned Instagram.26 A similar picture emerges from Twitter and YouTube. Russia has weaponized social media to spread rumors, conspiracy theories, and emotion-stirring images in a coordinated effort to radicalize the United States from within.27

Other countries were targeted with Russian disinformation as well: former Soviet republics (Ukraine, Georgia, Estonia, Lithuania), as well as countries of the EU (Poland, France, Germany). Russian trolls used the same tactics in different languages: planting and amplifying divisive messages, posting conspiracy theories to stoke fear (vaccinations are disguised tracking devices) and anger (powerful pedophiles are abusing children).

But Americans fell for Russian lies at a much higher rate. Researchers found an average of 1.73 likes, retweets, or replies for Russian troll posts in any language other than English. By contrast, the rate was nine times that high for English-language posts (15.25).28 Americans, it turned out, were easy targets for the Russian propaganda, including Q-conspiracy theories.

One cause of this American vulnerability is that U.S. education lags behind Europe in teaching critical thinking and social media literacy. The First Amendment to the U.S. Constitution guarantees freedom of speech, no matter how vile. Perhaps because of this country’s foundational principle, Americans don’t appreciate the danger of some speech in quite the same way as do Germans, who have experienced the power of Nazi propaganda, or as do former Soviet countries, which have lived through propaganda-driven communist regimes.

QAnon’s rise shows how important it is to build up psychological defenses against malevolent actors who use false narratives to radicalize. A recent study found that believing in COVID-19 conspiracy theories correlates with inability to follow scientific reasoning.29 Another study discovered that people who had trouble performing complex mental tasks were more likely to hold radical views.30 These two findings speak to the same fact: Inability to think critically predicts a radical and conspiratorial mindset. Americans should follow the European example in teaching critical thinking and social media literacy from an early age and through adulthood.

European countries teach critical thinking as part of their school curriculum, paying special attention to disinformation online. For example, Finland’s schoolchildren learn to identify fake news by studying examples of fake stories, altered photos, and divisive content designed to foment group conflict.31From elementary school onward, Finnish children practice sorting online content into truth or fiction, assessing media bias and deciphering how clickbait content preys on users’ emotions. This approach has a track record of success. A variety of media literacy courses and initiatives implemented across European schools have shown to be effective at building students’ ability to defend against disinformation.32

Similar courses should be integrated into U.S. school curriculums. Many American parents worry about their children’s online exposure to bad influences of all kinds, including to sexual predators. Introducing educational initiatives to teach children to identify and resist online dangers would do wonders not only for the children’s psychological resilience, but also for their parents’ peace of mind. Knowing children are learning how to protect themselves against online predators would relieve the pressure on parents to seek out and fight a global pedophilic cabal.

In addition, to build immunity against online disinformation in adults, media literacy programs should be available in public libraries. Many stay-at-home moms do not have the knowledge to discern one source of Internet information from another and may have only a vague idea about Internet trolls or bots. Free courses on basic concepts of social media’s power and pitfalls would help these women to become better-informed consumers of information. This would make them less vulnerable to online radicalization. As a bonus, community-based media literacy programs would open a new social milieu and possibilities for new real-life friendships.

Finally, social media companies would do well to integrate social media literacy exercises within individual user experiences. A person opening their Facebook or Instagram should occasionally be nudged to take a quick test of their ability to “tell a fake profile from a real one” or to “spot seven signs a post is fake news.” Through interactive and engaging content like this, users would improve their media literacy and boost their psychological resistance to QAnon-style disinformation.

Inoculation

It’s easy to get inoculation wrong, with biological viruses as well as with viral ideas. If someone tells you “Don’t think about a pink elephant,” this attempt at thought inoculation is likely to do the opposite. Even if you weren’t inclined to think of a pink elephant before, you will after the warning.

This may seem like common sense. However, politicians often get this truism wrong. For example, Nancy Reagan’s inoculation advice against drug use (“Just say no”) has been adopted as the official slogan of D.A.R.E.—a drug-prevention program that spent millions of government dollars popularizing it. Following in her footsteps, Bob Dole, a one-time Republican presidential candidate, offered a new variation the 1990s: “Just don’t do it.” Research showed that these anti-drug slogans just didn’t do it for teenagers at risk, whom they primarily targeted.33 Those exposed to the anti-drug slogans were no less likely to try drugs than those who weren’t. While “Just say no” didn’t help inoculate against drug use, at least it didn’t harm people’s chances to stay away from drugs.

Real danger lies with ham-handed attempts at inoculation that have the opposite of the desired effect. Presenting people with weak arguments against their beliefs can do exactly that. For instance, telling teenagers to exclusively practice abstinence because it’s morally superior to sex (a weak argument) actually results in more teenagers having sex and more teens becoming pregnant.34 When an individual can mentally or verbally defend their position against weak arguments, they feel more certain that they were right. Having won a fight against an alternative view, they are less likely to entertain it. Thus, instead of undermining original beliefs, weak arguments can make them stronger.35

For this reason, it is a bad idea to argue with QAnon followers. Conspiracy theorists are adept at dismissing or explaining away any evidence that contradicts their views. Arguing with nonbelievers is likely to entrench QAnons even deeper in their beliefs.

By the same token, simply labeling disinformation, as Twitter did to Donald Trump’s tweets before banning him, is likely to be counterproductive. The labels affirm conspiracy theory supporters’ paranoia about the “deep state” censoring their leaders. At the same time, the labels effectively signal to them which content to pay attention to, “pink elephant” style.

One type of inoculation, however, offers great promise in fighting against the spread of conspiracy theory. This technique makes people’s defensiveness work to their benefit, a little like jujitsu. It goes something like this: Imagine you’re about to meet someone. But before they walk up to you, I whisper into your ear, “This is a scam artist, and you are their ideal target. Just FYI.” I haven’t given you any facts, but I have made you suspicious. Anything this new person says to you after my warning is bound to be scrutinized. If they do something bad, they confirm your suspicions. If they do something nice, you suspect there’s a catch. If they are, in fact, a scam artist, it would be mighty hard for them to fool you now.

Researchers used this cunning technique to inoculate teenage boys against the appeal of junk food.36 Some boys were told about the dangers of junk food—a straightforward message similar to the “Just say no” of anti-drug campaigns. Other boys, however, were told about the devious ways the food industry manipulates adolescents into buying junk food. The second message emphasized how adults were controlling teenagers—a sentiment that would resonate with teens, triggering their defenses. In the three months that followed, researchers tracked what food the boys’ chose at their school cafeteria. For the first group, which was told to stay away from unhealthy food, there was no difference. They consumed as much junk food after the researchers’ intervention as before. But in the second group, researchers saw a strong and lasting effect of their message. Teens who were forewarned about deception aimed at them made better food choices. The inoculation worked.

Expanding this finding, a recent study tried attitude inoculation against political messaging.37 Participants in the study were presented with a slightly reworded manifesto from a 1960s terrorist group, Weather Underground. But some of them were inoculated in advance: They were told that they might encounter a political message from an extremist group that recruits “people just like you.” These inoculated participants were much more likely to argue against the radical political message and be suspicious about the group behind it than participants who didn’t receive an inoculation.

An effective counter-conspiracy theory campaign would include inoculating messages on social media. These would micro-target people vulnerable to disinformation, warning them about nefarious groups seeking to recruit “people just like them.” For example, a white, suburban woman browsing pastel-colored Instagram posts might see a banner alerting her that “some of this content was created by Russian trolls to manipulate opinions of American white, suburban women.” Or a mother, watching YouTube videos about child abuse, might see one reporting on the high volume of fake online content about pedophiles that seeks to exploit mothers’ protective instincts in order to milk them for donations to fake charities. We need more research to calibrate both the micro-targeting and the messaging. What is already clear is that attitude inoculation would be a great step toward building psychological resilience to QAnon.

Limiting Exposure

Social Media Filters

The tech giants—Twitter, Facebook, Instagram, and YouTube––have begun clamping down on disinformation after the January 6 insurrection brought QAnon into focus for the U.S. public. A number of conspiracy-spewing accounts have been shut down, including those of Donald J. Trump, General Michael Flynn, and Alex Jones, as well as over 70,000 Twitter QAnon accounts. This was a long-overdue and much-needed action. The results were immediate and impressive. In just one week, the amount of misinformation on these platforms went down by 73 percent.38 With the super spreader accounts gone, the momentum to rile support and recruit new people into the QAnon movement also died down. Mentions of election fraud and the stolen election dropped from 2.5 million to just 688,000. Likewise, the use of hashtags #FightForTrump, #HoldTheLine, and #MarchFor-Trump declined by 95 percent. Facebook users also became a lot less likely to click on or “like” right-leaning content in the days after Facebook banned Trump. It turns out that social distancing works not only with biological viruses but also with viral information.

Although successful, this effort was reactive rather than proactive. By the time QAnon followers stormed the Capitol, social media had already allowed conspiracy theories to proliferate. Influential individuals––super spreaders––had lent them credibility by reposting, liking, or engaging with their content. As a result, the number of people exposed to the radical rhetoric ballooned. Even when the social media companies eventually shut down conspiracy-posting accounts, there was no apparent coordination among the platforms. Users kicked off one platform transitioned to another, taking followers with them.

To curtail the spread of conspiracy theories, there must be a centralized and coordinated contingency plan for all social media platforms. Social media companies must take responsibility for addressing problematic content in real time––preventatively––before it has a chance to spread.

Adjust Algorithms

Part of what made QAnon into a giant social movement were the social media algorithms that recommended content to users. A woman worried about COVID-19 infecting her family would go online seeking information about the virus, and the algorithms would lead her down a “garden path” toward videos and posts about a Chinese bioweapons laboratory designing COVID-19 to control the world or about COVID-19 vaccines turning her children into homosexuals. Searching for information about Trump’s re-election, another woman would see recommendations about a secret cabal of pedophiles that Trump was fighting. As with the 6 o’clock news on TV, fear mongering and cliffhangers assured that the users would stay glued to their screens. Post by post, click by click, they ended up in the rabbit hole without even trying.

The algorithms are proprietary: Internet companies keep secret how they decide which recommendations users see. Researchers can’t get access to user data that Facebook, YouTube, Twitter, and others collect. But these restrictions are not applied equally. Thus, Cambridge Analytica, a private political consulting company, was implicated in a 2018 scandalous data breach, in which it had reportedly obtained private data from over 87 million Facebook users. Cambridge Analytica then used these highly detailed personal data to micro-target vulnerable users—for example, by discouraging them from voting.39

Going forward, social media companies must implement more oversight and offer more transparency. They should also consistently enforce their own terms of conduct, banning violators. This will cripple the capacity of malicious actors (such as Cambridge Analytica or Russian bots or trolls) from spreading conspiracy theories.

But that’s not enough.

Another problem QAnon exploited was that “alternative” social media platforms—such as Parler, Telegram, 4chan, and 8chan—do not abide by even the limited number of regulatory guidelines currently enforced by Twitter and Facebook. These alternative platforms remain a digital Wild West, providing a refuge to the most radical, and thus most dangerous, individuals and groups. Lawmakers must consider the implications of this unregulated Internet space and design some system of accountability. In the meantime, the following strategies work on the dark web just as well as on mainstream platforms.

Air Out Troll Caves

Russian psyops (psychological operations) have systematized and weaponized online trolling. Reportedly, thousands of Russian citizens are employed full-time as Internet trolls, clicking and commenting day and night to champion the Motherland’s interests.40 A post with thousands of likes tends to be recommended by social media algorithms more than one with just a few likes. Trolls trick the algorithms into prioritizing the content that they rally behind. And when the algorithms bring up troll-ridden posts on users’ timelines, users more often click on them and believe the content, because they think many other people liked it. Thus, trolls have perfected the art of tricking both computer programs and people on social media. But what if this art can be used for good instead of evil?

What if, instead of trolls (or perhaps in addition to them), there were “elves”—individuals whose job would be to add reality checks and counterarguments to discussion boards and forums? Elves would challenge anyone who posts disinformation to “bring receipts” or admit they were full of hot, stinky troll air. With elves airing out troll caves, an average user would get a clearer perspective than they would with only trolls skewing their view.

Psychologists studying group discussions discovered that the more arguments for a radical position a person encounters, the more probable it is that they shift their opinion, becoming more radical themselves.41 If you hear two people mention a crazy rumor, you’re more likely to question it than if a hundred people mention it. And if you see thousands of people agreeing with the crazy rumor, you are more apt to think that maybe it’s not so crazy after all. You’re a lot more likely to believe the crazy rumor yourself. A discussion board with active trolls running rampant is a radicalization minefield.

This is especially so with the kind of content that characterizes QAnon posts. In the classical series of experiments, Solomon Asch demonstrated that a unanimous majority stating an obvious falsehood can make a person accept and repeat the falsehood to avoid looking (and feeling) ridiculous.42 When asked to compare the lengths of three lines, for example, three quarters of participants would disregard the evidence of their own eyes and say that the short line was the long one––because the majority said so. (The majority, as you may have guessed, were experimenter’s accomplices, instructed to lie.) Asch called this tendency conformity. The larger the group, the less likely a person was to disagree with the group’s clearly wrong opinions, and the more likely they were to conform to the majority. But Asch discovered a cure for conformity. When one member of the group (also an accomplice of the experimenter) voiced disagreement with the majority, participants’ conformity evaporated. Like in Hans Christian Andersen’s fairy tale, when a little kid cried out “The emperor is naked,” others chimed in and supported this view. So, too, in online discussion, a dissenting voice is a powerful reality check that can break the spell of conformity.

An elf entering a forum buzzing about vaccines that turn children into homosexuals might ask the most active users for evidence. “Can you give me a link to a study that shows that?” They can also challenge the assertion that the story is true: “I googled, but I can’t find any facts about this. Sounds like fake news to me.” If sources are posted, an elf might question the sources’ legitimacy: “Dude, that’s not, like, a real report. It’s just some guy speaking into his camera phone.”

Planting elves would undermine the influence of the majority on troll-driven forums. Seeing trolls’ ridiculous or factually wrong statements challenged would empower users to resist trolls’ disinformation. Thus, introducing elves who would add counterarguments against radical opinions and would make users less likely to embrace radical rhetoric.

Flush Conspiratorial Space

QAnon offered users the joy of connecting the dots and figuring out for themselves “the Truth” and “the plan.” This activity seemed like creative discovery to them, when in fact the “dots” were arranged with a particular sinister purpose, and most QAnons would end up with the same conspiracy theories after connecting them. This is akin to what psychologists call groupthink—a tendency of individuals in groups to fall in line, adopting the same ideas, no matter how flawed those ideas might be.43 Groupthink is going along to get along, at the expense of facts and sometimes common sense.

One way to disrupt groupthink in the digital space is to flood the social interactions with appealing alternatives. As a result, people are more likely to question, and less likely to follow, the garden path prepared for them.

One study demonstrated how this could be accomplished with the help of bots—computer programs posing as human Internet users. Instead of connecting the dots into conspiracy theories, the study focused on inkblot interpretations, the kind used in Rorschach personality test.44 As with conspiracy theories, when Internet forums interpreted inkblots, they tended to converge on a single, most popular version. “It’s a butterfly,” the group would decide, ignoring other suggestions (crab, bunny, couch). And the larger the group, the more likely it was to embrace a single interpretation. Groupthink made people suppress their own dissenting opinion and embrace the most popular opinion instead.

But then study authors did something to disrupt this process. Knowing that “butterfly” was the most likely interpretation to emerge, they introduced bots into group discussion and made them suggest an unlikely and unpopular vision of the inkblot: “sumo wrestler.” Sure enough, as bots flooded the field, their interpretation gained followers. Fewer people were now agreeing that the inkblot looked like a butterfly. Once the bots comprised over 37 percent of the group, sumo wrestler became the dominant interpretation in the group. What’s more, this interpretation stuck with participants. Even after they were no longer in the same forum, even when they were shown a new inkblot that looked like a crab, not a butterfly, participants “brainwashed” by the bots were likely to see a sumo wrestler in it.

Translating these findings into the conspiracy theory world, by flooding Internet discussions with alternatives to conspiracy theories, bots can make people less likely to connect the dots into the pattern pushed on them by malicious actors or by groups like QAnon. This technique would work on mainstream media as well as on the dark web. Flooding the conspiratorial space would limit exposure to dangerous rhetoric for Diehards and Doubters, as well as for those teetering on the edge of the rabbit hole.

To disrupt the spread of conspiracy theories, flooding would require two steps. In the first step, focus groups would explore conspiratorial “crumbs.” These focus groups would then be instructed to connect the dots in a way that differed from the prevailing interpretation. In step two, bots would inject these alternative interpretations into the conspiratorial space on the dark web or on mainstream social media. With bots flooding forums, users would be pulled away from conspiracy theories.

Offering Treatment

QAnon’s rise coincided with the onset of the COVID-19 pandemic and the lockdowns that followed. This was also the time of a developing mental health crisis in the United States: The rates of depression and anxiety among Americans went up by a factor of 4, from 1 in 10 people reporting symptoms before the pandemic to 4 in 10 reporting them after.45 QAnon appealed to people already feeling anxious and depressed, and following QAnon exacerbated these feelings.

People who dove into the depths of conspiracy theory world often lost social connections outside of it. Some lost their businesses because of their beliefs. Isolation and financial troubles added strain on mental health, causing people to seek comfort in the QAnon community and pushing them even deeper into the radical belief system. The vicious circle continually chipped away at psychological well-being.

In this book, we presented several cases of ex-QAnon women. Even within this thin slice of the movement, there was a high prevalence and a great diversity of psychological disorder: Munchausen by proxy, substance abuse, depression, anxiety, PTSD, and bipolar disorder. Each of these is a devastating disease that can render those suffering from it incapable of handling the demands of daily life.

Consider the severity of psychopathology it would take for a mother to say, “If Biden wins, the world is over, basically . . . I would probably take my children and sit in the garage and turn my car on, and it would be over.”46 Tina Arthur, owner of a small business, made this statement in the context of sharing her belief that Democrats are part of the world-ruling cabal and that they rape children and drink their blood. But the context is less important than the fact that Arthur’s statement is a credible threat, with a detailed and accessible plan to harm herself and her minor children. Conspiracy theories aside, someone like Tina Arthur would benefit from psychological help.

A report mining criminal records of QAnon followers who were arrested for storming the U.S. Capitol on January 6 found an exorbitant number of them (68%) with documented psychiatric problems. These mental disorders ranged from depression and anxiety to paranoid schizophrenia, bipolar disorder, and PTSD.47 For comparison, only 21 percent of Americans had a diagnosed psychological disorder in 201948—a rate less than one third of that among QAnon insurrectionists. The same report found that seven out of eight women who committed violence at the Capitol riot had experienced a psychological trauma that led to their radicalization. The trauma was typically around their children being physically and/or sexually abused by their partners or by family members. The rates of psychopathology and reported psychological trauma among QAnons charged with crimes related to the January 6 insurrection are magnitudes higher than average. The overwhelming majority of these people are psychologically unwell and in dire need of comprehensive mental health treatment.

In fact, QAnon’s rise can be viewed as a symptom of a mass mental health crisis. Individuals who embrace Q-conspiracy theories may be inadvertently expressing deeper psychological problems, in the same way that self-harming behaviors or psychosomatic complaints flag deeper psychological issues. The American Psychological Association must direct resources toward studying beliefs in conspiracy theories as a symptom of other mental health problems, or as a separate diagnosis.

Presently, mental health services are out of reach for many Americans who don’t have health insurance, or whose health insurance doesn’t cover therapy. Even primary healthcare is in short supply in the United States, which means many people who suffer from psychological problems never receive a diagnosis or treatment. A comprehensive approach to rehabilitating former QAnon members would include psychological assessment, and, where needed, professional psychological services such as individual or group therapy.

Cognitive Therapy

Cognitive therapy, designed by Aaron Beck, is a highly successful treatment for a variety of psychological disorders.49 Its methods include guiding patients through the process of questioning their implicit assumptions (“nobody cares about me” or “doctors are only ever looking to get rich off of people’s suffering”) and recognizing their knee-jerk emotional reactions (shutting down at the prospect of a social interaction, or anger at medical professionals). Once patients learn to recognize their problematic patterns of thought and emotion, they can begin to counter them.

By repeatedly reinforcing this process of discernment, in therapy sessions and in “homework” assignment that patients complete between appointments, they develop a more mindful, less reactive cognitive style. This cognitive style helps to guard against falling into faulty thought patterns that lead to harmful behaviors. Before, a person might watch an Internet video about abused children and become overwhelmed with anxiety and guilt. Escaping these emotions, they might get onto Internet forums discussing the video, where their guilt would be redirected toward scapegoats like Soros and Clinton, and their anxiety might be repurposed into anger and plans for retaliation. After cognitive therapy, however, a patient would learn to recognize their emotions, including unpleasant ones. Instead of seeking to escape them in online forums, they would follow psychological techniques to reduce anxiety and guilt. Instead of adopting problematic beliefs (pedophiles and cannibals are running the world), they would question them.

For QAnon followers, cognitive therapy would help not only their beliefs in conspiracy theories, but also the cognitive and emotional needs that brought them to QAnon in the first place. Maybe the thought of COVID-19 made a woman feel like a bad mother for not being able to protect her children from the virus. Or maybe it was the thought of pedophile priests and politicians preying on young kids. This anxiety then pushed her into the rabbit hole of QAnon conspiracies. Cognitive therapy would help such a patient work through her cognition: What does it mean to be a “good mother”? Can any mother protect her children from EVERY danger? If not, how could one still be a good mother? What are some things that make a good mother? A homework assignment for that week might be to observe and note the actions the patient performs routinely that she would recognize as good parenting (i.e., feeding, helping, hugging the child) and maybe to add one new “good mother” behavior that she wouldn’t normally do (i.e., reading to the child before bed). In the subsequent sessions, the patient would reassess her belief about parenting in general, and about her own parenting.

For those whose QAnon descent was precipitated by loneliness and isolation, cognitive therapy could address their social needs. In this case, therapy goals could include learning and practicing social skills required to shore up a social network outside of QAnon. Homework assignments could be making small talk with a neighbor, saying hello to a stranger, or making a phone call to a friend or a family member. With the therapist’s help, the patient would review these experiences, noting problematic cognitions as well as what helped to overcome these.

To summarize, cognitive therapy can address the needs that give rise to conspiratorial beliefs. It can also offer the tools to rebuild the patient’s social, emotional, and personal life.

Support Groups

Loneliness, exacerbated by the COVID-19 pandemic shutdowns, was one of the major forces that drove people into QAnon. Once they were deep in QAnon, social isolation only got worse. This is how QAnon Defector Jitarth Jadeja described it:

“I think superficially it did seem like [QAnon] gave me comfort,” Jadeja said. “I didn’t realize the nefarious kind of impact it was having on me because it was very insidious how it slowly disconnected me from reality.”

“No one believes you. No one wants to talk to you about it. . . . You get all angsty and crabby and whatnot. [S]uch shouting, irrational, you sound like the homeless guy on the street yelling about Judgment Day.”50

Online QAnon community provided an illusion of instant, omnipresent connection, which is hard to find in the real world. Especially for former QAnons, who have often lost contact with family and friends over their beliefs, leaving QAnon puts them right back where they started: feeling lonely. Therefore, decreasing social exclusion must be a priority to prevent former QAnons from relapsing into their old ways. Support groups can help.

There are already a growing number of online forums that offer a safe space for Defectors to commiserate and to support one another. Popular Subreddits such as r/QAnonCasualties and r/ReQuovery invite both former Q and people affected by QAnonized loved ones to share their experience and tips for life after Q.

Additionally, former cult members, like Steven Hassan, founder of the Freedom of Mind Resource Center, are spearheading efforts to help Q followers re-enter the world outside of QAnon.51 Hassan himself had been recruited into the Reverend Moon cult at the age 19. After years in the cult, he left, and has since turned his defection into a career of helping other victims of mind control. Hassan says that the process of indoctrination and radicalization into a cult is identical whether it’s the Moonies, Scientology, or QAnon,52 and the process of de-radicalization is also similar. Former members of cults (like Scientology) or of radical groups (like neo-Nazis) might be invaluable to recovering QAnons, Doubters and Defectors alike, by sharing the challenges they had faced after defection, as well as success stories in overcoming these challenges.

Another useful model of social support for those trying to break a harmful habit has been perfected by Alcoholics Anonymous. Regular meetings with others (QAnons Anonymous, perhaps) who wouldn’t judge because they have been through the same lows would provide the face-to-face contact so many QAnons crave. Former QAnons might also benefit from having a sponsor they could count on when they need support and guidance, someone who would answer their calls and not turn their back on them no matter what their beliefs. By having a specific person to turn to in moments of crisis, who would support without judging, QAnon Defectors would fulfill both social and emotional needs.

Volunteering

Volunteering is a somewhat counterintuitive source of psychological and physical well-being for women and men, young and old.53 People who volunteer, whether with official charities or simply by helping their friends and neighbors without expecting anything in return, enjoy a variety of benefits: more happiness, higher self-esteem, and more self-confidence, as well as greater physical health and longer lifespans. One explanation for this surprising effect is that volunteering allows people to interact with less fortunate others, which makes volunteers feel more grateful for what they have and happier as a result. Another is that volunteering channels stress and anxiety into productive action, relieving the negative effects that stress biochemicals have on the body and on the mind. Both of these mechanisms would be especially beneficial to former QAnon members.

Instead of surfing the Internet, sick with worry for imaginary abused children, why not spend time and energy volunteering at a local children’s hospital, or at the local library’s story hour? Seeing actual children, and actually helping them, would bring incomparable moral and emotional rewards, rendering the appeal of QAnon moot. Women can help other women care for children, or volunteer to take over cooking one night a week, so a struggling single mother next door can have a break. Women whose gender identity feels threatened can reaffirm it through these traditionally feminine actions, while at the same time building social connections. Volunteering is a feel-good fest where everybody wins, like on an Oprah show: You get a benefit, you get a benefit, everybody gets a benefit.

Volunteering can also focus on helping other former QAnon followers. Defectors are best equipped to help Doubters get out of QAnon’s grip. With the wealth of their own experiences, Defectors can serve as sponsors/mentors in the AA-type support programs. They can also become elves in online forums, fighting trolls and disinformation. They can volunteer to serve on focus groups to work out alternatives to conspiracy theories that would then be used by bots to flood the conspiratorial space. Finally, former QAnon members would be of great help in programs to educate people on media literacy and teach them critical thinking. Outreach programs at schools and among vulnerable populations could use Defectors’ expertise and firsthand knowledge to educate the general public on the dangers of disinformation.

These volunteering activities would help people transcend the trauma and disappointment of their QAnon membership. By giving their lives purpose and meaning, volunteering could serve a personal utility far better than Qanon ever could. As a bonus, former Qanon volunteers would reap all the science-proven rewards of meaningful sacrifice: better quality of life, better health, and more happiness.

Through rebuilding social connections and volunteering, former Qanon followers would be less susceptible to being manipulated into feeling outrage, fear, or anger. They would therefore be more resilient to attempts at weaponizing their reactions through social media. And investing personal time and efforts into fellow Americans would help bridge the social divisions that carry potential for QAnon-style mass radicalization.

Adjusting Expectations

With so many people in the United States believing conspiracy theories, those of us who do not are not immune to QAnon’s damage. Whereas the three previous counter-QAnon approaches (building immunity, limiting exposure, and offering treatment) primarily targeted QAnons, the fourth prong, adjusting expectations, focuses on those of us outside QAnon.

A well-established finding of radicalization research is that radicalization on one side begets radicalization on the opposing side.54 When someone attacks us, we want to attack back. When someone questions our way of life and our social norms, we feel anger toward them that can boil into aggression. The terrorist attacks of 9/11, for example, resulted in the American public’s support for unnecessary wars abroad and for violations of constitutional rights at home. We became mass radicalized. The costs of that mass radicalization are still haunting us 20 years later. Adjusting expectations about conspiracy theories can help us avoid similarly costly mass radicalization in response to the threat of QAnon.

Conspiracy theories have been part of human civilization since its beginning. Even the major tropes of QAnon conspiracy lore have remained consistent over hundreds of years: evil Jews, child abuse, harmful substances that the powerful use to their advantage, a secret cabal that runs the world. These falsehoods have survived world wars, pandemics, natural disasters, and regime changes. Their popularity waxes and wanes—up in times of uncertainty and shifting social norms and down when the society is functioning like a well-oiled mechanism, with the government and social institutions fulfilling people’s expectations. It would be nice if conspiracy theories disappeared forever, and everyone was rational and reasonable. But it’s not very likely. Conspiracy theories will probably stick around––whatever becomes of QAnon.

With this in mind, measures to counter QAnon’s messaging and images are no more important than measures to counter our own psychological reactions to QAnon. Here, we recommend comprehensive mindfulness training.

Mindfulness Training

Mindfulness-based stress reduction (MBSR) is a clinical program that was developed and systematized by John Kabat-Zinn,55a molecular biologist whose interest in Buddhism inspired him to study how some Buddhist practices affect physiological stress response. MBSR combines meditation, breathing exercises, and mindful movement (yoga, tai chi, or chi gong) with practicing awareness. Clinical studies have documented beneficial effects of MBSR on clinical disorders, including depression and anxiety, but also on general well-being. Blood chemistry analyses confirm MBSR’s positive effects on stress hormones. MRI scans of individuals trained in MBSR show changes in their brain structures consistent with reduced stress response and less reactive psychological style. In short, MBSR helps fortify psychological defenses and reduce stress by training people to connect with the present moment rather than worry about the future or regret the past.

For those of us whose loved ones have been swept up in QAnon conspiracies, MBSR could offer a much-needed sanctuary from stress caused by frustration and helplessness. Like airlines’ instructions to passengers, first priority in dealing with crises should be taking care of oneself. Only after our own psychological needs are met can we be of help to others. Especially if your loved one is a Diehard QAnon, mindfulness would help to adjust expectations and to guard against soul-crushing disappointments.

A typical MBSR course brings together a group of people once a week, for eight weeks. Unlike therapy, MBSR students are not expected to talk about themselves. Unlike a therapist, an MBSR teacher won’t ask, “What brings you here today?” The assumption is that everyone’s life has stress and struggle, and everyone can use mindfulness to help deal with it. Whatever brings people into MBSR, they learn to be fully present in the moment. By focusing on immediate physical sensations, emotional experiences, and arising thoughts, MBSR students learn to recognize that all of these are transient. Sensations, feelings, and thoughts come and go—none are permanent. Learning to let go of persistent thoughts that can otherwise become obsessions or ruminations reduces stress. Mindful movement connects people with bodily sensations, helping them to recognize the physiological markers of stress. Meditation and breathing exercises offer an avenue to distance oneself from nagging worries and self-defeating patterns.

Without mindfulness practice, we are reactive to the vagaries of being. A bad phone call or a negative online post can push us into a downward spiral. Our shoulders hunch, breath becomes shallow, we grind our teeth. All of this feeds the body’s stress response, sending stress hormones throughout tissues and organs, and releasing anxiety-producing chemicals in the brain. An hour later, we find ourselves still obsessing, only now our entire life—past, present, and future—is somehow tied in with that damned phone call or stupid post. We feel not good enough, defective, defeated.

With mindfulness, life’s vagaries would still be there, but we would learn to catch our reaction to them before they become overwhelming. We would notice our posture—and straighten the spine. We would take a deep breath—and then another. And another one after that. We would get up and stretch or take a walk outside. The negative thoughts would still come, but we would know not to engage with them. Like clouds, they would pass over our mental horizon. Instead of being at the mercy of life’s challenges, mindfulness allows us to practice being in charge of our reactions to them.

Positive effects of MBSR last long after the eight-week course is over. For many, mindfulness they learn in an MBSR course becomes an important part of their life. With the positive effects of mindfulness well established in research findings, many health insurance plans now cover the cost of the eight-week course.

MBSR is a systematized and well-studied way to learn and practice mindfulness, but it is not the only way. Books that teach mindfulness techniques can be picked up at your local library for free. There are online videos on YouTube that can demonstrate yoga and breathing exercises. Many apps available for smartphones and tablets teach and help you practice meditation, awareness, and mindful movement.

While QAnon followers may be on the extreme end, the Internet and social media can be addictive for all of us. Content that stirs fears, anxiety, and anger is most engaging, and therefore most prevalent. The illusion is that we are controlling it, by clicking on the app icon, by reading when and what we choose. But the fact is, the Internet can be a medium for controlling us: Our attention, our emotions, our opinions and behaviors. What’s worse, Internet companies are only too eager to encourage our unhealthy dependence on them. Awareness is the first step toward maintaining agency and avoiding manipulation. Cultivating mindfulness about our online exposure would do wonders for our psychological health.