The Google walkout had shown cracks in the diehard faith many Googlers placed in their corporation. Something similar was happening on YouTube’s platform.
The people who made and populated the site in its earliest days believed in YouTube, as a shared project and shared community. As YouTube grew and splintered, the faith of its loyal creators and fans gradually eroded. By now, plenty had given up on the institution entirely, a widespread contagion in the Trump era. They were prepared to channel their fury.
It started with Will Smith.
YouTube: “YouTube Rewind 2018: Everyone Controls Rewind.” December 6, 2018. 8:13.
The former Fresh Prince kicks off YouTube’s annual year-in-review clip, the one YouTubers and fans pick apart, mining for faces, trends, and phenoms. Images whiz by. A cast of dozens of YouTubers celebrate K-pop, Fortnite, ASMR, “Baby Shark,” charity, drag performers, “all women in 2018 for finding their voices.”
To a YouTubeland outsider, this video might seem harmless.
Not so. After two years of economic turbulence and rapid upheavals, YouTubeland picked this moment to let loose. The video showed old-media dudes (Will Smith, John Oliver, Trevor Noah), an affront to many native YouTubers. It featured many creators who didn’t produce videos in English—mega-YouTubers from Korea, Brazil, and India—who were unfamiliar to boisterous American fans. And it glossed over big, indecorous moments from the year: beauty guru feuds and Logan Paul’s newfound, hyped-up boxing career. The 2018 Rewind felt alien, ad-friendly. Corporate.
The mob spoke. Within a week more than ten million people clicked a thumbs-down icon on the footage, making it the most disliked video ever. Naturally, YouTubers made videos about the Rewind video. In one of his, PewDiePie told viewers he found YouTube’s marketing reel “so disconnected from its community and creators.” Although, he added, there were now too many YouTube stars to manage; some two thousand channels had more than one million subscribers: “To really please everyone, it’s going to be impossible.”
Since the death to all jews disaster, Felix Kjellberg had gone wilder on-screen. He started a new format, “Pew News,” riffing on media critics and fellow YouTubers, raging like Network’s Howard Beale. He grew his beard out to Tolkien-dwarf length. He dropped the n-word in a gleeful moment on a video game livestream off YouTube, prompting an apology on YouTube (“I’m an idiot”) and another critical news cycle. In one “Pew News” clip, he dissected Logan Paul’s apology tour: after his Japanese forest stunt, the star went on daytime TV and made a doe-eyed video on suicidality. People had advised Kjellberg to do the same, but that felt “very disingenuous,” he told viewers. “I would rather just show people that I’ve changed through my videos and time.”
Not taking an apology tour may have hit his earnings, but not his audience size. By the fall of 2018 PewDiePie had more than sixty million subscribers whose loyalties had cemented during his publicity woes. Still, his audience wasn’t growing fast enough for him to keep his crown.
That August, Social Blade, a sort of sabermetrics for YouTube, published a chart showing PewDiePie on track to lose his title as most-subscribed YouTuber. His challenger: T-Series, an enormous Indian record label, Bollywood studio, and entertainment juggernaut. T-Series started posting frequently on YouTube as cheap smartphones spread across India, introducing the internet to tens of millions of Bollywood devotees. Much of this audience had never owned computers or even TVs. T-Series was the music hitmaker, box-office mogul, and pop-culture factory that YouTube’s Head had forever courted, all wrapped into one. Being an Indian hitmaker also dovetailed nicely with Google’s fervor for its “next billion users.”
And yet, for much of YouTubeland, T-Series was an invader: big, corporate media that pumped out dozens of polished YouTube videos a month. To be honest, few Americans had heard of the channel or cared about it until it began to encroach on PewDiePie, the embodiment of YouTube’s freewheeling culture. Somehow PewDiePie, a Swede whose entire career depended on advertisements sold by one of the largest global corporations, became an antiestablishment figurehead. Kjellberg rose to the occasion. In October he made a dis track video, “bitch lasagna,” addressed to T-Series, rapping with an Eminem inflection and edgelord lyrics. (“I’m a blue-eyes white dragon while you’re just a dark magician.”) It was classic PewDiePie—ludicrous, laden with internet in-jokes (“bitch lasagna” referenced a meme of an Indian man failing to court a suitor), a caricature of something, though it was hard for outsiders to tell what.
The rallying cry formed: “Subscribe to PewDiePie!” Its force shocked YouTube, which had by now grown all too accustomed to shock. An impish YouTuber, part of a Paul brother posse, purchased a Times Square ad for the cause, and Logan Paul himself commanded his Logang to support PewDiePie. Jimmy Donaldson, a.k.a. MrBeast, a rising YouTube phenom known for on-screen stunts of charity and excess, purchased billboards in his hometown of Greenville, North Carolina, that read, calling all bros! you can save youtube. subscribe to pewdiepie. Others conveyed that message by hacking printers, streaming devices, and (poetically) smart cameras from Nest, which Google owned. A self-propelling internet meme took flight. PewDiePie added millions more subscribers.
In December, after the botched YouTube Rewind, the company decided it needed to appear hip to the criticism coming its way. Wojcicki told staff even her kids found that Rewind video cringeworthy. As an act of self-awareness, YouTube’s marketing team prepared a playlist of top videos reacting to Rewind, a task that fell to Claire Stapleton. There was much consternation about including PewDiePie’s piece, which obviously was popular. Kjellberg had a YouTube senior partner manager, Ina Fuchs, a German executive, who kept in touch with him, but since the 2017 fiasco the company had severed public ties with its biggest star. (YouTube continued to run ads on his suitable videos, of course.) To YouTube brass this felt increasingly untenable, particularly given the groundswell of support for him, which often landed as condemnation of the company. Fuchs and other colleagues argued that Kjellberg had been misunderstood and deserved more corporate support. Such decisions were not made lightly.
Stapleton was interrupted during a massage on Google’s campus with missive after missive from her boss about including PewDiePie in the Rewind playlist and getting the messaging just right. Email debates ensued over whether YouTube’s Twitter account should click the ♥ below one of his tweets. Stapleton believed it shouldn’t. The YouTuber, she argued, was “irresponsible with his influence.” She refused to put his video on the playlist.
But it appeared anyway. Her manager had asked another marketer to add it, going around her entirely.
Stapleton should have seen it coming. After the walkout, when she appeared in the Times and on TV broadcasts, a colleague warned her that such visible actions, all planned and unspooled inside the company, invited a response. The colleague quoted the civil rights activist and writer Audre Lorde: “For the master’s tools will never dismantle the master’s house.”
Stapleton later recalled the lesson this way: “If you become inconvenient, then your days are numbered.”
She had become the face of the walkout, along with Meredith Whittaker, a Google researcher and ringleader of protests against its Pentagon contract. Whittaker argued fervently that her company was taking dangerous ethical missteps with its artificial intelligence. She had been at Google since 2006, a veteran, like Stapleton, which made them useful advocates. They were also both white. A YouTube colleague who wasn’t once told Stapleton they agreed with her on issues like PewDiePie but didn’t have the privilege to go nuclear on the bosses.
Google’s c-suite support for the walkout did not last. Its organizers hadn’t stopped at a march but presented five demands, which included an end to pay inequity and an employee seat on the corporate board. Shortly after the walkout Stapleton and a few other women from YouTube held a private meeting with their CEO. Wojcicki had confided to some staff that she knew nothing of charges against Andy Rubin and felt disgusted by them. In this meeting her employees raised concerns about YouTube’s gender pay gaps and its scarcity of Black leadership. Wojcicki indicated ignorance of these gaps and told them YouTube would right these disparities. After the meeting disbanded, a colleague turned to Stapleton and said, “She’s completely lying.” Wojcicki knew the data, they concluded, but was deflecting responsibility. “It was just lip service,” Stapleton recalled.
That would be her last sit-down with the CEO. In January, Stapleton’s manager informed her that her role was being “restructured.” Officially, this was an employee “reorg,” a regular occurrence at Google, but Stapleton lost several responsibilities and half the staff she managed, so she suspected other motives. She took her frustrations up the corporate chain, where she was advised to “rebuild trust” with her manager, maybe take some days off. The writing appeared on the wall.
March came, and she was invited to fly to California for a Well-Being Code Red retreat. “Oh this will be fun,” she wrote to a colleague, in email deadpan.
On Thursday, March 14, Jennie O’Connor arrived at YouTube’s “intelligence desk.” The company had formed this division to mitigate risk in early 2018, after the Elsagate crisis. As its leader, O’Connor was responsible for looking past YouTube’s Long Tail to spot troubling threats on horizons and anticipate messes so YouTube’s moderators and machines could adequately address them. O’Connor recruited former intelligence officials and creator managers to keep a finger more firmly on the site’s pulse. A twelve-year company veteran, she could “speak Google,” said an old colleague. Most critically, she had worked on product as a deputy for Neal Mohan. “Unless you’re in product or you’re writing code,” explained Hong Qu, a former YouTube designer, “you can’t influence anything.”
In her new job O’Connor got up to speed on ISIS. A former high school math teacher, she also got acquainted with the wild things kids were now up to. Like the “condom challenge” (dropping a prophylactic filled with water over a person’s head to form a fishbowl helmet, good internet fodder). Sometimes O’Connor’s unit was caught off guard, as in February, when a YouTuber exposed how pedophiles used coded links and phrases in comments below clips of children, sparking another wave of advertiser boycotts. O’Connor’s team moved fast. Within two weeks it stripped comments from millions of videos, released a more efficient AI comment sorter, and set harsher penalties. O’Connor had set up a rotating global team of “incident commanders” to be on call for such immediate disasters. War rooms, intelligence, incident commanders—the militant language made everyone feel as if YouTube were battling adversaries.
That Thursday, U.S. senators rejected the president’s emergency measure to build his border wall, prompting Trump to tweet, “VETO!” A Googler in Japan broke the Guinness World Record by counting pi to thirty-one trillion digits. O’Connor’s workday ended without incident. She left YouTube’s office and began to settle in at home when the emails about New Zealand arrived.
The terrorist was Australian and twenty-eight years old. He grew up in a city north of Sydney, playing video games and perusing message boards like the backwater forum 8chan. His parents divorced, and his mother entered an abusive relationship. Before the terrorist reached twenty, his father had died of cancer following an asbestos exposure, which resulted in a settlement and left his son significant money. The terrorist traveled often, usually alone, and, according to a later government report, “he did not form enduring relationships with others.” He was white and considered himself European, believing both to be signs of superiority and identities under dire threat from rising immigration levels, particularly of Muslim migrants, his version of the Great Replacement theory that had spread online.
He watched YouTube and subscribed to channels. He posted in the Lads Society, a far-right internet clubhouse relegated to a private Facebook forum after the social network swept its public groups clean. This was likely the reference of his final line before committing mass murder: “Remember, lads, subscribe to PewDiePie.” There was no evidence he watched PewDiePie’s material or felt inspired by it in any way. This line was spoken to get attention.
In early 2017 the young man did make donations to a U.S. white nationalist think tank and to Freedomain Radio, the network from Stefan Molyneux. (In a statement, Molyneux said he “immediately condemned the New Zealand terrorist.”) On a trip through France that spring, the terrorist saw migrants walking in a shopping mall. “Invaders,” he called them, later writing online that this epiphany led him to violence. But even before, there had been signs. His family would tell authorities that in late 2016 he returned from a trip “a changed person”—hardened, extreme, often discussing how Muslim migration heralded the demise of the West and world. His mother worried about his mental health. “Patriots and nationalists triumphant,” he posted online after Trump’s election. Later, in his Lads group, he wrote, “Our greatest threat is the non-violent, high fertility, high social cohesion immigrants.” He read and absorbed material on the “Great Replacement.” Books, forums, 4chan, Facebook groups.
Above all else, one service had undue influence, according to a report from the New Zealand government, which interviewed the terrorist after his act. This report concluded, “The individual claimed that he was not a frequent commenter on extreme right-wing sites and that YouTube was, for him, a far more significant source of information and inspiration.”
Two days before his attack, he posted on his Facebook page several dozen links, including fertility rate figures and a British tabloid on Asian gang violence. He linked to many YouTube videos: speeches from a 1930s British fascist; news footage of Europe in disarray; Russian bombers over Syria; a biped robot marching to a German military soundtrack. Next to a Latvian folk song video, he wrote, “This is what they want to destroy.” He later told investigators he used YouTube tutorials to assemble firearms for his attack.
He had moved to Dunedin, in southern New Zealand, in 2017 and lived there without incident. After his crimes New Zealanders scrambled to make sense of the mass murderer. “He was a total nothing,” recalled Kirsty Johnston, a reporter who investigated his life. “A garden-variety racist. He had money and time.”
Haji-Daoud Nabi had an adoring family and an admiring community in Christchurch. Nabi was seventy-one, a grandfather who had moved from Afghanistan in the 1970s and still wore traditional pakol caps. He fixed old cars and liked driving guests visiting the city to his mosque. He mentored fellow migrants in New Zealand but had also embraced his adopted land. His funeral featured a caravan of Harley-Davidson motorcycles, which he loved. “He was as Kiwi as he was Afghan,” a friend remembered. Nabi called everyone brother.
On that fateful afternoon of Friday, March 15, Nabi stood at the door of his mosque, Al Noor, to welcome fellow faithful. Shortly after 1:40 p.m., he encountered a man carrying an AR-15 and a body camera set on record who would take Nabi’s life and those of fifty other souls at Masjid Al Noor and another Islamic site. Nabi greeted him warmly, “Hello, brother. Welcome.”
Jennie O’Connor opened her work laptop on her kitchen counter soon after, still Thursday evening in California. Colleagues had notified her of a mass shooting in Christchurch, where the attacker had streamed his murders live on Facebook. And that the stream had arrived on YouTube.
A protocol was in place. YouTube would quickly categorize the footage on its violence gradient and write corresponding rules for moderators and machines. O’Connor decided the video should come down, set those wheels in motion, and, as the night wore on, eventually tried to get some rest.
Tala Bardan, the YouTube violent extremism specialist, awoke early to see news of the shooting on her Instagram. She immediately wept. She wiped her tears, opened her computer, and began watching the terrorist’s footage. It was even harder to watch the bystanders—clips of worshippers and neighbors crying in disbelief. Bardan helped write the guidance for moderators: scrub any re-uploads or clips praising the violence, but be careful not to delete news coverage. She took a taxi to the office to work uninterrupted; she would spend all weekend at home reviewing violent footage, her husband bringing plates of food to her desk. YouTube was bombarded with tributes to the Christchurch death reel; hatemongers or devious trolls had spliced the footage in ways to outsmart machine detectors. Bardan and her co-workers in Europe and Asia frantically tried to put out fires while California slept.
O’Connor awoke Friday morning to learn the protocols weren’t working. At first she had thought YouTube needed more screeners. “We don’t have enough reviewers,” she pleaded overnight. But by morning even that solution wasn’t adequate. At one point YouTube saw one new video replicating the shooting appearing every second. An executive later said that the alarming speed of reuploads led some inside the company to suspect that a state actor was involved. Mohan, O’Connor’s boss, would describe it as “a tragedy that was almost designed for the purpose of going viral.”
Virality had been a gift to YouTube so many times throughout its history. YouTube was designed as the internet’s bottomless repository—videos first broadcast elsewhere, like the Christchurch livestream, could easily jump to YouTube and take off. To remain relevant, YouTube had rewired its algorithms to promote more breaking news footage so people who once flipped on the TV after a mass shooting would open YouTube instead, which they did. Even oddities of YouTubeland, like the “Subscribe to PewDiePie” cry, were now major news events. Unlike social networks, which had clunky search features, YouTube offered remarkably easy ways to find anything. All those mechanisms that let YouTube flourish as a business, tools created with little regard for unintended disasters they might bring, had combined into a nightmare fuel that the company couldn’t turn off.
O’Connor was on a call en route to work when the decision was made. YouTube would remove any footage showing the Christchurch shooting, not just exact re-uploads. And it would cut the ability for viewers to search for the tragedy at all, removing an entire category from its searches for the first time. The company called off human reviewers, who couldn’t move fast enough. YouTube turned its filter dials up and handed over control to its machines.