Epilogue

From an insurgent underdog in entertainment, a money pit, something of a joke, YouTube had become one of the most dominant, influential, untamed, and successful media businesses on the planet. In less than two decades. At times Steve Chen could hardly believe it.

Back when Chen wrote YouTube’s first lines of code, he struggled to get the audio and visual files to sync. When he left the company five years later, an astounding one hundred hours of footage were uploaded every minute; that figure, by 2020, was well over five hundred hours. In the intervening years Chen’s health improved, and he had teamed up again with Chad Hurley on a digital media start-up. When that didn’t pan out, Chen settled into a role as a wizened industry sage, reminiscing about the era when playing video on desktop computers required great engineering feats. He had moved to Taipei, his birthplace, with his young family, and watched with awe as taxi drivers streamed YouTube on their phones. In his son’s elementary school class, all but two kids said they wanted to be YouTubers someday.

YouTube’s troubles with bad actors, conspiracies, political speech, and irate politicians, its sheer operational bulk—all that was beyond anything Chen ever imagined. “To be honest,” he admitted, “I kind of congratulate myself that I’m no longer with the company, because I wouldn’t know how to deal with it.” Jawed Karim, YouTube’s third co-founder, had become an investor and only commented on his old company when it made changes that irritated him, like removing the number counts on the video “dislike” button.

During the pandemic, Chad Hurley, like many accomplished, restless men, took to Twitter. He posted inane jokes and slung insults at Trumpies and tech bros with the gleeful abandon of someone who no longer had a corporate job. Hurley financed companies and basked in the glow of being a father of the creator economy when creators were all the rage.

As the Trump era began to fade in 2021, suddenly every company wanted in on the industry YouTube had built. TikTok, the red-hot app, began paying select video makers, triggering a deluge of young fame seekers. Rivals Twitter and Snapchat nervously followed suit. Facebook, yet again, made a push to recruit influencers, pledging to spend more than $1 billion on creators and vowing not to take commissions for multiple years. Spotify paid hundreds of millions to recruit podcasters like Joe Rogan, who had used YouTube to build a media powerhouse entirely outside the mainstream. Venture capitalists went wild for “web3,” an internet model based on cryptocurrencies that imagined regular people owning and profiting from their online activity; this was YouTube’s creator economy taken to its next extreme. Sequoia, YouTube’s first investor, minted its 2005 YouTube investment memo as a “non-fungible token,” which a crypto enthusiast purchased for $863,000 in digital coin.

There was business logic behind this rush to embrace creators. The pandemic turbocharged online entertainment and commerce. At the same time, the Web 2.0 model of targeted advertising was being dismantled by regulators; companies had a harder time marketing things online. Creators were great marketers and salespeople. But perhaps the internet businesses, facing continued political scrutiny, had also decided that paying people who produced the content that made them so rich might look good, optics-wise. Or perhaps the pandemic’s upheaval—like the financial crisis, which had helped jump-start YouTube’s economy a decade before—had convinced enough people that working for these platforms, even without security, benefits, or guarantees, beat a day job.

And so the world that YouTube ushered in—of abundant content and creativity, of influencers and online hustlers, of information overload and endless cultural wars—became more of our own.


All this renewed competition only underscored YouTube’s unrivaled power. The company’s battle scars from copyright fights, ad boycotts, and countless creator turmoil produced a compensation system that worked like nothing else. No other platform distributed video and money as effectively. Creators tinkered on TikTok and Instagram, sometimes cashing in handsomely, but they made reliable money on YouTube. Other companies trying to replicate YouTube’s creator economy now had to deal with YouTube’s old firestorms. TikTok stars appeared in the tabloids. Spotify saw weeks of outrage over Joe Rogan’s comments on COVID-19, while YouTube, Rogan’s homebase for years, skated by untouched. U.S. Senators tore into a Facebook official over Instagram’s damage to teenagers on the same morning that Kyncl, YouTube’s chief business officer, cheerily briefed reporters on a study claiming how great his business was for the economy.

During its history, YouTube tried to push or position its platform as something that it wasn’t—a premium service, a destination for Hollywood, a manageable and sanitary place with only a few “bad actors,” a great equalizing force. That tension between what the company wants and what it has will never end. But YouTube has learned to live with it, or at least run a prosperous business from it. In the summer of 2021, YouTube posted its greatest quarterly ad haul ever, more than $7 billion, on par with Netflix’s sales. YouTube announced it had paid broadcasters more than $30 billion in a three-year span. (Although it didn’t specify how much of that bounty went to creators rather than media companies and record labels.) For the first time YouTube even began running ads on channels that didn’t qualify for its partner program, confident it could open up its Long Tail without brand safety disasters. YouTube kept every dollar from these commercials.

After some rocky years YouTube’s c-suite drew closer to its stars. Big-name YouTubers praised Kyncl, who lavished attention on a creator class long neglected by the company; Casey Neistat called him “wildly proactive.” Matthew Patrick lauded Ariel Bardin, the executive he once chided for not understanding YouTube, as “an incredible advocate for creators.” (Bardin would leave YouTube in late 2020.) YouTube gave creators more ways to make money beyond ads, like fan sponsorship and merchandising. Company managers asked creators for advice on reducing burnout and commissioned a therapist to post videos on the topic. YouTube even managed to improve its comments section. “It’s gone from the lowest level of hell on the internet to a fairly pleasant experience,” said Natalie Wynn.

Many creators got enough steady viewers and cash that they no longer felt pulled to Hollywood. Lucas Cruikshank acted in three movies as Fred Figglehorn, his squealing YouTube persona, but after growing exhausted from performing in front of a crew, a director, a whole to-do, he went back to posting solo on YouTube, where, he said, “there’s no pressure at all.” Justine Ezarik, a.k.a. iJustine, marked her seventeenth straight year on YouTube, still scripting, producing, acting, and starring in each video herself. Movies and TV never offered that degree of creative control. “You don’t own that,” she added.

Wojcicki began referring to creators as “the heart of YouTube.” And it seemed her company had begun to appreciate their non-pecuniary value. YouTubers spotted each corrosive trend—the troubling kids’ stuff, the bullies, hucksters, con men, and extremists—before the company did. “You have to watch your platform,” Patrick once admonished YouTube in a private meeting. It seemed like they were.

And yet, YouTube still followed the Google playbook. Graham Bennett, a senior partner manager who worked closely with stars, described his role as “the least Googley part of YouTube”; his job couldn’t scale. YouTube’s multichannel networks once fulfilled some of these managerial duties, but they had all withered or collapsed. Bennett wished YouTube could do more for creators, although from Wojcicki’s perspective, he said, deciding between hiring one more version of him or hiring another engineer “is tough.” (YouTube doesn’t share how many engineers and senior partner managers it employs.) Efforts to organize YouTubers, like Hank Green’s Internet Creators Guild, had died. And to some, as long as YouTube continued to be an ads business that demanded mass audiences and expand like a universe, the company’s pledge that it was a stool standing on three equal legs—viewers, advertisers, and creators—just wasn’t true. Creators always got the short end. “It’s like in Animal Farm,” said Andy Stack, a YouTube manager who left in 2015. “Some are more equal than others.”

By 2022 YouTube had revamped its content strategy again, dropping its program to fund subscriber-only shows starring creators—let Netflix, Disney, and Amazon duke it out for paid streamers. Instead YouTube shifted resources into Shorts, a feature for bite-sized videos. It was an obvious TikTok clone and attempt to fend off the threat it posed. Old-school YouTubers likened TikTok’s playful canvas to early YouTube, that long-gone era, where creative types could experiment and flourish. (“It’s just come out of nowhere,” Wojcicki admitted about TikTok in 2020, even though Google had previously tried to buy Musical.ly, the company that would become TikTok.) YouTube launched Shorts in India, where TikTok was banned, and started a $100 million fund bankrolling creators of these brief clips. It would sort out the business model later. Nearly a decade after tilting its system toward longer videos, YouTube was now paying for shorter ones. Of course, the main algorithmic metric for Shorts, like that for all of YouTube, remained watch time.

Most signs indicated that TikTok did chip away at YouTube’s dominance. A 2021 report revealed that for the first time Americans watched more TikTok than YouTube on their phones. But thanks to its smart-TV app and streaming service, YouTube was growing enormously on television screens. YouTube’s sales team still focused on eating into TV’s market share, not TikTok’s, and its product team tinkered with ways for TV viewers to like, comment, and subscribe, making TV even more like YouTube. Besides, TikTok didn’t have stockpiles of yoga videos, bread-baking tutorials, “Let’s Play” gamers, beauty gurus, and billions of hours of toddler fodder. Only YouTube did.

Other tech platforms (namely, Facebook) panicked about the TikTok generation and disaffected citizens quitting its platform. Viewers complained about seeing YouTube’s frequent or annoying ads, but they rarely stopped watching. Throughout all its years of tumult, YouTube never fretted about people fleeing.

As one employee put it, “How do you boycott electricity?”


Kids certainly remained loyal to YouTube.

Harry and Sona Jho, the veteran nursery rhyme showrunners, braced for a painful shock as 2020 began. YouTube’s settlement with the FTC meant they couldn’t run higher-priced ads on “child-directed” videos anymore. And, once the pandemic hit, marketers paused commercials everywhere, unsure how consumers would proceed. The Jhos watched ad rates crater. But quarantines, it turned out, were very good for their viewership. Kids stuck at home watched like crazy. By the end of 2020 the top-five most viewed channels on all of YouTube were preschooler fare. A year into the pandemic Harry Jho cautiously admitted the audience surge had helped his business. “It’s not rosy, but we’re not laying people off,” he said.

And he thought YouTube’s machines had become far more attuned to quality. After the FTC case, YouTube stopped treating its Kids app like an algorithmic free-for-all and assigned staff to curate the selection, like the coolhunters once did on YouTube’s home page. The company started a fund for kids’ YouTubers and told creators it would finance videos that “drive outcomes” associated with subjective traits like humility, curiosity, and self-control. YouTube said its system would reward clips that encouraged young viewers to go do things offline.

“This is about as healthy of an algorithm environment that I’ve ever seen,” said Jho. It felt as if YouTube had relinquished some of its blind faith in machines. It felt as if human beings there were actually involved.

During the pandemic, kids’ YouTube also became ground zero for new-media moguldom. Moonbug Entertainment, a digital studio, purchased three behemoth YouTube channels, assembling an arsenal (seven billion views a month) rivaling anything on cable. By 2020 little Ryan Kaji was a nine-year-old seasoned pro. He had mostly left toy unboxing, the phenomenon that skyrocketed him to fame, for videos on science experiments, “challenge” gimmicks (“Edible Candy vs. Real!!!”), and exercise tips. He got into video games. Early in the pandemic Ryan and his parents posted footage of their chat with a health official about COVID-19. Ryan, a tireless performer, displayed the exaggerated emotional reactions of someone who had spent most of his life in front of a camera.

Still, Ryan’s back catalog continued to put him and YouTube under scrutiny. A 2020 New York Times headline asked, are “kidfluencers” making our kids fat? and printed a still of an old video of Ryan playing a McDonald’s cashier. An advocacy group accused him of breaking laws against deceptive advertising to kids. Chris Williams, a former Maker Studios executive, had started PocketWatch, an entertainment company that worked with Ryan and other YouTube child performers. Williams found such critics of his star misguided, comparing them to the scolds who raised a moral panic about video games and rap in the 1990s. Critics, he believed, failed to see the benefit for young audiences of having a relatable figure on-screen—even one as famous as Ryan. “Really, what they mean is, ‘This isn’t Sesame Street,’ ” said Williams. “If that were the bar, kids would never watch anything.”

When Ryan first exploded on YouTube’s charts, his parents set up a production studio to capitalize on his success. They sold Ryan-branded toys, clothes, and bedding in Walmart and Target. They made an animated Ryan character to continue his legacy if, one day, he stops YouTubing. That character appeared in the Macy’s Thanksgiving Day Parade. Forbes magazine annually listed the richest YouTubers and, starting in 2018, Ryan topped the list. His estimated earnings in 2020 neared $30 million, a sum that shocked practically everyone who heard it. A nine-year-old made how much?

The world still couldn’t grasp how media worked in the age of YouTube. From Williams’s vantage point, Ryan wasn’t just a nine-year-old YouTuber but the centerpiece of a business juggernaut. “I’m from Disney,” Williams said. “The delta between $30 million and Mickey Mouse is pretty big.” There was plenty of room to grow, so long as kids kept watching.


With such an enormous captive audience and payment system, YouTube could count on keeping a firm lead in its industry for years to come. It could count on another advantage, too: being inside Google, the undisputed leader in artificial intelligence. That capability let YouTube build a machine system that, by 2020, could detect obvious red flags—a Nazi symbol or a sexual comment about kids—as quickly as it could spot a copyrighted song. This, YouTube said, meant most “violative” footage came down without a human involved at all.

But Google’s superhuman AI couldn’t solve a messier issue—the endless quagmire about truth and misinformation online.

YouTube tried throwing computer science and its rulebook at this problem. Like other tech platforms, YouTube outlawed certain topics when they became politically untenable. A month before the 2020 election, YouTube banned videos promoting QAnon, the extremist pro-Trump movement. When COVID-19 vaccines arrived, YouTube removed footage that questioned official scientific guidance. (Several Trump videos came down for this reason.) After Russia invaded Ukraine, YouTube zapped Russian state media channels for violating rules against “trivializing well-documented violent events.” (Russia blocked Facebook after the invasion but not YouTube, which was massively popular in the country.) YouTube scrubbed more than a million videos for containing “dangerous coronavirus information.” Under a system called “Golden Set,” staff gave thousands of examples of clear lines in the sand to machine detectors—this video on COVID-19 contains lies; this one does not.

Yet YouTube knew this process wasn’t perfect. “People might think we have this great AI that can drive cars and everything,” said Goodrow, the veteran YouTube engineering leader. “But I don’t think right now we could even simply identify what specific claims are being made within a video.”

Even if it could, few outside the company agreed on the precise definitions of misinformation or disinformation, so debates on the matter typically circled into political stalemates. And, by and large, YouTube stayed out of the fray. Outrage in the press and halls of power usually focused on the social networks, not the video site. In 2021 President Joe Biden chastised tech platforms for promoting vaccine hesitancy, specifically accusing Facebook of “killing” people with lies. Ire on the right was largely aimed at Twitter, which had booted Trump off the service after the January 6 insurrection. Mark Zuckerberg and Jack Dorsey, the chiefs of Facebook and Twitter, testified multiple times before Congress; Susan Wojcicki never did. YouTube was the sleeping giant of social media.

There were many reasons for this. YouTube was better situated to avoid information warfare. You might see your cranky uncle rant about vaccines on Facebook or Twitter, but probably not YouTube. Political content repeatedly topped Facebook’s popularity charts. YouTube was still dominated by music, gaming, and kids’ videos. YouTube, like Facebook, had indefinitely banned Trump without clear plans for his reinstatement, but Trump wasn’t as big on YouTube, and his absence there drew less attention. And Facebook remained an easier punching bag; in the fall of 2021 a whistleblower released a series of damning documents, including evidence Facebook hadn’t acted swiftly enough to combat lies about COVID-19 vaccines. Some believed YouTube was simply a better managed company.

Or it could be that YouTube was more difficult to see into. It’s fairly simple to identify when someone touts dubious claims about vaccines in a tweet or Facebook post, in text; it’s much harder to do in a long video. YouTube shared relatively little data with outsiders, which helped the company evade scrutiny. After 2020 YouTube started revealing more about its algorithms and disclosing metrics that showed progress on its goals: views of borderline footage and videos that broke rules, before they were deleted, were low and falling. But YouTube was grading its own homework. No external group audited the results. Take Biden’s vaccine crusade against social media, which was based on findings from an advocacy group that examined statistics Facebook and Twitter shared; the findings excluded YouTube simply because YouTube did not share comparable data. The Facebook whistleblower revealed that Instagram ignored internal research on the damage its app had on the mental health of teenage girls, spurring waves of criticism of Facebook. Afterward, multiple people who worked at YouTube said their company either didn’t share this type of research widely or simply didn’t conduct it.

“YouTube is really opaque,” said Evelyn Douek, a Stanford Law assistant professor who studies content moderation. “It’s much more fun for me to lob stones from the outside. This stuff is hard,” she added. “That doesn’t mean that they don’t have responsibility.”

Privately, people at YouTube, like their peers at Facebook, complained they were being scapegoated for the collapse in democratic norms brought about by cable news, inequality, and God knows what else. One longtime YouTube executive put it bluntly: “Don’t blame the mirror.” It was a common refrain in Silicon Valley, that platforms merely reflected the society that used them.

But YouTube never reflected everything in society. And as regulation dragged on, it started reflecting less and less. In the fall of 2021, YouTube banned false claims about any vaccine and stripped ads from videos denying the reality of climate change. Some praised the moves; some saw them as overreach. Others asked the obvious: What took so long?

Under Biden and the Democrats, who asked that kind of question, YouTube leadership started pushing back. Wojcicki authored an op-ed comparing overly intrusive online moderation to censorship in the U.S.S.R., where her grandparents had lived. Neal Mohan claimed YouTube had seen “disturbing new momentum” in government takedown requests for political reasons. YouTube, he wrote, could aggressively attack COVID-19 falsehoods because health agencies gave official guidance, but it needed to tread carefully on other topics. “One person’s misinfo is often another person’s deeply held belief,” he blogged. He took for granted the assumption, less than two decades old, that any person was entitled to broadcast their deeply held beliefs over mass media.

Still, Mohan was right that YouTube saw disturbing trends. Officials in Russia and India began using the terms “fake news” and “extremism” to demand YouTube remove videos from critics and opposition figures—in effect, forcing the company to choose between its stated values and its desire to be everywhere. Other countries will likely follow suit. In nations rife with conflict and contested elections, YouTube, like other social media, hires few people who understand the language or political terrain.

YouTube leaders often discussed how their world-class AI software, while imperfect, was the only system capable of handling this enormity it had created. This was presented as an inevitability, but it was a choice. YouTube had once placed its coolhunter community managers and partners like Storyful, the Arab Spring newsroom, as editors on the frontlines to vet, verify, and make sense of the information flood. “We kept trying new things, which the culture of YouTube encouraged,” recalled Steve Grove, its former head of news and politics. YouTube chose to stop many of those experiments. It chose to pursue scale instead. This trade-off kept humans, as flawed as they are, away from one of humanity’s biggest problems—creating a shared set of facts and truth. “Misinformation online remains a threat to democracy,” said Grove, who left Google to become a state official in Minnesota. To address the threat, he added, “curation, in various forms, will always have to play a key part.”

That’s a solution without a clear mathematical formula or ability to scale. Not very Googley.


A few people at YouTube were consumed with these unending debates. But most were busy running the world’s largest online video business.

Claire Stapleton had thought about this more since leaving. She had returned to newsletters, starting an advice column for disgruntled Silicon Valley employees called “Tech Support,” where she ripped into her old employer and still linked to marvelous YouTube videos she liked. She had changed her mind about the PewDiePie affair, seeing now how those drawn-out debates about ♥ ing his tweets were a distraction when a rot festered underneath. “It was a futile fight,” she recalled. “We got so caught up on the aesthetics of the brand when there was a refusal to discuss the real issues around YouTube.” At work they rarely asked the sorts of questions she now did. How had her company, so devoted to organizing the world’s information, built a megaphone and payment system for conspiracists, cranks, and hatemongers? What did it mean that YouTube enticed teen moms to place their entire lives on-screen? Does everyone really need to broadcast themselves? And why couldn’t she look away? For Stapleton, this begged a deeper question: “Is YouTube net negative or net positive for society?”

Everything she put in the positive column—the site’s treasured communities and delightful brilliance—didn’t come from the company. “YouTube doesn’t foster creativity,” she concluded. “People do!”


Susan Wojcicki entered her eighth year as CEO in 2022 stepping back from the stage. She gave few public interviews. Advertising and media partners who met her frequently during YouTube’s crisis years said they saw her less often, now that those crises were over. Most YouTube viewers probably couldn’t name her. “She is not charismatic,” said Kim Scott, a former Google colleague. “But to her credit, that’s not a bad thing at all. Especially in her role, I think a charismatic leader of YouTube might be a disaster. Charismatic leaders kill these businesses because it’s all about them.”

Wojcicki and her husband, another Googler, ran a foundation that gave grants to several Jewish and interfaith groups, including the ADL, and environmental nonprofits like Earthjustice and Environmental Defense Fund. She was careful not to speak of her personal opinions. When she did appear publicly, she deployed a new talking point: YouTube’s “responsibility” efforts were good for business. Indeed, YouTube’s ad sales nearly doubled in the two years since its major overhauls in 2019. That year Google’s founders, Page and Brin, retired in their mid-forties, leaving Sundar Pichai in charge of Google and Alphabet. The Justice Department proceeded with its antitrust case against Google, and Congress introduced bills to regulate tech competition and “malicious algorithms,” which included YouTube recommendations. Google’s gravest threat, though, seemed increasingly unlikely: the company wouldn’t be broken up. In response to political heat Pichai positioned Google as being “helpful,” a utility people loved. He pushed to shift its ads business, facing serious backlash, into a powerhouse for e-commerce, an underdog to Amazon. YouTube, a place overflowing with how-tos and commercial influencers, became central to both strategies. When Google insiders speculated about who might take over if Pichai leaves, Wojcicki’s name always appeared on the shortlist.

Some observers believed Wojcicki’s sober disposition and careful management explained why YouTube hadn’t come under the same scrutiny as Facebook. “She cares,” said Jim Steyer, the founder of Common Sense Media, an influential advocacy group. Steyer has hit tech platforms hard on children’s issues, lobbying for regulation to curb tech’s addictive power and business practices. He no longer trusts Facebook. Where YouTube is concerned, “the jury is out,” he said. Though he added, “When Susan took over, it changed my attitude.”

Few in Silicon Valley or Hollywood would describe Wojcicki as a visionary or today’s YouTube as a hotbed for innovation. It’s a tanker, an enormous business and institution steered with small, careful turns. Even if she wanted to, Wojcicki probably couldn’t steer it entirely in a chosen direction. She is a steward of a platform with a life of its own. Running YouTube means dealing with an entity that is “inherently indefinable and ungovernable,” explained one company veteran. “You’re holding on to the reins.” Even so, Wojcicki has managed to contain certain parts of YouTube that had spun out of control. And she still runs a global mass media and economic giant with little transparency and accountability.

“If I was going to make a list of people to have the amount of power that Susan Wojcicki has, she might be on it,” said Hank Green. “But I would just rather not see someone with that much power, especially someone who is unelected.”


When Green sat down with Wojcicki in the early days of the pandemic, the Nerdfighter vlogger managed a rare feat: he got YouTube’s leader to reveal something. Wojcicki was explaining how YouTube bucketed broadcasters into three groups—creators, music labels, and traditional media—when she shared that YouTubers made up “about half” of all viewership.

Green tossed his hands up at the revelation. “Wow! That’s huge!” He beamed.

He had less luck getting somewhere on another front. During their conversation, Green pressed Wojcicki on YouTube’s strategy of financing channels. Why, after failing to get Hollywood on board, had the company given grants to only the biggest YouTube stars to make shows like TV? Why not fund smaller creators to run their own media businesses? “There’s a potential for a large middle class of YouTubers,” Green argued. “I’ve always been of the opinion that YouTube should lean into what YouTube is and—”

Wojcicki interrupted, “We agree with that!”

Green laughed. He knew YouTube better than anyone who didn’t work there, and probably many who did, and he was speaking with the person who had run YouTube longer than anyone else. And yet, it seemed the pair was talking past each other. “We may not agree on what exactly YouTube is,” he replied. “Shocker! Not that anybody does.”

By the end, their conversation lasted nearly an hour. Not long ago no one would have published a video that long on the internet, let alone expected an audience or financial reward for doing so. Green simply uploaded his clip, adding another hour to the billions already accumulating on YouTube that day.