Sixteen

image

AMERICA, DISRUPTED

image
Firefighters searched Ground Zero long after the collapse of both towers.

THE FIRST PLANE CRASHED INTO THE TOP FLOORS OF the north tower of the World Trade Center in New York at 8:46 a.m. on September 11, 2001. CNN broke into a commercial to show live footage of the tower, gray smoke billowing out of a black gash in the steel and glass against a nearly cloudless blue sky. On an ordinary day, some fifty thousand people worked in the Twin Towers, more than one hundred stories high; by a quarter to nine on that particular Tuesday, nearly twenty thousand people had already shown up, wearing hats and pumps, carrying laptops and briefcases. Orders to evacuate or not to evacuate, and whether to go up or go down, conflicted; most people decided to leave, and headed down. As more than a thousand firefighters, EMTs, and police officers raced to the scene and began rescue efforts, some people trapped on the upper floors, facing insufferable heat and unable to breathe, leapt to their deaths rather than be burned alive. One couple held hands as they fell. From far away, they looked like paper dolls.

At 8:52 a.m., Peter Hanson, a passenger on another plane, United Airlines Flight 175, was able to call his father. He asked him to report to authorities that his flight had been hijacked. Hanson, thirty-two, was flying with his wife and their two-and-a-half-year-old daughter: they were going to Disneyland. “I think they’ve taken over the cockpit,” Hanson whispered to his father. The passengers were thinking about trying to gain control of the plane from the terrorists, who’d used knives and Mace and said they had a bomb and appeared to have killed the pilots. At 9:00, Hanson called his father again. “I think we are going down,” he said. “My God, my God,” he gasped. Three minutes later, United 175 crashed into the south tower of the World Trade Center.

Television stations had been covering the fire in the north tower live; announcers and reporters watched in horror as the plane hit the south tower and burst into a fireball. It looked impossible, something out of a 1950s Hollywood disaster film, props, models, wires, and tin, something that could not happen, King Kong swinging from the Empire State Building, Godzilla climbing the Statue of Liberty. “My God, my God,” said a host on ABC News. “Oh Lord.” Sirens shrieked, and from the streets there came a wailing.

At 9:37 a.m., in Washington, DC, a third hijacked plane, traveling at 530 miles per hour, crashed into the Pentagon. The hijackers had intended to crash a fourth plane, United Flight 93, into the Capitol or the White House. This flight, unlike the first three, all of which had departed on time, was running more than half an hour late; it took off at 8:42 a.m. At 9:23, a United flight dispatcher sent out a message: “Beware any cockpit intrusion.” At 9:26, the pilot on Flight 93 responded with seeming disbelief: “Confirm latest mssg plz.” Two minutes later, the hijackers stormed the cockpit. In the moments that followed, ten of the flight’s thirty-three passengers and the two surviving members of the crew managed to make phone calls. They learned about the attacks on the World Trade Center; they decided to fight back. At 9:47, CeeCee Lyles, a flight attendant and mother of four, called her husband and left him a message. “I hope to be able to see your face again, baby,” she said, her voice breaking. “I love you.” Ten minutes later, the passengers and crew, having taken a vote about what to do, charged the cockpit. The plane began to roll. At 10:03, United Flight 93 plowed into a field in Shanksville, Pennsylvania, twenty minutes outside of Washington. Everyone on all four planes died.

In New York, emergency workers had entered the towers, evacuating thousands of people, but the burning jet fuel, at over a thousand degrees, was weakening the skyscrapers’ steel girders. At 9:58 a.m., the south tower collapsed into itself, falling straight to the ground like an elevator shaft, crushing everyone inside. CNN, which had been covering the crash at the Pentagon, cut back to New York, the TV screen showing nothing but cloud upon cloud; for a moment, watching the screen felt like looking out the window of a plane, flying through the white. The north tower fell at 10:28. CNN: “There are no words.”

It seemed altogether possible that there were more attacks to come. “We have some planes,” one of the hijackers had said. Poor communication between the civilian aviation authority and the military’s aerospace command—and the lack of any experience with or protocol for a suicide hijacking—meant that the U.S. military had been unable to mount a defense. About 10:15, the vice president authorized the air force to shoot down United Flight 93, unaware that it had already met its horrible end. By noon, all flights from all U.S. airports were grounded, federal buildings were evacuated, embassies were shuttered, and millions of prayers were whispered. The vice president was moved from the White House to an underground bunker, and the president, who had been visiting an elementary school in Florida, was flown to a secure location in Omaha, Nebraska. Nearly three thousand people had been killed.1

“America Under Attack,” ran the headline on CNN.com, whose coverage that day included videos, a photo gallery, a timeline, statements from leaders around the world, and emergency resources.2 NYTimes.com posted a slideshow, maps, a flight tracker, and a list of places to donate blood.3 The Drudge Report’s homepage displayed a pair of police sirens and the question “Who Did This?!”4 And Foxnews.com began an ongoing special report, “Terrorism Hits America.”5

That night, a resolved president delivered a televised address. “A great people has been moved to defend a great nation,” George W. Bush said. Even before night had fallen, he committed the United States to waging a “war against terrorism.”6

Nineteen men, trained by al Qaeda, an Islamic terrorist organization led by Saudi millionaire Osama bin Laden, had conducted the attacks. Bush’s rhetoric and that of the neoconservatives in his administration characterized the “war on terror” as an inevitable conflict that was part of a “clash of civilizations,” predicted by political scientist Samuel P. Huntington in a 1993 article in Foreign Affairs. Once, there had been wars between kings, then wars between peoples, then wars between ideologies, Huntington argued, but those ages had passed, and the future would be characterized by clashes between the world’s great civilizations, first along the fault line between Western civilization and the Islamic world. Western dependence on Arab oil and the rise of Islamic fundamentalism had already led to the 1979 U.S. hostage crisis in Iran and the Soviet invasion of Afghanistan, and, in 1990, to the First Persian Gulf War.7

“America was targeted for attack because we’re the brightest beacon for freedom and opportunity in the world,” Bush said.8 Barack Obama, an Illinois state senator and constitutional law professor, offered a different interpretation in a Chicago newspaper. “The essence of this tragedy,” he said, “derives from a fundamental absence of empathy on the part of the attackers,” a deformation that “grows out of a climate of poverty and ignorance, helplessness and despair.”9

It became something of a national myth, later, to describe the American people, long divided, as newly united after 9/11. More accurate would be to say that, in those first days, politicians and writers who expressed views that strayed far from the mournful stoicism that characterized the response of both Bush, on an international stage, and Obama, in a neighborhood newspaper, were loudly denounced. These included Susan Sontag, who traced the origins of the attack to U.S. foreign policy in the Middle East—the propping up of tyrants, the CIA toppling of Middle Eastern leaders, and the ongoing bombing of Iraq. “Where is the acknowledgment that this was not a ‘cowardly’ attack on ‘civilization’ or ‘liberty’ or ‘humanity’ or ‘the free world’ but an attack on the world’s self-proclaimed superpower, undertaken as a consequence of specific American alliances and actions?” Sontag asked in The New Yorker. “In the matter of courage (a morally neutral virtue): whatever may be said of the perpetrators of Tuesday’s slaughter, they were not cowards.”10 In the Washington Post, Charles Krauthammer accused Sontag of “moral obtuseness.”11 From the right, Ann Coulter, a columnist who’d earlier worked for Paula Jones’s legal team, wrote in the National Review, in an article posted online on September 13, that drawing any distinctions between anyone in the Arab world was unnecessary, as was any investigation into the attacks. “This is no time to be precious about locating the exact individuals directly involved in this particular terrorist attack,” Coulter wrote. “We don’t need long investigations of the forensic evidence to determine with scientific accuracy the person or persons who ordered this specific attack. . . . We should invade their countries, kill their leaders and convert them to Christianity.”12 Two weeks later, the editor of the National Review announced that it regretted publishing Coulter’s piece, and stopped publishing her column.13 “I really believe the pagans and abortionists, and the feminists and the gays and the lesbians who are actively trying to make that an alternative lifestyle, the ACLU, People for the American Way, all of them who have tried to secularize America, I point the finger in their face and say, ‘You helped this happen,’” Jerry Falwell said immediately after the attacks.14 But he, too, was condemned, including by the president.15

Alex Jones, cluster-bomb radio host, flew in under the radar of this opprobrium. On the afternoon of the attacks, he broadcast across the country, live from Austin, to nearly a hundred affiliated stations, for five hours. He began, not with sympathy, not with grief, not with horror, but with gleeful self-congratulation: “Well, I’ve been warning you about it for at least five years, all terrorism that we’ve looked at from the World Trade Center and Oklahoma City to Waco, has been government actions,” Jones crowed. “They need this as a pretext to bring you and your family martial law. They’re either using provocateur Arabs and allowing them to do it or this is full complicity with the federal government: the evidence is overwhelming.” (Earlier that summer, Jones had issued a warning. “Please!” he’d screamed. “Call Congress. Tell ’em we know the government is planning terrorism.”) On September 11, he reported the morning’s events as if reading from an incidents log, adding details of his own—“dead bodies up to six blocks away, arms, legs, you name it”—interrupting with updates, and cutting to eyewitnesses, in coverage that sounded straight out of Orson Welles’s The War of the Worlds. Like Welles, Jones asserted his own credibility by frequently sounding notes of caution—“we don’t know how many of these reports are accurate”—while making singularly outrageous and vicious claims, even as surgeons in New York were amputating limbs and nurses were cleaning burned skin and firefighters, falling down from exhaustion, were digging through rubble, looking for survivors. “I’ll tell you the bottom-line,” Jones growled. “Ninety-eight percent chance this was a government-orchestrated, controlled bombing.”16

Between 2001 and 2016, the demise of the daily newspaper, following the spiraling decline of broadcast television, contributed to a dizzying political disequilibrium, as if the world of news were suddenly revealed to be contained within a bouncy castle at an amusement park. New sources of news and opinion appeared like so many whirling, vertiginous rides, neon-bright, with screams of fright and delight, from blogs and digital newspapers to news aggregators and social media, roller coasters and water slides and tea-cup-and-saucer spinners. Facebook launched in 2004, YouTube in 2005, Twitter in 2006, the iPhone in 2007. By 2008, Twitter had a million users, and one in six Americans had a smartphone. Six years later, those numbers had climbed teeteringly high: Twitter had 284 million users, and two out of three Americans owned smartphones. They clutched them in their hands as they rode and rolled, thrilled by the G-force drop and the eardrum-popping rise and the sound of their own shrieking.

New sources of news tended to be unedited, their facts unverified, their politics unhinged. “Alternative” political communities took the 1990s culture wars online; Tumblr on the left and 4chan on the right, trafficking in hysteria and irony, hatred and contempt, Tumblr performing the denunciation of white privilege with pious call-outs and demanding trigger warnings and safe spaces, 4chan pronouncing white supremacy and antifeminism by way of ironic memes and murderous trolls.17 In a throwback to the political intrigues of the Cold War, Russia-sponsored hackers and trolls, posing as Americans, created fake Twitter and Facebook accounts whose purpose was to undermine the authority of the mainstream news, widen the American partisan divide, stoke racial and religious animosity, and incite civil strife. Under these circumstances, the fevered rants of deranged conspiracy theorists reached a new and newly receptive audience but, in a much broader and deeper sense, in an age of ceaseless online spectacle and massive corporate and government surveillance, nearly all political thinking became conspiratorial.

Jones, in retrospect, was the least but also the worst of this, the amusement park’s deadly but absurd Stephen King clown. After 9/11, he briefly lost some of his affiliates, but he didn’t especially need a radio network. In 1999, he’d launched a website called Infowars, where he presented himself to the world as a citizen journalist, a fighter for the truth by way of the new, no-holds-barred medium of the Internet. On September 11, Infowars warned, of the federal government, “They Are Preparing to Radically Re-engineer Our Society Forever.” That day, Jones inaugurated what came to be called the truther movement, a faction of conspiracy theorists who believed that the United States government was behind the 9/11 attacks. The vice president, Jones would later elaborate, had been disappointed by the passenger revolt on United Flight 93. “If it would have hit its target,” Jones said, “the government would have been completely decapitated and the president could have declared total martial law.”18

Jones, wild with malice, cut through the American political imagination with a chainsaw rigged to a broom handle, flailing and gnashing. In 2008, when Barack Obama sought the Democratic nomination for president in a close competition with Hillary Clinton, Jones and other truthers became birthers: they argued that Obama, who was born in Hawaii—an event reported in two Hawaiian newspapers and recorded on his birth certificate—had been born in Kenya. The truthers were on the far fringes, but even the broader American public raised an eyebrow at Obama’s name, Barack Hussein Obama, at a time when the United States’s declared enemies were Osama bin Laden and Saddam Hussein. Urged to change his name, Obama refused. Instead, he joked about it. “People call me ‘Alabama,’” he’d say on the campaign trail. “They call me ‘Yo Mama.’ And that’s my supporters!”19

So far from changing his name, Obama made his story his signature. His 2008 campaign for “Hope” and “Change” was lifted by soaring storytelling about the nation’s long march to freedom and equality in which he used his own life as an allegory for American history, in the tradition of Benjamin Franklin, Andrew Jackson, and Frederick Douglass. But Obama’s story was new. “I am the son of a black man from Kenya and a white woman from Kansas,” he said. “These people are a part of me. And they are a part of America.” Obama’s American family was every color, and part of a very big world. “I have brothers, sisters, nieces, nephews, uncles and cousins, of every race and every hue, scattered across three continents, and for as long as I live, I will never forget that in no other country on Earth is my story even possible.”20

Obama’s election as the United States’ first black president was made possible by centuries of black struggle, by runaways and rebellions, by war and exile, by marches and court cases, by staggering sacrifices. “Barack Obama is what comes at the end of that bridge in Selma,” said the much-admired man who had marched at Selma, John Lewis.21 His victory seemed to usher in a new era in American history, a casting off of the nation’s agonizing legacy of racial violence, the realizing, at long last, of the promises made in the nation’s founding documents. Yet as he took office in 2009, Obama inherited a democracy in disarray. The United States was engaged in two distant wars with little popular support and few achievable objectives, fought by a military drawn disproportionately from the poor—as if they were drones operated by richer men. The economy had collapsed in one of the worst stock market crashes in American history. The working class had seen no increase in wages for more than a generation. One in three black men between the ages of twenty and twenty-nine was in prison or on probation.22 Both parties had grown hollow—hard and partisan on the outside, empty on the inside—while political debate, newly waged almost entirely online, had become frantic, desperate, and paranoid. Between 1958 and 2015, the proportion of Americans who told pollsters that they “basically trust the government” fell from 73 percent to 19 percent.23 Forty years of a relentless conservative attack on the government and the press had produced a public that trusted neither. Forty years of identity politics had shattered Rooseveltian liberalism; Obama walked on shards of glass.

Even as Obama embraced a family of cousins scattered across continents, nationalism and even white supremacy were growing in both the United States and Europe in the form of populist movements that called for immigration restriction, trade barriers, and, in some cases, the abdication of international climate accords. New movements emerged from the right—the Tea Party in 2009 and the alt-right in 2010—and from the left: Occupy in 2011, Black Lives Matter in 2013. Activists on the left, including those aligned with an antifascist resistance known as antifa, self-consciously cast their campaigns as international movements, but the new American populism and a resurgent white nationalism had their counterparts in other countries, too. Whatever their political differences, they shared a political style. In a time of accelerating change, both the Far Left and the Far Right came to understand history itself as a plot, an understanding advanced by the very formlessness of the Internet, anonymous and impatient. Online, the universe appeared to be nothing so much as an array of patterns in search of an explanation, provided to people unwilling to trust to any authority but that of their own fevered, reckless, and thrill-seeking political imaginations.

In 2011, during Obama’s second term, the aging New York businessman, television star, and on-again, off-again presidential candidate Donald Trump aligned himself with the truthers and the birthers by questioning the president’s citizenship. In a country where the Supreme Court had ruled, in Dred Scott, that no person born of African descent could ever be an American citizen, to say that Obama was not a citizen was to call upon centuries of racial hatred. Like 9/11 conspiracy theorists, Obama conspiracy theorists (who were in many cases the same people) were forever adding details to their story: the president was born in Nairobi; he was educated at a madrasa in Jakarta; he was secretly a Muslim; he was, still more secretly, an anti-imperialist African nationalist, like his father; he was on a mission to make America African.24 “The most powerful country in the world,” right-wing pundit Dinesh D’Souza warned, “is being governed according to the dreams of a Luo tribesman of the 1950s.”25

Trump, bypassing newspapers and television and broadcasting directly to his supporters, waged this campaign online, through his Twitter account. “An ‘extremely credible source’ has called my office and told me that @BarackObama’s birth certificate is a fraud,” he tweeted in 2012.26 Trump did not back off this claim as he pursued the Republican nomination in 2015.27 The backbone of his campaign was a promise to build a wall along the U.S.-Mexican border. After 9/11, a white nationalist movement that had foundered for decades had begun to revive, in pursuit of two goals: preserving the icons of the Confederacy, and ending the immigration of dark-skinned peoples.28 Trump, announcing his candidacy from New York’s Trump Tower, gave a speech in which he called Mexicans trying to enter the United States “rapists,” borrowing from a book by Ann Coulter called ¡Adios, America!29 (On immigration and much else, Coulter promoted herself as a courageous teller of truths in a world of lies. “Every single elite group in America is aligned against the public—the media, ethnic activists, big campaign donors, Wall Street, multimillionaire farmers, and liberal ‘churches,’” Coulter wrote. “The media lie about everything, but immigration constitutes their finest hour of collective lying.”)30 Obama had promised hope and change. Trump promised to Make America Great Again.

Hillary Clinton, having lost the Democratic nomination to Obama in 2008, won it in 2016 and hoped to become the first female president. Her campaign misjudged Trump and not only failed to address the suffering of blue-collar voters but also insulted Trump’s supporters, dismissing half of them as a “basket of deplorables.” Mitt Romney had done much the same thing as the Republican nominee in 2012, when, with seething contempt, he dismissed the “47 percent” of the U.S. population—Obama’s supporters—as people “who believe they are victims.”31 Party politics had so far abandoned any sense of a national purpose that, within the space of four years, each of the party’s presidential nominees declared large portions of the population of the United States unworthy of their attention and beneath their contempt.

Trump, having secured the nomination, campaigned against Clinton, aided by the UK data firm Cambridge Analytica, by arguing that she belonged in jail. “She is an abject, psychopathic, demon from Hell that as soon as she gets into power is going to try to destroy the planet,” said Jones, who sold “Hillary for Prison” T-shirts. “Lock Her Up,” Trump’s supporters said at his rallies.32

American history became, in those years, a wound that bled, and bled again. Gains made toward realizing the promise of the Constitution were lost. Time seemed to be moving both backward and forward. Americans fought over matters of justice, rights, freedom, and America’s place in the world with a bitter viciousness, and not only online. Each of the truths on which the nation was founded and for which so many people had fought was questioned. The idea of truth itself was challenged. The only agreed-upon truth appeared to be a belief in the ubiquity of deception. The 2008 Obama campaign assembled a Truth Team.33 “You lie!” a South Carolina congressman called out to President Obama, during a joint session of Congress in 2011. “You are fake news!” Trump said to a CNN reporter at an event at the White House.34

“Let facts be submitted to a candid world,” Jefferson had written in the Declaration of Independence, founding a nation by appealing to truth. But whatever had been left of a politics of reasoned debate, of inquiry and curiosity, of evidence and fair-mindedness, seemed to have been eradicated when, on December 2, 2015, Trump appeared on Infowars by Skype from Trump Tower. In an earlier campaign rally, Trump had said that on 9/11 he’d been watching television from his penthouse, and had seen footage of “thousands and thousands of people,” Muslims, cheering from rooftops in New Jersey.35 Jones began by congratulating Trump on being vindicated on this point. (Trump had not, in fact, been vindicated, and no such footage has ever been found.) Jones, sputtering, gushed about the historic nature of Trump’s campaign.

“What you’re doing is epic,” Jones told Trump. “It’s George Washington level.”

“Your reputation’s amazing,” Trump told Jones, promising, “I will not let you down.”36

Five days later, Trump called for a “total and complete shutdown of the entry of Muslims to the United States.”37 In place of towers, there would be walls.

Between the attacks on September 11, 2001, and the election of Donald Trump fifteen years later, on November 9, 2016, the United States lost its way in a cloud of smoke. The party system crashed, the press crumbled, and all three branches of government imploded. There was real fear that the American political process was being run by Russians, as if, somehow, the Soviets had won the Cold War after all. To observers who included the authors of books like How Democracy Ends, Why Liberalism Failed, How the Right Lost Its Mind, and How Democracies Die, it seemed, as Trump took office, as if the nation might break out in a civil war, as if the American experiment had failed, as if democracy itself were in danger of dying.38

I.

IT BEGAN, in the year 1999, with a panic. Computer programmers predicted that at one second after midnight on January 1, 2000, all the world’s computers, unable to accommodate a year that did not begin with “19,” would crash. Even before the twenty-first century began, even before no small number of political dystopians forecast a thousand-year clash of civilizations or the imminent death of democracy, Americans were subjected to breathless warnings of millennial doom, a ticking clock catastrophe, not the global annihilation timed by the atomic age’s Doomsday Clock but a disaster, a “Y2K bug,” embedded into the programs written to run on the microprocessor tucked into the motherboard of the hulking computer perched on every desktop. After much rending of garments and gnashing of teeth, this bug was quietly and entirely fixed. The end of the world averted, digital prophets next undertook to predict the exact date of the arrival of an Aquarian age of peace, unity, and harmony, ushered in by, of all things, the Internet.

image
Wired magazine began appearing in 1993 and by 2000 announced that the Internet had ushered in “One Nation, Interconnected.”

In the spring of 2000, Wired, the slick, punk, Day-Glo magazine of the dot-com era, announced that the Internet had, in fact, already healed a divided America: “We are, as a nation, better educated, more tolerant, and more connected because of—not in spite of—the convergence of the Internet and public life. Partisanship, religion, geography, race, gender, and other traditional political divisions are giving way to a new standard—wiredness—as an organizing principle for political and social attitudes.”39 Of all the wide-eyed technological boosterism in American history, from the telegraph to the radio, few pronouncements rose to such dizzying rhetorical heights.

Over the course of the twentieth century, the United States had assumed an unrivaled position in the world as the defender of liberal states, democratic values, and the rule of law. From NATO to NAFTA, relations between states had been regulated by pacts, free trade agreements, and restraint. But, beginning in 2001, with the war on terror, the United States undermined and even abdicated the very rules it had helped to establish, including prohibitions on torture and wars of aggression.40 By 2016, a “by any means necessary” disregard for restraints on conduct had come to characterize American domestic politics as well. “If you see somebody getting ready to throw a tomato,” Trump told supporters at a campaign rally in Iowa, “knock the crap out of them, would you?”41 Countless factors contributed to these changes. But the crisis of American moral authority that began with the war on terror at the start of the twenty-first century cannot be understood outside of the rise of the Internet, which is everything a rule-based order is not: lawless, unregulated, and unaccountable.

What became the Internet had begun in the late 1960s, with ARPANET. By the mid-1970s, the Department of Defense’s Advanced Research Projects Agency’s network had grown to an international network of networks: an “internet,” for short. In 1989, in Geneva, Tim Berners-Lee, an English computer scientist, proposed a protocol to link pages on what he called the World Wide Web. The first web page in the United States was created in 1991, at Stanford. Berners-Lee’s elegant protocol spread fast, first across universities and then to the public. The first widely available web browser, Mosaic, was launched in 1993, making it possible for anyone with a personal computer wired to the Internet to navigate web pages around the world, click by astonishing click.42

Wired, launched in March 1993, flaunted cyberculture’s countercultural origins. Its early contributors included Stewart Brand and John Perry Barlow, a gold-necklace- and scarf-wearing bearded mystic who for many years wrote lyrics for the Grateful Dead. In Wired, the counterculture’s dream of a nonhierarchical, nonorganizational world of harmony found expression in a new digital utopianism, as if every Internet cable were a string of love beads. Brand, writing in an article in Time, “We Owe It All to the Hippies,” announced that “the real legacy of the sixties generation is the computer revolution.”43

But between the 1960s and the 1990s, the revolution had moved from the far left to the far right. Wired was edited by Louis Rossetto, a libertarian and former anarchist known to lament the influence of the “mainstream media.” In the magazine’s inaugural issue, Rossetto predicted that the Internet would bring about “social changes so profound their only parallel is probably the discovery of fire.” The Internet would create a new, new world order, except it wouldn’t be an order; it would be an open market, free of all government interference, a frontier, a Wild West. In 1990, Barlow had helped found the Electronic Frontier Foundation, to promote this vision. (The EFF later became chiefly concerned with matters of intellectual property, free speech, and privacy.) In 1993, Wired announced that “life in cyberspace seems to be shaping up exactly like Thomas Jefferson would have wanted: founded on the primacy of individual liberty and a commitment to pluralism, diversity and community.”44

The digital utopians’ think tank was Newt Gingrich’s Progress and Freedom Foundation, established in 1993 (and later the subject of an ethics inquiry); its key thinker was an irrepressible George Gilder, resurrected. Gingrich appeared on the cover of Wired in 1995, Gilder in 1996. Gingrich was battling in Congress for a new Telecommunications Act, the first major revision of the FCC-founding 1934 Federal Communications Act (itself a revision of the 1927 Federal Radio Act); his objective was to insure that, unlike radio or television, the new medium would lie beyond the realm of government regulation. At a 1994 meeting of Gingrich’s Progress and Freedom Foundation in Aspen, Gilder, along with futurists Alvin Toffler and Esther Dyson and the physicist George Keyworth, Reagan’s former science adviser, drafted a “Magna Carta for the Information Age.”45 It established the framework of the act Gingrich hoped to pass. Announcing that “cyberspace is the latest American frontier,” the writers of the new Magna Carta contended that while the industrial age might have required government regulation, the knowledge age did not. “If there is to be an ‘industrial policy for the knowledge age,’” their Magna Carta proclaimed, “it should focus on removing barriers to competition and massively deregulating the fast-growing telecommunications and computing industries.”46

Gingrich got his wish. On February 8, 1996, in an event broadcast live and over the Internet, Bill Clinton signed the Telecommunications Act in the reading room of the Library of Congress; he signed on paper and he also signed online, at a computer terminal.47 If little noticed at the time, Clinton’s approval of this startling piece of legislation would prove a lasting and terrible legacy of his presidency: it deregulated the communications industry, lifting virtually all of its New Deal antimonopoly provisions, allowing for the subsequent consolidation of media companies and prohibiting regulation of the Internet with catastrophic consequences.

Nevertheless, that the U.S. government would even presume to legislate the Internet—even if only to promise not to regulate it—alarmed the Internet libertarians. On the day Clinton signed the bill, Barlow, ex-hippie become the darling of world bankers and billionaires, watching from the World Economic Forum in Davos, Switzerland, wrote a Declaration of Independence of Cyberspace:

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather. . . . Governments derive their just powers from the consent of the governed. You have neither solicited nor received ours. We did not invite you. You do not know us, nor do you know our world. Cyberspace does not lie within your borders.48

He posted this statement on the web, where it became one of the very first posts to spread, as was said, like a virus, an infection.

Cyberutopians who had no use for government ignored the altogether inconvenient fact that of course not only the Internet itself but also nearly all the tools used to navigate it, along with the elemental inventions of the digital age, had been built or subsidized by taxpayer-funded, government-sponsored research. The iPhone, taking only one example, depended on the unquestionable and extraordinary ingenuity of Apple, but it also depended on U.S. government–funded research that had earlier resulted in several key technological developments, including GPS, multi-touch screens, LCD displays, lithium-ion batteries, and cellular networks. Nevertheless, Barlow and his followers believed that the Internet existed entirely outside of government, as if it had sprung up, mirabile dictu, out of nowhere, before and outside of civil society and the rule of law, in the borderless psychedelic fantasy world of cyberspace. “I ask you of the past to leave us alone,” Barlow pleaded. But if the futurists were uninterested in the past, they seemed also strangely incautious about the future. With rare exception, early Internet boosters, who fought for deregulation and antitrust measures even as they benefited from the munificence of the federal government, evidenced little concern about the possible consequences of those measures on income inequality and political division in the United States and around the world.49

The Internet, a bottomless sea of information and ideas, had profound effects on the diffusion of knowledge, and especially on its speed and reach, both of which were accelerated by smartphones. If not so significant to human history as the taming of fire, it was at least as significant as the invention of the printing press. It accelerated scholarship, science, medicine, and education; it aided commerce and business. But in its first two decades, its unintended economic and political consequences were often dire. Stability, in American politics, had depended not on the wealth of the few but on the comfort of the many, not on affluence but on security, and a commitment to the notion of a commonwealth. The Internet did not destroy the American middle class, but it did play a role in its decline. It fueled economic growth and generated vast fortunes for a tiny clutch of people at a time when the poor were becoming poorer and the middle class disappearing. It turned out that the antimonopoly regulations of the industrial era, far from being obsolete, were sorely needed in the information age. And the vaunted promise of Internet connection, the gauzy fantasy of libertarians and anarchists who imagined a world without government, produced nothing so much as a world disconnected and distraught.

Silicon Valley, as it grew, earned a reputation as a liberal enclave, but it also drew a younger generation of libertarians, who had come not from the counterculture but from the New Right. Peter Thiel, born in Germany in 1967, had gone to Stanford, and then to Stanford Law School, where in 1987 he had founded the Stanford Review with funding from Irving Kristol. It aimed to counter campus multiculturalism, feminism, and political correctness, whose rise at Stanford Thiel had lamented in The Diversity Myth, a 1990s update and dilution of God and Man at Yale. George Gilder and Robert Bork were among Thiel’s heroes. (Bork’s writings on the error of antitrust laws informed much Silicon Valley libertarianism.) After a brief career as a lawyer and a stock trader, Thiel had returned to California in 1996, just in time for the dot-com boom, which followed the lifting of restrictions on commercial traffic on the Internet. Ten thousand websites were launched every day, poppies in a field. In 1996, Bob Dole, an unlikely but bold pioneer, became the first presidential candidate to have a website. Amazon was founded in 1994, Yahoo! in 1995, Google in 1998. In 1998, Thiel co founded PayPal, hoping that it would free the citizens of the world from government-managed currency. “PayPal will give citizens worldwide more direct control over their currencies than they ever had before,” he promised.50

The Silicon Valley entrepreneur—almost always a man—became the unrivaled hero of the Second Gilded Age. He was a rescued man, the male breadwinner defended by George Gilder in the 1970s against the forces of feminism, saved, and newly seen as saving the nation itself. Multibillion-dollar Internet deals were made every day. In four years, the value of dot-coms, many of which had not earned a profit, rose by as much as 3,000 percent. By 1999, Bill Gates, at forty-three, had become the richest man in the world, and Microsoft the first corporation in history valued at more than half a trillion dollars.51

Inventors from Benjamin Franklin to Thomas Edison had been called “men of progress.” Silicon Valley had “disruptive innovators.” The language was laden, freighted with the weight of centuries. Historically, the idea of innovation has been opposed to the idea of progress. From the Reformation through the Enlightenment, progress, even in its secular usage, connoted moral improvement, a journey from sin to salvation, from error to truth. Innovation, on the other hand, meant imprudent and rash change. Eighteenth-century conservatives had called Jacobinism “an innovation in politics,” Edmund Burke had derided the French Revolution as a “revolt of innovation,” and Federalists, opposing Jefferson, had declared themselves to be “enemies to innovation.”52 Over the nineteenth century, the meaning of progress narrowed, coming, more often, to mean merely technological improvement. In the twentieth century, innovation began to replace progress, when used in this sense, but it also meant something different, and more strictly commercial. In 1939 the economist Joseph Schumpeter, in a landmark study of business cycles, used “innovation” to mean bringing new products to market, a usage that spread only slowly, and only in the specialized scholarly literatures of economics and business. In 1942, Schumpeter theorized about “creative destruction,” language that, after Hiroshima, had virtually no appeal.53 Progress, too, accreted critics; in the age of the atom bomb, the idea of progress seemed, to many people, obscene: salvation had not, in fact, been found in machines; to the contrary. Innovation gradually emerged as an all-purpose replacement, progress without goodness. Innovation might make the world a better place, or it might not; the point was, innovation was not concerned with goodness; it was concerned with novelty, speed, and profit.

“Disruption” entered the argot in the 1990s. To disrupt something is to take it apart. The chief proselytizer of “disruptive innovation” (a rebranding of “creative destruction”) was Clayton M. Christensen, a professor at Harvard Business School. In 1997, Christensen published The Innovator’s Dilemma, a business bible for entrepreneurs, in which he argued that companies that make only “sustaining innovations” (careful, small, gradual refinements) are often overrun by companies that make “disruptive innovations”: big changes that allow them to produce a cheaper, poorer-quality product for a much larger market. IBM made sustaining innovations in its mainframe computers, a big, expensive product marketed to big businesses; Apple, selling a personal computer that ordinary people could afford, made a disruptive innovation.54

After 9/11, disruptive innovation, a theory that rested on weak empirical evidence, became gospel, a system of belief, a way of reckoning with uncertainty in an age of rapid change, an age of terror. Terrorism was itself a kind of disruptive innovation, cheaper and faster than conventional war. The gospel of disruptive innovation applauded recklessness and heedlessness. Mark Zuckerberg founded Facebook in 2004, when he was not yet twenty, partly with funding from Thiel. “Unless you are breaking stuff, you aren’t moving fast enough,” he said, embracing the heedlessness of disruptive innovation. “Don’t be evil” was Google’s motto, though how to steer clear of iniquity appears to have been left to market forces. Companies and whole industries that failed were meant to fail; disruptive innovation aligned itself with social Darwinism. Above all, the government was to play no role in restraining corporate behavior: that had been a solution for the industrial age, and this was an age of knowledge.55

One of the first casualties of disruptive innovation, from the vantage of American democracy, was the paper newspaper, which had supplied the electorate with information about politics and the world and a sense of political community since before the American Revolution. “Printers are educated in the Belief, that when Men differ in Opinion,” Benjamin Franklin had once written, “both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”56 There had been great newspapers and there had been lousy newspapers. But the Republic had never known a time without newspapers, and it was by no means clear that the Republic could survive without them, or at least without the freedom of the press on which they were established, the floor on which civil society stands. Nevertheless, neither that history nor that freedom—nor any manner of editorial judgment whatsoever—informed decisions made by the disruptive innovators who declared the newspaper dead.

The deregulation of the communications industry had allowed for massive mergers: General Electric bought RCA and NBC; Time merged with Warner, and then with AOL. Newspapers housed within this giant corporation became less accountable to their readers than to stockholders. (The New York Times, the Washington Post, and National Public Radio were among a handful of exceptions.) Fast-growing dot-coms had been a chief source of newspaper advertising revenue; during the dot-com bust, those companies either slashed their advertising budgets or eliminated them; they also turned to advertising online instead. Readers found that they could get their news without paying for it, from news aggregators that took reported stories from the newspapers and reprinted them. Papers began laying off a generation of experienced editors and reporters, then whole bureaus, and then the papers began closing their doors.57

“The Internet is the most democratizing innovation we’ve ever seen,” Democratic presidential candidate Howard Dean’s campaign manager said in 2004, “more so even than the printing press.” At the time, many journalists agreed. Tom Brokaw talked about the “democratization of news,” and conservative journalists, in particular, celebrated the shattering of the “power of elites” to determine what is news and what is not.58

Compared to newspapers and broadcast television news, the information available on the Internet was breathtakingly vast and thrilling; it was also uneven, unreliable, and, except in certain cases, unrestrained by standards of reporting, editing, and fact-checking. The Internet didn’t leave seekers of news “free.” It left them bruatally constrained. It accelerated the transmission of information, but the selection of that information—the engine that searched for it—was controlled by the biggest unregulated monopoly in the history of American business. Google went public in 2004. By 2016, it controlled nearly 90 percent of the market.59

The Internet transformed the public sphere, blurring the line between what political scientists had for decades called the “political elite” and the “mass public,” but it did not democratize politics. Instead, the Internet hastened political changes that were already under way. A model of citizenship that involved debate and deliberation had long since yielded to a model of citizenship that involved consumption and persuasion. With the Internet, that model yielded to a model of citizenship driven by the hyperindividualism of blogging, posting, and tweeting, artifacts of a new culture of narcissism, and by the hyperaggregation of the analysis of data, tools of a new authoritarianism. Data collected online allowed websites and search engines and eventually social media companies to profile “users” and—acting as companies selling products rather than as news organizations concerned with the public interest—to feed them only the news and views with which they agreed, and then to radicalize them. Public opinion polling by telephone was replaced by the collection and analysis of data. Social media, beginning with Facebook, moving fast and breaking things, exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right, automating identity politics, and contributing, at the same time, to a distant, vague, and impotent model of political engagement.60 In a wireless world, the mystic chords of memory, the ties to timeless truths that held the nation together, faded to ethereal invisibility.

“OUR WAR ON TERROR begins with al Qaeda, but it does not end there,” Bush said when he addressed Congress and a shaken nation on September 20, 2001. “It will not end until every terrorist group of global reach has been found, stopped, and defeated.” Bush pledged to destroy not only the perpetrators of the attacks on 9/11 but terrorism itself. This was not merely the saber rattling of a moment. By 2006, the stated objective of the National Security Strategy of the United States was to “end tyranny.” Like a war on poverty, a war on crime, and a war on drugs, a war on terror could imagine no end.61

Terrorism respected no borders and recognized no laws. Fighting it risked doing the same. In 1980, twenty-three-year-old Osama bin Laden had joined a resistance movement against the Soviet occupation of Afghanistan, supplying funds and building a network of supporters. In 1988, when the mujahideen triumphed and the Soviet Union agreed to withdraw from Afghanistan, bin Laden formed al Qaeda as a base for future jihads, or holy wars. Bin Laden was not a cleric and did not in any way speak for the religion of Islam. But he did describe his movement in religious terms, as a form of political incitement. At a time of economic decline, political unrest, and violent sectarianism throughout the Arab world, he called for a jihad against Americans, whom he described as a godless, materialist people. Bin Laden argued that Americans had defiled the Islamic world and undermined the Muslim faith by causing wars between Muslims in Europe, Asia, Africa, and the Middle East. “It is saddening to tell you that you are the worst civilization witnessed by the history of mankind,” he wrote in a letter to America. In 1990, he urged the Saudi monarchy to support a jihad to retake Kuwait after the Americans ousted Saddam Hussein; instead, the Saudis welcomed U.S. forces into Saudi Arabia. Bin Laden denounced the American “occupation” and recruited and trained forces for terrorist acts that included suicide bombings. The CIA formed a special unit to work against al Qaeda and bin Laden in 1996, by which time bin Laden had declared war on the United States and found refuge with the Taliban, radical Islamic fundamentalists who had taken over Afghanistan and remade it as a religious state. In 1998, bin Laden called for a fatwa against all Americans, describing the murder of Americans as the “individual duty for every Muslim who can do it in any country,” in the name of a “World Islamic Front.”62

After 9/11, the Bush administration demanded that the Taliban hand over bin Laden. The Taliban refused. On October 7, 2001, the United States began a war in Afghanistan. The immediate end of the war, aided by coalition partners, was to defeat al Qaeda; its more distant aim was to replace the Taliban with a democratically elected, pro-Western government.63 It became the longest war in American history.

The Bush administration conceived of the war on terror as an opportunity to strike against hostile regimes all over the world, on the grounds that they harbored and funded terrorists. Between 1998 and 2011, military spending nearly doubled, reaching more than $700 billion a year—more, in adjusted dollars, than at any time since the Allies were fighting the Axis. In his 2002 State of the Union address, Bush described Iraq, Iran, and North Korea as another axis. “States like these, and their terrorist allies, constitute an axis of evil, arming to threaten the peace of the world,” he said. “By seeking weapons of mass destruction, these regimes pose a grave and growing danger. They could provide these arms to terrorists, giving them the means to match their hatred.” For all his fierce rhetoric, Bush took great pains and care not to denounce Islam itself, steering clear of inciting still more hatred. “All Americans must recognize that the face of terror is not the true face of Islam,” he said later that year. “Islam is a faith that brings comfort to a billion people around the world. It’s a faith that has made brothers and sisters of every race. It’s a faith based upon love, not hate.”64

The Bush administration soon opened a second front in the war on terror. In 2003, another U.S.-led coalition invaded Iraq, with the aim of eradicating both Saddam Hussein and his weapons of mass destruction. The architects of this war were neoconservatives who regretted what they saw as George H. W. Bush’s premature withdrawal from the Middle East, his failing to occupy Iraq and topple Hussein after pushing him out of Kuwait. With few exceptions, Democrats and Republicans alike supported the wars in Afghanistan and Iraq, but support for the Iraq war was, from the start, more limited, and dwindled further after it became clear that Hussein in fact had no weapons of mass destruction. “In 2003, the United States invaded a country that did not threaten us, did not attack us, and did not want war with us, to disarm it of weapons we have since discovered it did not have,” wrote Pat Buchanan, placing the blame for the war on the neocons’ hijacking of the conservative movement, whose influence he greatly regretted. He complained, “Neoconservatives captured the foundations, think tanks, and opinion journals of the Right and were allowed to redefine conservatism.”65

The war on terror differed from every earlier American war. It was led, from Washington, by men and women who had never served in the military, and it was fought, in the Middle East, by an all-volunteer force whose sacrifices American civilians did not share or know or even, finally, consider. In both Afghanistan and Iraq, the United States’ regime-building efforts failed. Vietnam had been a bad war, and a distant war, and its sacrifices had been unevenly borne, but they had been shared—and protested. Far distant from the United States, in parts of the world that few Americans had ever visited, the wars in Afghanistan and Iraq were fought by a tiny slice of the American population; between 2001 and 2011, less than one-half of 1 percent of Americans saw active duty. Hardly any members of Congress had ever seen combat, or had family members who had. “God help this country when someone sits in this chair who doesn’t know the military as well as I do,” Eisenhower once said. George H. W. Bush was the last president of the United States to have served in the U.S. military, to fear and loathe war because of knowing war.66 His successors lacked that knowledge. During the Vietnam War, George W. Bush had avoided combat by serving in the Texas Air National Guard. Bill Clinton and Donald Trump had dodged the draft. Obama came of age after that war was over. None of these men had sons or daughters who served in the military.67

image
The Iraq War mired U.S. soldiers in counterinsurgency campaigns.

The war on terror had its dissenters: among them were those who fought it. A 2011 Pew study reported that half of veterans of Afghanistan and Iraq thought the war in Afghanistan wasn’t worth fighting, nearly 60 percent thought the war in Iraq wasn’t worth it, and a third thought neither war was worth what it cost.68 One of the war on terror’s severest critics was Andrew J. Bacevich, a West Point graduate and career army officer who, after fighting in Vietnam in 1970 and 1971, had risen to the rank of colonel and become a history professor. Bacevich’s only son was killed in Iraq. A Catholic and a conservative, Bacevich argued that while few Americans served in the military, Americans and the American government had “fallen prey to militarism, manifesting itself in a romanticized view of soldiers, a tendency to see military power as the truest measure of national greatness, and outsized expectations regarding the efficacy of force.” Somehow, Bacevich wrote, Americans accepted that it was the fate of the United States to engage in permanent war, without dissent: “The citizens of the United States have essentially forfeited any capacity to ask first-order questions about the fundamentals of national security policy.”69

By no means had the wars in Afghanistan and Iraq gone unquestioned, but one reason there had been relatively little debate had to do not only with a widening gap between the civilian and the military populations but also with the consequences of disruptive innovation. In many parts of the country, the daily paper, with its side-by-side op-ed essays, had vanished. Voters had been sorted into parties, the parties had been sorted, ideologically, and a new political establishment, the conservative media, having labeled and derided the “mainstream media” as biased, abdicated dispassionate debate. Rigorous studies of newspapers had not, up to that point, been able to discern a partisan bias. Nevertheless, the conservative establishment insisted that such bias existed, warned their audiences away from nonconservative media outlets, and insulated their audience from the possibility of persuasion by nonconservative outlets by insisting that anything except the conservative media was the “liberal media.”70 This critique applied not only to the news but to all manner of knowledge. “Science has been corrupted,” Rush Limbaugh said on the radio in 2009. “We know the media has been corrupted for a long time. Academia has been corrupted. None of what they do is real. It’s all lies!”71

Limbaugh, who came of age during the Vietnam War but did not serve in the military (apparently due to a cyst), strenuously supported the war on terror.72 Roger Ailes, who, like Limbaugh, had neither seen combat in Vietnam nor served in the military (Ailes suffered from hemophilia), strongly supported U.S. military action in both Afghanistan and Iraq. And his network, Fox News, did more than report the wars; it promoted them. After 9/11, when Fox News anchors and reporters began wearing flag pins, some journalists, including CBS’s Morley Safer, condemned the practice. Ailes brushed him off: “I’m a little bit squishy on killing babies, but when it comes to flag pins I’m pro-choice.” When the United States invaded Iraq, Fox News adopted an on-air chyron: “The War on Terror.” John Moody, Fox’s vice president for news, circulated morning memos with directives for the day’s coverage. On June 3, 2003, he wrote, “The president is doing something that few of his predecessors dared undertake: putting the US case for Mideast peace to an Arab summit. It’s a distinctly skeptical crowd that Bush faces. His political courage and tactical cunning are worth noting in our reporting through the day.” On March 23, 2004, following early reports that the 9/11 commission was investigating the degree of negligence involved in the Bush administration leading up to the attacks, Moody wrote: “Do not turn this into Watergate. Remember the fleeting sense of national unity that emerged from this tragedy. Let’s not desecrate that.” Moody’s editorial directives included prohibitions on certain words. On April 28, 2004, he wrote: “Let’s refer to the US marines we see in the foreground as ‘sharpshooters,’ not snipers, which carries a negative connotation.” Walter Cronkite said of the memos, after they were leaked: “I’ve never heard of any other network nor any other legitimate news organization doing that, newspaper or broadcast.”73

The conservative media establishment broadcast from a bunker, garrisoned against dissenters. Those who listened to Rush Limbaugh, and who only years before had also gotten news from their local newspapers and from network television, were now far more likely to watch only Fox News and, if they read a newspaper, to read only the Wall Street Journal, which, like Fox, was owned, as of 2007, by Rupert Murdoch. The conservative websites to which search engines directed listeners of Limbaugh, watchers of Fox News, and readers of the Wall Street Journal only reinforced this view. “It’s a great way to have your cake and eat it too,” wrote Matt Labash in the Weekly Standard in 2003. “Criticize other people for not being objective. Be as subjective as you want. It’s a great little racket. I’m glad we found it actually.”74

Other administrations, of course, had lied, as the Pentagon Papers had abundantly demonstrated. But in pursuing regime change in the Middle East, the Bush administration dismissed the advice of experts and took the radically postmodern view that all knowledge is relative, a matter of dueling political claims rather than of objective truth. That view had characterized not only its decision to go to war in Iraq but also the campaign’s argument against the recount in 2000, and Bush’s withdrawal from the Kyoto Protocol, a climate change agreement, in 2001.75 In 2002, a senior Bush adviser told a reporter for the New York Times that journalists “believe that solutions emerge from your judicious study of discernible reality” but that “that’s not the way the world works anymore. We’re an empire now, and when we act, we create our own reality.”76 The culture and structure of the Internet made it possible for citizens to live in their own realities, too.

Jaundiced journalists began to found online political fact-checking sites like PolitFact, which rated the statements of politicians on a Truth-O-Meter. “I’m no fan of dictionaries or reference books: they’re elitist,” the satirist Stephen Colbert said in 2005, when he coined “truthiness” while lampooning George W. Bush. “I don’t trust books. They’re all fact, no heart. And that’s exactly what’s pulling our country apart today.”77 But eventually liberals would respond to the conservative media by imitating them—two squirrels, chasing each other down a tree.

WHAT DID HE know and when did he know it? had been the pressing question of the Watergate investigation. What does anyone know anymore, and what is knowledge, anyway? became the question of the Bush era.

The United States’ position as the leader of a liberal world order based on the rule of law entered a period of crisis when, pursuing its war on terror, the country defied its founding principles and flouted the Geneva Conventions, international law, and human rights through the torture of suspected terrorists and their imprisonment without trial.

On October 26, 2001, Bush signed the Patriot Act, granting the federal government new powers to conduct surveillance and collect intelligence to prevent and investigate terrorist acts. It passed both houses less than two months after the 9/11 attacks, in a frenzied climate in which legislators who dared to break ranks were labeled unpatriotic. Outside the Capitol, the ACLU and the Electronic Frontier Foundation were among the many vocal opponents of the act, citing violations of civil liberties, especially as established under the Fourth Amendment, and of civil rights, especially the due process provision of the Fourteenth Amendment. John Ashcroft, Bush’s attorney general, defended the Patriot Act, citing the war on drugs as a precedent for the war on terror. “Most Americans expect that law enforcement tools used for decades to fight organized crime and drugs be available to protect lives and liberties from terrorists,” Ashcroft said.78

In November 2001, Bush signed a military order concerning the “Detention, Treatment, and Trial of Certain Non-Citizens in the War Against Terrorism.” Suspected terrorists who were not citizens of the United States were to be “detained at an appropriate location designated by the Secretary of Defense.” If brought to trial, they were to be tried and sentenced by military commissions. The ordinary rules of military law would not apply. Nor would the laws of war, nor the laws of the United States.79

The conduct of war will always challenge a nation founded on a commitment to justice. It will call back the nation’s history, its earlier struggles, its triumphs and failures. There were shades, during the war on terror, of the Alien and Sedition Acts passed in 1798 during the Quasi-War with France, of the Espionage Act of the First World War, and of FDR’s Japanese internment order during the Second World War. But with Bush’s November 2001 military order, the war on terror became, itself, like another airplane, attacking the edifice of American law, down to its very footings, the ancient, medieval foundations of trial by jury and the battle for truth.

“You’ve got to be kidding me,” Ashcroft said when he read a draft of the order. He’d expected the prosecution of people involved in planning the attacks on 9/11 to be handled criminally, by his department—as had been done successfully with earlier terrorism cases, with due process. National security adviser Condoleezza Rice and Secretary of State Colin Powell only learned that Bush had signed the order when they saw it on television. In the final draft, the Department of Justice was left out of the prosecutions altogether: suspected terrorists were to be imprisoned without charge, denied knowledge of the evidence against them, and, if tried, sentenced by courts following no established rules. The order deemed “the principles of law and the rules of evidence generally recognized in the trial of criminal cases in the United States district courts” to be impractical. The means by which truth was to be established and justice secured, traditions established and refined over centuries, were deemed inconvenient. “Now, some people say, ‘Well, gee, that’s a dramatic departure from traditional jurisprudence in the United States,’” Vice President Cheney said, but “we think it guarantees that we’ll have the kind of treatment of these individuals that we believe they deserve.”80

The Bush administration’s course of action with the wars in Afghanistan and Iraq and with the military tribunals and with the Patriot Act rested on an expansive theory of presidential power. The party in control of the White House tends to like presidential power, only to change its mind when it loses the White House. From Woodrow Wilson through FDR and Lyndon Johnson, Democrats had liked presidential power, and had tried to extend it, while Republicans had tried to limit it. Beginning with the presidency of Richard Nixon, Democrats and Republicans switched places, Republicans extending presidential power with Nixon and Reagan. But the conservative effort to expand the powers of the presidency reached a height in the George W. Bush administration, in powers seized while the nation reeled from an unprecedented attack.81

Beginning in the fall of 2001, the U.S. military dropped flyers over Afghanistan offering bounties of between $5,000 and $25,000 for the names of men with ties to al Qaeda and the Taliban. “This is enough money to take care of your family, your village, your tribe, for the rest of your life,” one flyer read. (The average annual income in Afghanistan at the time was less than $300.) The flyers fell, Secretary of Defense Donald Rumsfeld said, “like snowflakes in December in Chicago.” (Unlike many in Bush’s inner circle, Rumsfeld was a veteran; he served as a navy pilot in the 1950s.)82 As hundreds of men were rounded up abroad, the Bush administration considered where to put them. Taking over the federal penitentiary at Leavenworth, Kansas, and reopening Alcatraz, closed since 1963, were both considered but rejected because, from Kansas or California, suspected terrorists would be able to appeal to American courts and under U.S. state and federal law. Diego Garcia, an island in the Indian Ocean, was rejected because it happened to be a British territory, and therefore subject to British law. In the end, the administration chose Guantánamo, a U.S. naval base on the southeastern end of Cuba. No part of either the United States or of Cuba, Guantánamo was one of the known world’s last no-man’s-lands. Bush administration lawyer John Yoo called it the “legal equivalent of outer space.”83

On January 9, 2002, Yoo and a colleague submitted to the Department of Defense the first of what came to be called the torture memos, in which they concluded that international treaties, including the Geneva Conventions, “do not apply to the Taliban militia” because, although Afghanistan had been part of the Geneva Conventions since 1956, it was a “failed state.” International treaties, the memo maintained, “do not protect members of the al Qaeda organization, which as a non-State actor cannot be a party to the international agreements governing war.” Two days later, the first twenty prisoners, shackled, hooded, and blindfolded, arrived at Guantánamo. More camps were soon built to house more prisoners, eventually 779, from 48 countries. They weren’t called criminals, because criminals have to be charged with a crime; they weren’t called prisoners, because prisoners of war have rights. They were “unlawful combatants” who were being “detained” in what White House counsel Alberto Gonzales called “a new kind of war,” although it was as ancient as torture itself.84

The White House answered terrorism, an abandonment of the law of war, with torture, an abandonment of the rule of law. Aside from the weight of history, centuries of political philosophy and of international law, and, not least, its futility as a means for obtaining evidence, another obstacle to torture remained: the Convention against Torture and other Cruel, Inhuman or Degrading Treatment or Punishment, a treaty the United States had signed in 1988. This objection was addressed in a fifty-page August 2002 memo to Gonzales that attempted to codify a distinction between acts that are “cruel, inhuman, or degrading” and acts that constitute torture. “Severe pain,” for instance, was defined as pain that caused “death, organ failure, or permanent damage resulting in the loss of significant bodily functions.” (“If the detainee dies, you’re doing it wrong,” the chief counsel for the CIA’s counterterrorism center advised, according to meeting minutes later released by the Senate Armed Services Committee.) Methods described in the torture memos included stripping, shackling, exposure to extremes of temperature and light, sexual humiliation, threats to family members, near-drowning, and the use of dogs. Many of these forms of torment, including sleep deprivation and semi-starvation, came from a 1957 U.S. Air Force study called “Communist Attempts to Elicit False Confessions From Air Force Prisoners of War,” an investigation of methods used by the Chinese Communists who tortured American prisoners during the Korean War. Top security advisers, including Colin Powell, objected to what the White House called “enhanced interrogation techniques.” Others, including Ashcroft, urged discretion. “Why are we talking about this in the White House?” he is said to have asked at one meeting, warning, “History will not judge this kindly.” But the position of the secretary of defense prevailed. On a list of interrogation techniques approved for the use of U.S. military, Rumsfeld wrote: “I stand for 8–10 hours a day. Why is standing limited to 4 hours? D.R.”85

Torture wasn’t confined to Guantánamo. In Iraq, American forces inflicted torture at Abu Ghraib, and in Afghanistan, in a CIA prison in Kabul and at Bagram Air Base, where, in 2002, two men died while chained to the ceiling of their cells. Within the legal academy and among civil liberties organizations, opposition both to provisions of the Patriot Act and to the treatment of suspected terrorists had been ongoing. During Barack Obama’s 2003 Senate bid, he called the Patriot Act “a good example of fundamental principles being violated,” and objected to the lack of due process in the arrest and trials of suspected terrorists. Glimpses of what was happening only reached the American public in 2004, after The New Yorker and 60 Minutes reported on abuses at Abu Ghraib and the ACLU published the torture memos. In June 2006, in Hamdan v. Rumsfeld, the Supreme Court ruled that, without congressional authorization, the president lacked the power to establish the military commissions. Six months later, Congress authorized the commissions, but in 2008, the court found this act unconstitutional as well.86 Still, something crucial about the fundamental institutions on which the nation had been founded had been very badly shaken.

The Supreme Court’s ruling had neither righted the Republic nor healed its divisions. During Bush’s two terms in office, income inequality widened and polarization worsened, as they had during the Clinton years and the Reagan years, and as they would under Obama and Trump. A Bush-era tax cut granted 45 percent of its savings to the top 1 percent of income earners, and 13 percent to the poorest 60 percent. In 2004 and again in 2008, the percentage of voters who did things like post campaign yard signs in front of their houses or paste bumper stickers onto their cars was higher than it had been at any time since people had been counting those things, in 1952. Members of Congress no longer regretted hyperpartisanship but instead celebrated it, outgoing Republican House majority leader Tom DeLay insisting in his 2006 farewell address that “the common lament over the recent rise in political partisanship is often nothing more than a veiled complaint instead about the recent rise of political conservatism.”87

DeLay had been indicted for money laundering and had also been tied to all manner of other political grubbiness in connection with the Russian government and with lobbyists. Political insiders like DeLay had a financial stake in heightened partisanship: the more partisan the country, the more money they could raise for reelection, and the more money they could make after they left office. Before the 1990s, “change elections,” when a new party took over Congress or the White House or both, meant that politicians who were thrown out of office left town, along with their staff. That stopped happening. Instead, politicians stayed in Washington and became pundits, or political consultants, or management consultants, or, most likely, lobbyists, or—for those with the least scruples—all of the above. They made gargantuan sums of money, through speaking fees, or selling their memoirs, or hawking their connections, or appearing on television: the cable stations, compelled to fill twenty-four hours of airtime, needed talking heads at all hours of every day, the angrier and more adversarial the talk, the higher the ratings. “Insiders have always been here,” the New York Times’s Mark Leibovich observed in 2013. “But they are more of swarm now: bigger, shinier, online, and working it all that much harder.”88

Bush’s presidency ended with a global economic collapse, the explosion of a time bomb that had begun ticking during the Reagan administration. Clinton’s administration had not managed to defuse that bomb; instead, it had contributed to the deregulation of the financial services industry by repealing parts of the New Deal’s Glass-Steagall Act. Like all financial collapses in the long course of American history, starting with the Panic of 1792, it seemed to come suddenly, but, looking back, it appeared inevitable.

Wall Street totters from the top. Most of the suffering happens at the bottom. The first to fall were financial services giants Bear Stearns, Lehman Brothers, and Merrill Lynch, which had been wildly leveraged in high-risk subprime mortgages. The Dow Jones Industrial Average, 14,164 in October 2007, had fallen to 8,776 by the end of 2008. Unemployment rose by nearly 5 percentage points. Home values fell by 20 percent. In the last years of Bush’s administration, nearly 900,000 properties were repossessed. Millions of Americans lost their homes.89

In yards once festooned with campaign placards, Bush/Cheney ’04 or “Kerry/Edwards: A Stronger America,” Foreclosure and For Sale signs waved in front of doors boarded with plywood. Here and there the tails of yellow ribbons fluttered from trees, in remembrance of soldiers. Here and there were staked flags, and signs painted red, white, and blue: Bring Our Troops Home. And still, in the faraway and troubled lands of Afghanistan and Iraq, the wars dragged on, seen occasionally on Americans’ flickering, hand-held screens in fleeting footage of ruin and rubble.

II.

BARACK OBAMA HAD a narrow face and big ears and copper-colored skin, and sometimes he spoke like a preacher and sometimes he spoke like a professor, but he always spoke with a studied equanimity and a determined forbearance. “We, the people, have remained faithful to the ideals of our forebears, and true to our founding documents,” he said in his inaugural address in 2009, speaking to the largest crowd ever recorded in the nation’s capital, more than one-and-a-half-million people, on a terribly cold Tuesday in January. The day of hope and change was a day of hats and mittens.

His voice rose and fell with the cadences of Martin Luther King Jr. and held fast with the resolve of Franklin Delano Roosevelt. People had driven for hours, for tens of hours, to see him sworn in. “I just feel like if you had the opportunity to be there for the Gettysburg Address or when Hank Aaron hit his historic home run, would you take it?” Dennis Madsen, a thirty-nine-year-old urban planner from Atlanta, told CNN. Eight-year-old Bethany Dockery, from Memphis, wore a pink hat and coat, and jumped up and down when Obama took the oath of office. “It makes us feel good,” her mother said, crying, “because we have a chance.”90

The time had come, Obama said, “to choose our better history.”91 For Obama, that better history meant the long struggle against adversity and inequality, the work that generations of Americans had done for prosperity and justice. His inauguration marked a turn in American history, but just around that bend lay a hairpin.

He’d wanted to be a writer. He’d written his first book, Dreams from My Father: A Story of Race and Inheritance, when he was thirty-three, long before running for office. “His life is an open book,” his wife, Michelle, later said. “He wrote it and you can read it.” He’d been reckoning with race and inheritance since he was a little boy. “To some extent,” he once told a reporter, “I’m a symbolic stand-in for a lot of the changes that have been made.”92 But he’d also made himself that stand-in, by writing about it.

Obama’s mother’s father, Stanley Dunham, born in Wichita, Kansas, in 1918, was named after the explorer Henry Morton Stanley, whose books included In Darkest Africa, which was published right around the time that Obama’s father’s father, Hussein Obama, was born in Kanyadhiang, Kenya. During the Second World War, Hussein Obama worked as a cook for the British Army in Burma, and Stanley Dunham enlisted in the U.S. Army and went to Europe while, in Wichita, his wife, Madelyn, helped build B-29s for Boeing. Obama’s father, Barack Hussein Obama, was born in 1936; his mother, Stanley Ann Dunham, in 1942. On September 26, 1960, the day Richard M. Nixon first debated John F. Kennedy, seventeen-year-old Stanley Ann Dunham met twenty-three-year-old Barack Hussein Obama in an elementary Russian class at the University of Hawaii. By Election Day, she was pregnant. They married on February 2, 1961, two weeks after Kennedy’s inauguration, in the Wailuku County courthouse. In twenty-one states, that marriage would have been illegal, as a violation of miscegenation laws that were not overturned by the Supreme Court until 1967, in Loving v. Virginia. Neither family approved of the marriage. As recorded on his birth certificate, Barack Hussein Obama II was born at the Kapi’olani Maternity and Gynecological Hospital, in Honolulu, on August 4, 1961, at 7:24 p.m.93

image
Barack Obama’s inauguration in 2009 drew the largest crowd ever assembled on the Mall.

As a boy, living with his grandparents in Hawaii—his parents had divorced—young Barack Obama became a reader. He soaked up James Baldwin and W. E. B. Du Bois. “At night I would close the door to my room,” he later wrote, and “there I would sit and wrestle with words, locked in suddenly desperate argument, trying to reconcile the world as I’d found it with the terms of my birth.” After graduating from Columbia, he worked as a community organizer on the South Side of Chicago, planting roots in a city that had just elected its first black mayor. He joined a black Baptist church and began dating an ambitious young lawyer named Michelle Robinson, descended from men and women who had been held in slavery. At Harvard Law School, he worked as a research assistant for Laurence Tribe, who’d been looking for common ground between what appeared to be incommensurable arguments; this would become Obama’s signature move, too: reconciling seemingly irreconcilable differences.94

Not since Woodrow Wilson had Americans elected a scholar as president. At the University of Chicago Law School, Obama taught a seminar on race and law that amounted to a history of the United States itself, from Andrew Jackson and Indian removal through Reconstruction and Jim Crow, from civil rights to Ronald Reagan and affirmative action. Later, during the campaign, when the course syllabus was posted online, constitutional scholars from both the right and the left applauded its evenhandedness. Obama, as a professor, cultivated the values of engaged, open-minded debate: students were to be graded for their ability “to draw out the full spectrum of views,” for their display of “a thorough examination of the diversity of opinion” and for evidence of having broken “some sweat trying to figure out the problem in all its wonderful complexity.”95 By no means was it clear that what worked in a law school seminar room would work in Washington.

In 1996, the professor sought a seat in the state senate and offered this bridge across the American divide: the Right had “hijacked the higher moral ground with this language of family values and moral responsibility,” and the Left had ceded that ground and needed to gain it back, because a language of moral responsibility was what the whole nation needed, together. “We have to take this same language—these same values that are encouraged within our families—of looking out for one another, of sharing, of sacrificing for each other—and apply them to a larger society,” he said.96

Obama brought together the language of the nation’s founding with the language of its religious traditions. Elected to the U.S. Senate, where he became its only black member, he was tapped to deliver the keynote address at the 2004 Democratic National Convention. He wrote a speech that drew as much on the Bible—“I am my brother’s keeper”—as on the Declaration of Independence: “We hold these truths to be self-evident”; and he recited both as prayers. (Like William Jennings Bryan before him, Obama had worked with a Shakespearean speech coach.) Part preacher, part courtroom lawyer, he electrified the crowd. “There are those who are preparing to divide us, the spin masters and negative ad peddlers who embrace the politics of anything goes,” he said. “Well, I say to them tonight, there is not a liberal America and a conservative America; there is a United States of America.”97

Obama-mania began that night. He was young and handsome and glamorous; his rhetoric soared. Reporters, especially, swooned. Before he’d even taken his seat in the Senate, Obama was asked if he intended to run for president, a question he waved away. He did not enjoy his time in the Senate. If, after the end of his term, he stayed in Washington, he told a friend, “Shoot me.”98 He found bloody-minded partisanship maddening. Liberals were fools if they thought they could defeat conservatives by treating them like enemies. The American people, he insisted, “don’t think George Bush is mean-spirited or prejudiced.” Instead, he went on, “they are angry that the case to invade Iraq was exaggerated, are worried that we have unnecessarily alienated existing and potential allies around the world, and are ashamed by events like those at Abu Ghraib, which violate our ideals as a country.”99

Obama ran for the Democratic nomination in 2008 with a slogan adapted from the 1972 United Farm Workers campaign of Cesar Chavez and Dolores Huerta, “Sí, se peude”: Yes we can. His resume for the job was thin. He ran on his talent, his character, and his story. Some people said he was too black, some people said he wasn’t black enough. In a heated and very close primary race against sixty-year-old Hillary Clinton, he benefited from having opposed the Iraq War, which Clinton, then in the Senate, had voted to authorize. And while Clinton began with deep support from African American voters and leaders, that support was squandered by her husband. Threatened by Obama’s poise and charm—a cooler, blacker, and more upright version of himself—Bill Clinton alienated black voters by accusing Obama and his supporters of deviousness: “I think they played the race card on me,” the former president complained.100

In an age of extremes, Obama projected reasonableness and equanimity in politics and candor about religion. His faith, he said, “admits doubt, and uncertainty, and mystery.” His belief in the United States—“a faith in simple dreams, an insistence on small miracles”—admitted no doubt.101 In a time of war and of economic decline, he projected the optimism of Reagan and held the political commitments, it appeared, of FDR.

Obama’s candidacy stirred an apathetic electorate. It also changed the nature of campaigning. Turnout in 2008 was the highest since 1968. Against the much-admired long-term Republican senator from Arizona, John McCain, who had been a prisoner of war in Vietnam, Obama won by more than nine million votes. He also defeated McCain on social media. McCain, seventy-two, a man of his generation, hadn’t yet grasped the power of new forms of political communication. Obama’s campaign had four times as many followers as McCain on Facebook, the social media juggernaut, and an astounding twenty-three times as many on Twitter. His digital team registered voters at a website called Vote for Change. His supporters, who texted “HOPE” to join his list, received three texts on Election Day alone. When he won, more than a million Americans received a text that read “All of this happened because of you. Thanks, Barack.”102

Obama had promised hope and change. He seemed, at first, poised to deliver both. He swept into office with majorities in both the House and the Senate and the wind of history at his back. It proved a fickle wind.

To address the global financial collapse that had torqued the markets during Bush’s last months in office, he asked Congress to approve a stimulus program of $800 billion that reporters dubbed the New New Deal. The Economist announced “Roosevelt-mania.” But Obama was no FDR. His administration did not prosecute the people whose wrongdoing had led to the financial disaster. His economic program rescued the banks, but it didn’t rescue people who’d lost their savings. During Obama’s first year in office, while ordinary Americans lost their jobs, their houses, and their retirement money, executives at Wall Street’s thirty-eight largest companies earned $140 billion and the nation’s top twenty-five hedge fund managers earned an average of $464 million.103

Obama’s biggest initiative was the Affordable Health Care Act, which passed the Senate at the end of 2009 and the House at the beginning of 2010 in a razor-thin, party-line vote, 219 to 212. It had been a century since American Progressives first proposed national health care. Hillary Clinton’s own proposal had failed, badly, in 1994. (Obama, inspired by a biography of Lincoln, who put his political rivals in his cabinet, had named Clinton his secretary of state.) But the win was diminished by the fury of the campaign to repeal it, a campaign begun even before the legislation passed.

The day before Obama’s inauguration, Fox News launched a new program hosted by a radio talk show celebrity named Glenn Beck. Beck compared Obama to Mussolini. He turned his television studio into an old-fashioned one-room schoolhouse, with chalk and a blackboard, and oak desks, and lectured his viewers about American history, and how everything Obama stood for was a betrayal of the founding fathers. If Beck’s campaign was different from Alex Jones and the truthers, it drew on the same animus and exploited the same history of racial hatred. In March, Beck launched a movement called 9/12, whose purpose was to restore the unity Americans had supposedly felt the day after the attacks on the Twin Towers. Opponents of Obama’s economic plan and of health care reform called for a new Tea Party, to resist the tyranny of the federal government. In the spring of 2009, Tea Partiers across the country held rallies on town commons and city streets, waving copies of the Constitution. They dressed up as George Washington, Thomas Jefferson, and Benjamin Franklin, in tricornered hats and powdered wigs, knee breeches and buttoned waistcoats. They believed American history was on their side. They wanted, in words that would later become Donald Trump’s slogan, to make America great again.

With the Tea Party, the conservative media and the conservative movement merged: the Tea Party was, in some ways, a political product manufactured by Fox News. Former Alaska governor Sarah Palin, who’d gained a place in the national spotlight when McCain named her as his running mate in 2008, signed a one-million-dollar-per-year contract with Fox, and then began speaking at Tea Party rallies. Glenn Beck began holding Founders’ Fridays. Fox News host Sean Hannity began invoking the Liberty Tree.104

But the Tea Party was much more than a product of Fox News; it was also an earnest, grassroots movement. Some Tea Partiers cherished the NRA’s interpretation of the Second Amendment, or cared deeply about prayer in schools, or were opposed to same-sex marriage. Some held grievances against globalization, about immigration and trade deals, echoes of fears from the isolationist and nativist 1920s. Most had plenty of longstanding populist grievances, about taxes, in particular, and their objections to a federally run health care program, like the plans for such a program, dated back more than a century.

image
Tax Day protests held on April 15, 2009, marked the birth of the Tea Party movement, which countered Obama’s call for change with a call for a return to the principles of the founding fathers.

In the twenty-first century, the Tea Party married nineteenth-century populism to twentieth-century originalism. As populists, they blamed a conspiracy of federal government policymakers and Wall Street fat cats for their suffering. As originalists, they sought a remedy for what ailed them in a return to the original meaning of the Constitution.

Not irrelevantly, the movement was overwhelmingly white and it imagined a history that was overwhelmingly white, too. This is not to say that Tea Partiers were racists—though many liberals did say this, often without the least foundation—but, instead, that the story of American history had been impoverished by not being told either fully or well. Whole parts, too, had been rejected. “The American soil is full of corpses of my ancestors, through 400 years and at least three wars,” James Baldwin had written in 1965. Wrote Baldwin, “What one begs American people to do, for all sakes, is simply to accept our history.”105 That acceptance had not come.

If most Tea Partiers were mainly worried about their taxes, a few really did object to the changing nature of the Republic, on the ground that it was becoming less white. They objected to the very idea of a black president. It was as if they had resurrected Roger Taney’s argument from Dred Scott in 1857, when he ruled that no person of African descent could ever become an American citizen. “Impeach Obama,” their signs read. “He’s unconstitutional.”106

IN DECEMBER 2010, sixty-nine-year-old Vermont senator Bernie Sanders delivered an eight-and-a-half-hour speech on the floor of the Senate—without eating or drinking, or sitting down, or taking a bathroom break. He had no audience but the cameras. Sanders wasn’t speaking to his fellow senators; he was trying to reach the public directly, through social media. “My speech was the most Twittered event in the world on that day,” Sanders said later.

Sanders, born in Brooklyn in 1941, had been a civil rights and antiwar activist at the University of Chicago, leading sit-ins against segregated housing on campus and working for SNCC. After Chicago, he moved to Vermont, where he ran for mayor of Burlington. He took office the same year Reagan was inaugurated. Ten years later, he went to Washington as Vermont’s only member of Congress. There were perks to being the only socialist in Congress, he told the New York Times. “I can’t get punished,” he said. “What are they going to do? Kick me out of the party?”107 Sanders’s career in the Senate began in 2007—Obama had campaigned for him in 2006—and had been undistinguished. But during the recession, he emerged as one of the few prominent people in Washington, a city flooded with money, willing to speak about poverty.

The numbers were staggering. In 1928, the top 1 percent of American families earned 24 percent of all income; in 1944, they earned 11 percent, a rate that remained flat for several decades but began to rise in the 1970s. By 2011, the top 1 percent of American families was once again earning 24 percent of the nation’s income. In 2013, the U.S. Census Bureau reported a Gini index of .476, the highest ever recorded in any affluent democracy. Nations with income inequality similar to that in the United States at the time included Uganda, at .447, and China, at .474.108

Sanders was a socialist; his hero was Eugene Debs. He’d once made a recording of Debs delivering his most famous speech, during the First World War: “I am opposed to every war but one,” Debs had said then. “I am for that war, with heart and soul, and that is the worldwide war of the social revolution. In that war, I am prepared to fight any way the revolution the ruling class may make necessary, even to the barricades.” Sanders, nearly a century later, offered his echo, as if history were a reel of tape, winding and rewinding and winding again: “There is a war going on in this country,” Sanders said. “I am not referring to the war in Iraq or the war in Afghanistan. I am talking about a war being waged by some of the wealthiest and most powerful people against working families, against the disappearing and shrinking middle class of our country.”109

In 2010, in a series of deals that made possible the passage of health care reform, Democrats agreed to extend the Bush-era tax cuts, and Sanders was one of the few members of Congress to object. “President Obama has said he fought as hard as he could against the Republican tax breaks for the wealthy and for an extension in unemployment,” he said during his eight-hour speech. “Well, maybe. But the reality is that fight cannot simply be waged inside the Beltway. Our job is to appeal to the vast majority of the American people and to stand up and to say: Wait a minute.”110

By 2011, Sanders was no longer a lone voice in the wilderness. Protests against the bailout and against tuition hikes and budget cuts had started at the University of California in 2009, where students occupying a campus building carried signs that read “Occupy Everything, Demand Nothing.” The Occupy movement spread on social media, adopting the slogan, “We are the 99%.” Occupy Wall Street, an encampment in Zuccotti Park in downtown New York, begun in September 2011, drew thousands. Within months, Occupy protests had been staged in more than six hundred American communities and in hundreds more cities around the world. “We desperately need a coming together of working people to stand up to Wall Street, corporate America, and say enough is enough,” Sanders said during Occupy Wall Street. “We need to rebuild the middle class in this country.”111

Occupy, for all its rhetoric, was not a coming together of a representative array of working people. It was overwhelmingly and notably urban and white, and most protesters were students or people with jobs. It also had no real leadership, favoring a model of direct democracy, and lacked particular, achievable policy goals, preferring loftier objectives, like reinventing politics. Demand nothing. But it did propel Sanders to national prominence, and established the foundations for a movement that would lead him to one of the most remarkable progressive presidential campaigns since Theodore Roosevelt in 1912.

If the Tea Party married populism to originalism, Occupy married populism to socialism. The Tea Party on the right and Occupy on the left together offered an assault on Washington, sharing the conviction that the federal government had grown indifferent to the lives of ordinary Americans. Neither Republicans nor Democrats were able to unseat that conviction.

Obama’s team had gone to Washington disdainful of “insider Washington,” with its moneymakers and its dealmakers and its partisanship-for-hire. This piety did not last. David Plouffe, Obama’s 2008 campaign manager, called the GOP “a party led by people who foment anger and controversy to make a name for themselves and to make a buck.” In 2010, Plouffe earned $1.5 million; his income included management consulting work for Boeing and GE and speaking gigs booked through the Washington Speakers Bureau. Nor did the press, on the whole, hold politicians to account. Reporters had become “embedded” journalists in Iraq; more were embedded in Washington. So breezily did the press socialize with White House and congressional staff that a politician’s wife issuing an invitation to a child’s birthday party might take pains to announce that it would be “off the record.” But, in truth, hardly anything was off the record, and the record was blaring. The race-car pace of online news—the daily email newsletters, the blogs, and then Twitter—made for frantic, absurd fixations and postures, both grand and petty. “Never before has the so-called permanent establishment of Washington included so many people in the media,” reported Mark Leibovich. “They are, by and large, a cohort that is predominantly white and male.” They held iPhones in their hands and wore wireless receivers in their ears. They reported in breathless bursts. “They are aggressive, technology-savvy, and preoccupied by the quick bottom lines,” wrote Leibovich. “Who’s winning? Who’s losing? Who gaffed?”112

The mantras of Obama’s University of Chicago Law School syllabus were not the watchwords of a jacked-up, Bluetoothed, wallet-stuffed Washington. “Draw out the full spectrum of views on the issue you’re dealing with,” he’d instructed his students. “Display a thorough examination of the diversity of opinion that exists on the issue or theme.” House members raising money for reelection and booking their next television appearance didn’t think that way. Obama’s administration, unsurprisingly, found it difficult to gain traction with Congress, and the new president’s commitment to calm, reasoned deliberation proved untenable in a madcap capital.

The president’s aloofness kept him from the fray. Then, too, his signature health care act was a complicated piece of legislation, a feast for people who could make money by mocking it or explaining it, or both. Sarah Palin said that Obama’s health care plan would lead to “death panels,” which, while both absurd and untrue, was simply put. This, and the Democratic response, was the sort of outrageous assertion that generated a lot of web traffic, which had become a kind of virtual currency. Madness meant money. “We get paid to get Republicans pissed off at Democrats, which they rightfully are,” one Republican lobbyist told the Huffington Post. “It’s the easiest thing in the world. It’s like getting paid to get you to love your mother.” The intricacies of reforming health care insurance, which constituted a fifth of the American economy, chiefly served the interests of lobbyists. “Complication and uncertainty is good for us,” said Democratic lobbyist Tony Podesta, the brother of Bill Clinton’s former chief of staff, John Podesta.113 It meant more clients.

More money was made by more people interested in profiting from political decay after the Supreme Court ruled, in a 2010 case called Citizens United v. Federal Election Commission, that restrictions on spending by political action committees and other groups were unconstitutional. Roscoe Conkling’s fateful maneuver of 1882—telling the Supreme Court that when he’d helped draft the Fourteenth Amendment, the committee had changed the word “citizens” to “persons” in order to protect the rights of corporations—would make judicial history, time and time again. Where earlier rulings had granted corporations, as “persons,” certain liberties (especially the Lochner-era liberty of contract), Citizens United granted corporations a First Amendment right to free speech. By 2014, the court would grant corporations First Amendment rights to freedom of religious expression. In a landmark case, corporations owned by people who objected to contraception on religious grounds were allowed to refuse to provide insurance coverage for birth control to their employees, citing their corporation’s First Amendment rights.114

And yet on college and university campuses, students continued to protest not for but against free speech. Every hate speech code that had been instituted since the 1990s that had been challenged in court had been found unconstitutional.115 Some had been lifted, others disavowed. In 2014, the University of Chicago issued a report on freedom of expression: “The University’s fundamental commitment is to the principle that debate or deliberation may not be suppressed because the ideas put forth are thought by some or even by most members of the University community to be offensive, unwise, immoral, or wrong-headed.”116 Nevertheless, a generation of younger Americans who had been raised with hate speech codes rejected debate itself. They attempted to silence visiting speakers, including not only half-mad provocateurs but scholars and serious if controversial public figures, from Condoleezza Rice to longtime political columnist George Will to former FBI director James Comey.

While campus protesters squashed the free speech rights of people, the Supreme Court protected the free speech rights of corporations. When Citizens United demolished the constitutional dam, money flooded the vast plains of American politics, from east to west. The Tea Party movement was soon overwhelmed by political grifters. Within five years of the movement’s founding, its leading organizations, including the Tea Party Express and the Tea Party Patriots, were spending less than 5 percent of their funds on campaigns and elections.117

All that money bought nothing so much as yet more rage. Liberal columnist E. J. Dionne detected a pattern: candidates and parties made big promises, and when they gained power and failed to make good on those promises, they blamed some kind of conspiracy—any sort of conspiracy: a conspiracy of the press, a conspiracy of the rich, a conspiracy of the “deep state” (including, during Trump’s first term, a conspiracy of the FBI). Then they found media organizations willing to present readers with evidence of such a conspiracy, however concocted. Conservative commentator David Frum offered a not dissimilar diagnosis: “The media culture of the U.S. has been reshaped to become a bespoke purveyor of desired facts.”118 Under these circumstances, it was difficult for either party to hold a majority for long. Democrats lost the House in 2010, the Senate in 2014, and the White House in 2016.

image

WHEN DONALD TRUMP was out of the White House, he railed at the government. When he was in the White House, he railed at the press. He railed at Congress. He railed at immigrants. He railed at North Korea. He railed at his staff. He grew red in the face with railing.

Well known in the world of professional wrestling, Trump brought to politics the tactics of the arena, which borrowed its conventions of melodrama from reality television, another genre with which Trump was well acquainted, having starred, beginning in 2004, in a reality program called The Apprentice. On The Apprentice, Trump’s signature line was “You’re fired.” In professional wrestling, a hero known as a face battles his exact opposite, a villain known as a heel; every time they meet, they act out another chapter of their story together. They say their lines, they take their bows.

Not long into Obama’s presidency, Trump began staging bouts, as if he were the face and the president his heel. He taunted. He smirked. He swaggered. He wanted Obama to be fired. Early in 2011, he called for Obama to release to the public his “long-form” birth certificate, intimating that the president had something to hide. “He doesn’t have a birth certificate, or if he does, there’s something on that certificate that is very bad for him,” Trump said. “Now, somebody told me—and I have no idea if this is bad for him or not, but perhaps it would be—that where it says ‘religion’ it might have ‘Muslim.’”119

These performances reached a ready-made audience. If the polls could be trusted, a dubious proposition, even before Trump began his imaginary bout with Obama, more than two in five Republicans believed that the president was either definitely or probably born in another country. Another difficult-to-credit poll reported that more than one in three Americans believed, about that time, that it was either “somewhat likely” or “very likely” that “federal officials either participated in the attacks on the World Trade Center and the Pentagon, or took no action to stop them.”120

Both the truther conspiracy theory and the birther conspiracy theory had long been peddled by Alex Jones. By 2011, by which time the Drudge Report had begun linking to Infowars, Jones’s audience was bigger than the audiences of Rush Limbaugh and Glenn Beck put together. (Jones had no use for either man. “What a whore Limbaugh is,” he said). “Our investigation of the purported Obama birth certificate released by Hawaiian authorities today reveals the document is a shoddily contrived hoax,” Jones wrote after the White House released the long-form certificate at the end of April 2011. The Drudge Report linked to the story. After the release, another Gallup poll reported—again, dubiously—that nearly one in four Republicans still believed that Obama was definitely or probably born outside of the United States.121

On February 26, 2012, in a national atmosphere of racial incitement, a twenty-eight-year-old man named George Zimmerman, prowling around the neighborhood outside Orlando, Florida, called 911 to report seeing “a real suspicious guy.” He’d seen seventeen-year-old Trayvon Martin, who was walking to a nearby store. Zimmerman got out of his car and shot Martin, who was unarmed, with a 9mm handgun. Zimmerman told the police that Martin attacked him. Zimmerman weighed 250 pounds; Martin weighed 140. Martin’s family said that the boy, heard over a cellphone, had begged for his life. Martin did not survive. Zimmerman was not charged for six weeks. On March 8, Trayvon Martin’s father, Tracy Martin, held a press conference in Orlando and demanded the release of recordings of calls to 911. “We feel justice has not been served,” he said.122

Martin’s death might not have gained national attention if it had not been for yet another shooting. The day after George Zimmerman killed Trayvon Martin, a seventeen-year-old boy named T. J. Lane walked into the cafeteria at Chardon High School, about thirty miles outside of Cleveland, pulled out a .22-caliber pistol, and fired, killing three students and badly injuring two more.123

By then, the United States had the highest rate of private gun ownership in the world, twice that of the country with the second highest rate, which was Yemen. The United States also had the highest homicide rate of any affluent democracy, nearly four times higher than France or Germany, six times higher than the United Kingdom. In the United States at the start of the twenty-first century, guns were involved in two-thirds of all murders.124 None of these facts had dissuaded the Supreme Court from ruling, in 2008, in District of Columbia v. Heller, that DC’s 1975 Firearms Control Regulations Act was unconstitutional, Justice Scalia writing, “The Second Amendment protects an individual right to possess a firearm unconnected with service in a militia.” Anticipating openings on the court, the new head of the NRA told American Rifleman that the 2012 presidential election was “perhaps the most crucial election, from a Second Amendment standpoint, in our lifetimes.”125

There were shootings on street corners, in shopping malls, in hospitals, in movie theaters, and in churches. The nation had been mourning shootings in schools since 1999, when two seniors at a high school in Columbine, Colorado, shot and killed twelve students, a teacher, and themselves. In 2007, twenty-three-year-old Seung Hui-Cho, a senior at Virginia Tech, shot fifty people in Blacksburg, killing thirty-two people before he killed himself.126 The shooting in an Ohio high school, the day after Martin was killed in Florida, was, by comparison with Virginia Tech, a lesser tragedy, but it cast in a very dark light the claims coming out of Florida that George Zimmerman had a right to shoot Trayvon Martin.

Between 1980 and 2012, forty-nine states had passed laws allowing gun owners to carry concealed weapons outside their homes for personal protection. (Illinois was the sole holdout.) In 2004, Bush had allowed the 1994 Brady Bill’s ban on the possession, transfer, or manufacture of semiautomatic assault weapons to expire. In 2005, Florida passed a “stand your ground” law, exonerating from prosecution citizens who use deadly force when confronted by an assailant, even if they could have safely retreated. More states followed.127 Carrying a concealed weapon for self-defense came to be understood not as a failure of civil society, to be mourned, but as an act of citizenship, to be vaunted, law and order, man by man.

Obama refused to cede this argument. “If I had a son,” the president said at a press conference on March 23, visibly shaken, “he’d look like Trayvon.”128 Later that day, Rick Santorum, a Republican presidential aspirant, spoke outside at a firing range in West Monroe, Louisiana, where he shot fourteen rounds from a Colt .45. He told the crowd, “What I was able to exercise was one of those fundamental freedoms that’s guaranteed in our Constitution, the right to bear arms.” A woman called out, “Pretend it’s Obama.”129

On April 2, thousands of students rallied in Atlanta, carrying signs that read “I am Trayvon Martin” and “Don’t Shoot!”130 Even as they were rallying, a forty-three-year-old man named One Goh walked into a classroom in a small Christian college in Oakland, took out a .45-caliber semiautomatic pistol, lined the students against the wall, said, “I’m going to kill you all,” and fired. That same morning, in Tulsa, five people were shot on the street. An investigation called “Operation Random Shooter” led the Tulsa police to Jake England, nineteen, whose father had been shot to death two years before. By Easter Sunday, two college students had been shot to death in Mississippi.131

On March 20, the U.S. Justice Department announced that it would conduct an investigation into the death of Trayvon Martin. On April 7, Martin’s parents appeared on Good Morning America. Five days later, Newt Gingrich, seeking the 2012 Republication nomination, called the Second Amendment a “universal human right.” Trump found this a suitable moment to cast doubt, once more, on the president’s birth certificate. “A lot of people do not think it was an authentic certificate,” Trump said that May, just before endorsing Mitt Romney as the GOP nominee.132

Obama won reelection in 2012, even as Democrats lost control of the Senate. Weeks later, on a woeful day in December in the snow-dusted New England town of Newtown, Connecticut, a mentally ill twenty-year-old shot his mother and then went to his former elementary school, fully armed. He shot and killed six teachers and staff and twenty very young children, as young as five, a massacre of first graders.

“I know there’s not a parent in America who doesn’t feel the same overwhelming grief that I do,” Obama said at the White House. He could not stop himself from weeping. “Our hearts are broken.”133 And yet the Obama administration had no success getting gun safety measures through a Republican Congress, which staunchly defended the right to bear arms at all costs, calling the massacre of little children the price of freedom.

OBAMAS SECOND TERM was marked by battles over budgets and the mire of the Middle East. In 2011, U.S. forces had found and killed Osama bin Laden, and Obama withdrew the last American troops from Iraq. Yet Obama’s foreign policy looked aimless and haphazard and tentative, which diminished both his stature and that of his secretary of state, Hillary Clinton. While war in Afghanistan wore on, Islamic militants attacked U.S. government facilities in Libya in 2012, and by 2014 a new terrorist group, calling itself the Islamic State, had gained control of territory in Iraq. America’s nation-building project in the Middle East had failed. Obama, who had been an early critic of the Patriot Act, of the prison at Guantánamo, and of the Iraq War, led an administration that stepped up surveillance through a secret program run by the National Security Agency, prosecuted whistle-blowers who leaked documents that revealed U.S. abuses in the Middle East, and used drones to commit assassinations. Critics argued that the war on terror had been an unmitigated disaster, that occupying Arab countries had only produced more terrorists, and that the very idea of a war on terror was an error. Terrorism is a criminal act, historian Andrew Bacevich argued, and required police action and diplomacy, not military action.134

With a massive defense budget, the federal government proved unmovable on tax policy and all but unable even to discuss its spending priorities. House leader Paul Ryan, a Wisconsin Republican, proposed capping the top income tax rate at 25 percent, a rate not seen in the United States since the days of Andrew Mellon. Of 248 Republican members of Congress and 47 Republican senators, all but 13 signed a pledge swearing to oppose any income tax increase. The Obama administration wanted to raise the top rate to 39 percent, a recommendation supported by the nonpartisan Congressional Research Service. But Senate Republicans objected to the CRS’s report (finding, for instance, the phrase “tax cuts for the rich” to be biased) and, in a move without precedent in the century-long history of the Congressional Research Service, forced the report’s withdrawal.135

While Congress fought over the implications of the phrase “tax cuts for the rich,” political scientists raised a distressing question: how much inequality of wealth and income can a democracy bear? In 2004, a task force of the American Political Science Association had concluded that growing economic inequality was threatening fundamental American political institutions. Four years later, a 700-page collection of scholarly essays presented its argument as its title, The Unsustainable American State. A 2013 report by the United Nations reached the conclusion that growing income equality was responsible not only for political instability around the world but also for the slowing of economic growth worldwide. The next year, when the Pew Research Center conducted its annual survey about which of five dangers people in forty-four countries considered to be the “greatest threat to the world,” most countries polled put religious and ethnic hatred at the top of their lists, but Americans chose inequality.136

As the 2016 election neared, inequality seemed poised to gain the candidates’ full attention. Bernie Sanders, seeking the Democratic nomination, would make inequality the centerpiece of his campaign, leading a movement that called for Progressive-style economic reform. But Hillary Clinton, the eventual Democratic nominee, would fail to gain any real traction on the problem. And the unlikely Republican nominee, Donald Trump, would blame immigrants.

A movement to fight gun violence began during Obama’s second term, but it wasn’t a gun control movement; it was a movement for racial justice. In 2013, after a jury in Florida acquitted George Zimmerman of all charges related to the death of Trayvon Martin, organizers began tweeting under the hashtag #BlackLivesMatter. African Americans had been fighting against domestic terrorism, state violence, and police brutality since before the days of Ida B. Wells’s anti-lynching crusade. Black Lives Matter was Black Power, with disruptively innovative technologies: smartphones and apps that could capture and stream candid footage live over the Internet. If stand your ground laws encouraged vigilantism, data service providers encouraged do-it-yourself reporting. Newt Gingrich insisted that the Second Amendment was a human right, but data plans promoted the idea that all users of the Internet were reporters, every man his own muckraker, and that uploading data was itself a human right. “A billion roaming photojournalists uploading the human experience, and it is spectacular,” said the voice-over in an ad for a data plan, over images of a vast mosaic of photographs. “My iPhone 5 can see every point of view, every panorama, the entire gallery of humanity. I need, no, I have the right, to be unlimited.”137

Black Lives Matter made visible, through photography, the experience of African Americans, maybe even in the very way that Frederick Douglass had predicted a century and a half earlier. With photography, witnesses and even the victims themselves captured the experiences of young black men who for generations had been singled out by police, pulled over in cars, stopped on street corners, pushed, frisked, punched, kicked, and even killed. In 2014, police in Ferguson, Missouri, not far from St. Louis, shot and killed eighteen-year-old Michael Brown in the middle of the street. Witnesses captured the shooting on their smartphones. All over the country, witnesses captured one police shooting after another. Police shot and killed Tamir Rice, age twelve, in a city park in Cleveland; he was carrying a toy gun. Minnesota police shot and killed Philando Castile in his car; he had a licensed handgun in his glove compartment and was trying to tell them about it. Castile’s girlfriend livestreamed the shooting. “Social media helps Black Lives Matter fight the power,” announced Wired. Yet legal victories eluded the movement. One killing after another was captured on film and posted on the Internet, but in nearly all cases where officers were charged with wrongdoing, they were acquitted.138

Black Lives Matter called urgent attention to state-sanctioned violence against African Americans, in forms that included police brutality, racially discriminatory sentencing laws, and mass incarceration. Unsurprisingly, perhaps, the movement did not make gun control legislation a priority, not least because its forebears included the Black Panthers, who had argued that black men had to arm themselves, and advanced that argument by interpreting the Second Amendment in a way that would later be adopted by the NRA. Meanwhile, hair-trigger fights along the battle lines first drawn in the 1970s, over guns and abortion, continued to be waged on the streets and at the ballot box, but especially in the courts. A pattern emerged. Second Amendment rights—a de facto rights fight led for and by white men—gathered strength. Civil rights for black people, women, and immigrants stalled and even fell back. And gay rights advanced.

In the early years of the twenty-first century, while other civil rights claims failed, the gay rights movement, newly styled the LGBT movement, won signal victories, chiefly by appropriating the pro-family rhetoric that had carried conservatives to victories since Phyllis Schlafly’s STOP ERA campaign. In 2003, in Lawrence v. Texas, the Supreme Court overruled its 1986 decision in Bowers by declaring a Texas sodomy law unconstitutional. In a concurring opinion, Justice Sandra Day O’Connor said she based her decision on a Fourteenth Amendment equal protection argument, asserting that the Texas law constituted sex discrimination: a man could not be prosecuted for engaging in a particular activity with a woman but could be prosecuted for engaging in that same activity with a man. O’Connor’s reasoning marked the way forward for LGBT litigation, which turned, increasingly, to marriage equality. Less than a year after the ruling in Lawrence, the Massachusetts Supreme Judicial Court made the commonwealth the first state to guarantee same-sex marriage as a constitutional right.139

The Brown v. Board for same-sex marriage came in the spring of 2015, fifty years after the court’s landmark decision on contraception in Griswold v. Connecticut. The case, Obergefell v. Hodges, consolidated the petitions of four couples who had sought relief from state same-sex marriage bans in Kentucky, Michigan, Ohio and Tennessee. In 2004, Ohio had passed a law stating that “Only a union between one man and one woman may be a marriage valid in or recognized by this state.” Ohioans James Obergefell and John Arthur had been together for nearly twenty years when Arthur was diagnosed with ALS, a wrenching and terminal illness, in 2011. In 2013, they flew to Maryland, a state without a same-sex marriage ban, and were married on the tarmac at the airport. Arthur died four months later, at the age of forty-eight. To his widower he was, under Ohio law, a stranger.140

In its ruling in Obergefell v. Hodges, the Supreme Court declared state bans on same-sex marriage unconstitutional. At New York’s Stonewall Inn, the movement’s holy site, people gathered by candlelight, hugged one another, and wept. It had been a long and dire struggle and yet the victory, when it came, felt as unexpected and as sudden as the fall of the Berlin Wall. One minute there was a wall; the next, sky.

A triumph for a half century of litigation over reproductive and gay rights, Obergefell marked, for conservative Christians, a landmark defeat in a culture war that had begun with the sexual revolution in the 1960s. Between Griswold and Obergefell, Christians had joined and transformed the Republican Party and yet had not succeeded in stopping a tectonic cultural shift. Many felt betrayed, and even abandoned, by a secular world hostile to the basic tenets of their faith. Conservative Christians had long identified Hollywood, for its celebrating sex and violence on film and television, as an agent of that change. But as entertainment, including pornography, moved online, conservative Christians, like everyone else, began to wonder about the effects of the Internet on belief, tradition, and community. In a book outlining “a strategy for Christians in a post-Christian nation,” Rod Dreher, an editor at the American Conservative, wrote, “To use technology is to participate in a cultural liturgy that, if we aren’t mindful, trains us to accept the core truth claim of modernity: that the only meaning there is in the world is what we choose to assign it in our endless quest to master nature.”141

Exactly what role the Internet had played in the political upheaval of the first decades of the twenty-first century remained uncertain, but in the aftermath of the 9/11 attacks, Americans believed in conspiracies and feared invasions. Different people feared different conspiracies, but their fears took the same form: intruders had snuck into American life and undermined American democracy. Wasn’t that invader the Internet itself? Cyberutopians said no, and pointed to Obama’s 2008 campaign, the Tea Party, the Occupy movement, Black Lives Matter, the Arab Spring, and political hackers from Anonymous to WikiLeaks as evidence that the long-predicted democratization of politics had at last arrived. “A new Information Enlightenment is dawning,” Heather Brooke wrote, in The Revolution Will Be Digitised. “Technology is breaking down traditional social barriers of status, class, power, wealth and geography, replacing them with an ethos of collaboration and transparency.”142

Instead, social media had provided a breeding ground for fanaticism, authoritarianism, and nihilism. It had also proved to be easily manipulated, not least by foreign agents. On the Internet, everything looked new, even though most of it was very old. The alt-right, a term coined in 2008 by Richard Spencer, was nothing so much as the old right, with roots in the anti–civil rights Ku Klux Klan of the 1860s and the anti-immigration Klan of the 1920s. It stole its style—edgy and pornographic—from the counterculture of the 1960s. The alt-right, less influenced by conservatism than by the sexual revolution, considered itself to be transgressive, a counterculture that had abandoned the moralism of the Moral Majority—or any kind of moralism—and deemed the security state erected by neoconservatives to be insufficient to the clash of civilizations; instead, it favored authoritarianism.143

Spencer had been a History PhD student at Duke before leaving in 2007 to become an editor and a leader of what he described as “a movement of consciousness and identity for European people in the 21st century.” The alt-right, fueled by the ideology of white supremacy and by disgust with “establishment conservatism,” turned misogyny into a rhetorical style and made opposition to immigration its chief policy issue. In 2011, Spencer became the president of the National Policy Institute, whose website announced in 2014, “Immigration is a kind of proxy war—and maybe a last stand—for White Americans who are undergoing a painful recognition that, unless dramatic action is taken, their grandchildren will live in a country that is alien and hostile.”144

About the only thing that was new about the alt-right was the home it found online, on forums like Reddit and especially 4chan, where users, mostly younger white men, mocked PC culture, bemoaned the decline of Western civilization, attacked feminism, trolled women, used neo-Nazi memes, and posted pornography, and also on new, disruptive media sites, especially Breitbart, which was started in 2007 and was for a time one of the most popular websites in the United States.145

image
Supporters of Bernie Sanders, many of them also affiliated with the Occupy movement, insisted on their right to protest outside the 2016 Democratic National Convention in Philadelphia.

The alt-right’s online counterpart, sometimes called the alt-left, had one foot in the online subculture of Tumblr and other platforms, and the other foot in the campus politics of endless pieties over smaller and smaller stakes. If the favored modes of the alt-right were the women-hating troll and the neo-Nazi meme, the favored modes of the alt-left were clickbait and the call-out, sentimental, meaningless outrage—“8 Signs Your Yoga Practice Is Culturally Appropriated”—and sanctimonious accusations of racism, sexism, homophobia, and transphobia. In 2014, Facebook offered users more than fifty different genders to choose from in registering their identities; people who were baffled by this were accused online of prejudice: public shaming as a mode of political discourse was every bit as much a part of the online Far Left as it was of the online Far Right, if not more. After fourteen people were killed in a terrorist attack on a gay nightclub in San Bernardino, California, the alt-left spent its energies in the aftermath of this tragedy attacking one another for breaches of the rules of “intersectionality,” which involve intricate, identity-based hierarchies of suffering and virtue. “One Twitter-famous intersectionalist admonished those who had called it the worst mass shooting in US history by reminding them that ‘the worst was wounded knee,’” the writer Angela Nagle reported. “Other similar tweeters raged against the use of the term Latina/o instead of Latinx in the reporting, while still others made sure to clarify that it was the shooter’s mental illness, not his allegiance to ISIS and the caliphate, that caused the shooting. Not to be outdone, others then tweeted back angrily about the ableism of those who said the shooter had a mental illness.”146

Millennials, a generation of Americans who grew up online, found their political style on the Internet. At the time of the 2016 election, a majority of younger eligible voters got their news from Facebook’s News Feed, which had been launched in 2006. Not many of them—fewer than in any generation before them—believed in political parties, or churches, or public service. The mantra of the counterculture, “question authority,” had lost its meaning; few institutions any longer wielded authority. Sellers of data plans suggested that people could upload all of themselves onto the Internet, a self of selfies and posts, an abdication of community and of inquiry. Sellers of search engines suggested that all anything anyone needed to know could be found out with a click. “Eventually you’ll have an implant,” Google cofounder Larry Page promised, “where if you think about a fact, it will just tell you the answer.”147 But online, where everyone was, in the end, utterly alone, it had become terribly difficult to know much of anything with any certainty, except how to like and be liked, and, especially, how to hate and be hated.

III.

“I’VE SAT AROUND these tables with some of these other guys before,” Jeb Bush’s campaign manager said. In a room about the size of a tennis court, its walls painted martini-olive green, the campaign managers of the candidates for president of the United States in 2016 sat around a broad conference table to debrief after the election. They were warriors, after the war, standing atop a mountain of dead, remorseless. They had gathered at Harvard’s Kennedy School, as campaign managers had done after every presidential election since 1972, for a two-day tell-all. Most of what they said was shop talk, some of it was loose talk. No one said a word about the United States or its government or the common good. Sitting in that room, watching, was like being a pig at a butchers’ convention: there was much talk of the latest technology in knives and the best and tastiest cuts of meat, but no one pretended to bear any love for the pig.

image
Not long after his 2017 inauguration, President Trump greeted visitors to the White House in front of a portrait of Hillary Clinton.

The election of 2016 was a product of technological disruption: the most significant form of political communication during the campaign was Donald Trump’s Twitter account. It involved a crisis in the press, whose standards of evidence and accountability were challenged by unnamed sources and leaks, some of which turned out to have been part of a campaign of political interference waged by the Russian government, in what came to be called troll factories. The election dredged from the depths of American politics the rank muck of ancient hatreds. It revealed the dire consequences of a dwindling middle class. It suggested the cost, to the Republic’s political stability, of the unequal constitutional status of women. It marked the end of the conservative Christian Coalition. And it exposed the bleak emptiness of both major political parties.

Seventeen candidates had vied for the Republican nomination. At the debrief, the campaign managers talked about their candidates and the campaign the way jockeys talk about their horses, and the conditions on the race track. “Our strategy was to keep our head down,” said Florida senator Marco Rubio’s manager. Wisconsin governor Scott Walker’s manager said, “The path was going to be the long game.” Ted Cruz’s manager talked about what lane his horse was racing in. Trump’s former campaign manager, CNN analyst Corey Lewandowski, spoke the longest. His horse was the best, the prettiest, the fastest, and ran “the most unconventional race in the history of the presidency.” He told a story, likely apocryphal, about how in 2012 Mitt Romney had been driven in a limo to campaign events but then, at the last minute, he’d jump into a Chevy. Not Trump. Trump went everywhere in his jet. “Our goal was to make sure we were going to run as the populist, to run on our wealth and not run from it, and to monopolize the media attention by using social media unlike anybody else,” Lewandowski gloated. “What we know is that when Donald Trump put out a tweet, Fox News would cover it live.” Field organizing was over, he said. Newspapers, newspaper advertisements? Irrelevant, he said. “Donald Trump buys ink by the television station,” he said. Trump hadn’t run in any lane. Trump had run from a plane.148

South Carolina senator Lindsey Graham’s manager pointed out that much had turned on Fox News’s decision to use polls to determine who participated in the primary debates, and where each candidate would stand on the stage, and how much camera time each candidate would get. In the 2016 election, the polls had been a scandal of near Dewey-Beats-Truman proportion, a scandal that people in the industry had seen coming. During the 2012 presidential election, twelve hundred polling organizations had conducted thirty-seven thousand polls by making more than three billion phone calls. Most Americans—more than 90 percent—had refused to speak to them. Mitt Romney’s pollsters had believed, even on the morning of the election, that Romney would win. A 2013 study—a poll—found that three out of four Americans distrusted polls. But nine of ten people, presumably, distrusted the polls so much that they had refused to answer the question, which meant that the results of that poll meant nothing at all.149

“Election polling is in near crisis,” a past president of the American Association for Public Opinion Research had written just months before the 2016 election. When George Gallup founded the polling industry in the 1930s, the response rate—the number of people who answer a pollster as a percentage of those who are asked—was well above 90. By the 1980s, it had fallen to about 60 percent. By the election of 2016, the response rate had dwindled to the single digits. Time and again, predictions failed. In 2015, polls failed to predict Benjamin Netanyahu’s victory in Israel, the Labour Party’s loss in the United Kingdom, and a referendum in Greece. In 2016, polls failed to predict Brexit, the vote to withdraw Great Britain and Northern Ireland from the European Union.150

The more unreliable the polls became, the more the press and the parties relied on them, which only made them less reliable. In 2015, during the primary season, Fox News announced that, in order to participate in its first prime-time debate, Republican candidates had to “place in the top 10 of an average of the five most recent national polls,” and that where the candidates would be placed on the debate stage would be determined by their polling numbers. (Standing in the polls had earlier been used to exclude third-party candidates from debates—a practice that had led to a raft of complaints filed with the Federal Election Commission—but not major-party candidates.) The Republican National Committee didn’t object, but the decision had alarmed reputable polling organizations. The Marist Institute for Public Opinion called the Fox News plan “a bad use of public polls.” Scott Keeter, Pew’s director of Survey Research, said, “I just don’t think polling is really up to the task of deciding the field for the headliner debate.” Pew, Gallup, and the Wall Street Journal/NBC pollsters refused to participate.151

Polls admitted Trump into the GOP debates, polls placed him at center stage, and polls declared him the winner. “Donald J. Trump Dominates Time Poll,” the Trump campaign posted on its website following the first debate, referring to a story in which Time reported that 47 percent of respondents to a survey it had conducted said that Trump had won. Time’s “poll” was conducted by PlayBuzz, a viral content provider that embedded “quizzes, polls, lists and other playful formats” onto websites to attract traffic. PlayBuzz collected about seventy thousand “votes” from visitors to Time’s website in its instant opt-in Internet poll. Time posted this warning: “The results of this poll are not scientific.”152 Less reputable websites did not bother with disclaimers.

Efforts to call attention to the weakness of the polls, or to make distinctions between one kind of poll and another, were both unsuccessful and halfhearted. The New York Times ran a story called “Presidential Polls: How to Avoid Getting Fooled.” Polls drove polls. Good polls drove polls, and bad polls drove polls and when bad polls drove good polls, they weren’t so good anymore. Then, too, warning their readers, listeners, or viewers about the problems with polls hadn’t prevented news organizations from compounding them. In August 2015, the day after the first GOP debate, Slate published a column called “Did Trump Actually Win the Debate? How to Understand All Those Instant Polls That Say Yes,” even as Slate conducted its own instant poll: “Now that the first Republican presidential debate is over, pundits and politicos will be gabbing about what it all means for each candidate’s campaign. Who triumphed? Who floundered? Who will ride the debate to electoral glory, and who is fated to fizzle?” They made the same populist promises Gallup had made in the 1930s. “TV talking heads won’t decide this election,” promised Slate’s pollster (whose title was “Interactives Editor”). “The American people will.”153

Every major polling organization miscalled the 2016 election, predicting a win for Hillary Clinton. It had been a narrow contest. Clinton won the popular vote; Trump won in the Electoral College. The Kennedy School post-election debrief served as one of the earliest formal reckonings with what, exactly, had happened.

After the Republican campaign managers finished taking stock, the Democrats spoke. “Hillary, a lot of people don’t recall, came to electoral politics late in her career,” her campaign manager, Robby Mook, said. “She got her start with the Children’s Defense Fund . . .” Clinton’s campaign had failed to say much of anything new about Hillary Clinton, a candidate Americans knew only too well. Mook apparently had little to add. Bernie Sanders’s manager looked wan. He shook his head. “We almost did it.”154

The more obvious explanations for Clinton’s loss went, on the whole, unstated. Obama had failed to raise up a new generation of political talent. The Democratic National Committee, believing Clinton’s nomination and even her victory to be inevitable, had suppressed competition. Clinton, dedicating her time to fund-raising with wealthy coastal liberals from Hollywood to the Hamptons, failed to campaign in swing states and hardly bothered to speak to blue-collar white voters. After Trump won the nomination, she failed to do much of anything except to call out his flaws of character, even though Trump’s most vocal supporters had pointed out, from the very beginning, that a call-out approach would fail.

The Clinton campaign believed Trump’s political career had come to an end when an audio recording was leaked in which he said that the best way to approach women was to “grab ’em by the pussy.” But even this hadn’t stopped conservative Christians from supporting him. “Although the media tried to portray Trump’s personality as a cult of personality, ironically, the one thing voters weren’t wild about was his personality,” wrote Ann Coulter, in In Trump We Trust, a hastily written campaign polemic that, like her earlier work, waved aside even the vaguest interest in evidence: “I’m too busy to footnote.” As for charges of Trump’s depravity and deceit, Coulter rightly predicted that his supporters would be untroubled: “There’s nothing Trump can do that won’t be forgiven,” she wrote. “Except change his immigration policies.”155

Phyllis Schlafly, the grande dame of American conservatism, had provided Trump with one of his earliest and most important endorsements, at a rally in St. Louis in March of 2016. At ninety-one, her voice quavered but her powers were undiminished. In a pink blazer, her blond bouffant as flawless as ever, she told the crowd that Trump was a “true conservative.” Trump, to Schlafly, represented the culmination of a movement she had led for so long, from the anticommunist crusade of the 1950s and the Goldwater campaign of the 1960s to STOP ERA in the 1970s and the Reagan Revolution in the 1980s. Since 9/11, Schlafly had been calling for an end to immigration, and for a fence along the border, and Trump’s call for a wall had won her loyalty. “Donald Trump is the one who has made immigration the big issue that it really is,” Schlafly said. “Because Obama wants to change the character of our country.”156

That summer, Schlafly had attended the Republican National Convention to celebrate Trump’s historic nomination. In a wheelchair, she looked weak and pale and yet she spoke with her trademark determination. She said she wanted to be remembered for “showing that the grassroots can rise up and defeat the establishment, because that’s what we did with the Equal Rights Amendment, and I think that’s what we’re going to do with electing Donald Trump.” Schlafly died only weeks later, on September 5, 2016. Her endorsement, The Conservative Case for Trump, published the day after her death, called on conservative Christians to support Trump because of his positions on immigration and abortion: “Christianity is under attack around the world—most dramatically from Islamists, but also insidiously here at home with attacks on religious freedom.”157

Only weeks before the election, Trump delivered the opening remarks at Schlafly’s funeral, at a gothic cathedral in St. Louis. “With Phyllis, it was America first,” said Trump from the altar. He raised a finger, as if making a vow: “We will never, ever let you down.” On Election Day, at least according to exit polls, 52 percent of Catholics and 81 percent of evangelicals voted for Trump.158

Trump’s election marked a last and abiding victory for the woman who stopped the ERA. Yet dissenting conservative Christians argued that it also marked the end of Christian conservatism. “Though Donald Trump won the presidency in part with the strong support of Catholics and evangelicals, the idea that someone as robustly vulgar, fiercely combative, and morally compromised as Trump will be an avatar for the restoration of Christian morality and social unity is beyond delusional,” wrote Rod Dreher after the election. “He is not a solution to the problem of America’s cultural decline, but a symptom of it.”159

Dreher called on Christians to engage in “digital fasting as an ascetic practice.” Other conservatives who had not supported Trump wrestled with the consequences of the right-wing attack on traditional sources of authority and knowledge but especially the press. “We had succeeded in convincing our audiences to ignore and discount any information whatsoever from the mainstream media,” former conservative talk radio host Charlie Sykes reported after the election, in an act of apostasy called How the Right Lost Its Mind.160

The Left placed blame elsewhere. Hillary Clinton mainly attributed her defeat to a scandal over her email, for which she blamed the FBI, though she and her supporters also blamed Bernie Sanders, for dividing the Democratic Party.161 At the Kennedy School post-election conference, neither the Clinton campaign nor the mainstream media was ready to reckon with its role in the election. At an after-dinner discussion about the role of the media in the election. Jeff Zucker, the president of CNN, rebuffed every suggestion that CNN might have made mistakes in its coverage—for instance, in the amount of airtime it gave to Trump, including long stretches when, waiting for the candidate to appear somewhere, the network broadcast footage of an empty stage. “Frankly, respectfully, I think that’s bullshit,” Zucker said of the complaints. “Donald Trump was on CNN a lot. That’s because we asked him to do interviews and he agreed to do them. We continuously asked the other candidates to come on and do interviews and they declined.”162

“You showed empty podiums!” someone hollered from the audience. Zucker refused to back down. “Donald Trump was asked to come on, and he agreed to come on, and he took the questions. These other candidates were asked—”

“That’s not true!” screamed another campaign manager.

Zucker: “I understand that emotions continue to run high. . . .”163

The moderator, Bloomberg Politics writer Sasha Issenberg, called for calm. “Let’s move to a less contentious subject—fake news.”164

During the campaign, voters who got their news online had been reading a great many stories that were patently untrue, pure fictions, some of them written by Russian propagandists. Russian president Vladimir Putin disliked Clinton; Trump admired Putin. During Trump’s first year in office, Congress would investigate whether the Trump campaign had colluded with the Russian government, and even whether the meddling affected the outcome of the election, but the meddling, which appeared to consist of stoking partisan fires and igniting racial and religious animosity, had a larger aim: to destroy Americans’ faith in one another and in their system of government.165

In any event, not all writers of fake news were Russians. Paul Horner, a thirty-seven-year-old aspiring comedian from Phoenix, wrote fake pro-Trump news for profit, and was amazed to find that Trump staff like Lewandowski reposted his stories on social media. “His campaign manager posted my story about a protester getting paid $3,500 as fact,” Horner told the Washington Post. “Like, I made that up. I posted a fake ad on Craigslist.” Horner, who did not support Trump, later said, “All the stories I wrote were to make Trump’s supporters look like idiots for sharing my stories.” (Horner died not long after the election, possibly of a drug overdose.)166

Horner may have been surprised that people reposted his hoaxes as news, but a great deal of reposting was done not by people but by robots. In the months before the election, Twitter had as many as 48 million fake accounts, bots that tweeted and retweeted fake news. On Facebook, a fake news story was as likely as a real news story to be posted in Facebook’s News Feed.167

At the Kennedy School forum, moderator Issenberg turned to Elliot Schrage, Facebook’s vice president of global communications, marketing, and public policy.

“At what point did you recognize there was a problem with fake news?” Issenberg asked.

“The issue of our role as a news dissemination organization is something that really surfaced over the course of the past year,” Schrage said.168

Congress would subsequently conduct an investigation into what Facebook knew, and when it knew it, and why it didn’t do more about it.169 Mark Zuckerberg, who appeared to be exploring the possibility of some day running for president of the United States, had at first dismissed the notion that Facebook played any role in the election as “crazy.” During a subsequent congressional investigation, Facebook would reluctantly admit that a Kremlin-linked misinformation organization, the Internet Research Agency, whose objective was to divide Americans and interfere with the election, had bought inflammatory political ads from Facebook that had been seen by more than 126 million Americans.170 It later came out that Facebook had provided the private data of more than 87 million of its users to Cambridge Analytica, a data firm retained by the Trump campaign.

Schrage, however, didn’t speak to any of that. Facebook had only very recently begun to wonder whether it ought to think of itself as a “news organization”—“I’d say probably in the last three or six months,” he explained—and it showed. Schrage, a corporate lawyer who specialized in acquisitions and mergers, displayed little evidence of any particular understanding of news, reporting, editing, editorial judgment, or the public interest. When he dithered about photographs with nipples that Facebook’s algorithms had classified as pornography, but which might really be legitimate news stories, the Associated Press’s Kathleen Carroll interjected witheringly.171

“Can I just say that news judgment is a lot more complicated than nipples?”172

Schrage shrank in his chair.

At the start of Trump’s second year in office, the Justice Department would indict thirteen Russian nationals involved with the Internet Research Agency, charging them with “posing as U.S. persons and creating false U.S. persons,” as well as using “the stolen identities of real U.S. persons” to operate and post on social media accounts “for purposes of interfering with the U.S. political system,” a strategy that included “supporting the presidential campaign of then-candidate Donald J. Trump . . . and disparaging Hillary Clinton.” They were also charged with undermining the campaigns of Republican candidates Ted Cruz and Marco Rubio, supporting the campaigns of Bernie Sanders and Green Party candidate Jill Stein, using Facebook and Twitter to sow political dissent in forms that included fake Black Lives Matter and American Muslim social media accounts, and organizing pro-Trump, anti-Clinton rallies, posting under hashtags that included #Trump2016, #TrumpTrain, #IWontProtectHillary and #Hillary4Prison.173 More revelations would follow.

At the post-election panel, Issenberg asked Marty Baron, esteemed editor of the Washington Post, whether he had considered not publishing the content of Democratic National Committee emails released by WikiLeaks, an anonymous source site established in 2006. WikiLeaks founder Julian Assange, an Australian computer programmer, styled himself after Daniel Ellsberg, the political scientist who had leaked the Pentagon Papers, but Assange, living in the Ecuadorian embassy in London, bore not the remotest resemblance to Ellsberg. Russian hackers had broken into the DNC’s servers, Assange had released the hacked emails on WikiLeaks, and the Post was among the media outlets that decided to quote from emails that would turn out to have been hacked by a sovereign nation-state.174

Baron, otherwise serene and oracular, grew testy, evading Issenberg’s question and pointing out, irrelevantly, that the Post had not hesitated to release the contents of the emails because “the Clinton campaign never said that they had been falsified.”175

Issenberg asked Schrage why Facebook hadn’t fact-checked purported news stories before moving them in the News Feed rankings. Schrage talked about Facebook’s “learning curve.” Mainly, he dodged. “It is not clear to me that with 1.8 billion people in the world in lots of different countries with lots of different languages, that the smart strategy is to start hiring editors,” he said.176 As congressional hearings would subsequently confirm, Facebook had hardly any strategy at all, smart or otherwise, except to maximize its number of users and the time they spent on Facebook.

“Where’s news judgment?” called out someone from the audience, directing the question at the entire panel.

Zucker shrugged. “At the end of the day, it is up to the viewer.”177 He was answered by groans.

Carroll, a longtime eminence in the profession of journalism and a member of the Pulitzer board, summed up the discussion. “I know that there are some organizations or some journalists or some observers who feel like the media ought to put on a hair shirt,” she said. “I think that’s crap.”178 And the evening ended, with no one from any of the campaigns, or from cable news or social media or the wire services, having expressed even an ounce of regret, for anything.

The election had nearly rent the nation in two. It had stoked fears, incited hatreds, and sown doubts about American leadership in the world, and about the future of democracy itself. But remorse would wait for another day. And so would a remedy.