I will never forget June 23, 2016, the date of the Brexit vote. I stayed up most of the night watching the BBC, absorbing the mounting shock of political commentators and elected officials as the returns trickled in. Not even Nigel Farage, the father of the United Kingdom Independence Party, who had campaigned for decades against Britain’s membership in the European Union, could believe the results. Early in the evening he had issued a statement of concession, which he happily retracted some hours later.
In the wee hours of the morning, a thought flashed through my mind: with different accents and a change of proper nouns, the worthies of the BBC could have been talking about the American presidential campaign. The issues were similar, as were the grievances and demographic divides. Donald Trump began calling himself “Mr. Brexit.” I took him seriously; I should have taken him literally as well.
Postelection studies of the British vote have clarified its principal dynamics. A synthesis of the research shows that “education levels appear to have been the single biggest driver of the decision to either Leave or Remain.”1 While 73 percent of voters with college and advanced degrees voted to remain, 75 percent of voters who left the British equivalent of high school without passing standard exit exams voted to leave.2 Not only does higher education expand one’s opportunities, it also shapes one’s outlook.
All else being equal, individuals with higher education tend to favor openness, variety, and innovation. They are more open to demographic change and internationalism, and they tend to value creativity and curiosity over order and discipline. Highly educated individuals are more likely to believe that they have options in life and that they retain a measure of control over their own fate. Less educated people are more likely to see themselves as lacking control over their own lives, a sentiment that political psychologists have linked to the desire for order and authority.
Other key drivers of the Brexit vote included income, age, place, immigration, and economic sector. Older, lower-income voters from smaller cities and rural areas were more likely to favor leaving the European Union, as were those from manufacturing regions and areas that had seen a rapid surge in immigration during the past decade. A majority of citizens who considered themselves middle class voted to remain in the Union, while a majority of self-identified working-class citizens voted to leave it.3
It is a mistake, researchers agreed, to overemphasize the role of economics in the outcome. Rather, voters viewed the events of the past two decades through a multifaceted prism of culture, values, and sentiments. As the political scientists Matthew Goodwin and Oliver Heath put it, the Brexit vote was “anchored predominantly, albeit not exclusively, in areas of the country that are filled with pensioners, low-skilled and less well-educated blue-collar workers and citizens who have been pushed to the margins not only by the economic transformation of the country, but by the values that have come to dominate a more socially liberal media and political class.”4
Immigration too was more than an economic issue: for many voters, the previous decade’s rapid pace of immigration—especially from Eastern and Central Europe—posed unwelcome challenges to both cultural stability and national sovereignty.5 If economics had been dominant, as the leaders of the Remain campaign assumed, the United Kingdom might not have opted to leave the European Union. When the Leave campaign focused more sharply on immigration and sovereignty, the tide turned in its direction.
Geography also mattered in surprising ways. It turns out that the size and diversity of social networks had a significant impact on attitude toward Brexit. Individuals who had socialized with people from another country, another part of Britain, or even another town were more likely to favor remaining in the European Union, while those whose social relations were confined to their own communities were more likely to vote Leave.6
Leave and Remain voters differed greatly in their attitudes toward the past and future of the United Kingdom. Leave voters believed that their children’s generation would do worse than they themselves had done, while Remain voters were more optimistic. Fifty-eight percent of Leave supporters felt that life in Britain was worse than it was thirty years before, while 73 percent of Remain supporters thought it was better.7
Brexit voters reported a marked sense of political disenfranchisement. Many felt politicians had neglected their local areas and that the national government did not listen to their concerns. Brexit was popular among those who did not typically vote; many Leave voters had long withdrawn trust from government and elected officials and had not participated in the 2015 general election.8 In this atmosphere of mistrust, these voters were drawn to conspiracy theories about collaboration between British intelligence services, the government, and the European Union to prevent a Brexit vote. This mistrust extended to the vote itself. One survey found that half of Leave voters believed the election might be rigged, versus 11 percent of Remain supporters.9
There was, finally, an international dimension to the public attitudes shaping the outcome. Those who favored cooperation with other countries were 52 percentage points more likely to favor remaining in Europe than were those who thought Britain is better off when it puts its own interests first “without worrying what other countries think.” Not surprisingly, those who opposed compromise were also more likely to favor leaders they regarded as strong and principled rather than consensual and conciliatory.10
I need not dwell, I suspect, on the multiple resonances between the Brexit vote and Donald Trump’s remarkable rise to the presidency of the United States. A detailed analysis of U.S. exit polls as well as postelection survey research reinforces most people’s qualitative first impressions. Still, each country is different. The response of Americans to their country’s economic, social, and political dysfunction has set the stage for a distinctively American populism.
The poor performance of the economy, at least as average Americans have experienced it, has framed the politics of the past generation. After seven consecutive years of growth following the recession of the early 1990s, median household income peaked in 1999.11 Since then there has been no growth whatsoever. In mid-2017, eight years after the official end of the Great Recession, median household income barely exceeded the 2007 Bush-era peak and roughly equaled the level of the late 1990s.
After the previous peak in 1989, household income declined for four years, bottoming out in 1993. By 1996 it had regained all the lost ground, and it continued to surge for years afterward. In the sixteen years from 1983 to 1999, median household income rose by nearly nine thousand dollars—more than 18 percent.12 There is no postwar parallel for the stagnation Americans have experienced during the past generation.
Making matters worse, the economic pain has been unevenly divided. By virtually every measure, metropolitan areas have done much better than small towns and rural areas. For example, aggregate employment in metropolitan America is 5 percent above its peak prior to the Great Recession, while nonmetropolitan employment remains substantially lower than at the end of 2007, right before the bottom fell out.13 The sharp decline in U.S. manufacturing employment since the beginning of the century has been concentrated in the country’s heartland, while the postindustrial coastal economies have suffered much less damage.
A glance at recent economic history underscores the magnitude of this shift. During the first five years of recovery from the recession of 1990–91, rural and small-town counties were responsible for 63 percent of newly generated jobs. In the five years after the 2001 recession, the results were comparable, with 59 percent of new jobs located in these less populated counties. But in the first five years after the Great Recession, only 35 percent of job gains were in these counties, versus 64 percent in counties with populations of five hundred thousand or more. During these two decades, the share of new jobs in counties dominated by large cities more than doubled, from only 16 percent in the early and middle 1990s to 41 percent between 2010 and 2014. This massive shift of jobs, income, and wealth to urban centers has not gone unnoticed, and it has fed rural and small-town Americans’ sense of being left out and ignored.14
To a greater extent than in other Western democracies, trade enters into the American narrative of economic decline. Americans blame the North American Free Trade Agreement for the development of continental supply chains that shifted manufacturing production to Mexico. The entry of China into the World Trade Organization accelerated the growth of its exports to the United States, and the regions most exposed to Chinese import penetration experienced the largest losses of manufacturing jobs and wages.15 In this context, Donald Trump’s denunciation of the entire postwar trade regime found a receptive audience.
The past generation’s economic performance has underlined longer-term changes in opportunity and mobility in the United States. Children born into middle-income households in 1940 had a better than nine-in-ten chance of outperforming their parents by the time they reached the age of thirty. But fewer than half the children born in the 1980s were doing better than their parents thirty years later.16 Little changes if incomes are compared at the age of forty rather than thirty.
This multigenerational economic change has profoundly affected public attitudes. The heart of the “American Dream” is progress—the expectation by parents that their children will do better than they have. But a 2015 Pew Research Center survey found that only 32 percent of Americans expressed such optimism about the next generation, compared to 60 percent who thought the next generation would be worse off.17
The public outlook was bleaker in nearly every country in Europe. Only 15 percent of Italians and 14 percent of French respondents thought the next generation would enjoy a better future.18 But optimism has never been as central to European societies as it has been in the United States, whose citizens have experienced a profound shock to long-held expectations.
Ever since the countercultural eruption that began in the late 1960s, American society has been divided about issues such as abortion, illegal drugs, the role of religion in politics, and—most recently—the proper legal status for sexual orientations and acts outside the boundaries of heterosexuality. Frequently these divisions have figured centrally in national political contests, but while they have by no means disappeared, their impact on political debate in the past two years has diminished, overlaid by rising concerns about the impact of immigration on the U.S. population.
These concerns fall into three categories. Many Americans with lower levels of education and skills believe that poorly educated immigrants, especially from Mexico and Central America, are competing for increasingly scarce low-skilled jobs and are driving down working-class wages. Higher-than-average unemployment rates among lower-skilled workers and a decades-long reduction in their incomes have reinforced this belief.
Next come demographic concerns, which a brief history can frame. The surge of immigration around the turn of the twentieth century raised the share of first-generation immigrants to 15 percent of the population, triggering a nativist reaction that culminated in the restrictive immigration legislation of 1924. Over the next four decades, the first-generation share declined by two-thirds, bottoming out at 4.7 percent in the early 1960s. The political salience of ethnic differences within the white majority faded.19
In 1965, the landmark Hart-Celler Bill reopened the gates and allowed large numbers of immigrants from long-excluded areas such as East Asia and the Indian subcontinent, as well as from the Spanish-speaking countries of the Americas. The consequence over the past five decades has been a demographic revolution. Millions of nonwhite, non-European immigrants have entered U.S. society. Latinos and Asians are the fastest-growing groups, while the white share of the population is shrinking steadily. Three states (including California and Texas, the two largest) already have majority-minority populations, and many more will join them in the coming decades. By 2044, if current trends continue, the United States as a whole will no longer have a white majority, as “white” is now defined.20
To be sure, previous generations of immigrants from Central and southern Europe—Poles, Hungarians, Czechs, and Italians, among others—gradually blended into, and identified with, the overall population. But this process occurred during an extended period when new immigration had slowed to a trickle and first-generation immigrants represented a steadily declining share of the population—the reverse of the situation that now prevails.
This ongoing demographic shift has triggered palpable anxiety among many native-born Americans, especially those outside the metropolitan areas that have always served as immigration gateways. These Americans have a sense, understandable in light of their experience, that they are the rightful owners of the country and that new entrants threaten their control. Although they express their anxiety most often as anger against the roughly eleven million immigrants who are present in the United States illegally, many also believe that current levels of legal immigration are too high and should be reduced.
Finally, security concerns weigh heavily on many voters’ minds. During the 2016 presidential campaign, Donald Trump asserted that immigrants from Mexico increase the U.S. crime rate and that immigrants from Muslim-majority countries constitute a terrorist threat. Although policies such as mass deportation and a ban on Muslim immigration never received majority support, a substantial minority of Americans regarded them as justified.
The threat of crime and terrorism creates a pervasive sense of insecurity. In surveys taken in June 2016, 86 percent of Americans expressed concern about so-called lone wolf terrorist attacks, and only 31 percent had confidence in the government’s ability to prevent them. It is easy to understand why the desired balance between security and civil liberty is shifting. The same surveys revealed 54 percent of Americans worried that the government would not do enough to monitor the activities of “potential terrorists,” compared to 39 percent who feared that the government would go too far. Seventy-two percent favored increased surveillance of people suspected of possible links to terrorism, even if it would intrude on privacy rights.21
The dysfunction of the American political system is well enough known to require only brief remarks. Suffice it to say that over the past quarter century the two major political parties have become more polarized—that is, both more internally homogeneous and more ideologically distant from each other. As this process has proceeded, the adherents of the respective parties have tended to cluster geographically, a phenomenon the sociologist Bill Bishop has dubbed the “Big Sort.”22 Combined with the decline of transpartisan broadcasting and the rise of politically inflected media, this sorting has produced the social equivalent of echo chambers in which partisans are increasingly likely to hear only the opinions with which they agree and to encounter only the evidence consistent with these opinions.
Polarization is affective as well as cognitive. For the first time in the history of modern survey research, majorities of partisans have not merely an unfavorable but a deeply unfavorable view of the other party. In a 2016 survey, 49 percent of Republicans reported that the Democratic Party makes them afraid, and 46 percent that it made them angry. The sentiments of Democrats were even more intense: 55 percent said the Republican Party made them afraid, and 47 percent that it made them angry. Forty-seven percent of Republicans see Democrats as more “immoral” than other Americans; 70 percent of Democrats see Republicans as more “closed-minded.” Forty-five percent of Republicans view Democratic policies as not only misguided but also a “threat,” up from 37 percent in 2014, while 41 percent of Democrats see Republican policies as threatening, up from 31 percent in 2014. Among both sides’ most engaged and active partisans, these figures are even higher.23
In a remarkable inversion of the feminist dictum that the personal is political, it now seems that the political has become personal. Large numbers of Americans are troubled that their child might marry someone of the opposite political persuasion. A fiftieth-anniversary remake of Stanley Kramer’s Guess Who’s Coming to Dinner would feature a Trump-supporting boyfriend at the table of an upscale liberal family, or vice versa.
In a parliamentary system, these polarities, though troubling, would at least be manageable. In the U.S. constitutional system, which allows for divided control of different national institutions, they are much more problematic. Partisan polarization makes compromise difficult, and so the typical consequence of divided government is gridlock. In contemporary circumstances, the national government can act effectively only when all its powers are in the hands of a single party. But then the dominant party is likely to go it alone and implement its preferred program, whatever the minority thinks. Few single-party governments resist the temptation to overreach. Winston Churchill’s injunction—in victory, magnanimity—is ignored. So the cycle of gridlock yielding public dissatisfaction producing unified government giving way to partisan overreach followed by public reaction producing divided government and renewed gridlock continues indefinitely.
Although unified government can produce unbalanced and unsustainable public policy, gridlock is a greater threat to the democratic order. In their efforts to govern effectively, presidents are tempted to extend their powers beyond constitutional bounds. Worse, an impatient populace becomes more willing to set aside the restraints inherent in the rule of law. In a June 2016 survey conducted by the Public Religion Research Institute, 49 percent of voters agreed with the statement “Because things have gotten so far off track in this country, we need a leader who is willing to break some rules if that’s what it takes to set things right.” This figure included 57 percent of Republicans, 60 percent of white working-class voters, 72 percent of Trump supporters, and—tellingly—59 percent of those who felt that the American way of life needs protection from foreign influences.24
Many ordinary citizens hold American elites (often of both political parties) responsible for what has gone wrong over the past generation, and there is some basis for their view. While experts enjoyed a rare period of deference between the end of World War II and the mid-1960s, policy failures since then, both at home and abroad, have weakened their claims. The “best and the brightest” led the United States into Vietnam. The intelligence community’s consensus that Saddam Hussein possessed weapons of mass destruction smoothed the path to war in Iraq. Financial experts engineered new forms of investment that helped bring on the Great Recession.
At the same time that trust in expertise has declined, meritocratic norms and practices have propelled highly educated Americans to the highest reaches of the economy, media, and politics. This group has benefited from the transition to a knowledge-based economy as well as from freer flows of goods, people, and capital. On the other hand, leaders have made at best half-hearted efforts to insulate average Americans from the negative consequences of these trends or to compensate them for their losses. Worse, many leaders have appeared oblivious to the travails of their fellow citizens, and this blindness is often tinged with meritocratic snobbery toward those with less education and status.
The phrase “flyover country” perfectly captures the outlook of bicoastal elites, and the citizens of flyover country took their revenge in 2016. Who were these voters, and why did Trump’s message resonate?
Rather than securing victory through the uncompromising support of a homogeneous voting bloc, Trump won the general election by assembling a coalition of voters with diverse political ideologies and personal circumstances. Yet one group stands out for its early and sustained enthusiasm. These core supporters of Trump made up a fifth of his general election voters and were responsible for his initial unexpected rise (82 percent of them voted for him in the GOP primaries). Members of this group are economically progressive and overwhelmingly support increasing the tax rate for the wealthy, believe the system is rigged against people like them, and hold strong anti-immigrant and ethnocentric views.25
They are the least educated and earn the lowest incomes among Trump’s general election voters. They are the most likely to receive Medicaid benefits, to report a disability that interferes with employment opportunities, to spend many hours watching TV every day, and to be categorized as politically uninformed. Perhaps surprising to some, they are not staunch social conservatives. Only 33 percent describe themselves as pro-life, and few are gun owners or NRA members. They are not even passionate partisans. Fifty-three percent want to see their member of Congress legislate on a bipartisan basis to get things done. Of the members of the Trump coalition, these voters were the most likely to have voted for Barack Obama in 2012.26
While his populism upended some established right-left political divisions, Trump also benefited from a shift in the party coalitions that began before his candidacy. Between 2012 and 2016, this group of core Trump supporters shifted thirteen points toward the Republican Party. While white voters without a college education had split their votes almost evenly during the 1990s between the Democratic and Republican parties, by 2015, 57 percent of these voters identified as Republican and only 33 percent as Democrat. This “diploma divide” marks one of the most significant changes in today’s party landscape.27
An examination of white vote switchers (those who voted for Obama in 2012 and for Trump in 2016) finds that attitudes on immigration and racial and religious minorities dominated voters’ decision-making. For example, 33 percent of white Obama voters characterized illegal immigrants as a “drain,” and 34 percent favored making it harder for foreigners to come to the United States. Staunch opposition to a pathway to citizenship for illegal immigrants motivated many to switch to Trump in 2016. Similarly, the 37 percent of white Obama voters holding less favorable attitudes toward Muslims and reporting negative perceptions of African Americans were also likely to favor Trump in 2016. Compared to 2012, 2016 saw a much tighter relationship between the attitudes of voters on identity issues and their choice of candidate.28
Throughout the 2016 campaign, a debate persisted over whether economic anxiety or cultural backlash explained support for Trump. Some argued that populist disaffection fundamentally stems from economic insecurity and therefore marks an effort to ameliorate woes caused by deindustrialization and globalization. Others argued the election demonstrated a powerful resurgence of overtly racist and xenophobic attitudes.
Postelection survey research demonstrates that voters’ economic anxiety does not offer an adequate explanation of the 2016 election.29 But there does appear to be some relation between financial insecurity and anti-immigrant or ethnocentric views. One comprehensive study shows that white survey respondents who thought the economy was getting worse in 2012 were more likely in 2016 to believe it should be more difficult for foreigners to immigrate, to characterize immigration as a drain on the country, and to hold negative views toward Muslims, regardless of their responses to the same survey questions in 2012. This general relationship persisted when considering perceptions voters had of their personal financial situation in 2012: voters who reported they were struggling financially were more likely in 2016 to hold negative views of immigrants and Muslims, regardless of their responses in 2012.30
While today’s populism poses a challenge to the classic Marxist framework in which economic structures and relations determine a society’s political and cultural life, it would be wrong, in analyzing the causes of Trump’s rise, to dismiss the views that voters had of the economy and the economic well-being of their families and communities. There is a complex interaction among economic, cultural, and security factors, each of which independently affects public attitudes.
As I argue at length in this book’s concluding chapter, an element of tribal thinking is inherent in the human condition. When economic times are good and citizens feel personally secure, tribal sentiments remain muted: there are no urgent problems for which the Other need be blamed. But either a sharp downturn or a pressing security threat can activate these sentiments. When both occur simultaneously, tribalism surges. When demographic change is added to the mix, We/They thinking can dominate public consciousness. And as we have seen, this sharp dichotomy is the breeding ground of populism.
The public sentiments behind the populist explosion have been building for many years. Since the 1970s—with a few temporary interruptions during the economic boom of the late 1990s and again in the wake of the 9/11 attacks—public trust and confidence in national governmental institutions has hovered between 20 and 30 percent.31 The most recent survey found that only 20 percent of Americans, close to the historic low, trust the federal government to do what’s right all or most of the time.32 Half a century ago, this figure stood near 75 percent.
At the same time, Americans’ views concerning the motives of elected officials have darkened. Half a century ago, nearly two-thirds of Americans believed the federal government was run for the benefit of all the people. By the end of 2015, only 19 percent of Americans held this view.33
More recently, other major institutions such as banks, large corporations, and the news media have lost the public’s trust. Today, surveys find that the public regards only a handful of institutions—the U.S. military, colleges and universities, churches and religious organizations, technology companies, and small businesses—as making a positive contribution to the country.
For decades, researchers have monitored Americans’ overall assessment of their country’s trajectory. One survey asks respondents to assess whether the nation is generally headed in the right direction or if things have gone off on the wrong track; another version asks whether respondents are satisfied or dissatisfied with the way things are going in the United States. Despite differences of methodology, the results are remarkably consistent. In the final five years of the twentieth century, solid majorities of Americans were positive about the direction of the country. But since 2004, despite multiple changes in party control of Congress and the White House, majorities have been consistently negative.34
As this book goes to press, the election of Donald Trump has not disrupted these trends. Much will depend on his ability to make good on his promises to the white working-class voters whose overwhelming support largely gave him his victory. Repealing Obamacare, whose Medicaid expansion disproportionately benefited working-class voters in Republican-dominated states, would not have reopened shuttered coal mines and steel mills. A classic Republican tax cut will not revitalize declining Rust Belt communities. It remains to be seen whether the small-government Republicans who dominate Congress will be willing to fund the president’s $1 trillion infrastructure buildup, the most broadly popular policy component of his populist appeal.
There are signs of impatience with liberal democratic restraints in the United States, where constitutionalism and the rule of law are more deeply entrenched than in the newer European democracies. In two pathbreaking articles, Roberto Foa and Yascha Mounk have presented survey research suggesting declining support in America for liberal democracy and a rising willingness to consider alternatives.35
Still, the connection between public attitudes and policy outcomes is loose. In times of intense concern about national or personal security, Americans have often expressed doubts about the scope of individual liberty. In the aftermath of 9/11, for example, a 49 percent plurality agreed that the freedoms guaranteed by the First Amendment went “too far.” These reservations were never translated into permanent reductions in personal liberty, and by 2006 the share of Americans who believed First Amendment liberties were too expansive had fallen to 18 percent.36 American institutions have served as bulwarks against inconstant public attitudes, and when institutions fail—as the Supreme Court did when it upheld the internment of Japanese Americans during World War II—second thoughts by elites and eventually the public have usually reversed the damage.
If modern survey research had been available during the 1930s, it would probably have shown support for liberal democracy at a low ebb and substantial levels of sympathy for both communism and fascism. In his First Inaugural Address, President Franklin Roosevelt made it clear that the national economic emergency might require a “temporary departure from [the] normal balance of . . . executive and legislative authority.” If halfway measures proved insufficient, he said, he would not hesitate to ask Congress for “broad executive power” to wage war against the emergency “as if we were in fact invaded by a foreign foe.”37 He did not say what he would do if Congress refused to go along. Fortunately, new policies and institutions within the constitutional framework proved equal to the task of preserving liberal democracy. No doubt FDR’s assessment of the American people’s underlying devotion to the constitutional order, whatever their temporary doubts, strengthened his own commitment.
The question is whether U.S. institutions and norms will prove strong enough to outlast, and if necessary resist, today’s challenge to liberal democracy.
A moment of testing comes when the judiciary hands down a ruling that prevents the president from doing what he wants or orders him to do something he does not want. When the Supreme Court told President Truman that he could not seize the steel mills, he backed down. When it told President Nixon to hand over the Oval Office tapes, he complied. Notably, when the federal courts told President Trump that his executive order on immigrants and refugees did not meet constitutional standards, he accepted their verdict and drafted a revised order that he hoped would fare better.
Tensions between the executive and judicial branches often escalate when steps taken to enhance national security restrict individual liberty. In the aftermath of the 9/11 attacks, George W. Bush’s administration dealt with detained terrorist suspects in ways that the Supreme Court ruled violated constitutional rights. The administration accepted these judgments. Democracy in the United States would enter new and dangerous territory if a president did not.
Another moment of testing for liberal democracy would come if an administration infringed on freedom of the press. Since the Supreme Court permitted the publication of the Pentagon Papers, it has been taken for granted that the executive branch cannot invoke claims of national security to prevent the media from making classified information public. Still, an administration could threaten other means—such as tax audits and regulatory crackdowns—to pursue the same end. Relations between presidents and the press almost always turn adversarial, and an all-out assault on the press led by the president could do lasting damage to American democracy. While the attack by President Trump on the media as “enemies of the people” was hardly helpful, his words are not enough to create a crisis for democracy.
Most Americans have a hard time believing that their democracy is at risk of what Foa and Mounk call deconsolidation, and they have centuries of history on their side. The constitutional order has survived the no-holds-barred battle between the Federalists and the Jeffersonians, the Civil War, the Great Depression of the 1930s, the assassinations and cultural upheavals of the 1960s, and the security panic that swept the country after the 9/11 attacks. During the two world wars of the twentieth century, both of which evoked national mobilizations, liberal restraints on government were weakened only temporarily. Freedom of the press survived the Alien and Sedition Acts of the 1790s, the Espionage and Sedition Acts of 1917–18, and the clashes of the Nixon era. The ethos of individual liberty has always been a powerful countervailing force. Besides, the greatest challenges to constitutional democracy have always come during wars or national emergencies, and current circumstances, however distressing, do not rise to this level.
Developments since President Trump’s election suggest that America’s constitutional institutions once again are proving equal to the task. The judiciary has acted with characteristic independence. The press is doing its job in the face of unrelenting pressure from the administration and its supporters. The long-supine Congress is showing signs of standing up and strengthening its backbone. Ideological and geographical differences within the Republican Party have impeded its efforts to put unified government in the service of a conservative agenda that lacks majority support. Madison’s nightmare of a tyrannical concentration of power seems as distant as ever.
Nevertheless, events at home and abroad have been a salutary warning against progressivist complacency. History does not have an end; nor does it arc toward justice. Liberal democracy is not self-sustaining. It is a human achievement, not a historical inevitability. Like every human creation, it can be weakened from within when those who support it do not rally to its cause.
That an event has never happened is no guarantee that it will not happen. Eternal vigilance is indeed the price of liberty, and liberal democracy will endure as long as citizens believe it is worth fighting for. Despite some troubling signs, most Americans still think it is, and support for the key institutions that protect the country from tyranny remains strong. Newer democracies, where liberal norms are less firmly entrenched, could be a different story.38
As this book went to press, the ANO Party, founded and led by billionaire businessman Andrej Babiš, won the Czech election with almost 30 percent of the vote. Babiš campaigned against immigration, Brussels, and government corruption. He favors constitutional changes that would reduce checks on executive power, and he has stated, “The [ANO] party is connected to my person. The party is me.” There is a distinct possibility that he could end up leading his country’s hard-won democracy down the illiberal trail that Hungary and Poland have blazed.