THE ROAD FROM SERFDOM

by Danielle Allen

[DECEMBER 2019]

In her 2019 essay “The Road From Serfdom,” which appeared as part of the Atlantic special issue “How to Stop a Civil War,” the political philosopher Danielle Allen returned to George Washington’s farewell address as a modern touchstone. Washington, she explained, had more than one reason for warning Americans away from the allure of “faction”—what today we would call “factionalism.” Washington wrote, “The disorders and miseries which result gradually incline the minds of men to seek security and repose in the absolute power of an individual.” But the problem goes deeper. Factionalism erodes unity, and a sense of unity—we’re all in this together—is a prerequisite of compromise. Compromise is what enables a diverse people to organize and get important things done—which is what our Constitution was designed to enable us to do. In her essay, excerpted here, Allen argued that we have lost sight of these basic truths—and must reclaim them.

Danielle Allen is the James Bryant Conant University Professor at Harvard University and the director of Harvard’s Edmond J. Safra Center for Ethics. She is the author of Talking to Strangers: Anxieties of Citizenship since Brown v. Board of Education (2004), Our Declaration: A Reading of the Declaration of Independence in Defense of Equality (2014), and Cuz: The Life and Times of Michael A. (2017).

As I think about our national paralysis, and the reasons for it, I find myself recalling George Washington’s farewell address—the letter he wrote to the American people in 1796 after deciding not to run for a third presidential term. Washington wanted to issue a warning about the dangers of factionalism: what happens when a nation forgets that it is one country—that its citizens and values and processes, whatever the inevitable disagreements, are indivisible. Unity, he believed, had a moral dimension. It was also the best defense against bullying and tyranny, on the one hand, and gridlock, on the other. In recent decades, the fact that Washington owned slaves has obliterated his words from many minds. Hamilton notwithstanding, quoting slaveholders is out of fashion. But here is some of what Washington said:

The alternate domination of one faction over another, sharpened by the spirit of revenge, natural to party dissension, which in different ages and countries has perpetrated the most horrid enormities, is itself a frightful despotism. But this leads at length to a more formal and permanent despotism. The disorders and miseries which result gradually incline the minds of men to seek security and repose in the absolute power of an individual; and sooner or later the chief of some prevailing faction, more able or more fortunate than his competitors, turns this disposition to the purposes of his own elevation on the ruins of public liberty.

This slave owner knew something about liberty and its preservation. It is a paradox that the embrace and maintenance of slavery at the nation’s founding—its original sin—schooled early Americans in the lessons of freedom and equality. They didn’t share freedom and equality with everyone, but they learned those lessons. Washington himself used the language of “freemen” and “slaves” to define the stakes in the contest with Britain. His audience knew firsthand what he meant.

Washington worried deeply about the prospects for the early American republic—and especially that factionalism might destroy it. Faction—an old word, but better than tribalism—captures the idea not just of political parties but of parties ready to fight existentially, as if unto death. Washington spoke about the “artificial and extraordinary force” of faction. In particular, he cited its capacity

to put in the place of the delegated will of the nation the will of a party, often a small but artful and enterprising minority of the community; and, according to the alternate triumphs of different parties, to make the public administration the mirror of the ill concerted and incongruous projects of faction, rather than the organ of consistent and wholesome plans digested by common councils and modified by mutual interests.

Washington’s phrases sometimes creak like the old phaeton that carried him about in Philadelphia, but his meaning remains clear. In Washington’s view, public liberty depended on a process of mutual consultation—adjusting the interests of various parties in relation to one another—with the aim of achieving “consistent and wholesome plans” that could provide stability of direction over the long haul. Our very political institutions, born of compromise and sketched in the Constitution, were this country’s first plan. Washington believed that the business of government—of “public administration”—was to get important things done, that getting things done depended on compromise, that compromise was enabled by a commitment to unity, and that deciding what needed to be done required a long view of the public interest.

Sensible ideas. How did we ever misplace them?


Centuries from now, historians looking back at contemporary America will identify the 1970s as a moment when one era gave way to another. We are accustomed to focusing on how the advent of the digital age disrupted the world as we knew it. But for all of technology’s impact, other changes have been even more disruptive: the changes to our values themselves. The ’70s brought major shifts in the social, political, and economic domains, and brought them all at once. Now, in 2019, we sit on the far side of that transformation.

Consider, first, the social changes. The draft ended in 1973. The introduction of a professional, all-volunteer military put an end to a culture in which most adult men had a shared experience of service. The civil-rights movement, the widespread use of the pill, and the legalization of abortion overturned generations of old social hierarchies. These hierarchies had always served as the informal constitution for the country. They underpinned the legal Constitution by determining who could wield power through political institutions. The wielders of power had primarily been white men. In the final decades of the 20th century, large parts of the country, but not all, embraced the project of replacing the old informal constitution with a new, egalitarian one.

The social changes of the ’70s have sometimes been described as bringing a decline in social capital and social activity. The political scientist Robert Putnam famously made this argument in a book pointedly titled Bowling Alone. He attributed shrinking membership in clubs such as the Rotary and the Jaycees to new commuting patterns, new forms of entertainment (such as television), the influx of women into the workforce, and “generational change.” Yet what actually occurred from 1970 to 1990 was bigger than that. The U.S. rewrote the law of association, mainly through state action and Supreme Court decisions; for instance, new legislation and jurisprudence made the gender-exclusive membership policies of the Rotary, the Jaycees, and other groups illegal. In a society seeking to do away with hierarchies based on race and gender, some organizations had trouble adapting. Many faded away and were never replaced.

The 1970s widened divisions: between rich and poor, experts and nonexperts, veterans and nonveterans, young and old. An age of burden sharing gave way to an age of burden evasion, as those who reaped the benefits of our social arrangements and those who bore their costs separated into distinguishable subpopulations. Warring tribes faced off along yet another dividing line—one group seeking to establish an American social order on the basis of egalitarian norms, the other less sure that this new order was worth creating and in some cases actively working to retain the old social order. A battle was joined over how to define “Americanness.” It is raging now.

Meanwhile—owing in part to the Vietnam War, in part to the Watergate scandal, in part to other factors—the country’s respect for political institutions began a precipitous decline. There were attempts at reform and greater transparency. Political parties made their nominating process more democratic: thumbs-up for open primaries, thumbs-down for smoke-filled rooms. None of this restored the legitimacy or capability of our politics—unexpectedly, it eroded them further. As Jonathan Rauch argued in “What’s Ailing American Politics?” [excerpted in Section II], political reforms in the name of democracy and transparency, combined with the advent of nonstop forms of new media, have done away with the power of political gatekeepers, the pragmatic brokers who organized the political system around relatively moderate, deal-cutting politicians. The result, he argued, has been “chaos syndrome,” where politics is dominated by politicians who do not care about what other politicians think of them and who pander to their base of voters. Senator Ted Cruz of Texas is despised by members of both parties, but that is of little concern to him. He happily read Green Eggs and Ham on the Senate floor as part of a fake filibuster because he knew it would energize his supporters. The disappearance of gatekeepers has undermined the ability of politicians and citizens to “organize”: to convene, to set a direction at the level of broad principle, to negotiate, and ultimately to come to a result that moves everyone—imperfectly and with some noses out of joint—toward an incrementally more desirable outcome. You can write whole tomes describing the basic work of democracy, but the single word organize will save you a lot of ink.

The Declaration of Independence makes use of this vocabulary. With respect to government, the Declaration charges citizens with the task of “organizing its powers in such form, as to them shall seem most likely to effect their safety and happiness.” Citizens of democracies must know how to organize—not merely to secure power but to convert power into usable plans. This was the point made by George Washington, who understood that the only alternative would be “the ill concerted and incongruous projects of faction.” But the democratic work of organizing is something that many Americans are no longer capable of doing. The Tea Party gave us the House Freedom Caucus, whose purpose is obstruction. Progressive activism is giving rise to the Justice Democrats, who hinder governance by insisting on perfection, as if that’s an option. For most people, texts such as the Declaration of Independence and Washington’s farewell address don’t pulse with vitality anymore. We don’t really know them, and we certainly don’t use them as the handbooks they were meant to be—the owner’s manuals that came with our new country.

All of this has left Americans to fight over the issues that are now most politicized—namely, social and cultural issues. Sometimes we do this brutally, when one faction believes it has the power to force its will—imposing draconian measures by referendum, confirming judges by partisan fiat. We have shed the burden of compromise because politics has become factional. This state of affairs was epitomized by a statement from Senate Majority Leader Mitch McConnell: “Winners make policy, and losers go home.”

When we’re not fighting, we slink away, sorting ourselves into ideological and residential enclaves. The Civil War, it turns out, didn’t settle the question of secession; it outsourced the job to moving vans. If you know a zip code, you know whether you’ll find an Applebee’s or a gastropub in the neighborhood; you know whether the waiting list at the library will be for John Grisham or Jesmyn Ward; you know the odds on whether the local confectioner will bake a cake for a same-sex wedding.


The collapse of the old informal constitution, as yet not fully replaced, and the hollowing-out of our political institutions have left society disunited, disorganized, and raw. They have also left it defenseless against the consequences of a third major shift—in economic thinking and economic reality, and in who is at the table when decisions are made.

In the late 20th century, economics established itself firmly as the queen of the policy-making sciences. Up until then, before the emergence of digital computing power and the spread of numbers-based social science, people who were trained as lawyers, not as economists, had dominated policy making. The shift is documented in recent research by the sociologist Elizabeth Popp Berman. The difference in outlook between economists and lawyers is immense. Whereas economists seek out rules that are in theory universal—mathematical principles that apply everywhere, and are blind to context—legal thinking is fundamentally about the institutions of specific societies and about how institutions actually work in specific situations. This is not to say that we can always count on lawyers to see real people or that lawyers went away. The point is that a different way of thinking—emerging first in economics—has ascended across a wide range of professions.

In the utilitarian model that dominates economics, the goal of policy, in an abstract, mathematical sort of way, is to maximize happiness—or, to use the jargon you won’t hear at most dinner tables, “utility”—for members of a society in the aggregate. In its crudest form, the effort to maximize aggregate utility relies on cost-benefit analyses linked not to the conditions of actual communities—small-town Nebraska, working-class Ohio, rural Mississippi—but to broad national measures of expenditure, income, and wealth. This way of thinking, detached from popular debate, has spread worldwide. It is evident in the behavior of central banks and in the demands made of developing countries in return for aid. It is linked to policies intended to enhance the size and efficiency of markets and create an integrated, “frictionless” global economy. The policies have done that, and helped many. But they have also disrupted the world’s labor markets. They helped sink the Rust Belt and contributed to unprecedented levels of mass migration. At the same time, the unregulated behavior of the powerful financial sector brought on the Great Recession of 2008, which devastated ordinary Americans and for which virtually no one was held to account.

The kinds of economists involved most intimately with government and financial institutions by and large don’t notice real people in real places—people who may be losing jobs and falling into despondency, addiction, and suicide. They tend not to see as relevant to their domains of expertise the millions of people on the move and the impact of mass migration on cultural cohesion. In recent years, they overlooked the warning signs indicating limits to the acceptance of their worldview, notably in the very communities suffering because of their economic policies. Elites on both the left and the right, with their well-thumbed passports and multicultural outlook, were no less blind. They did not see the pressures rising. In the immediate aftermath of the 2016 presidential election, I more than once heard an economist friend say something like the following: “We knew globalization would force transformations, but we never thought they would be localized in a specific subset of communities.” And: “We knew that globalization would cause disruption over a 20-year period, but I never thought about what 20 years is like in the life of a specific person or community.” The very language conveys remoteness—the sheer size of the chasm between the World Economic Forum and the actual world. This is what happens when the messy, mediating business of popular politics no longer functions properly—when it no longer serves as the membrane through which ideas must pass before they turn into action.

No one wants to feel buffeted in this way—subject to, and at the mercy of, the will of powerful others, to whom they are invisible. There’s a word we can use to describe a condition when people feel helpless, whipsawed, and disconnected from the levers of personal and economic autonomy; when people feel trapped in a particular place and circumstance; when decisions about one’s life and work and mode of cultural existence seem to rest in the hands of others; when even personal property seems to be evanescent, or nonexistent, or on loan. It’s an extreme word, but let’s put it on the table. The word is serfdom.


Freighted though it is, serfdom has a modern intellectual history. How did the present state of affairs come about? How were questions of political economy hived off from political debate?

The first part of the explanation lies in the work of Friedrich A. Hayek, an Austrian economist who spent most of his career after 1931 at the London School of Economics and the University of Chicago. In 1944, Hayek published a book called The Road to Serfdom—a condition he saw arising through a very different process from the one I’ve described. He had left continental Europe at a time when socialist movements of various kinds were growing in influence, and now believed he saw the same dynamic emerging in the English-speaking world. Since the onset of the Great Depression, American economists had been advocating for a managed-market economy in which the government played an essential regulatory role. They were following in the footsteps of the British political economist John Maynard Keynes, who bent monetary policy and government spending toward a target of full employment. This approach brought strength to labor unions and to legislatures, which played a role in both the short- and long-term management of the economy.

In the rise of Keynesian-style intervention, Hayek heard echoes of his past. He felt that, in some sense, by moving to Britain and then to the U.S., he was moving back through time. These countries, he believed, were heading in the same direction as socialist Europe, but were not too far down the path. Like a time traveler, he would use his experience of later stages of that process—his tussles with the fearsome socialist Morlocks—to warn his new compatriots of what he saw as a dangerous drift toward disaster.

At its core, the argument in The Road to Serfdom was an argument against “planning”—against the very idea that George Washington had celebrated as the essence of public administration. In particular, Hayek sought to ward off any sort of top-down decision making about the economy. His basic worry was that a planned economy, where a government nationalizes the means of production and makes centralized decisions about how to allocate resources, violates human freedom, distorts human activity, and damages economic productivity. The term serfdom captured this concern. Hayek, who won a Nobel Prize in 1974, was right to worry about the dangers of preemptive state control, as the record of communist countries has shown. And he was not some blinkered zealot. His economic theory rested on a rich and evocative account of human beings and human society, and on how to bring about the best outcomes for both on a foundation of freedom. That said, he made a fundamental mistake in thinking that free economic activity within a market system would, by itself, inevitably be good for society as a whole. He disregarded the clear evidence that human beings commonly use economic activity to pursue narrow interests, including domination over one another.

Hayek’s anti-planning position was profoundly influential. It links him, in many minds, with another Nobel laureate, Milton Friedman. Friedman’s work provides the second part of the explanation. Friedman disagreed with much of Hayek’s economic theory, but he considered The Road to Serfdom to be of crucial political importance and adopted its anti-planning gospel wholeheartedly. It dovetailed with his vision of the purpose of monetary policy: to establish a stable frame for free-market transactions. He championed price stability—keeping inflation low—as the primary managerial target, not jobs, wages, equality, or anything else. Friedman’s advocacy of a market economy, monetarism, and stable prices that secure the efficient functioning of markets rested on a set of propositions about human beings that have made their way, often in caricatured form, deep into the heart of our culture.

Friedman wrote with forceful confidence. He envisioned society as “a number of independent households—a collection of Robinson Crusoes, as it were,” united only by relations of free exchange. There are, in his view, no larger national or social goals beyond the aggregate of “the goals that the citizens severally serve.” His faith in the ultimately benign power of the free market—rational individuals pursuing their own ends in a ferment of competition—ran deep. Because competition worked so efficiently—bringing productivity, wealth, and social equilibrium—the job of government must be “to foster competitive markets.” Further, this limited role for government had an immense corollary benefit: “By removing the organization of economic activity from the control of political authority, the market eliminates this source of coercive power.” In conclusion: “A society that puts equality before freedom will get neither. A society that puts freedom before equality will get a high degree of both.”

In time, propositions like these produced a broad policy framework that saw markets as the solution to every problem. The idea was that individual self-interest and wealth creation produce beneficial outcomes for everyone. Markets therefore need to be protected from the kind of political control that was built into Keynesian fiscal policy. Economic policy became a matter for economists and bankers. In the 1980s, the task of achieving price stability came to dominate the work of ever more powerful independent central banks. Institutions such as the Federal Reserve, which have formal independence from politics and considerable insulation from democratic accountability, grew stronger. Legislatures grew weaker. Republican and Democratic administrations alike fell into line. The liberation of markets from politics was understood to be a policy that put an end to planning—even though, of course, “putting an end to planning” is itself a kind of plan. It sets the rules of the game.

The final part of the explanation is the rise of a class of people whose job it is to make decisions for us. When it comes to government by technocrats, the center of gravity was once mainly in the economic sphere. And it must be said: Technocrats—or, better, democratically accountable experts—have a role to play. Central banks are necessary and can’t be operated by plebiscite (though they could indeed be more accountable). But technocrats are not just setting interest rates. The technocratic way of thinking has affected everything from homeownership to the quality of schools, from income distribution to the rights of workers, from insurance rates to the legal system. All of these issues get talked about as if they were still the central domain of politics, and as if elected officials actually dealt with them, but in fact they are being addressed (or left unaddressed) by the technocratic class—the people sometimes derided as policy wonks. The sprawling nature of the modern state may have its roots in political decisions made, willy-nilly, at various points in the past, but its evolution and management are largely detached from politics. Indeed, in America, the modern state is even becoming detached from government, its functions outsourced to private-sector contractors, whose ranks have swelled by more than 1 million since 1996. (The federal payroll has held more or less constant.) Think about defense strategy, environmental regulation, privacy laws, prison building, and on and on: How much of the directional policy making about any of these, much less the implementation of policy, is any longer the outcome of an open, democratic process, as opposed to a few knowledgeable people deciding what to do?

The result of all this has been the erosion of our collective understanding of the work of public administration. We are descending again into a form of servitude, though not the kind that Hayek feared: assignment to this or that role in an excessively planned economy. This time our servitude is to those who have siphoned away the power of ordinary citizens, transferring major decisions about our future from a political to a technocratic realm.


By now, Americans pretty much take for granted modes of decision making that do not involve electoral politics. Looking back at the past two decades, the only major pieces of actual legislation that will seem significant 50 years from now are the Affordable Care Act, if it survives, and the post-9/11 legislation that gave the executive branch enhanced surveillance powers and a blank check to deploy the military—these last two representing a further separation of vital decisions from democratic will. We have come a long way from George Washington’s “wholesome plans digested by common councils.”

I keep coming back to Washington because his emphasis on collective accomplishment is the forgotten half of America’s constitutional ethos. We all remember what the Founders said about electoral procedures, about checks and balances, about the basic rights of citizens. We forget that all these elements were part of a plan. We forget that they were supposed to be tools to help us create something. And we forget that politics and compromise are essential to the act of creation.

Americans must learn how to plan again—to plan in the way George Washington intended. This means recovering knowledge of how to create and operate democratic institutions, and putting experts back in their proper place as advisers to a decision-making people. Our dignity, our freedom, and our public liberty are at stake. Think about what planning in its visionary sense has done for us. The Northwest Ordinance. The canals and railroads. The land-grant college system. The Progressive-era reforms. Social Security and Medicare. The GI Bill. The highway system. The civil-rights revolution. None of them perfect. None of them easy. All of them achieved through democratic negotiation. All of them hard to imagine getting through Congress in today’s climate.

A failure to understand the value of plans grows out of a failure to understand actual human beings—the one thing that true democratic politics, for all its flaws, really does take into account. People are not so much rational actors as purpose-driven ones. Human goals and values cannot always be represented in financial terms. Can you put a price on family? Empowerment? Self-sacrifice? Love? You cannot. As purpose-driven actors, we develop our values and learn to justify them within the context of communities that give our lives meaning and worth. Human moral equality flows from the human need to be the author of one’s own life.

As a measure of human flourishing, empowerment is more important than wealth. Wealth is merely one possible source of empowerment. It cannot buy what makes nations flourish: social cohesion, freedom, and healthy institutions. Social cohesion is created by cooperation, and cooperation occurs only if individuals have equal standing. The role of government is not to stay out of the way of markets. It is to secure the rights that undergird empowerment, cohesion, and participation. Securing these rights requires combatting monopolies. We understand what monopoly power means in the economic sense. But the issue of monopoly power applies to the political and social domains, too. Gerrymandered districts create monopolistic political power. Our current approach to education funding, which tightly links it to property taxes, has allowed the socioeconomically advantaged to establish a near monopoly on genuine educational opportunities. People with money enjoy a position of privilege in the legal system. Corporations enjoy one when it comes to the quiet tweaking of bureaucracy and regulation. A proper role of government—nearly forgotten today, but the overriding concern of the Founders—is finding ways to prevent undue concentrations of power wherever they occur. Power tends toward self-perpetuation; where it is left undisturbed, it will draw further advantages to itself, shut out rivals, and mete out ever-bolder forms of injustice.

Undue concentrations of power sow division and factionalism. When Washington described public liberty as depending on the citizenry’s ability to ward off the despotism of faction, he was offering a profound insight: The precondition of democratic decision making is unity. If a political system that relies heavily on majority rule cannot keep minorities affixed to it through loyalty, then every fresh, durable minority faction that comes into being will bring with it the threat of breakup. A first secession will provide grounds for a second, and on and on; the polity will face a threat to its very existence. The United Kingdom is living through a version of this nightmare right now.

From antiquity through the formation of the American republic and beyond, those who have looked closely at the question of what is required to maintain free institutions—from Livy to Machiavelli to Washington to Lincoln—repeated one lesson over and over again: Choose unity. A commitment to unity—an unswerving insistence on unity—induces citizens to seek out ways of adapting their purposes so as to get something done. Because if unity is not negotiable, then there is no other choice. Technocrats are oblivious to this. If you accept Hayek or Friedman, unity recedes as an essential factor—in fact, disunity is inevitable. But if you emphasize unity—and stipulate that it cannot be sacrificed—then it becomes a democratic tool. It encourages all sides to compromise. It is the opposite of executive decision making fueled by the self-interest or anger of one part of the electorate.

Compromise is what allows us to stay together in the space we share; discard it, and we’re all condemned to our own private Bosnias. The goal is not unanimity; that is neither achievable nor desirable. Compromise entails embracing not Mitch McConnell’s outlook—the winners make policy and the losers go home—but rather the view that the winners deserve a leadership role in steering the conversation toward the “wholesome plans” that Washington spoke of. The Old English root of the word whole means “healthy.” That is what we seek—to be a healthy people.


In America, we have drifted far from that vision—to the point where many have no experience of its promise, much less its necessity. The findings of Yascha Mounk and Roberto Stefan Foa, published in the Journal of Democracy, are startling. Ask people born in the 1930s whether they believe it is “essential” to live in a democracy, and 72 percent will answer yes. Ask that same question of people born in the 1980s, and only 30 percent will say yes. Nearly half of Millennials in a recent Pew survey said they’d prefer to be governed by “experts” than by elected officials. This is more than a yellow warning light. It is a sign of catastrophic breakdown.

Americans today, across the political spectrum, have plenty of ideas about how to address our own great national challenges—health care and immigration, for example. But we have arrived at a point where no issue is as important as restoring the institutions of democratic participation—enabling those institutions to recognize the people’s will and channel it toward a common purpose. Yes, one faction or another might strong-arm a “solution” to some grave problem, but the solution will never be seen as legitimate—and will never prove durable—unless the decision-making process itself is seen as legitimate. I have strong views about what the nation should be doing when it comes to education, inequality, and economic development. But reviving political participation is by far the most urgent priority. If we do not address the corrosion of our democracy itself, we will have lost the essence of the American experiment. Nothing else will matter.

The challenges of participation and justice won’t be met by markets working independently of politics, and they won’t be met by the triumph of one faction over another. No great challenge can be met that way. As a nation, we have been called to be our best and most united selves by inspirational goals. The salvation of the democratic experiment must become such a goal. In the first half of the 20th century, reciting the Pledge of Allegiance on thousands of days, Americans in a segregated country spoke of being “one nation, indivisible.” The language papered over a different reality. In 1954, Congress split the phrase by adding the words under God and divided the country along yet another line. The simple fact is we have lost the shared vocabulary that should bind us all as Americans. We fight over words like patriotism, solidarity, loyalty. Yet there is a word that defines our relationship. Lincoln knew what it was. The word is union. In a political sense, the word points to something concrete. It means talking honestly, fighting fairly, and planning together. It means “Choose unity.”

It’s time for all of us to become citizens again.