And God said, “Let there be light,” and there was light. And God saw that the light was good; and God separated the light from the darkness.
—Gen 1:3–4
Light is the beginning of creation. Light grows our food. Light helps us see. Light warms our faces in the summer.
Dark is a very different matter. In Scripture, the “outer darkness” is a place we want to avoid. The dark, even when it’s thick with silence and scents and romance, can also be thick with carnivores—a fact that imprinted itself very early on our species’ memory. Put simply: Light is good. Dark is not. Thus, moderns speak of the Dark Ages and the Enlightenment, the light of reason and the darkness of ignorance. It’s hard to imagine anyone confusing the two. But given the right circumstances, odd things can happen.
Here’s an example. It’s night. We’re walking down a corridor. The lights are intense, so we squint. Then they suddenly go black—all of them. For a few moments, we’re blind. After all, it’s dark. But just as our eyes adjust to the dark, the lights snap back on. And now we’re squinting again. The jolt from bright to pitch black to bright again is a vividly conscious experience.
Now imagine we’re walking down the same hallway. This time, the lights dim slowly. This time, we hardly notice. By corridor’s end, the dark is so deep we can barely see the signage on the walls. But the change has been subtle, the fading of the light gradual, and we humans are adaptable. We can adjust. In fact, we can survive in the dark a long time if we need to or choose to. We can even learn to prefer it. Rather like the Morlocks.
That, arguably, is the story of our past three hundred years—not a history of the whole world, but a history of our world, the singular Western world that shaped the modern era, the world we call our own. Consider the following:
In his masterwork After Virtue, the philosopher Alasdair MacIntyre starts with a thought experiment. It goes like this. Picture a world where a massive, unforeseen disaster has wrecked the environment. The survivors blame science. Mobs lynch physicists. Books are burned. Instruments are trashed. Labs and computer networks are torn to pieces. Teaching and practicing science are banned. Even the memory of science is attacked, and where possible, stamped out.
Time passes. Eventually, enlightened minds try to resurrect science. But all they have to work with are pieces; the half-memories, partial theories, and fragments of a much larger (but lost) organic body of knowledge. The new scientists forge ahead anyway. They speak confidently and get some fine results. They use many of the same words crucial to the old science—neutrino, mass, velocity, and so on—but without fully understanding them. They also push competing and incompatible ideas about what the old science was and meant, and how science should be conducted. They share a common vocabulary. But the same words mean different things. The result is disorder—a disorder no one can heal or even adequately recognize, because no one possesses the older body of knowledge that has been lost.
For MacIntyre, the point of the fairy tale is simple. Today, “in the actual world we inhabit,” he writes, “the language of morality is in the same state of grave disorder as the language of natural science” in the fantasy world he describes. “We continue to use many of the key expressions” of traditional morality, but “we have—very largely, if not entirely—lost our comprehension, both theoretical and practical, of morality.”1 In effect, “the language and the appearances of morality persist, even though the integral substance of morality has to a large degree been fragmented and then in part destroyed.”2
What that means is this: The moral conflicts that permeate our public policy debates are endless and irresolvable because our culture no longer has a rational, mutually accepted way of getting to moral agreement. The answer of the liberal state (including our own) to these stubborn disputes is to remove morality to the private sphere. But that course of government is itself a value judgment, a morally loaded act disguised as neutrality. All law and all public policy embody someone’s idea of what we ought to do, including the notion that we “ought” to keep personal moral beliefs out of public debates.
The underlying assumption of our public discourse today is that facts and values are radically distinct. “The plane crashed” is a statement of fact, and therefore “real.” Crash evidence is tangible. Nobody can argue with debris. On the other hand, “Don’t kill the disabled” is a statement of value. It’s an expression of opinion and sentiment—so the logic goes—and therefore not “real” or “true” in the same solid sense. For example, the importance of protecting disabled persons is an admirable and widely shared view; surely that’s obvious. But some people might disagree. Some people might argue quite sincerely that disabled persons are a waste of precious resources, and we’d be better off without them. Some people did argue that way in Germany in the last century, with great effect.
Of course, for most of us, murdering the disabled, starving the poor, or deliberately targeting innocent civilians in war is an appalling idea, a crime against humanity. But apparently sucking the brains out of unborn children, or trading in their body parts, is not so appalling. It may even be “good,” because we already do it. We not only do it, but we also build a fortress of pious-sounding chatter about reproductive rights to surround and bless it.
This is the kind of obscenity that comes from reducing a nation’s politics to a clash of allegedly equal values. What it masks is a transfer of power from proven traditions of moral wisdom to whoever can best lobby the media, the courts, Congress, and the White House. It’s the reason MacIntyre warned that today’s barbarians “are not waiting beyond the frontiers; they have already been governing us for quite some time. And it is our lack of consciousness of this that constitutes part of our predicament.”3
* * *
AFTER VIRTUE IS A challenging work. It’s not for the casual reader. But if we want to understand ourselves as a nation, some of its key ideas are worth noting. As we’ve already seen, America is a child of both biblical and Enlightenment spirits. Its roots are therefore tangled. And they go back a long way.
The medieval Europe that preceded modern times was the product of classical and Christian thought. Aristotle, the ancient Greek philosopher, played a large role in shaping it. For Aristotle everything, including man, has an inherent nature or purpose. A man lives a good life when he acts in accord with that nature. In Aristotle, “the relationship of ‘man’ to ‘living well’ is analogous to that of ‘harpist’ to ‘playing the harp.’”4 As the harpist disciplines herself through practice to play the harp more beautifully, so also man cultivates the virtues—courage, justice, mercy, humility, and so on—to become more truly human.
Moreover, in the classical tradition, to be a human being involves fulfilling certain roles, each with its own distinct purpose: husband, wife, father, mother, soldier, philosopher, citizen, servant of God. And it “is only when ‘man’ is thought of as an individual, prior to and apart from all roles,” that the idea of “man” ceases to be a meaningful, purpose-filled concept (emphasis added).5 In other words, for Aristotle, what it means to be human is not a matter of self-invention; it depends on our network of human connections and responsibilities.
Aristotle gave Thomas Aquinas the tools for articulating a medieval Christian civilization that combined both reason and biblical faith. For Christians, man does indeed have a purpose. Scripture reveals that purpose and provides the foundation on which human reason builds. We were made to know, love, and serve God in this world. We’re also meant to be happy with him in the next, to show love to others, and to care for the world placed in our keeping. God is the Author and sustainer of creation. Thus all things in nature are a gift. They have a God-given meaning prior to any human involvement.
Man is part of creation but endowed with special dignity. Every man is a free moral agent, responsible for his personal choices and actions. But no man exists in isolation. Every man is also an actor in a much larger divine story, and he’s shaped by his social relationships and duties to others. Thus the purpose of knowledge is to understand, revere, and steward the world, and to ennoble the people who share it—and thereby to glorify God.
As After Virtue notes, the Enlightenment thinkers of the eighteenth century were diverse. Generalizing about their beliefs can be dangerous. But most wanted to keep a Christian-like morality, purified of “superstition” and based on reason. They also wanted to discard any approach to nature based on Aristotle or Aquinas. For the Enlightenment, nature is simply raw material. It has no higher purpose. Man alone gives it meaning by using it for human improvement. Thus the goal of knowledge is to get practical results. And man is not a bit player in some divine Larger Story. He’s a sovereign individual who creates his own story.
As MacIntyre shows, the Enlightenment tried to keep the moral content of Christianity while eliminating its religious base. But it doesn’t work. The biblical grounding can’t be cut away without undermining the whole moral system. Every attempt to build a substitute system has suffered from incoherence, no matter how reasonable sounding. And bad ideas have consequences. The resulting moral confusion has trickled into every corner of our daily life.
Simply put, once a higher purpose and standard of human behavior are lost, moral judgments are nothing but personal opinions. In a nation of sovereign individuals, nobody’s opinion is inherently better than anyone else’s. All moral disagreements become rationally irresolvable because no commonly held first principles exist.
This post-Christian confusion—MacIntyre calls it “emotivism”—now shapes American public life. In such an environment, the purpose of moral discourse, he writes, “[becomes] the attempt of one will to align the attitudes, feelings, preferences and choices of another with its own.” Other people become instruments to be dominated and used. They’re means to achieve our ends, not ends in themselves.6 As a result, most of our moral debates about public policy never get near the truth of an issue. They’re exercises in manipulation. In MacIntyre’s words:
[Each] of us is taught to see himself or herself as an autonomous moral agent; but each of us also becomes engaged by modes of practice, aesthetic or bureaucratic, which involve us in manipulative relationships. Seeking to protect the autonomy that we have learned to prize, we aspire ourselves not to be manipulated by others; seeking to incarnate our own principles and standpoint in the world of practice, we find no way open to us to do so except by directing toward others those very manipulative modes of relationship which each of us aspires to resist in our own case. The incoherence of our attitudes and our experiences arises from the incoherent [Enlightenment-born] conceptual scheme which we have inherited.7
For MacIntyre, this incoherence explains three chronic patterns in our public life: the appeal to rights, the eagerness to protest, and the appetite for unmasking. Aggrieved parties demand their rights, which are allegedly self-evident (despite the absence of any agreed-upon grounding for rights). They protest the attack on those rights by oppressive structures and rival parties. And they seek to unmask the wicked designs of their opponents. All of which feeds a spirit of indignation and victimhood across the culture.
In a world of bickering individuals, the job of government becomes managing conflict. And since, in a seemingly “value-neutral” state, no higher moral authority can be appealed to, government becomes the ultimate referee of personal appetites and liberties and justifies itself by its effectiveness. Effectiveness demands a managerial class of experts, as MacIntyre notes:
Government insists more and more that its civil servants themselves have the kind of education that will qualify them as experts. It more and more recruits those who claim to be experts into its civil service … Government becomes a hierarchy of bureaucratic managers, and the major justification advanced for the intervention of government in society is the contention that government has resources of competence which most citizens do not possess.8
Never mind that many of the government’s expert managers are in practice incompetent. Bureaucracy by its labyrinthine size interferes with its own accountability. The politics of modern societies swings between extremes of personal license and “forms of collectivist control designed only to limit the anarchy of self-interest … Thus the society in which [Americans, among others] live is one in which bureaucracy and individualism are partners as well as antagonists.” They’re locked in a permanent embrace. “And it is in the cultural climate of this bureaucratic individualism that the emotivist self is naturally at home.”9
Running a society of warring, emotivist selves, of course, requires two things from political leaders: the claim of value neutrality and the reality of manipulative skill.10
* * *
THE WORLD OF AFTER VIRTUE shows itself in at least five major features of current American culture. They reinforce one another. And they have a big impact on the course of our shared life.
The first feature is the role of the social sciences (demography, anthropology, political science, and similar disciplines). Parents often complain that America’s education establishment abuses the classroom and misuses their children by preaching new moral orthodoxies on a whole range of issues like gender identity. The courts and legal profession then enforce those new orthodoxies. But it’s the social sciences that actually help create them. Among scientists, social scientists tend to be the least religious. Exceptions do exist. Christian Smith and Mark Regnerus, both of them distinguished social researchers, are committed Christians. So are many others. But as a group of disciplines, the social sciences tend to reflect agnostic or atheist thought committed to a particular brand of social reform.
This should surprise no one. By their nature, the social sciences objectify the human person. They make man a specimen of study and desacralize him in the process. They seek to identify and apply scientific laws to human behavior, the better to understand (and control) it, and they assume that such laws exist. It’s no accident that the gambling industry has used behavioral psychology to sharply increase its profits. Casinos and a new generation of electronic gaming machines are deliberately crafted to produce “addiction by design” in the gamblers who use them—an economic model now spreading to other industries.11
The social sciences also tend to dismiss religion as a man-made illusion, a projection of purely human hopes and needs. This is clearly the case with Freudian psychology. The same is true in the work of Auguste Comte, the father of sociology.
Sociology is a useful example. As Christian Smith writes, “Sociologists today are disproportionately not religious compared to all Americans … And a great deal of sociology is devoted to showing that the ordinary world of everyday life as it seems to most people is not really what is going on—in short, to debunking appearances” (emphasis in original).12
In the words of Smith,
American sociology as a collective enterprise is at heart committed to the visionary project of realizing the emancipation, equality and moral affirmation of all human beings as autonomous, self-directing, individual agents [who should be] out to live their lives as they personally so desire, by constructing their own favored identities, entering and exiting relationships as they choose, and equally enjoying the gratification of experiential material and bodily pleasures (emphasis in original).13
In practice, many different forces shape American sociology. They range from Enlightenment liberalism to the Marxist tradition. They include the progressive social reform movement of the early twentieth century and John Dewey’s pragmatism. They also include the lessons of therapeutic culture, the civil rights struggle, community organizing, the 1960s sexual revolution, and LGBT activism.14
But why does any of this matter? It matters because the social sciences, especially psychology and sociology, work in secular society as a new kind of priesthood.15 As critics like to argue, the social sciences are really a disguised form of storytelling, not science. They lack the rigor and credibility of natural sciences such as physics and chemistry. But they do bring a set of tools to social problems that produce useful statistics. And their data, in skilled hands, can seem to have moral weight and prove some otherwise dubious things.
In their effect, as Alasdair MacIntyre saw, the social sciences serve as a means of social control “to sustain bureaucratic authority.”16 And as a result of their influence, as the historian Christopher Lasch said, “Today the state controls not merely the individual’s body, but as much of his spirit as it can preempt; not merely his outer but his inner life as well; not merely the public realm but the darkest corners of private life, formerly inaccessible to political domination.”17
A second key feature of current American culture is the role of education. A friend of mine is a well-known economist at a leading American university. He’s also the gatekeeper for an elite doctoral program in his field. Asked once what he valued most in candidates for his program, he said, “an undergraduate degree in Classics.” Homer and Virgil, of course, have very little to do with things like debt-deflation theory. But my friend’s reasoning is, in fact, quite shrewd.
Since economics is a human (i.e., social) science, its practitioners should first know how to be actual human beings before learning their specialized skills. A formation in the classics or any of the other humanities is an immersion in beauty and knowledge. It has no utility other than enlarging the soul. But that achievement—the ennobling of a soul, the enlarging of the human spirit to revere the heritage of human excellence and to love things outside itself—is something no technical skill can accomplish.
As Leo Strauss once wrote, “liberal education is concerned with the souls of men, and therefore has little or no use for machines … [it] consists in learning to listen to still and small voices and therefore in becoming deaf to loudspeakers.”18 A liberal education—a balanced experience of the humanities, art, music, mathematics, and the natural sciences—is designed to form a mature “liberal” adult; liberal in the original sense, meaning free as opposed to slave. Thus for Strauss, “liberal education is the counter-poison … to the corroding effects of mass culture, to its inherent tendency to produce nothing” but specialists without vision or heart.19
Scholars like Anthony Esolen, Allan Bloom, Neil Postman, Matthew Crawford, and Alasdair MacIntyre, each in his own way and for different reasons, have all said similar things. For all of them, the point of a truly good education, from pre-K to graduate school, is to form students to think and act as fully rounded, mature, and engaged human beings. In other words, as adult persons of character.
As Matthew Crawford puts it, “Education requires a certain capacity for asceticism, but more fundamentally it is erotic. Only beautiful things lead us out [of our addictive self-focus] to join the world beyond our heads.”20 But the dilemma of postmodern life is that we can’t agree on what a fully rounded, mature “human being” is—or should be. The fragmentation in American culture runs too deep. Recent battles over imposing gender ideology in school curricula and rewriting and politicizing civics and American history textbooks simply prove the point. So does the “progressive” intellectual conformism in so many of our university faculties.
Meanwhile, as American student skills decline in global comparisons, more and more stress is placed on developing STEM (science, technology, engineering, and math) competence at earlier student ages. There’s nothing wrong with this in principle. Technical skills are an important part of modern life. But as we’ve already seen, American trust in the promise of technology is robust and naive to the point of being a character flaw. And a real education involves more profound life lessons than training workers and managers to be cogs in an advanced economy.
We tend to forget that “everything that human beings are doing to make it easier to operate computer networks is at the same time, but for different reasons, making it easier for computer networks to operate human beings.”21 We also tend to forget that our political system, including its liberties, requires a particular kind of literate, engaged citizen—a kind that predates the computer keyboard.
A third feature of our common life is the role of the law, the courts, and the legal profession. As we’ve already seen, America is an invented nation, a legal contract. Americans are bound together by neither blood nor ancestry. We’re a “people” only in the sense that we share allegiance to a fundamental law—the Constitution—and the public institutions and procedures it prescribes. This makes our lawmakers, courts, and the legal profession arbiters of justice. It also makes our political allegiance more tentative and less organic. And in the world of After Virtue, it also makes all of us warriors in an ongoing guerrilla conflict over what “justice” is, and who gets it, and when and how.
“The strangeness of our day,” notes the political scientist Robert Kraynak, “consists in a strong moral passion for the virtue of justice sitting alongside a loss of confidence in the very foundations of justice, and even an eagerness to undermine them” by leading secular thinkers.22 Kraynak goes on:
[T]he crucial requirement for human equality is a conception of human dignity, which views human beings as having a special moral status in the universe, and individuals as having unique moral worth entailing claims of justice.
What is so strange about our age is that demands for justice are increasing even as the foundations for those demands are disappearing. In particular, beliefs in man as a creature made in the image of God, or an animal with a rational soul, are being replaced by a scientific materialism that undermines what is noble and special about man, and by doctrines of relativism that deny the objective morality required to undergird human dignity.23
In effect, secular thinkers live off the capital of religious beliefs they reject. “The ‘faith’ of these modern thinkers in human dignity,” says Kraynak, is actually a convenient, parasitic laziness. It “enables them to have respectable moral commitments while avoiding the hard work of actually establishing foundations for them, whether in the moral order of nature or the revealed knowledge of God.”24
The problem with a justice-related idea like equality is that, divorced from other public virtues and networks of obligation, it acts like a bulldozer. It flattens everything in its path. When it infects Supreme Court decisions, it licenses an official bigotry like Justice Anthony Kennedy’s opinion in United States v. Windsor. In striking down the federal Defense of Marriage Act (DOMA), Kennedy argued in Windsor that by passing a law rooted in centuries of moral tradition, Congress had acted with a “bare desire to harm” motivated by malice.
So much for the will of the people and their elected representatives.
But disregard for “the will of the people and their elected representatives” is hardly limited to the courts. Administrative law and especially executive orders were the preferred tools of the Obama administration and others before it. And they embody a kind of government absolutism with roots in European monarchy. In the words of Philip Hamburger, the distinguished Columbia Law School scholar,
Administrative adjudication evades almost all of the procedural rights guaranteed under the Constitution. It subjects Americans to adjudication without real judges, without juries, without grand juries, without full protection against self-incrimination, and so forth … [A]dministrative courts substitute inquisitorial process for the due process of law—and that’s not just an abstract accusation; much early administrative procedure appears to have been modeled on civilian-derived inquisitorial process. Administrative adjudication thus becomes an open avenue for evasion of the Bill of Rights …
Rather than speak of administrative law, we should speak of administrative power—indeed, of absolute power or more concretely extra-legal … power. Then we at least can begin to recognize the danger.25
A fourth feature of our current culture is the character of our emerging young adult leadership. Writing in April 2001, David Brooks offered a portrait of Ivy League students as America’s “meritocratic elite.” He focused on young people at Princeton, but he saw the same pattern at other leading universities. These elite millennials were very different from their boomer parents. They were disciplined, hardworking, and courteous. They had little interest in politics. They respected authority. They were generous in their community service, upbeat about the future, and intensely driven to achieve.26
Children of privilege from the start, they lived at a unique moment. They had no memory of Vietnam. No memory of deep social conflict. The Cold War was over. We had won it. Prosperity was growing. The United States was the world’s only great power. The possibilities for personal success were bright. Brooks found these students open, sensible, and deeply likable. It’s what he didn’t find that bothered him.
The students he met lacked the basics of what earlier Ivy League generations would have learned as a matter of course. Things more vital even than material success. Things like a sense of moral gravity and duty, a distrust of luxury, “a concrete and articulated moral system,” and some understanding that “life is a noble mission and a perpetual war against sin, that the choices we make have consequences not just in getting a job or a law school admission but in some grand battle between lightness and dark.”27
Five months after the Brooks article, the Twin Towers were attacked. Wars in Iraq and Afghanistan followed. In 2008, so did a global financial meltdown. The world changed. And harsher times led to a very different environment.
What we have now at elite universities, according to one former Yale scholar, is a “system [that] manufactures students who are smart and talented and driven, yes, but also anxious, timid and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.”28
And what about students outside the Ivy elite? Comparing his own talented students to the ones David Brooks had encountered thirteen years before, the Villanova scholar Mark Shiffman was struck not by their lack of curiosity, but by their fear:
When the kid at the next desk might out-compete me, edging me out of the path to economic security, then the hope that we may prevail together gives way to the fear that I will be the one who fails. When the specter of shrinking prosperity increases competition for scarce opportunities and engenders doubt that I will do as well as my parents, that fear intensifies … Our fear has become a pathological condition, a desperate need to bring the future under control. And we seek therapy from colleges and universities, the therapy of cumulative achievement along clearly marked pathways to success.29
College isn’t cheap. As costs rise and the college success-therapy model fails, the fear of many young adults goes deeper. That fear, fused with the lack of a demanding, morally coherent vision for university life, helps drive the confusion, anger, hypersensitivity, and spirit of entitlement that now too commonly mark student life.
And that leads to a fifth and final feature of our current national life. But first some background.
* * *
WRITING ABOUT THE PREVALENCE of scientific fraud in 2016, William Wilson, a Bay Area software engineer, noted that “When a formerly ascetic discipline suddenly attains a measure of [policymaking] influence, it is bound to be flooded by opportunists and charlatans, whether it’s the National Academy of Science or the monastery of Cluny.”
In practice, earnest-sounding junk thought from opportunists can infect every element of society, not just science. And junk thought—from rewriting history, to inventing new narratives of oppression, to sex and gender studies that claim to prove the implausible—is the human intellect weaponized to serve political goals. Especially the goal of silencing different views.
Culture warriors come in all shades of opinion, including the most ostentatiously tolerant, progressive, and forward-thinking. In fact, to borrow a warning from the great French writer Charles Péguy, we may never fully know the acts of deceit and cowardice that “have been motivated by the fear of looking insufficiently progressive.”30
So this is the fifth key feature of our common national life today: malice wrapped in the language of tolerance, sensitivity, and rights. It consists in an appetite to use power not simply to prevail in political debate, but to humiliate and erase dissent, and even its memory, in reworking the cell structure of society.
Examples range from the strange to the bitter. They include efforts to scrub Confederate public monuments from the South, and “progressive” academic attacks on American Founder Alexander Hamilton (of Hamilton Broadway musical fame). They include companies like Apple and Salesforce.com that attack religious liberty legislation in states nationwide and “use economic threats to exercise more power over public policy than the voters who use the democratic process.” And they include selective shaping of the Advanced Placement U.S. history framework by the College Board, controversial efforts to “fix” democracy by major philanthropies, and bitter posthumous attacks on “unprogressive” public leaders.31
There’s more. In 2011, the Obama administration stripped funds from the U.S. bishops’ Migrant and Refugee Services anti-human-trafficking program. MRS was a highly regarded leader in its help to victims of sex trafficking. But it was defunded because the bishops declined to provide abortion and contraceptives as part of MRS services. The money was reassigned to groups ranked lower in quality, but more ideologically compliant, by the same White House.
The same intolerance marked the administration’s fight to coerce abortion and contraceptive services as part of national health care. It tenaciously refused reasonable compromise on exemptions for religiously affiliated providers and organizations and deliberately sought to break any opposition—prompting a sardonic Wall Street Journal editorial renaming the Little Sisters of the Poor “the Little Sisters of the Government.”32
The pattern of a White House calling for compromise and national unity, and then making both impossible, is hardly the sin of any one party. Nor is it new. But the scope and the nature of the damage today are. Because unlike fifty years ago, and many other serious moments of national confusion, the country’s character is now fundamentally different.
For many Americans today, the life of the nation has become something remote. It offers too little sense of a shared history or national purpose; too little reasoned debate; too little civility (despite endless calls for it); too little moral gravity or continuity with the past; too little value to warrant personal engagement; and too few reasons for anyone to risk his or her life in defending through military service what many see as the public-square equivalent of a big-box discount store.
It takes an extraordinary faith in democracy and the integrity of higher education to share in America’s tradition of optimism when, day after day, the most distinguished national newspaper of record offers stories like “Colleges Show Their Lobbying Might” in the same pages as “The Rise of the College Crybullies,” “A Campus Mayhem Syllabus,” “I Was Disinvited on Campus,” “Speechless on Campus,” “Tolerance, Free Speech Collide on Campus,” “A Campus Crusade Against the Constitution,” “Bonfire of the Academy,” “Whatever Happened to Religious Freedom,” “The First Amendment Needs Your Prayers,” “When the College Madness Came to My Campus,” “Radical Parents, Despotic Children,” “The Climate Police Escalate,” and “Yale’s Little Robespierres.”33
As Greg Lukianoff and Jonathan Haidt wrote in 2015, “A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas and subjects that might cause discomfort or give offense.” Faculties and administrators—adults who should know better but, lacking a moral center of gravity, don’t—have often buckled under the pressure. And how does that fit with the genuine greatness of the American experiment? Thomas Jefferson answered it best at the founding of the University of Virginia: “This institution will be based on the illimitable freedom of the human mind. For here we are not afraid to follow truth wherever it may lead, nor to tolerate error so long as reason is left free to combat it.”34
We’ve come to the end of a long, dimming corridor. And as we do, we might remember some lessons from the last century.
In 1935, Sinclair Lewis wrote It Can’t Happen Here, the tale of a populist candidate who runs for president on a platform of patriotism, deep economic and political reform, and traditional values. Once elected, he imposes a homegrown American fascism.
Five years later, in 1940, Arthur Koestler told the story of Rubashov, one of the original “Old Bolsheviks,” a man arrested and condemned to death by the communist revolution he spent his life fighting for. Koestler’s Darkness at Noon is one of the great novels of the twentieth century. And it has one of the era’s greatest themes:
Rubashov wandered through his cell. It was quiet and nearly dark. It could not be long before they came to fetch him. There was an error somewhere in the equation … It was a mistake in the system; perhaps it lay in the precept that until now he had held to be incontestable, in whose name he had sacrificed others and was himself being sacrificed, in the precept that the end justifies the means. It was this sentence that had killed the great fraternity of the Revolution and made them all run amuck. What had he once written in his diary? “We have thrown overboard all conventions; our sole guiding principle is that of consequent logic; we are sailing without ethical ballast.”
Perhaps the heart of the evil lay there. Perhaps it did not suit mankind to sail without ballast. And perhaps reason alone was a defective compass, which led one on such a winding, twisted course that the goal finally disappeared in the mist. Perhaps now would come the time of great darkness.35
Fascism, communism, complicated ideologies of every shape and sort: Today in pragmatic, postmodern America, these clumsy, antique systems of thought are museum pieces. Misshapen children of the Enlightenment. Words from a dark time, long past. They clearly don’t belong in any sane discussion of advanced liberal democracies. They can’t happen here. And it’s quite true: They really can’t happen here.
But there’s more than one way to skin a cat, as Tocqueville saw long ago. The peculiar despotism innate to democracy
extends its arms over society as a whole; it covers its surface with a network of small, complicated, painstaking, uniform rules through which the most original minds and the most vigorous souls cannot clear a way to surpass the crowd; it does not break wills but it softens them, bends them, and directs them … it does not tyrannize, it hinders; compromises, enervates, extinguishes, dazes, and finally reduces each nation to being nothing more than a herd of timid and industrious animals, of which the government is shepherd.36
To put it another way: The Enlightenment’s crippled children have a gentler, more successful, and far more congenial sibling. Us.