10

TRIBALISM TODAY

Nationalism, Populism, and Identity Politics

The old American ideal is that all men are created equal and are masters of their fate, captains of their souls. It was, in the words of Barack Obama, “a creed written into the founding documents that declared the destiny of a nation.”1 It took a good deal of time for that concept to include women, blacks, and other marginalized groups. But one of the reasons why women and blacks succeeded in changing the Constitution and American attitudes is that they were appealing to that ideal, not rejecting it. It is always easier to win an argument when you can truthfully tell your adversary he’s right in what he believes, just wrong in how he’s applying the principle.

But America has other ideals too. And sometimes these ideals can be in conflict. The other side of the coin to the conviction that the individual is the master of his fate is the idea that every American be an American. There is a healthy tension between these two principles. The German-Americans of the eighteenth and nineteenth centuries wanted to be Americans, but they did not want to abandon their culture and language wholesale either. It is the difference between enforced ghettos and free communities. No one should be barred from fully participating in the American experiment on account of ethnicity, race, or religion, but no one should be forced to forsake one’s heritage either.

The key to resolving this tension was twofold: liberty and time. Giving individuals the freedom to make these trade-offs on their own terms and giving society time to let the melting pot work its magic. That magic depended on many things, but none was more important than simple good manners. America has a culture as deep and rich as any other society, but Americans tend to think otherwise. When they travel abroad, they rub against the other cultures without realizing that the friction comes from the fact they brought their cultural expectations with them. In America, it is simply good manners to take individuals as you find them and not as representatives of some abstract group or classification. Accepting that is part of becoming an American. In other parts of the world, and for most of human history, it has been natural to treat individuals as a member of their tribe. In America, you’re supposed to judge people by their character.

This cultural norm is as much a product of the Enlightenment as our Constitution, and arguably just as essential. The whole Enlightenment-derived idea behind the American founding is that America can turn Frenchmen, Italians, Germans, Chinese, Arabs, etc., into Americans—that is to say, a new people dedicated to the principles of the Founding and the culture of liberty it birthed. The appeal of this vision attracted millions of people from around the world eager to escape the deadweight of history, class, and caste in their native countries. My brilliant friend, the late Peter Schramm, liked to tell the story of his family’s escape from Hungary in the aftermath of its failed revolution against Russian Communists:

“But where are we going?” I asked.

“We are going to America,” my father said.

“Why America?” I prodded.

“Because, son. We were born Americans, but in the wrong place,” he replied.2

We were once taught that, when America deviated from this ideal, it was a shameful betrayal of our best selves. When, for instance, the Supreme Court upheld the Chinese Exclusion Act in 1889, it agreed with the government that the Chinese “remained strangers in the land, residing apart by themselves,” and that they were unlikely “to assimilate with our own people or to make any change in their habits.”3 I was always taught this was a dark moment in American history.

To be sure, American students are still taught that. We still believe that the government shouldn’t exclude some groups based upon arbitrary prejudices. But the rest of the melting-pot formula is breaking down in three ways. First, we are now taught that the government should give special preferences to some groups. Second, as a cultural imperative, we are increasingly told that we should judge people based upon the group they belong to. Assimilation is now considered a dirty word. And last, we are taught that there is no escaping from our group identity.

Multiculturalism and identity politics ideologies contain within them myriad contradictions and inconsistencies, but as a broad generalization it is impossible to deny that our culture is shot through with an obsession with race, gender, and ethnic essentialness. “At a very young age, our children are being encouraged to talk about their individual identities, even before they have them,” writes the political theorist Mark Lilla. “By the time they reach college many assume that diversity discourse exhausts political discourse, and have shockingly little to say about such perennial questions as class, war, the economy and the common good.”4

If you can’t see this, you are a rare bird, given that the current debate about the explosion of identity politics isn’t whether or not it exists but whether it is good or bad. For that reason I will not drown the reader with page after page of horrible or hilarious stories from American campuses and left-wing media outlets (though the curious reader will find in the endnotes a micro-fractional list of examples that is at least somewhat illustrative of the point).5 But I will offer a few examples that advance my argument that this turn to the tribalism of identity politics is poisonous to the American miracle.

Before I begin, I should recap the argument of this book: First, the rust of human nature is eating away at the Miracle of Western civilization and the American experiment. Second, this corruption is nothing new; nature is always trying to reclaim what is hers. But this corruption expresses itself in new ways in different times as the romantic spirit takes whatever form it must to creep back in. Third, the corruption can only succeed when we willfully, and ungratefully, turn our backs on the principles that brought us out of the muck of human history in the first place. The last point, which is the subject of the next chapter, is that the corruption has now spread, disastrously, to the right, not just in America but throughout the West.

But for more than a generation now, the best principles of the West have been under assault. Intellectuals are recasting the virtues of our system and making them vices. “Merit,” the essence of the Jeffersonian ideal of an anti-aristocratic society, is now code for racism. “Whenever you hear someone (White or Black) oppose affirmative action with the ‘merit plea,’ you are listening to racism,” explains Ibram H. Rogers, author of The Black Campus Movement: Black Students and the Racial Reconstruction of Higher Education, 1965-1972.6 CNN commentator Van Jones has said that Republicans who desire a color-blind meritocracy have a racial “blind spot.”7 His colleague Ana Navaro—a liberal Republican—insists that a merit-based immigration points system is “absolutely racist.”8 Which would mean that Canada and Australia rank high on the list of racist nations.

Color-blindness is in fact a facet of not just meritocracy but also of the principle of universal equality. Perhaps Martin Luther King Jr.’s most famous line was that he dreamed of a world where people would be judged on the content of their character, not the color of their skin. The moral clarity and power of this appeal is what fueled the success of the civil rights movement. But the forces of identity have been trying to topple the idol of color-blindness for decades. “Colorblindness is the New Racism,”9 proclaims one headline. “Color-Blindness Is Counterproductive,” insists another.10 A third: “When you say you ‘don’t see race’, you’re ignoring racism, not helping to solve it.”11 Ta-Nehisi Coates, the most celebrated author on the issue of race in decades, writes that “the [American] Dream thrives on generalizations, on limiting the number of possible questions, on privileging immediate answers. The Dream is the enemy of all art, courageous thinking, and honest writing.”12 The American Dream, he continues, is a “specious hope”13 constructed out of “the progress of those Americans who believe that they are white.”14 That white progress is exploitation and violence, based, he says, in “plunder.”15 “ ‘White America’ is a syndicate arrayed to protect its exclusive power to dominate and control our bodies.”16 Coates’s indictment is primarily of white people, not the Constitution or notions of merit, but his indictment of white people is more than broad enough to include a host of American institutions.

Feminists have more diverse and often convoluted arguments about merit, vacillating between appeals to merit and equality and claims of beneficial female uniqueness when convenience dictates. Before joining the Supreme Court, Justice Sonia Sotomayor repeatedly suggested that a “wise Latina” would reach “a better conclusion than a white male.”17 Long before debates about transgenderism became mainstream, female identity became severed from biology. It is perfectly fine to criticize former Alaska governor Sarah Palin as a flawed politician, but one would think it would be fairly easy to form a consensus around the claim that she is a woman. And yet, when John McCain picked her as his running mate in 2008, the response from feminists was to insist that ideological conformity negated gender conformity. A spokeswoman for the National Organization for Women proclaimed Palin more of a man than a woman. Wendy Doniger, a feminist academic at the University of Chicago, wrote of Palin: “Her greatest hypocrisy is in her pretense that she is a woman.”18

Behind every double standard lurks an unstated single standard, and in virtually every identity politics campaign that standard is power. Whatever accrues to the net benefit of my group or to allied groups advances social justice. Thus, for example, in arguments over equal pay, feminists insist that statistical disparities are prima facie evidence of institutional prejudice against women. The principle they invoke is correct, but the disparities they cite don’t make their case.19 The claims hinge on statistical light shows that use aggregate disparities between male and female compensation to prove discrimination. There’s a similar problem with the argument over women in science. Women tend not to go into the STEM fields in large numbers. A 2016 study found that only 18 percent of computer science majors were female.20 This disparity, according to many feminists and diversity activists, can only be explained by systemic biases. It’s certainly possible that such biases exist. But women are overrepresented in many other fields. Some 60 percent of postdoctoral biology degrees and 75 percent of psychology degrees go to women. Is there some plausible reason to believe those fields successfully purged their ranks of sexism but computer engineers remained a stubborn hotbed of patriarchal bigotry? As psychiatrist and science blogger Scott Alexander writes:

As the feminist movement gradually took hold, women conquered one of these fields after another. 51% of law students are now female. So are 49.8% of medical students, 45% of math majors, 60% of linguistics majors, 60% of journalism majors, 75% of psychology majors, and 60% of biology postdocs. Yet for some reason, engineering remains only about 20% female. And everyone says “Aha! I bet it’s because of negative stereotypes!”21

As Christine Rosen of The New Atlantis puts it:

On the one hand, the argument goes, if there were no discrimination, women and minorities would be perfectly represented in every field proportionate to their numbers in the general population because there are no substantive differences between these groups and the white men who have long dominated certain fields (such as technology and engineering). At the same time, however, diversity ideology insists that women and minorities bring a special viewpoint and unique experiences to their work, and companies need this in order to thrive. In other words, they are especially valuable because they are different, and therefore favoring them in hiring is justifiable.22

For our purposes, the question of whether those choices have some grounding in biology or culture or both is a distraction. The more simple answer is this: Individual women made individual choices to pursue careers that appealed to them. When large numbers of free people make choices, expecting the aggregate results of those choices to be perfectly representative by gender (or race or ethnicity) is not only ridiculous but also sexist (or racist) because it assumes a uniformity of talent, interest, and drive for whole categories of people.

Unless, that is, you are someone who makes a living from exploiting these disparities. Few feminists complain about the comparative dearth of female sanitation workers, but they are happy to cite disparities at Google or in corporate boardrooms as proof of sexism. And the technique of their argument is consistent with the real aim: power, not policy. As a prominent feminist textbook explains, feminists measure gender equality by “the degree to which men and women have similar kinds or degrees of power, status, autonomy, and authority.”23 Jessica Neuwirth, founder and director of the ERA Coalition, insists that “the entrenched historical inequality between the sexes cannot be erased by the creation of a level playing field because the players themselves are at two different levels.”24 In other words, the state must intervene on behalf of women because merit is an unworkable standard. The intent of the employer or policy maker and the qualifications or character of the individual job seeker are irrelevant. It is the “system” itself that is corrupt and racist (or sexist). And the proposed remedy is almost always to bend the rules, to discard objective standards in favor of selective ones that arbitrarily designate some group to be entitled to special treatment. This is the logic of the state as an instrument of divine justice manifesting itself yet again.


One need not reject the entirety of arguments made by the prosecutors to see the problems with this approach. Freed slaves certainly did deserve forty acres and mule (at least!), as many post-Civil War Radical Republicans proposed. Similarly, the early affirmative action programs targeted specifically to blacks in the wake of the Civil Rights Acts have intellectual and moral merit. Of course, notions of merit and color-blindness can serve to mask conscious or unconscious biases on the part of employers, managers, and others. There are indeed structural problems in American law and culture that are worth addressing or discussing. The embryonic left-right consensus on criminal justice reform has a lot of promise, for example. But the argument being made by countless tenured radicals goes much further than a call for practical reforms. They seek to overturn the status of merit and color-blindness as ideals.

Stanley Fish, one of the pioneers of this project, is honest. The literary and legal scholar has made it plain that he considers objective and neutral standards, fair rules of the game, to be a mirage concealing the will to power of whites or the system or the European mind. Even reason is a con. According to Fish, there really is nothing called reason; there is simply argument and other contests of power. Whoever wins the argument gets to claim that reason validates his position. He writes that “like ‘fairness,’ ‘merit,’ and ‘free speech,’ Reason is a political entity,” an “ideologically charged” product of “a decidedly political agenda.”25 University of Virginia law professor Alex M. Johnson contends that “the presumed norm of neutrality actually masks the reality that the Euro-American male’s perspective is the background norm or heuristic governing in the normal evaluative context.”26

Power politics is as old as politics. Coalitions of interests have vied with each other for power in every political system ever created. It would be easy to dismiss the identity politics of race, gender, and ethnicity as simple reinventions of the sort of coalitional squabbles that defined American politics—and politics generally—forever. Germans versus Anglos, farmers versus city dwellers, Catholics versus Protestants, everyone versus the Jews. And to be sure, some hucksters, like Al Sharpton, are less devotees of Stanley Fish and more devotees of the ward-heeling rabble-rousers common to big-city politics in the nineteenth and early twentieth centuries. But some differences of degree become sufficiently large to become differences in kind. Racial and gender identity have been abstracted, converted into a permanent and immutable ideological category that claims there is no common ground between groups save perhaps the common effort to overthrow “white male privilege.” Anything associated with the system that white men created is discredited. Argument, grounded in reason, is itself now a tool of oppression. And the unshakable faith that those on the side of “social justice” are right has itself gelled into a kind of tribal ideology.

The legendary French liberal theorist Raymond Aron commented in 1957 that “the essentials of liberalism—the respect for individual liberty and moderate government—are no longer the property of a single party: they have become the property of all.”27 That is no longer the case. On the left, and increasingly on the right, large swaths of tribalists have forfeited their ownership stake in the liberal project.


This effort to delegitimize classical liberal standards manifests itself every day on college campuses. When Swarthmore invited left-wing philosopher Cornel West and conservative philosopher Robert P. George—close friends and colleagues at Princeton—to speak, many students were outraged. “What really bothered me is, the whole idea is that at a liberal arts college, we need to be hearing a diversity of opinion,” Erin Ching told the Daily Gazette, the school’s newspaper. “I don’t think we should be tolerating conservative views because that dominant culture embeds these deep inequalities in our society.”28 A student writing in the Harvard Crimson bemoans the ideals of “free speech” and “academic” freedom as systems of oppression. “When an academic community observes research promoting or justifying oppression, it should ensure that this research does not continue.”29

When my National Review colleague Kevin D. Williamson and free speech activist Greg Lukianoff spoke on a panel at Yale on the virtues of free speech, many students were livid. The panel was interrupted by a student who shouted: “Stand with your sisters of color. Now, here. Always, everywhere.” Some of the participants were spat on.30

Again, one could go on not just for pages but at book length documenting these bonfires of asininity at various elite universities.31 And while it would be too generous to credit many of these individual students with an intellectually sophisticated or thought-through ideology, it’s important to recognize that they didn’t invent these ideas: They were taught them.

Again, this effort to enthrone liberal ideals is inseparable from a desire for power—power for professors, students, activist groups, Democrats, etc. Some of it is just conventional guild protection stuff: As we’ve seen, groups of any kind, once organized and established, guard their status jealously. Various professors specializing to the exclusion of almost everything else in the study of race and gender—but also diversity consultants, administrators, and various outside activist groups—have a vested interest in heightening racial and sexual grievances for the simple reason that they make a living from such things. Women’s studies departments are not particularly popular, which is one reason women’s studies faculty members are eager to create or exploit controversies that make their disciplines relevant. If you are a journalist who only knows how to churn out articles explaining why something is racist, the last thing you want to hear is that racism isn’t as big a problem as you claim it is. The Southern Poverty Law Center once did important work identifying bigoted groups and policies around the country. Now it invents new categories of “hate”—so as to sweep conventional conservatives into its demonology—to justify its fund-raising and relevance.32

But the pursuit of power isn’t merely reducible to careerism and profit. The more important dynamic, the one that makes this such an appealing ideology, is the desire to have authority over others, to control the terms of debate, and to establish yourself as the new authority on what is or is not legitimate. Every society since the agricultural revolution has created a priestly class that defines the scope of right thinking and right action. For millennia that role was played by actual priests. In modern society the new clerisy is increasingly to be found among the self-anointed class of academics, activists, writers, and artists who claim a monopoly on political virtue. They unilaterally get to decide who is to be anathematized or excommunicated for wrong thinking. And college campuses serve as their most formidable monasteries and citadels.

Indeed, free speech isn’t merely emotionally painful (“triggering”); it is a threat to ideological hegemony. Identity politics has always been about the politics and psychology of power. By insisting that some questions cannot be asked, some ideas not entertained, the new clerisy is wielding power. The whole notion of creating “safe spaces” should be understood as an effort to control certain battle spaces in the culture war.

The clerisy changes the rules of what is permissible to say—or how to say it—in the same way Mao’s Red Guard terrified their elders. The stakes may be lower on the Yale campus, but does anyone doubt that some students would love to march ideologically wayward professors around the quad in dunce caps? According to the Anti-Defamation League, Ben Shapiro, an Orthodox Jewish conservative, came in first on its list of anti-Semitic social media attacks from the alt-right. (I came in sixth.)33 Nonetheless, when he spoke at Berkeley in 2017, he was widely attacked for being a white supremacist. Tariq Nasheed, a self-described “Anti-Racism Strategist,” announced on Twitter: “Suspected white supremacist Ben Shapiro, who tries to mask his racist rhetoric by claiming to be Jewish, is in Berkeley now.”34 Ayaan Hirsi Ali, an atheist classical liberal who was mutilated in her native Somalia, and Maajid Usman Nawaz, a former Liberal Democrat politician in England, are committed opponents of Islamic extremism. But, according to the SPLC, they are now anti-Muslim bigots.35 They are routinely banned, protested, and disinvited from speaking at college campuses on the grounds that students cannot be exposed to their injurious and dangerous “hate speech.”

I was once invited to speak at Williams College by a group that called itself Uncomfortable Learning. The group chose that name because they knew that, if they tipped off students that they might hear conservative or libertarian views, the students would boycott the event. But because “Uncomfortable Learning” sounds so rebellious, and transgressive students assumed they’d be hearing things they already agreed with, the reaction from the audience when I spoke reminded me of my dogs’ reaction when they think we’re driving to the park, only to discover we’re heading to the vet.


The great irony of all this is that identity politics wins not by making compelling arguments but by exploiting the inherent decency of the American people, including, most ironically, liberal college professors who are terrified of being called racist, even when the accuser is a cynical opportunist, poltroon, or emotionally immature waif.

One needn’t be absolutist about such things. The essence of serious thinking is the ability to make meaningful distinctions even when facile analogies can deceive us. Calling someone a “nigger” or “kike” is grotesque, and a campus administrator should have the power to discipline students who do so even if it limits their speech. But using an epithet is not the same thing as “punching someone in the face,” as many students increasingly argue.36 And it is different from making an argument that someone doesn’t want to hear. One Yale student, somewhat infamously, argued that the “Master” of her dorm at Yale had oppressed her because he wanted to debate a (ridiculous) controversy about Halloween costumes. “He doesn’t get it,” she wrote. “And I don’t want to debate. I want to talk about my pain.”37 The Master has since lost his job, and Yale has banned the use of the word “Master” to spare students further pain. (After all, slaves had “masters” too.)38

To listen to the activists at Yale, you might think the school was a hotbed of white oppression, willy-nilly excluding minorities from participation in campus life. When this “Master” controversy erupted in the fall of 2015, I took a look at the course offerings at Yale that year. By rough count, Yale offered at least twenty-six courses on African-American studies, sixty-four courses on “Ethnicity, Race and Migration,” and forty-one courses under the heading “Women’s, Gender, and Sexuality Studies.” These are conservative estimates and do not include independent study. Meanwhile, I found two courses on the Constitution. A single professor teaches all of the courses on the Founding era: three. As for safe spaces outside the classroom and the dorm, I tallied an Afro-American Cultural Center, a Native American Cultural Center, an Asian American Cultural Center, La Casa Latino Cultural Center, and the Office of LGBTQ Resources. Plus there were nearly eighty organizations dedicated to specific identity groups in one way or another.39 The same pattern holds at most elite colleges. The moral of the story: Appeasing identity politics demands, like all appeasement, simply leads to more and more demands.


Of course, campuses are simply one front in the larger war. For decades, representatives of various identity groups have asserted authority over how to deal with, or even talk about, certain issues. This effort is ideological, but it is also cynical. “Diversity consultants” and similar specialists have a class interest in perpetuating a constant state of uncertainty about what constitutes racism, because such priestcraft gives them power, status, and income. For instance, it is a settled fact of social science that bilingual education hampers English learning and assimilation.40 But for a politician to say so is to invite charges of racism or “insensitivity” from the anointed representatives of the “Hispanic community.” What better way to prevent assimilation than to foreclose debate on the matter by simply declaring assimilation bigoted? No doubt many advocates believe it, but it’s no coincidence that the bureaucrats and educators invested in the business of bilingual education benefit from censoring any competing point of view.

The most redeeming aspect of political correctness stems from the legitimate effort to create a code of good manners for a diverse society. We have a tendency to concentrate on the forms good manners take rather than the purpose of them. From prehistoric times until today, manners—ceremony, custom, etiquette, etc.—have simply been mechanisms for reducing unwanted conflict by showing respect, particularly to strangers. Some believe that the handshake was born by the need to show that one didn’t have a weapon in hand. At its best, PC is a way to show respect to people. If black people don’t want to be called “Negroes,” it is only right and proper to respect that desire. If Asians object to “Oriental,” lexicological arguments can’t change the fact that it is rude not to oblige them.

The problem is that the ambitions of political correctness go much deeper than that, which is why activists are constantly changing the acceptable vocabulary. The clerisy doesn’t own anything other than its monopoly over acceptable words. Clear, universal rules about acceptable terminology—i.e., what constitutes good manners—are a threat to that monopoly. And so the rhetorical ground underneath us is constantly shifting. When I was a trustee of my alma mater, a diversity consultant explained to the board that “tolerance” was no longer kosher, because it implied a certain kind of condescension. “Acceptance” was the new word of the moment. These days, “celebration” seems to be the new “acceptance.” But there are enormous differences between “tolerance,” “acceptance,” and “celebration.” “Tolerance” and “acceptance” acknowledge disagreement to one extent or another. The requirement to celebrate, however, is ultimately a form of psychological bullying. It says, “You must abandon your convictions and agree with mine.” It is one thing to argue that a free society should accept gay marriage or allow people to define their gender in terms utterly unrecognizable to science. It is another thing to demand that individuals rejoice—or pretend to rejoice—in the lifestyles or decisions of others. But that is precisely what the jihad against “hate speech” demands. Dissent from the orthodoxy is now the equivalent of violence or complicity in it. The war on tolerance has become an effort to make room for a new intolerance.

Even democracy is now seen as a threat to tribal power politics. Support for democracy is eroding across the West, particularly among young people. Much of this has to do with the worldwide populist reaction to “globalism,” as we’ll see. But the tribal attack on democracy has been under way for a very long time. Consider Lani Guinier, the Harvard professor who briefly achieved celebrity status for her failed bid to run the Civil Rights Division of the Clinton Justice Department. Guinier argues in her book The Tyranny of the Majority: Fundamental Fairness in Representative Democracy and in various law review articles that the doctrine of “One man, one vote” needs to be jettisoned in favor a more “authentic” form of democracy. She proposes an idea inspired by her then four-year-old son, Nikolas: “Taking turns.”41 When Nikolas couldn’t get a consensus among his friends about what the kids should play, they decided they should take turns deciding. Similarly “authentic minorities” should have a “turn” at representation even if their “authentic leaders” cannot win a majority of the vote.

Guinier places enormous emphasis on the term “authentic”; merely being black is not enough. One must represent the authentic spirit of the people, or what the Germans call Volksgeist, of the black community, as determined by those with the deepest investment in a specific definition of authenticity, like Ms. Guinier. “Authenticity reflects the group consciousness, group history, and group perspective”42 of a specific “social group.” “Authentic leadership” is not merely “electorally supported by a majority of black voters.” The leader must be “politically, psychologically, and culturally black.”43

“Authenticity refers to community-based and culturally rooted leadership. The concept also distinguishes between minority-sponsored and white-sponsored black candidates.” She clarifies: “Basically, authentic representation describes the psychological value of black representation. The term is suggestive of the essentialist impulse in black political participation.” She rejects the principle of color-blindness because it “abstracts the black experience from its historical context” and “ignores the existence of group identity within the black community.”44

The upshot, as she makes clear at great length in unambiguous prose, is that blacks who are elected with significant shares of the white vote—like then Virginia Democratic governor Douglas Wilder—may not, and often do not, count as authentically black. This is where racial essentialism and political leftism intersect. According to many on the identity politics left, only left-radical politics are authentically black. This is why Justice Clarence Thomas doesn’t count as black among so many black activists. Blacks are supposed to think a certain way, and if they do not, they are essentially inauthentic, or “Uncle Toms.”

There are countless ominous echoes in these fundamentally romantic, tribalist ideas. Karl Marx believed that the Jew (and the Negro) had authentic natures rooted in psychology, history, and culture (and, in the case of blacks, biology). So did Joseph de Maistre. Needless to say, German nationalists had strong opinions about the essentialism of various groups. German nationalist intellectuals like Johann Gottfried Herder and Johann Fichte wrote at epic length about the essential psychological and cultural uniqueness of the German Volk. But perhaps the most interesting parallel is to the great champion of Southern slavery, John C. Calhoun. He argued that a “mere numerical majority” could not overrule a minority if the majority’s decision conflicted with the core interests of the minority, i.e., slaveholding whites. Guinier even invokes Calhoun’s theory of concurrent majorities as one possible remedy to the problem of “One man, one vote.”45

Whatever parallel you want to draw, the conclusion is the same: This is not liberalism, rightly understood.

Guinier’s views are actually relatively moderate compared to other champions of identity politics on the left. These days, there are whole academic departments dedicated to “Whiteness Studies.” But this discipline is not analogous to “Black Studies” or “Hispanic Studies” or “Women’s Studies.” Those schools of thought are dedicated to the project of building up an identity, celebrating its uniqueness, and cultivating, essentially, a sense of nationhood. Whiteness Studies are dedicated to cataloging the illegitimacy and even the evil of whiteness. The syllabus at one university describes Critical Whiteness Studies as a field “concerned with dismantling white supremacy in part by understanding how whiteness is socially constructed and experienced.”46

This sort of thinking has spilled out into the mainstream culture. Essentialism for Maistre was all about nationality. Now it is about ethnic or gender categories. As one black journalist recently put it on Twitter: “Yes, ALL white people are racist. Yes, ALL men are sexist. Yes, ALL cis people are transphobic. We have to unpack that. That’s the work!”47

Again, one need not be categorically opposed to ethnic groups or other minorities flexing their muscles in a diverse society. That’s a story as old as the country and is unavoidable in any society. The key distinction, once more, is that some within these groups are not merely fighting for their piece of the pie or for recognition of their legitimate interests. They are seeking to overthrow the ideals that made this country so successful in the first place. They are not merely arguing that the system needs to live up to its own ideals, which was the argument of the suffragettes and the civil rights movement. They are arguing that the ideals themselves are illegitimate.

The tragedy here is that liberalism—in the classic Enlightenment sense—is the only system ever created to help people break out of the oppression of identity politics. For thousands of years, nearly every society on earth divided people up into permanent categories of caste, class, peasant and noble, and, of course, male and female. The Lockean principle of treating every human as equal in the eyes of God and government, heedless of who their parents or ancestors were, broke the chains of tyranny more profoundly and lastingly than any other idea.

Does America fail to live up to that ideal? Of course. Every human and human institution fails to live up to its ideals. That is why we call them ideals. They are something to strive for. Every wife and husband who ever repeated a marriage vow has fallen short of their promise at one point or another. But that is not an argument for not trying to stay true to their oath. The devout Christian is the first to admit that he or she fails to live up to the injunction to be Christlike (1 Corinthians 11:1). But that entirely human failure is not an indictment of the Christian ideal. Even the greatest philanthropists will readily concede that they could be even more charitable. Does that discredit the good they do? Oskar Schindler, the man made famous by Steven Spielberg’s Schindler’s List, was overwhelmed with remorse that he didn’t do more to save more Jewish lives during the Holocaust. But he did save more than a thousand of them, at great personal risk. Shall we declare him a villain in that chapter of humanity for doing good while falling short of perfection?

The original argument for diversity was a thoroughly liberal one, in the Lockean sense. Elite universities once discriminated against Jews, blacks, Asians, and women on the grounds that such institutions were a privilege reserved for white Anglo-Saxon Christians. The argument for diversifying universities was purely an appeal to classic American principles of inclusion and meritocracy. Today, many universities as a matter of core policy and conviction discriminate in admissions against Asians, Jews, and whites on the grounds that the principle of diversity trumps any considerations about merit. When the University of California system was forced, against strenuous objections, to abandon racial preferences, the number of Asians admitted skyrocketed. They now make up a plurality of students, roughly one-third, even though Asians constitute only 15 percent of the state’s population. Asian students made up 40 percent of the student population at UC Berkeley in 2012 and 43 percent of the student population at the California Institute of Technology.48 Meanwhile, in elite universities outside of California, Asians need 140 more points (out of 1600) on the SAT to be admitted (while blacks need 310 points fewer).49

In order to defend this institutional discrimination, the clerisy must embrace doctrines of racial essentialism and authenticity. Lee Bollinger, then the president of Columbia University, famously stated:

Diversity is not merely a desirable addition to a well-rounded education. It is as essential as the study of the Middle Ages, of international politics and of Shakespeare. For our students to better understand the diverse country and world they inhabit, they must be immersed in a campus culture that allows them to study with, argue with and become friends with students who may be different from them. It broadens the mind, and the intellect—essential goals of education.50

This is a fine sentiment. But it glosses over the fact that universities subscribe to a very narrow definition of diversity. Intellectual, ideological, and religious diversity take a backseat—sometimes a very distant backseat—to a very specific kind of bean counting. Beside the practical educational problems with racial quotas—promoting students above their ability, making it more likely they will drop out of college, for instance—there is the philosophical and moral problem. It makes racial essentialism into a permanent standard. The original justification for affirmative action policies was that they were a necessary bending of an ideal for special circumstances. “You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and then say, ‘you are free to compete with all the others,’ and still justly believe that you have been completely fair,” Lyndon Johnson famously explained in his 1965 commencement address to Howard University.51 The argument for bending the ideal of individual merit in 1965 was defensible, given the special history and conditions of African-Americans at the time. The doctrine of “diversity” for its own sake goes past bending to breaking the old ideal.

This highlights the power of words and how the new class intellectuals use them to change or undermine institutions. New class bureaucrats have not only expanded the definition of diversity to include other racial groups that had never been slaves or subjected to Jim Crow but have also arrogated to themselves the arbitrary power to decide what counts as “good” diversity, just as they assert authority over what constitutes acceptable language. When agents of the state and other officials have unilateral authority to change the ideal based upon their own political, aesthetic, or cultural preferences, they are substituting objective standards for their own arbitrary power, their own priestcraft.


All political parties are coalitional to one extent or another, but the Democratic Party has always been more coalitional than the modern Republican Party, which, since the rise of Goldwater-style conservatism, has been more ideological. To the outside observer, it might seem odd that the FDR coalition contained Klansmen, blacks, and socialist Jews. Even in recent years, it was not intuitively obvious why the party of same-sex marriage should also be the default party of the Teamsters.

Psychologically and ideologically, these alliances are often rationalized by progressive elites on the grounds that they are defending the defenseless victims of social prejudice. This makes some sense if you start from the romantic premise that traditional civilization is retrograde and oppressive and therefore those who want no part of it are oppressed. This kind of argument is routinely voiced in Europe by human rights activists, who have no problem attacking traditional customs and institutions but insist that non-Western religious or cultural minorities must be given the widest possible latitude: “Who are we to judge?” for the practitioners of Sharia, but judgmentalism as far as the eye can see for traditional Christians.

As a broad generalization, the practitioners of identity politics and their coalitional allies have leached off the inherent decency of this country and the constitutional order to press their advantages. To the extent that they have respected the rules while trying to undermine them at the same time, they have done so while living off borrowed capital. You can get away with a lot of illiberal theatrics and demands in a liberal society, and one of the great, eternal challenges for democratic governments is to figure out how much one can tolerate before the forces of illiberalism corrode the liberal order.

But tolerance is a two-way street. In a decent society, the majority owes respect to the minority. And the minority owes the majority respect as well. That bargain has fallen apart, most acutely in Europe, but America is not far behind, as the champions of identity have grown in power. The story has been embellished to the point where the majority are not cast as tolerant and decent citizens trying to figure out how we should live with one another; the majority are now simply villains.

In August of 2017, two law professors, University of Pennsylvania’s Amy Wax and University of San Diego’s Larry Alexander, penned an op-ed arguing that the breakdown in bourgeois values has led to much of the social discord and dysfunction of contemporary society. The bourgeois culture of the 1940s to 1960s laid out “the script we all were supposed to follow”:

Get married before you have children and strive to stay married for their sake. Get the education you need for gainful employment, work hard, and avoid idleness. Go the extra mile for your employer or client. Be a patriot, ready to serve the country. Be neighborly, civic-minded, and charitable. Avoid coarse language in public. Be respectful of authority. Eschew substance abuse and crime.

Wax and Alexander acknowledged the downsides of that era, but they also noted that bourgeois norms help the disadvantaged more than they help the wealthy, because the wealthy can afford their deviations. But, they note correctly, “all cultures are not equal” and bourgeois culture has benefits others do not.52 A coalition of students and alumni responded to the essay in predictable fashion. Wax and Alexander were peddling the “malignant logic of hetero-patriarchal, class-based, white supremacy that plagues our country today. These cultural values and logics are steeped in anti-blackness and white hetero-patriarchal respectability…”53 It goes on in that vein for a while.

It’s all such nonsense. One has to wonder: If the Judeo-Christian and bourgeois norms of the 1940s and 1960s were so malignantly racist and sexist, how is it that the civil rights movement and feminism ever succeeded in the first place? America was far whiter, the government and leading institutions far more dominated by white men, and the society as a whole was far more religious in the 1960s than it is today. And yet the Civil Rights Acts passed (almost exclusively thanks to the votes of white males in Congress—a majority of them Republicans), universities became coed, and society became more tolerant and welcoming. Martin Luther King Jr. didn’t demonize whites or the Founding; he appealed to the very ideals that are now declared illegitimate. He didn’t vilify bourgeois values; he modeled them in public. He didn’t denounce the Judeo-Christian tradition; the Reverend extolled them from his heart. And, by the way, why did the struggle for gay marriage succeed? Because it appealed not to radicalism but to bourgeois values about family formation.

It must be pointed out that this is not simply about rhetoric. The rhetoric yields reality. Anywhere these religious or bourgeois values come into conflict with the agenda of the new class, they must give way. The architects of Obamacare insisted that nuns—nuns!—must pay for birth control and abortion coverage. In Massachusetts, Boston’s Catholic Charities closed down its adoption services because the state told it that if it wanted to find homes for orphans, it needed to place them with same-sex couples.54 I’ll spare the reader all of the controversies over transgender bathrooms, bakers being forced to make cakes for same-sex weddings, forcing the Marines to accept women, and the like.

Whatever one thinks about the merits of these individual policies, the larger point still stands. Under the progressive view of the state, tolerance only has one meaning: bending to a single vision of the culture. When activists say, “If you’re not part of the solution, you’re part of the problem,” they are saying there are no safe harbors in the culture, no rights of exit from the agenda of “social justice.” The Nazis borrowed the term Gleichschaltung from engineering to describe a doctrine whereby every institution, every Burkean “little platoon,” must coordinate with the state or be crushed. My point here is not to single out the role of the state but to emphasize the larger climate of power politics.

As Alexis de Tocqueville most famously argued, our liberal order depends upon mediating institutions, or what he called “associations,” that create and enrich the space between the individual and the state. These institutions—families, churches, businesses, schools, sports teams, charities, the Boy Scouts and Girl Scouts, etc.—are the microcosms that provide meaning for individuals in the larger macrocosm of the nation. By their nature, they must be culturally distinct in some meaningful way if they are to be “sticky.” It is the cultural distinctiveness—the quirks of theology, custom, and mission—that appeal to some people and leave other people cold that provide members a sense of community, belonging, and meaning.

African-Americans understand this implicitly and often express it with great eloquence when it comes to their own historic institutions, both physical and cultural. Historically-black colleges have a rich and laudatory history in America. The black church has been a heroic spiritual, cultural, and political bulwark and safe haven. Jews, likewise, have an incredibly rich collective consciousness of not just the role their religion plays but the myriad customs that give their lives meaning and that have kept them culturally and religiously intact over millennia. The same goes for virtually every ethnic minority and identity group, from gays to the Amish to the deaf. Within broad parameters, there is nothing wrong with any of this, and very much that is right. The key to a thriving civil society is a multiplicity of institutions where diverse groups of people can find a home.

The one hitch is that you must have the “right to exit.” Individuals must have the ability to leave communities and other institutions that do not serve their interests. The miserable, abused wife must be allowed to leave the marriage. The non-believer must be able to walk out of his church, mosque, or temple. The worker must be allowed to leave her job. The right to exit is not absolute. The soldier cannot desert his unit without paying a price. Divorce laws can be written in a way that allows for “cooling-off periods.” Children cannot wander off in a huff. Employees can be held to account for contracts they voluntarily signed. But individuals must have the ultimate authority to say “this is not for me,” and institutions must be allowed to have some cultural “stickiness” if they are going to be able to do their job. And that stickiness can only come from a certain degree of cultural distinctiveness that runs somewhat counter to the mainstream culture. What is true of the hippie commune and gay chorus is also true of the Catholic nunnery and the Boy Scout troop.

There was a time when the right to exit wasn’t the problem, but the right to enter was. Jim Crow laws and sex and religious discrimination policies were immorally exclusive. The country had a series of big, democratic arguments about these barriers, and those arguments were consistent with the Western tradition, not in defiance of it.

Unfortunately, progressives could not take yes for an answer. The failure of ubiquitous and total equality to materialize overnight was seen as proof that classically liberal, color-blind policies were not enough, particularly among a whole class of activists who made a career of exaggerating the nature of the problems so as to justify their own status and power. Psychologically, the romantic desire to fight oppression, to be a person of radical commitment, was unfazed by success after success. Social justice has become an industry unto itself. Progressivism now lacks a limiting principle for governmental and social action. There’s always more work to be done, more injustice to be identified—or imagined—and then rectified. As Democratic senator Chris Murphy said in a moment of jubilation when the effort to repeal the Affordable Care Act failed: “There is no anxiety or sadness or fear you feel right now that cannot be cured by political action.”55 This is a description not of politics but of religion.

Social justice warriors do not seek to simply destroy existing traditional Western culture (or what’s left of it); they seek to create a new culture, or what Hillary Clinton called a “new politics of meaning.” On its best terms, this can be a defensible vision of social democracy, multiculturalism, and secularism. But that vision is almost entirely theoretical. Quite literally, it has simply never been tried on anything like the scope its proponents are attempting, save in places like the Soviet Union. And as that catastrophic experiment demonstrated, whenever you try to replace well-established cultural norms and traditions with an abstract new system, you do not open the door to a new utopia; you open the door to human nature’s darker impulses.

Among the greatest benefits of old institutions is that they are old. Old trees can weather a storm that uproots saplings. Any institution that has been around for a long time has, through a kind of evolutionary adaptation, learned how to cope with crises. The Catholic Church has endured for more than 2,000 years, and in that time it has learned a few things. Judaism has been around for roughly twice as long, which at minimum speaks to the resources Jews have developed to survive.

The Japanese monarchy, the oldest continuous monarchy, dates back to 660 B.C. There is a reason the current Japanese constitution describes the emperor as “the symbol of the State and of the unity of the people.”56 In the literal ashes of World War II, the Japanese could still look to the emperor as a reassuring symbol of communal meaning.

What is true of nations is also true of institutions. Anyone who has relied on church or family during a personal storm understands how rooted institutions provide us not only physical shelter but, even more important, emotional, psychological, or spiritual shelter. We lash ourselves to these oaks of the culture. Chopping them down with the aim of building a perfect society is the perfect recipe for destroying a good society. Because when you destroy existing cultural habitats, you do not instantly convert the people who live in them to your worldview. You radicalize them. This is a point many on the left understand very well when it comes to American foreign policy. They are among the first to argue that nation building or “imperialism,” invites a backlash. The war in Iraq did not deliver democracy, they argue; it delivered ISIS.

But when it comes to domestic cultural imperialism, many of the same people have a blind spot. They see nothing wrong with forcing Catholic institutions to embrace gay marriage or abortion. They think the state should force small business owners to celebrate views they do not hold. They brand any parent or institution that resists allowing men to use women’s bathrooms as bigots. They constantly change the rules of our language to root out disbelievers so they can hold them up for mockery. In June of 2017, Senator Bernie Sanders voted against the confirmation of Russell Vought, President Trump’s nominee for deputy director of the Office of Management and Budget. Vought had written that Muslims were not “saved” because they do not accept Jesus Christ.57 This is not a radical interpretation of Christianity. It is Christianity. “I would simply say, Mr. Chairman, that this nominee is really not someone who is what this country is supposed to be about,” Sanders said. “I will vote no.” In other words, a faithful Christian cannot serve in government, according to Sanders. He has no such policy for Muslims who hold a very similar view toward Christians.58

Sanders’s office issued a statement clarifying his position: “In a democratic society, founded on the principle of religious freedom, we can all disagree over issues, but racism and bigotry—condemning an entire group of people because of their faith—cannot be part of any public policy.” This is correct on its face. No public policy can discriminate against someone on the basis of faith. But there was no evidence whatsoever that Vought would discriminate against Muslims at the OMB. Meanwhile, Sanders’s own policy is that no one who actually believes in Christian doctrine has a right to make policy.

Later that same summer, Senators Dianne Feinstein and Dick Durbin interrogated a judicial nominee, Amy Coney Barrett, about her Catholic faith, insinuating time and again that one cannot be a devout Catholic and a judge. “Dogma and law are two different things,” Feinstein said. “And I think whatever a religion is, it has its own dogma. The law is totally different. And I think in your case, professor, when you read your speeches, the conclusion one draws is that the dogma lives loudly within you, and that’s of concern…”59

You can agree with Sanders, Feinstein, and Durbin if you like, but ask yourself, How do you expect believing Christians to respond to this? Will they instantly embrace this radical reinterpretation of our Constitution—which would have barred every president we’ve ever had from the office (to the extent that they were all truthful when they said they were Christians)—or would they feel like Sanders is trying to take their country away from them? No doubt there is a diversity of responses among even the most orthodox Christians to Sanders’s views, but can anyone doubt that many would take offense?


It is a cliché of the left to say that “perception is reality.” Well, the perceived reality for millions of white, Christian Americans is that their institutional shelters, personal and national, are being razed one by one. They do not like the alternatives they are being offered. Some fraction may indeed be racists, homophobes, or Islamaphobes, but most simply don’t like what they are being offered because they do not know it or because they do know it but prefer what they perceive to be theirs. And yet people like Sanders insist that resistance to their program is not just wrong but evil.

The grave danger, already materializing, is that whites and Christians respond to this bigotry and create their own tribal identity politics. I don’t think the average white American is nearly as obsessed with race, never mind invested in “white supremacy,” as the left claims. But the more you demonize them, the more you say that “whiteness” defines white people, the more likely it is white people will start to defensively think of themselves in those terms. Some liberals will—and do—embrace a self-hating creed. Recall that Robert Frost said a liberal is a man so broad-minded he won’t take his own side in an argument. But most white people will respond differently. They will take the identity peddlers’ word for it and accept that whiteness is an immutable category. White working-class voters who said that they felt like “strangers in their own land” were 3.5 times more likely to vote for Trump.60 In 2016, the more aggressively a person embraced the white identity, the more likely that person was to vote for Trump.61

Now, one last essential point needs to be made. Neither the left nor the state is entirely, or even in some cases primarily, to blame. Capitalism itself is a big part of the problem. The creative destruction of capitalism is constantly sweeping away traditional arrangements and institutions. The thriving communities that grew up around the steel and coal industries, only to be denuded by market forces, are just two obvious examples of how capitalistic innovation unsettles the status quo.

Whenever misfortune befalls us, we are instinctually inclined to assume there was agency behind it. Someone must be responsible! The ruling classes! The industrialists! The Globalists! The New Class! Immigrants! (And for generations of bigots: the Jews!) And while some of these actors may deserve some blame, in some sense the real demon lurking in the shadows is change itself. Populist demagogues promise not only that they have the answer to ease the pains of change (“Free silver! Tariffs! Share the wealth! Build a wall!”), but that they will punish the culprits responsible. Such promises are a thick miasma of snake oil containing healthy portions of nostalgia, demonization, and scapegoating.

Such siren songs—whether from technocrats or demagogues—are inevitable by-products of capitalism. That’s because innovation and efficiency maximization are at eternal war with “the way we’ve always done it.” Capitalism arouses in us feelings of nostalgia for an imagined—and, in some cases, actual—better past when people knew their place in the universe, and their work and their identity were inextricably intertwined.

Here lies the eternal tension inherent in Enlightenment-based societies. The extra-rational institutions of family, faith, and community in all their forms are in constant battle with the force of change and the sovereignty of the individual. Our inner Rousseauians crave community and group meaning. Our inner Lockeans demand that we be given the tiller in finding our own fate. Because capitalism is unnatural and government (broadly understood) is natural, we constantly look to the state to fix the very real problems and anxieties that inevitably emerge from capitalistic destruction.

No one wants to be replaced by a machine or told that her work is no longer valued. It is here where the left often has the better part of the argument, for they at least recognize the havoc that the market can wreak on those left behind. Donald Trump was not the first to appeal to the “Forgotten Man.” He appropriated the phrase—without credit, of course—from Franklin Roosevelt, who argued that “better the occasional faults of a government that lives in a spirit of charity than the consistent omissions of a government frozen in the ice of its own indifference.”62

The Luddites had a point. The Industrial Revolution destroyed whole ways of life for English communities. And while we should be grateful for the Industrial Revolution, one can understand why the immediate victims of it were not inclined to say “Thank you.” Their rage as they saw this new system like a tornado razing their villages was wholly understandable.

But while we can concede the obvious merits of Roosevelt’s views, there remains an inherent defect in this thinking. And it is at once a practical and a philosophical objection. When can you know? When can the neo-Luddites or the technocratic liberals know that the forces of creative destruction are not to the ultimate benefit of mankind or the nation? So far, the evidence is overwhelmingly on the side of innovation. When do you think we should have frozen technological progress? Would we have been better off if Rousseau went back in time and stopped the first man who put a fence around a piece of land and called it his own? Perhaps not. What about in Byron’s age? When life expectancy in England was forty years,63 and when as late as 1851, more than a third of all boys aged ten to fourteen worked, as did about a fifth of all girls?64 Should we have frozen the economy during the 1950s? The wages were good, but life expectancy was sixty-five65 and countless diseases were a death sentence.

The larger problem is that any attempt by the state, or an outraged populist movement, to suppress innovation and more humanely or rationally plan the economy inevitably leads to restrictions on our liberties. No doubt some are easy to tolerate and even welcome. (For instance, it bothers me not a whit that the state makes it difficult for consumers to find child pornography.) But economic liberty is ultimately inseparable from liberty. Socialist society, as Robert Nozick famously put it, must “forbid capitalist acts between consenting adults.”66

The rising tide of protectionism in this country and across the West is merely the most obvious symptom of the larger malady. We live in a moment of ingratitude. Thankfulness is wanting, not just in regard to capitalism, but in regard to democracy itself. In our romantic rage against the machine, we do not differentiate between causes. The state gets blamed for the faults of capitalism. Capitalism gets blamed for the faults of the state. And everywhere we are told that it doesn’t have to be like this and that some other tribe is responsible for our ills. And so we build coalitions of tribes determined to dethrone the authors of our misfortune.

This is the prologue to the story of Donald Trump’s victory and the rise of the “alt-right.” It is also the context for the ascent of Marine Le Pen, the victory of Brexit, and the new global crusade against “globalism.” In the face of the staggering rebuke to the progressive project, we see progressives on their hands and knees searching amidst the wreckage they created, searching for the ideals they were all too happy to smash when they were in power.