5

The Contemporary Culture Wars

IN THE LORE OF the Religious Right, the first shots in today’s culture wars were fired from the left. “Who started this? Who is on the offensive?” asked conservative political commentator Pat Buchanan in 2004. “The answer is obvious. A radical Left aided by a cultural elite that detests Christianity and finds Christian moral tenets reactionary and repressive is hell-bent on pushing its amoral values and imposing its ideology on our nation.”1

There is some truth to this claim. The Supreme Court did escort God out of public school classrooms in the early 1960s; the New Left targeted racism, sexism, and warmongering later that same decade; an increasingly multicultural academy revised college reading lists in the 1970s and 1980s; and San Francisco mayor Gavin Newsom issued marriage licenses to gay and lesbian couples in 2004. But the view that liberals started these culture wars is more conservative spin than careful history. There were radicals in the antiwar, feminist, black power, and gay liberation movements who wanted to turn American society upside down, perhaps “by any means necessary.” But the Supreme Court was not girding for battle, cultural or otherwise. The browning of faculty and curriculum in higher education was viewed by most on the left as a social reform rather than a religious crusade. At least a thousand bras were burned in the imaginations of cultural conservatives to every one burned in real time. Even the gay rights movement was not radical so much as reactionary—a response to long-standing bigotry against homosexuals.

It was cultural conservatives who injected into these debates the rhetoric of war, precisely because they saw the issues as matters of absolute morality and eternal truth. They started the contemporary culture wars because once again they saw American society drifting away from them, erasing forms of culture they held dear. Channeling Cold War rhetoric into domestic politics, they blasted “liberal elites” as socialists undermining the country from within. If liberals in this era were wide-eyed parents-to-be, hoping for a better future for their children, conservatives were anxious parents, clutching their daughters and sons close and willing to fight not to lose them in an increasingly bewildering world.

In other words, the contemporary culture wars began in the fight over the sixties, not in the 1960s themselves. And that fight did not begin until the Right started to protest. Reagan declared war on the “bad sixties” as early as 1966, when he launched his political career by denouncing student protesters and sympathetic faculty at the University of California, Berkeley.2 Neoconservatives, who rejected their New Left roots around the time the Democratic Party selected George McGovern as its standard-bearer in 1972, declared war on the counterculture. But these fights did not go national until the late 1970s and early 1980s, when Reagan ran for (and won) the presidency and the Religious Right broadcast its critique of the sixties far and wide. That decade was not about peace, love, and understanding, conservatives insisted. It was about drug overdoses, race riots, flag burnings, promiscuity, angry feminists, and even angrier blacks—all quite predictable in a society that had lost its moral compass, its respect for authority, and its unifying faith. Undergirding and accompanying this attack on the “bad sixties” was nostalgia for the “good fifties,” a golden age for cultural conservatives even today.

Liberals defended themselves (and the sixties) from this conservative attack, criticizing the 1950s in turn. But like the culture wars of the 1920s and 1930s, the culture wars of the 1970s and beyond played out relatively quickly. While the culture wars cycle accelerated in the Roaring Twenties in part because of the rise of national media, it sped up even more in recent decades because of the emergence of the Internet and the twenty-four-hour news cycle. Gay marriage, all but unthinkable in the late 1980s, was legal in nearly three out of every four states when the Supreme Court declared it the law of the land in 2015.

With this acceleration came increasing divisiveness, as decades of bitterness were compressed, in some cases, into months. In the contemporary culture wars, the maw of cultural politics opened wide and nearly swallowed civil society whole. Culture warriors continued to fight over religion, family, sexuality, race, education, and evolution, but now they fought as well over issues that had previously been considered nonpartisan. As the modus operandi of cultural warfare became the MO of politics writ large, Americans were drawn into a Culture War of Everything, which saw liberals and conservatives taking up sides on the arts, sports, and foreign policy.

Like other family feuds, today’s culture wars began not with a single shot but with an accumulation of grievances that finally bubbled up and boiled over. In 1954, in Brown v. Board of Education, the Supreme Court mandated public school integration. In 1962 and 1963, it banned official prayer and devotional Bible reading in public schools. In 1971, it upheld busing as a means to desegregate those schools. In 1972, Congress sent the Equal Rights Amendment, which guaranteed equal rights for women, to the states for ratification. In 1973, in Roe v. Wade, the Supreme Court affirmed abortion rights. In 1976, the IRS stripped Bob Jones University of its tax-exempt status because it prohibited interracial dating. And then there were the 1960s themselves and their less talented stepchild the 1970s—that bacchanalia of “sex, drugs, and rock and roll” followed by the hangover of Farrah Fawcett posters, Atari, and disco. As these decades crested and crashed, many on the right came to fear that American culture had come unmoored and was drifting onto the rocks of secularity and moral relativism. Our indivisible country was breaking apart and sinking.

But this anxiety did not yet catalyze those who would come to be known as the Religious Right. Evangelicals and fundamentalists initially ignored Brown. Protests against Engel v. Vitale (which outlawed official prayers in public schools in 1962) and Abington School District v. Schempp (which outlawed devotional Bible reading in public schools in 1963) were muted. “They put the Negroes in the schools and now they’ve driven God out,” Rep. George Andrews (D-AL) said of the Engel decision, which also “shocked and disappointed” evangelist Billy Graham.3 But Graham’s Southern Baptist Convention praised both decisions for upholding church–state separation. And fundamentalist Carl McIntire refused to fuss over Engel because the vague prayer the Court deemed unconstitutional in that case—“a pagan prayer,” in his view—did not mention Jesus.4 Regarding Roe v. Wade, Protestants left the protesting of that decision to Catholics for most of the 1970s. In fact, Baptist Press praised Roe for advancing “the cause of religious liberty, human equality and justice,” and as late as 1980 the evangelical magazine Moody Monthly was observing that “evangelicalism as a whole has uttered no real outcry” against abortion.5

Historians have long observed that evangelicals retreated from the public square after the embarrassments of the Scopes “Monkey” Trial of 1925 and prohibition repeal in 1933 only to emerge, Rip Van Winkle–like, from their long slumber in the 1970s. Whereas their spiritual ancestors worked to save souls and reform society, mid-twentieth-century evangelicals effected a “great reversal,” separating themselves from “worldly things.” That is not quite right. “Plain-folk” evangelicals dug deep into politics in the Sunbelt as early as the 1930s and 1940s. The Family, a secretive group founded in 1935, vigorously opposed FDR’s New Deal in Christ’s name. The National Association of Evangelicals was established in 1942 and the National Religious Broadcasters in 1944. Billy Graham, no shrinking violet in the field of politics, befriended President Eisenhower and made the cover of Time magazine in 1954. And the evangelical magazine Christianity Today published its first issue in 1956.6

However, none of the above wielded much political power (Graham was more pawn than king in the Great Game of national politics) and rank-and-file evangelicals did devote less time to reforming America in the half century after the Scopes Trial. Many separated themselves from sinful society, withdrawing into their own private Idahos, worshiping in their own churches, forming their own denominations, attending their own secondary schools and colleges, patronizing their own camps, reading books from their own publishers, listening to their own radio stations, and watching their own television programs.

As they observed their fellow citizens grinding their hips like James Brown, evangelicals were able to imagine that their subculture would never be seduced by the counterculture’s “sex machine.” Their daughters would not engage in premarital sex. Their sons would not succumb to the “homosexual lifestyle.” But the rulings of the Supreme Court in the 1960s and 1970s called this faith into question. Evangelicals began to wonder whether they could continue to hold off the barbarians at the gate. Then came Jimmy Carter and the IRS.

“Segregation Academies” and the IRS

THIS TIME THE threat was not to public school prayer but to private school segregation, more specifically to the tax-exempt status of the hundreds of “segregation academies” that had sprung up post-Brown. The Supreme Court’s effort to desegregate public schools had been met by “massive resistance” in the South. Many segregated school districts had ignored the decision. Others had shut down rather than admit blacks, funneling their students into new, all-white private schools (which were not subject to the Court’s desegregation mandate). After the Civil Rights Act (1964) prompted stricter federal enforcement of desegregation, enrollment at these “schools that fear built” exploded, creating “a new dual system of schools—one, white and ‘private’; the other, disproportionately black and ‘public.’”7

In May 1969, in Green v. Kennedy (later Green v. Connally), black parents in Holmes County, Mississippi, had sued to end tax exemptions for three new “white flight” academies. The Nixon administration initially sided with the academies, but reversed course in July 1970, announcing that it could “no longer legally justify allowing tax-exempt status to private schools which practice racial discrimination.” In 1971, the DC Court of Appeals ruled for Green’s plaintiffs, finding that segregated private schools were not charitable and therefore not tax-exempt.8 In the early 1970s, the IRS issued a series of ineffective guidelines that required nondiscriminatory policies, but did not require private schools to follow them.9 Under this “farcical” policy, critics said, “a private school [is] nondiscriminatory just because it says it is.”10

In 1977, Jimmy Carter carried his born-again piety from rural Georgia to the White House, thanks to strong support among evangelicals (who now constituted perhaps one-third of the U.S. population), including 59 percent of Southern Baptists.11 There was hope in the heartland that Carter would use his bully pulpit to denounce the nation’s moral bankruptcy, and perhaps even to do something about it. Then Carter’s IRS commissioner Jerome Kurtz cracked the whip. Under stricter IRS rules published in August 1978, the burden of proof shifted to these private schools, which were presumed to be racially discriminatory if they opened or expanded in the Brown era and if their minority enrollments were under one-fifth of the minority school-age population in their communities.

At this juncture, what came to be known as the Religious Right found its voice and its power. It also found common cause with political conservatives. “There was an overnight conversion,” recalled Paul Weyrich—the conservative strategist who coined the term “moral majority” and would go on to become a kingmaker in the Religious Right—as conservative Christians realized that “big government was coming after them as well.”12 Observes historian Grace Hale, “Distance had been created to promote godliness. Maybe distance would have to be violated to save it.”13

According to Ralph Reed, who would later lead the Christian Coalition, many white evangelicals decried the IRS decision as “a declaration of war on their schools, their churches, and their children.”14 But it was the Religious Right that declared that war, reintroducing to American politics the rhetoric of the cultural battlefield and a mind-set of spiritual struggle: no compromise, no negotiation, no surrender.

With evangelicals mobilized as they had not been in decades, the IRS and Congress fielded close to half a million protest letters. Some pastors and parents simply wanted the feds to butt out of their schools (segregated or otherwise). Others insisted these schools were Christian academies, not segregation academies, and had been unfairly and illegally targeted.

This mass protest united evangelicals and Catholics on education long before they found common ground on abortion. It prompted a series of hearings in 1979 at the IRS and in Congress, which effectively vetoed the new policy by refusing to appropriate funds to carry it out. The controversy also caught the attention of GOP leaders, including California governor Ronald Reagan, who said that the IRS rule “threatens the destruction of religious freedom itself.”15 Rather than perpetrators of racial discrimination, parents and administrators at these Christian schools were, in his view, victims of religious discrimination at the hands of an overreaching federal government. This sentiment made its way into the 1980 GOP platform, which vowed to “halt the unconstitutional regulatory vendetta launched by Mr. Carter’s IRS commissioner against independent schools.”16

In this way, the Religious Right did more than start another culture war. It began a radical reevaluation of the role of Christianity in public life. At times its members cast themselves, like evangelicals from the 1830s to the 1930s, as insiders charged with bending American society toward God and the Good. In this morality play, they were benevolent overseers of American politics—part of Nixon’s “silent majority.” More often they cast themselves as the oppressed—rebels who, like folk singers and Beat poets, drank deep of “the romance of the outsider” in American life.17 Expanding this victim mentality to the breaking point, Robert Billings, who would later serve as executive director of the Moral Majority, compared white evangelicals to slaves under the yoke of DC masters:

The cost of political negligence is slavery! As our government increases its crippling pressure on the Christian home, school and church, the need for Christian action becomes increasingly critical. If Christians do not master politics, we will, most certainly, be mastered by those who do.18

Bowing to this newly awakened political giant, the IRS scrapped its 1978 guidelines, instructing local IRS officials to review segregated private schools on a vague (and ineffectual) case-by-case basis. For many evangelicals, this mass protest was their first taste of political power. They indulged that taste by serving up a wide-ranging culture war targeting such matters as abortion, busing, feminism, pornography, art, history, affirmative action, rock music, divorce, violence on television, drug use, gambling, secular humanism, liberalism, high school textbooks, the literary canon, school prayer, Vietnam, and the legacy of the 1960s. All of these subjects were in their view moral and religious. None was subject to negotiation or compromise.

Jerry Falwell and the Moral Majority

BACK IN 1965, in a sermon delivered the day Rev. Martin Luther King Jr. led civil rights marchers from Selma to Montgomery, Alabama, Jerry Falwell, a fundamentalist preacher at Thomas Road Baptist Church in Lynchburg, Virginia, and the host of The Old-Time Gospel Hour television program, suggested that King was a communist. He also criticized “left-wing” leaders of the “so-called freedom movement” for stirring up hatred and violence. “Preachers are not called to be politicians but to be soul winners,” Falwell said.19

In 1976—Newsweek’s “The Year of the Evangelical”—Falwell seemed to be having a change of heart. In this bicentennial year, he organized I Love America rallies in 141 cities in an effort to bring “America Back to God.” On July 4, 1976, he told his followers that “this idea of ‘religion and politics don’t mix’ was invented by the devil to keep Christians from running their own country.” In 1977, he worked with actress Anita Bryant and her Save Our Children campaign to repeal a gay rights ordinance in Dade County, Florida. In 1978, he helped to defeat a Virginia referendum to legalize pari-mutuel betting. In 1979 in Lynchburg, he joined forces with Weyrich and others to form the Moral Majority—a “pro-life, pro-family, pro-moral, and pro-American” organization. So it was no surprise when in 1980 he repudiated as “false prophecy” his earlier stance against mixing religion and politics. “All the moral issues that matter today are in the political arena,” Falwell said. “There’s no way to fight these battles except in that arena.”20

In addition to saving individual souls, Falwell was now committed to saving the soul of the nation, stopping “the growing tide of permissiveness and moral decay” from “crushing our society.” But Falwell was not just beckoning America “back to biblical morality” and “back to patriotism.” He was urging its citizens both to oppose the SALT II treaty, teachers unions, and the Occupational Safety and Health Administration, and to support capitalism, Israel, nuclear weapons, a balanced budget amendment, tax cuts, and massive increases in military spending. “The free-enterprise system is clearly outlined in the Book of Proverbs of the Bible,” he said.21

The marriage Falwell officiated between religious and political conservatives worked because he and his flock came to see big government as morally wrong. The federal government was not just telling him how to run his schools and his family. It was using its coercive power to spread secular humanism. So members of the Moral Majority did not stop at agitating for “family values.” They also agitated for limited government. When Reagan said “government is not the solution to our problem; government is the problem,” they said “Amen,” and when Pat Robertson called government “public enemy number one,” they joined in the chorus.22

Falwell was not the only conservative Christian to convert in the Carter years to the gospel of conservative political activism. In 1977, psychologist James Dobson founded Focus on the Family and Donald Wildmon founded the American Family Association. Two years later, Beverly LaHaye formed Concerned Women for America with the goal of “reversing the decline in moral values in our nation” and televangelist James Robison organized the Religious Roundtable.23 On August 22, 1980, Robison delivered a rousing wake-up call in front of fifteen thousand enthusiastic evangelicals at a Dallas rally for Reagan. “I’m sick and tired of hearing about all of the radicals and the perverts and the liberals and the leftists and the Communists coming out of the closet,” he said. “It’s time for God’s people to come out of the closet.”24 At this same rally, Reagan offered his famous endorsement of the Religious Right: “I know you can’t endorse me. But I want you to know that I endorse you.”25

Many of these leaders were inspired by the popular books, lectures, and films of the goateed evangelical theologian Francis Schaeffer. Schaeffer, who opposed on biblical grounds the “noncompassionate use of accumulated wealth,” was to the left of the Moral Majority on many economic and environmental issues.26 But his argument that the God of the Bible called Christians to involve themselves in every aspect of human life influenced many born-again Christians to pop their spiritual bubbles and get into politics. “If it hadn’t been for Francis Schaeffer, I would have been a pastor in Lynchburg, Virginia, but otherwise never heard of,” Falwell said. “He was the one who pushed me out of the ring and told me to put on the gloves.”27

As they looked back on the birth of the Religious Right and told their own creation stories, other evangelical leaders credited Schaeffer. But more often they pointed to Roe v. Wade. According to this “abortion myth,” as historian Randall Balmer calls it, the leaders of the Religious Right portrayed themselves as “new abolitionists” fighting for the unborn just as abolitionists had fought against slavery.28 Ed Dobson, a founding member of the Moral Majority, put it in these terms: their focus was on the family. This is a good story, but it is not how it happened. Falwell did not preach his first anti-abortion sermon until 1978, and the Southern Baptist Convention did not oppose abortion until 1980.29

“The Religious New Right did not start because of a concern about abortion,” says Dobson. “I sat in the non-smoke-filled back room with the Moral Majority, and I frankly do not remember abortion ever being mentioned as a reason why we ought to do something.”30 Weyrich’s recollection is similar:

What galvanized the Christian community was not abortion, school prayer, or the ERA. . . . I am living witness to that because I was trying to get those people interested in those issues and I utterly failed. What changed their mind was Jimmy Carter’s intervention against the Christian schools, trying to deny them tax-exempt status on the basis of so-called de facto segregation.31

The IRS rule “kicked the sleeping dog,” says conservative icon Richard Viguerie. “It was the spark that ignited the religious right’s involvement in real politics.”32

Since the early 1960s Weyrich (a Catholic) had tried to bring white evangelicals and political conservatives together in order to raise money and stir up votes on such issues as school prayer, abortion, and pornography. But well into the 1970s he was like the man in the New Testament parable casting seed upon rocky ground. The IRS’s Kurtz, who according to the Moral Majority’s executive director did “more to bring Christians together than any man since the Apostle Paul,” unwittingly replenished that soil, in no small part because so many in the Religious Right were tied so closely to “white flight” academies.33

In Lynchburg in 1967, the same year public schools were finally desegregated in Virginia, Falwell had established a private school called Lynchburg Christian Academy. At its opening it had zero black students, the same number of black members at Falwell’s massive Thomas Road Baptist Church. Although he would later repent of his segregationism (and admit blacks to his school and his church), Falwell was at the time a white supremacist.34

Ronald Reagan

TO WAGE THEIR “civil war of values,”35 the conservative Christians who formed the Moral Majority needed soldiers on the ground, in state capitals, and in Washington, DC. They drafted plenty—into “policy institutes, political action committees, fund-raising and direct mail organizations, publishing networks, hundreds of state and local groups, and dozens of national umbrella political organizations.”36 But they also needed a general. They recruited him not from the Deep South but from the western reaches of the Sunbelt in Southern California.

Unlike Falwell, who seemed to have been born to lead troops in this war, Reagan looked at first to be miscast. While endorsing the GOP presidential candidate Barry Goldwater in 1964 in his nationally televised speech, “A Time for Choosing,” Reagan had endeared himself to conservatives by calling on U.S. citizens to choose between free enterprise and big government, individual liberty and totalitarian collectivism, the freedom of capitalism and the slavery of the welfare state. Invoking Moses and Jesus, he spoke for what is “morally right.”37 Reagan knew that conservatism was not just about economics. It was also driven by cultural concerns and shot through with nostalgia for a golden age when men were men, morality was absolute, and America was exceptional. But Reagan was a B-movie actor, rather than an A-list preacher, and the former head of a labor union (the Screen Actors Guild). For much of his adulthood he had been a Democrat, and throughout his life he praised FDR. As California’s governor, Reagan signed a bill allowing women to end their pregnancies in order to safeguard their own health and he opposed a campaign to prohibit gays and lesbians from serving as public school teachers. Equally important, Reagan never showed much interest in Christianity before the late 1970s. He was an irregular churchgoer. He was divorced. And his second wife, Nancy Reagan, had been pregnant when they married. For a man who would come to be identified with “family values,” he had horribly strained relationships with his four children. His son Ron called him “distant and inattentive”; he didn’t meet his grandchildren until the 1980 presidential campaign necessitated it; and when his daughter Maureen ran for the Senate in 1981 he refused to endorse her.38

Like Federalist ministers who ignored John Adams’s heterodoxy in the election of 1800, conservative Christians were pragmatic enough to overlook these shortcomings. So they endorsed Reagan just as enthusiastically as he had endorsed them. Thanks in part to the Moral Majority’s voter mobilization efforts (the Iranian hostage crisis was also key), Reagan defeated Carter in 1980, and during his presidency, which ran from 1981 to 1989, Reagan thanked the Religious Right repeatedly for its support.

One “thank you” came in January 1982, when the Reagan administration intervened in litigation between the IRS and Bob Jones University. BJU had not enrolled African American students until 1971; from 1971 to 1975 it had enrolled only married blacks; and in 1975 it had adopted a series of rules against interracial dating and interracial marriage (including advocating for either). BJU had appealed on religious liberty grounds the IRS decision to revoke its tax-exempt status, arguing that its racial policies were based on the religious conviction that “the Bible forbids interracial dating and marriage” and were therefore protected under the First Amendment.39 The courts agreed first with BJU and later with the IRS. In 1982, as the Supreme Court was about to hear oral arguments, Reagan sided with the University, ordering Treasury and Justice officials to drop the case. This action produced an outcry on the left nearly as fierce as the Religious Right’s reaction to the 1978 IRS guidelines. Why would a president, nearly three decades after Brown, support such a blatantly racist institution? In 1983, the Court ruled against BJU, ordering it to pay a million dollars in back taxes.

As these IRS disputes demonstrate, the culture wars of the 1970s and beyond were to some extent (and sometimes to a great extent) about race. In 1983, culture warrior Sen. Jesse Helms spent sixteen days filibustering a bill to create a federal holiday in honor of Martin Luther King Jr. Reagan himself was not above playing the “race card” by denouncing Cadillac-driving “welfare queens” and “strapping young bucks” buying T-bone steaks with food stamps.40 (He also called the popular television miniseries Roots “rather destructive” because of “the bias of all the good people being one color and all the bad people being another.”41) The Religious Right first flexed its muscles and felt its power in the fight against IRS “segregation academy” rules. And the rhetoric on the right about small government, states’ rights, and welfare cheats was always also about race. But race was not the central concern in the culture wars. The racial anxiety that got the movement going in the South also resonated in the heartland. But it could not have sustained a nationwide movement without two key pivots from the Religious Right.

Pivoting from Race to Religion

THE FIRST WAS a pivot from race to religion, which Falwell and his friends pulled off in the IRS protest. Conservative culture warriors have a long history of co-opting liberal rhetoric—of “talking left while walking right.”42 In the nineteenth century they opposed religious liberty for Catholics and Mormons in the name of preserving religious liberty for Protestants. In the contemporary culture wars they described abortion as a “holocaust,” the ERA as a violation of women’s rights, and affirmative action as “reverse discrimination.” But nowhere was this rhetorical reversal as inventive as in the debate over segregation academies.

Although Falwell obviously supported segregation academies (he was, after all, the founder of one), he did so in the name of religious liberty, not white supremacy. The 1978 IRS rule was not about integrating private schools, he argued. It was about controlling Christian ones. In its 1962 and 1963 rulings, the Supreme Court had launched a war on God in the public schools. Now the federal government was bringing that fight to private schools.

The genius of this first pivot was that it redefined the Religious Right as victims of bigotry rather than bigots themselves. The schools they were defending were Christian academies, not segregation academies, they argued. Their aim was to protect schoolchildren from secularization, not integration. As the Religious Right put it, those who wanted to close their schools were motivated not by racial equality but by anti-Christian bias. Unbelievers wanted to impose their own religion—“secular humanism”—on everyone else, insisting, as Lutheran minister Richard John Neuhaus would later frame it, on a “naked public square” denuded of religious reasoning.43 “In one fell swoop,” writes political scientist Corey Robin, “the heirs of slaveholders became the descendants of persecuted Baptists, and Jim Crow a heresy the First Amendment was meant to protect.”44

As the 1980s arrived and the Reagan Revolution advanced, culture warriors leapt gleefully over Jefferson’s famous wall of church–state separation. Reviving imagery from Gov. Winthrop’s Arbella sermon (imagery that itself was derived from Matthew 5:14), Reagan spoke of the United States as a “shining city on a hill.” And what was shimmering was not just political or economic freedom but also Christian liberty. In an effort to reintroduce God to the classroom, Reagan supported school prayer and the teaching of creationism. He adopted the rhetoric of sin and evil, he spoke of Americans as God’s chosen people, and he turned “God bless America” into a coda for all future presidential proclamations. “The First Amendment was not written to protect the people and their laws from religious values,” he told the National Religious Broadcasters in 1983. “It was written to protect those values from government tyranny.”45

Pivoting from Race to Family

THE SECOND PIVOT of Falwell and the Religious Right was from race (where the Bible was not particularly helpful) to family (where it was).46 Like Pope Paul VI, who referred to the family in his 1968 encyclical Humanae Vitae as “the primary unit in the state,”47 conservative Christians rejected the classical liberal view (of John Locke and others) that society is a compact of individuals. The family, Falwell argued, is “the fundamental institution of society, an immutable structure established by our Creator.”48 That institution was now in danger of being lost at the hands of an overreaching federal government.

Yes, abortion was murder and homosexuality was unnatural. But each also undermined family life. Similarly, feminism was dangerous because (among other things) it confused the distinct roles men and women and boys and girls were to play in the “traditional family,” which Falwell and his followers understood to be of a singular sort: one male breadwinner and one female homemaker, married, with children, living under one roof and the patriarchal authority of the man of the house. The ERA, which the Religious Right sent down to defeat in 1982, confused these responsibilities, luring women out of their roles as housewives and challenging the role of the husband as “head of the wife.” “Breadwinner conservatism thus largely replaced, or incorporated in subtle ways, white supremacy as the public face of southern conservatism,” writes historian Robert Self.49

As they turned their attention to abortion, homosexuality, feminism, and “family values,” conservative Christians joined the perennial American battle over the one and the many—the battle between those who believe the alternative to a unified society is chaos and those who embrace pluralism as a positive good. “No one has a monopoly on the family,” said Bella Abzug, giving voice at the 1977 National Women’s Conference to the feminist position that family, like abortion, should be a matter of choice.50 Women didn’t have to be homemakers. They didn’t have to marry or have children. And they didn’t have to restrict their sexual lives to marriage or even to men. When it came to family, liberals said, let a thousand flowers bloom.

Conservative culture warriors rejected this multiplication of family types, insisting that there is one eternal and unchanging design for the family. (They were livid when the Carter administration reimagined its 1980 “Conference on the Family” as the “Conference on Families.”) Deviations from this godly design had created a social crisis visible in skyrocketing divorce rates, abortion on demand, urban crime, rampant pornography, government-funded child care, two-worker households, and homosexuality. Again this issue helped to fuse political and cultural conservatives, by providing a moral and religious justification for antipathy to government. The reforms of FDR’s New Deal and Johnson’s Great Society had turned the powers of the federal government against the “traditional family.” Now culture warriors were determined to redirect those same powers toward supporting it.

Christian America

AFTER MAKING THESE two pivots—from race to religion and family—conservatives needed a master narrative to bring everything together. They found it in the story of the rise and demise of “Christian America.” Instead of defending white superiority and the “Southern way of life,” they defended “Christian America.”

According to this narrative of loss and recovery, the United States is not an experiment in Enlightenment virtue undertaken by Deists and Unitarians who wrote a godless constitution. It is a Christian project initiated by men who followed Jesus Christ as their Lord and Savior. The Declaration of Independence spoke of a “Creator” who “endowed” U.S. citizens with “certain inalienable rights.” And the First Amendment’s establishment clause was designed to protect the church from state interference, not the state from church interference. Sixties liberals spat in the face of this legacy, as the Supreme Court banished God and the Bible from the public square and the sexual revolution replaced sexual restraint with sexual license. From this perspective, only Eisenhower-era Christians (or, in some versions of the story, Judeo-Christians) were true Americans. The secular sorts in nonclerical robes who ruled in Engel, Schempp, and Roe were not defenders of the American faith but enemies of it. “Secular humanism,” too, was un-American. And those who proselytized for it were guilty of treason as well as apostasy.

The broad outline of this quintessentially conservative tale derived from the narratives of loss and recovery of other culture wars, but the most intimate antecedent was what Southern historian Charles Reagan Wilson has referred to as “the religion of the lost cause.”51 After the Civil War, he argues, Southerners embraced a civic theology that cast them as martyrs: God’s chosen people even in the jaws of defeat. More than a century later, a cherished way of life was again under attack, and the fact that the righteous were losing only made them all the more righteous. As of the late 1970s, however, this way of life was no longer confined to one region. It drew its sustenance from “Christian America” and the “traditional family”—two symbols that acquired an air of sacrality before melting into each other under the pressure of the Religious Right: one nation under the providence of God and the family patriarch.

Yes, the Religious Right was born of anxieties over racial mixing and the demise of white supremacy. But if it had continued to speak as Falwell spoke in his pro-segregation sermons, it would not have traveled much farther than the five Southern states Alabama governor George Wallace carried against Richard Nixon and Hubert Humphrey in the 1968 presidential election. What gave the Religious Right reach was its leaders’ decision to harness the anger and frustration conjured up by the demise of legal segregation to causes that traveled better. Instead of rushing to the barricades of white superiority and the “Southern way of life,” they rushed to the defense of “Christian America” and the “traditional family.” Instead of denouncing the federal government’s interference in racial segregation, they denounced its intrusion into family and religious life—except, of course, when they were demanding its intrusion into individual choices about the religions Americans would affirm and what sorts of families they would form.

Liberal Counterattack

THESE CONTEMPORARY CULTURE wars were again asymmetrical; the Right fired far more shots than the Left returned. But the Left did respond. Historian Henry Steele Commager criticized the Religious Right for its preoccupation with “private sin” at the expense of “social evil.” “They have much to say about the wickedness of limiting posterity, whether by birth control or abortion,” he wrote, “but very little if anything to say about the kind of world children will be born into or about the systematic destruction of a rightful inheritance of natural resources.”52

Bart Giametti, a classicist turned Yale president who would later serve as Major League Baseball’s commissioner, attacked the Moral Majority as a threat to liberty, liberal arts education, and religious pluralism in a blistering August 1981 address to incoming Yale freshmen. These “peddlers of coercion” are “angry at change, rigid in the application of chauvinistic slogans, absolutistic in morality,” he said. They “presume to know what justice for all is . . . which books are fit to read, which television programs are fit to watch, which textbooks will serve for all the young” and “when human life begins.” For them, “there is no debate, no discussion, no dissent. . . . There is only one set of overarching political and spiritual and social beliefs; whatever view does not conform to these views is by definition relativistic, negative, secular, immoral, against the family, anti-free enterprise, un-American.” This isn’t just “nonsense,” Giametti continued. It is “dangerous, malicious nonsense.” It rejects “the very pluralism—of peoples, political beliefs, values, forms of merit, and systems of religion—our country was founded to welcome and foster.”53 There is not just one way to love one’s family, one’s country, and one’s God.

Canon Wars

DESPITE GOP ELECTORAL gains in the Reagan era, conservatives did feel “dispossessed by change.”54 As the counterculture went mainstream, they experienced a litany of losses: the loss of the patriarchal family in which mothers and fathers, daughters and sons, all played their appointed roles and demonstrated the appropriate deference; the loss of a Christian America, where the authority of the biblical witness was unchallenged by Hindus, Buddhists, or nonbelievers; the loss of a white society, in which it seemed either natural or divinely ordained that those who occupied the White House and corporate boardrooms would be sturdy stock of European descent; and the loss at the end of the Vietnam War of faith in Americans as God’s chosen people. These losses triggered anxieties in many, and their anxieties found expression in “canon wars” about history and literature, and “art wars” about ethics and aesthetics.

Since the emergence of public schooling in the 1820s, Americans had fought over what should be taught in public schools, colleges, and universities. The deadly “Bible wars” of the 1840s were about which Bible would be read in public schools, and the Scopes Trial of the 1920s turned on whether those schools could teach evolution. In the twentieth century, Southern conservatives denounced textbooks that glorified Lincoln’s fight against the Confederacy, and laissez-faire capitalists denounced those that lauded FDR’s New Deal. Irish Americans and German Americans demanded to be seen and heard in textbooks, and African Americans and American Indians did the same, protesting racial slurs and negative stereotypes and demanding not only recognition of their contributions to U.S. history but also acknowledgment of the evils visited upon them. The Civil Rights Act (1964) and the Voting Rights Act (1965) prepared the way for many African Americans to enter high school and college teaching jobs. And as 1960s activists moved from the streets into tenured positions, high school textbooks and college curricula started to reflect what abolitionist Frederick Douglass once described as our “composite nation.”55 In secondary schools, textbooks championed Douglass, W. E. B. Du Bois, and Eleanor Roosevelt alongside Washington, Franklin, and Jefferson.56 In higher education, departments of African American Studies and Women’s Studies proliferated.

Bennett’s Legacy

WILLIAM BENNETT, WHO served under Reagan as National Endowment for the Humanities chairman and secretary of education, aimed to reverse these trends with a controversial manifesto. Based on consultations with a study group of thirty-one educators, To Reclaim a Legacy: A Report on the Humanities in Higher Education (1984) fired the first shot across the bow of the new pluralism in U.S. colleges and universities.

Like any good conservative, Bennett started with a lament. Most students who graduate from U.S. colleges, he argued, are “lacking even the most rudimentary knowledge” of Western civilization because professors are either ignoring the humanities or teaching them as “one subject among many.” The humanities, which Bennett defined (following Matthew Arnold) as “the best that has been said, thought, written,” had once fostered community by transmitting “a common culture” to students. But educators no longer agreed on a common core, because they had succumbed to the false view that “all meaning is subjective and relative to one’s own perspective.” The humanities were also suffering from the intrusion of “special interest politics” and a shift from delivering content to cultivating skills.57

We need “a common curriculum with humanities at the core,” he said. This curriculum would include

         an understanding of the history of Western civilization;

         “a careful reading of several masterworks of English, American, and European literature”;

         “an understanding of the most significant ideas and debates in the history of philosophy”; and

         “demonstrable proficiency in a foreign language.”58

Bennett knew that American colleges and universities could not turn the clock back to a “classical curriculum” taught to “only a privileged few.” He also knew that his “common curriculum”—“the glue that binds together our pluralistic nation”—would need to be “sensitive to the long-overlooked cultural achievements of many groups.” So he included in it “some familiarity with the history, literature, religion, and philosophy of at least one non-Western culture or civilization.”59 All that came with a caveat, however, and a big caveat it was:

But the core of the American college curriculum—its heart and soul—should be the civilization of the West, source of the most powerful and pervasive influences on America and all of its people. It is simply not possible for students to understand their society without studying its intellectual legacy. If their past is hidden from them, they will become aliens in their own culture, strangers in their own land.60

Hirsch’s Cultural Literacy

IN HIS MANIFESTO, Bennett had offered a list of a few dozen canonical authors and titles—from classical antiquity, medieval and modern Europe, and American history and literature—including Homer, Plato, Dante, Shakespeare, Locke, Twain, and the Bible. In his 1987 bestseller, Cultural Literacy: What Every American Needs to Know, University of Virginia English professor E. D. Hirsch Jr. offered a list of over four thousand building blocks of “cultural literacy.” Whereas Bennett’s list was prescriptive (what we should know), Hirsch’s was descriptive (what literate Americans actually know). Moreover, Hirsch set a lower bar, insisting only that his “common reader” know the terms “Huckleberry Finn” and “I Have a Dream,” not that she actually read them.61

Like To Reclaim a Legacy, Hirsch’s Cultural Literacy was driven by a perceived “decline of literate knowledge.” Three-quarters of American seventeen-year-olds could not identify Henry Thoreau or Walt Whitman, Hirsch warned, and two-thirds could not place the Civil War in the second half of the nineteenth century. Hirsch, a Democrat, understood cultural literacy as an “avenue of opportunity for disadvantaged children.” His book’s central insight was this: In order to succeed in society, you need to be able to communicate effectively, and in order to communicate effectively you need to possess the “shared culture” upon which communication depends. Cultural literacy was that “shared culture”—“the network of information that all competent readers possess”—and Hirsch was determined to deliver it to underprivileged children. Taking possession of this prize would enable American youth of all races and classes to make sense of the Declaration of Independence, but it also would enable them to read the Black Panther Party platform of 1972, which referred to “the Declaration of Independence, the Pledge of Allegiance to the Flag, the Gettysburg Address, and the Bible.” You need “traditional culture,” Hirsch wrote, to understand “progressive ideas.”62

Bloom’s Closing of the American Mind

IN CULTURAL LITERACY, Hirsch focused on elementary education. In The Closing of the American Mind (1987), University of Chicago political philosopher Allan Bloom focused on higher education. “The truth is the one thing most needful,” he wrote in the book that came to define the canon wars, and that most needful thing was under attack in the very citadel built to defend it.63 This book was not, as cultural critic Camille Paglia once claimed, “the first shot in the culture wars,” but it definitely hit the bull’s-eye.64 Reviewer after reviewer scratched heads trying to figure out how a nerdy volume that included extensive meditations on Plato, Machiavelli, Heidegger, and Nietzsche could become a bestseller and transform its author from an unknown academic into a celebrity scold.

Bloom thrilled conservatives by taking on feminism, deconstructionism, historicism, rock music, the New Left, black power, the cult of progress, secularization, divorce, equality, radicalism, democracy, affirmative action, the sexual revolution, immodesty, modernity, nihilism, the Soviet Union, the feminization of American culture, and the “decomposition of the university.” These woes he traced to the “bad sixties,” when radical youth spurned the powers that be, and to ancient Athens, when those same powers put Socrates to death. But his central target was relativism. “There is one thing a professor can be absolutely certain of: almost every student entering the university believes, or says he believes, that truth is relative,” he wrote. And his book was devoted to demonstrating that relativism is for the faint of heart. Bloom attacked relativism because he believed in discriminating between truth and falsehood, good and evil, the profound and the superficial, the educated and the uneducated. All cultures were not equal, and no book was trapped in the historical moment in which it was written. But he also attacked relativism because he was convinced that it sapped students of passion.65

Bloom saw in his students nothing of the intellectual eros—the longing for truth, beauty, and the good—that had characterized the “Great Books” education he had received at Chicago in the glory years of the 1940s and 1950s. He wanted to bring longing back to the student body, and to stir students’ souls in the process. (Souls Without Longing was the book’s original title.)

Political philosopher Albert Hirschman has written of a “rhetoric of reaction” that “delights in almost automatic inversion, whereby increased liberty leads to slavery” and “open-mindedness is a form of fanaticism.”66 Bloom drew on this conservative rhetoric, fretting about “the great democratic danger” that, by liberating the individual from tradition, only enslaved him to the banality of the here and now. While the university once shaped students into lovers of ancient wisdom, it now contorted them into parrots of their peers. As a result, young people were trapped inside the illogic of a democratic society that fetishized the useful, sneered at the theoretical, and lauded tolerance more than truth. Ironically—and this is the point of the title—all this “open-mindedness” closed the minds of American youth to the thinkers and ideas that might actually liberate them. These soulless students exhibited none of the “humanizing doubts” that infuse the dialogues of Plato. Among them, the “great-souled man, who loves beautiful and useless things” had gone missing.67

Much of the passion in this diatribe seems to have derived from Bloom’s experience of a Cornell student rebellion in the late 1960s and early 1970s—a period he recalled as “an unmitigated disaster” for higher education. In this rebellion, as he tells it, reason collapsed under the force of emotion as students, faculty, and administrators conspired to politicize and trivialize the university. Although student radicals saw themselves as enemies of authority and partisans of nonconformity, they were in Bloom’s view conformists to the core and their age a “period of dogmatic answers and trivial facts.”68

Bloom knew that the Reagan era had witnessed a revival of core curriculum requirements, but this revival failed to stir his soul, since universities still refused to put philosophy and the humanities atop the disciplinary pecking order. In fact, they refused to recognize any pecking order at all. Neither would they weigh in on what made for an educated person. Bloom’s solution was to return to “the good old Great Books approach.” This meant “trying to read [the classics] as their authors wished them to be read” (i.e., closely and slowly) and wrestling in the process with the perennial questions they posed: “Is there a God? Is there freedom? Is there punishment for evil deeds? Is there certain knowledge? What is a good society?”69

Backlash

AS BENNETT WORKED Washington, DC, and Bloom and Hirsch charged up the bestseller lists (at one point occupying the top two nonfiction slots), liberals launched a canon wars counterattack.

University of Chicago professor Jonathan Z. Smith compared the authorial voice in To Reclaim a Legacy to British administrators at the twilight of the Indian Raj who spoke “with the smug, yet wistful voices of dying colonialism.” Bennett’s imagined past “appears to lack women, Jews, Byzantines, and Muslims,” Smith observed. How can he speak so blithely of “the western mind” as a singular entity? Doesn’t he know that even inside his list of a few dozen canonical texts “there are differences and dissonances”—that the work of the humanities always involves the complexities of “translation, interpretation, and historical consciousness?” “The nostalgic tone of his Report is apt,” Smith wrote, for what Bloom awaits is “the coming of last year.”70

African American literature scholar Henry Louis Gates Jr. said that Bennett’s study offered a “nostalgic return” to “the ‘antebellum aesthetic position,’ when men were men, and men were white, when scholar-critics were white men, and when women and persons of color were voiceless, faceless servants and laborers, pouring tea and filling brandy snifters in the boardrooms of old boys’ clubs.” To return to this canon, Gates continued, is to return to “an old order in which my people were the subjugated, the voiceless, the invisible, the unrepresented, and the unrepresentable. Who would return us to that medieval never-neverland?”71

Literary critic Robert Scholes attacked Hirsch for claiming that memorizing lists of key terms would raise living standards, improve social justice, extend democracy, and make the United States more competitive in global markets. (“Voodoo education,” he called it, echoing George H. W. Bush’s critique of “voodoo economics.”72) The authors of Multi-Cultural Literacy (1988) deemed Hirsch’s list “alarmingly deficient in its male and European bias.” They then offered a list of their own, which included Cinco de Mayo, Betty Friedan, Toni Morrison, Wounded Knee, beatnik, ashram, condom, karma, Little Red Book, rap music, and Swahili.73

Still, Bloom got the lion’s share of the criticism (and the book sales). When neoconservative Norman Podhoretz called his bestseller “the most devastating assault on the liberal culture that anyone has produced in our time,” he meant it as a compliment.74 Liberals and radicals were less kind. Writer and editor David Rieff called The Closing “a book decent people would be ashamed of having written.”75 Philosopher Robert Paul Wolff wrote a tongue-in-cheek review in which he pretended that Saul Bellow had written a “funny novel in the form of a pettish, bookish, grumpy, reactionary complaint against the last two decades.”76 Other reviewers called Bloom’s book sexist and racist. They criticized his esoteric writing and shoddy scholarship. They lamented his misanthropy. They called him to task for scapegoating his students and for blindly following the Straussian orthodoxy of his teacher, political philosopher Leo Strauss. In a book called The Opening of the American Mind (1996), historian Lawrence Levine insisted on making room in the canon for non-Western sources.

The core complaint was that The Closing was elitist and, therefore, anti-American. Political scientist Benjamin Barber referred to Bloom as a “Philosopher Despot” and to his bestseller as “one of the most profoundly antidemocratic books ever written for a popular audience.” Why, he asked, are American readers so enamored of this “raging assault on liberal tolerance and democratic education”—a book that exalts “the few who embody philosophy” and sneers at “the many who embody America”? Barber knew that lots of conservative readers were “anxious about the loss of fixed points, wishing for simpler, more orderly times,” and he acknowledged that these “anxious ones” were wary about liberty lapsing into anarchy and equality into mediocrity. But Barber cast his vote for American values nonetheless.77

Unyielding, Bloom wore his elitism with glee. In a 1988 Harvard speech, he greeted his audience as “Fellow Elitists!” Hadn’t he observed in The Closing that real education was not for everyone? Philosophy is a way of life, he wrote, and only a “tiny band of men” are equipped to do it.78

You won’t get this from reviews, but neither Cultural Literacy nor The Closing was lacking in nuance. Praising “the virtues of multicultural education,” Hirsch wrote that “tolerance of diversity is at the root of our tradition.”79 And Bloom, who sneered at “impoverishing certitudes,” seemed to disdain blind conservatism nearly as much as blind liberalism.80 In fact, Bloom repeatedly refused to label himself a conservative, noting that “any superficial reading of my book will show that I differ from both theoretical and practical conservative positions.”81 If he were alive today, Bloom would be no friend of the anti-intellectual populism of Rush Limbaugh, Ann Coulter, and Bill O’Reilly. The point of philosophy, in his view, was not to arrive at the truth; it was to deliver one from dogma. Still, both authors were drafted into the “right” side of the culture wars, where (as in all warfare) nuances go to die.82

“The Closing of the Stanford Mind”

AMID THIS FUROR, Stanford University opened up another battlefield in the canon wars when it reexamined its Western Culture requirement—a yearlong core curriculum course that required all freshmen to read fifteen classics in Western literature, from the Bible, Homer, and Plato to Dante, Galileo, Marx, and Freud. In the spring of 1986, the Stanford Black Student Union denounced the course’s “core list” as racist. The message of Western Culture, said former Black Student Union president Amanda Kemp, was “nigger go home.”83 Soon Hispanic, Asian, and feminist groups were piling on, arguing that Western civilization was nothing to celebrate and calling Western Culture sexist and Eurocentric.

In January 1987, on Martin Luther King Jr.’s birthday, Democratic presidential candidate Jesse Jackson marched in solidarity with Stanford student protesters, who chanted, “Hey, hey, ho, ho, Western Culture’s got to go.” That spring, minority students occupied the offices of Stanford president Donald Kennedy, demanding curricular changes. In March 1988, the Stanford Faculty Senate voted 39 to 4 to replace its old requirement with a new, more pluralistic sequence, Cultures, Ideas, and Values (CIV), which would include, in addition to Western classics, “works by women, minorities and persons of color,” attention to at least one non-Western culture, and at least one work focusing on race, class, or gender.84 Days after that vote, William Bennett came to Palo Alto to denounce “the closing of the Stanford mind.” The decision to axe Western Culture, he said, was political rather than educational—the result of “pressure politics and intimidation,” not “enlightened debate.” “Those who attack Western values and accomplishments do not see an America that—despite its imperfections, its weaknesses, its sins—has served and continues to serve as a beacon to the world,” he said. “Instead, theirs is an America hopelessly tainted—tainted by racism, imperialism, sexism, capitalism, ethnocentrism, elitism.”85

Novelist Colman McCarthy accused Bennett of “academic nationalism” and of treating Stanford students like “invading Huns,” but a chorus of conservative applause drowned out the liberals.86 Harking back to the corruption of Boss Tweed’s Tammany Hall, columnist George Will denounced Stanford’s “academic spoils system,” which was designed not to promote the truth but “to satisfy racial, sexual and ethnic groups.”87 “Affirmative action for great books is an embarrassment,” wrote conservative columnist Charles Krauthammer.88 “If Jane Austen and Virginia Woolf appear in the curriculum,” added City University of New York (CUNY) professor Gertrude Himmelfarb, “there should be no ambiguity about the grounds of their inclusion—not as women novelists but as first-class novelists.”89

The canon wars did not end at Stanford, of course. Conservatives reacted with horror as universities across the country tried to address “multicultural illiteracy” with requirements that emphasized non-Western cultures and the contributions of women and minorities to U.S. history.90 Local fights concerned whether The Adventures of Huckleberry Finn was an appropriate book for high school classrooms or hometown libraries. Is Twain’s novel “racist trash,” as one educator argued, or is it “a matchless satire on racism, bigotry, and property rights in human beings”?91

Similar questions bedeviled the overseers of state and national standards for secondary education. As New York State revised its social studies standards in the direction of multiculturalism, CUNY professor Arthur Schlesinger Jr. gave voice to the anxiety of many traditionalists that America’s pluribus was overrunning its unum. In his bestseller The Disuniting of America (1991), Schlesinger acknowledged the importance of women’s history and Latin American, Asian, and African history but described “Europhobia” and “the new ethnic gospel” as dangers to the long-standing American ideals of assimilation and integration, and to our “common ideals” and “common culture.” “The bonds of national cohesion in the republic are sufficiently fragile already,” he wrote. “Public education should aim to strengthen those bonds, not to weaken them. If separatist tendencies go on unchecked, the result can only be the fragmentation, resegregation, and tribalization of American life.”92 Schlesinger also made an important historical observation: that universalist arguments for unity used to be regarded as liberal, but with the rise of identity politics, they had come to be seen as conservative. In a similar turnabout, many Republicans today oppose the Common Core State Standards.

Piss Christ

IN 1989, THE culture wars visited the art world, as cultural conservatives found in the museum a new site for airing fears about the rise of homosexuality and the decline of the West. Again combatants debated the one and the many, objectivity and subjectivity. Again they politicized a venue that was supposed to be free from political influence. In these fights, culture warriors grew increasingly attuned not simply to symbols such as the flag and the cross but also to images—visual representations of naked black bodies and aborted fetuses.

Back in 1981, pundits and politicians had debated the winning design for a proposed Vietnam War memorial on the National Mall. Critics of that design, by Yale College undergraduate Maya Lin, objected to its minimalism and to its refusal to celebrate the war as (in Reagan’s terms) a “noble cause.”93 Noting that its two sunken walls were to be black granite instead of the white granite favored elsewhere in the capital, one critic termed the proposed memorial “a black gash of shame and sorrow.” Journalist Tom Wolfe called it “a tribute to Jane Fonda.” Others judged it inane, elitist, depressing, perverse, insulting, antiwar, and nihilistic. Lin’s defenders charged that these critics were motivated less by artistic or even political calculations than by sexism, racism, and anti-intellectualism. (Lin was a woman, an Asian American, and a Yalie.) Look at her simple design without prejudice, they said, and you will see an eloquent piece of art—a perfect expression of a nation deeply divided over the Vietnam War.94

In the late 1980s, debates over art turned from memorials to photographs, and from patriotism to family values. Donald Wildmon of the American Family Association stirred the pot with an April 5, 1989, public letter denouncing Andres Serrano’s Piss Christ—a photograph of a wood and plastic crucifix awash in the artist’s urine. Previously, Wildmon had attacked the indecencies of popular culture, pressuring 7-Eleven to stop selling Playboy and Penthouse, and CBS to remove a scene from a cartoon supposedly depicting Mighty Mouse sniffing cocaine. Now he turned to high art. “The bias and bigotry against Christians, which has dominated television and movies for the past decade or more,” he contended, “has now moved over to the art museums.” First came The Last Temptation of Christ depicting “Jesus as a tormented, deranged, human-only sinner,” then Madonna’s “Like a Prayer” video depicting “Christ having sex with a priest,” and now Serrano’s “desecration of Christ,” Wildmon wrote. “Maybe, before the physical persecution of Christians begins, we will gain the courage to stand against such bigotry.”95

Piss Christ had been shown without fuss or fury in Manhattan, Los Angeles, Pittsburgh, Richmond, Virginia, and Winston-Salem, North Carolina. According to Serrano, it was “absolutely not calculated to offend.”96 But offend it did. Members of the AFA’s massive mailing list barraged Congress and the National Endowment for the Arts (NEA) with protest letters and phone calls. So did viewers of Pat Robertson’s 700 Club television program and members of Phyllis Schlafly’s Eagle Forum. When Pat Buchanan looked at Piss Christ he saw an “anti-Christian, anti-American, nihilistic” society in which liberals were repeatedly coming out on top. “While the right has been busy winning primaries and elections, cutting taxes and funding anti-communist guerrillas abroad, the left has been quietly seizing all the commanding heights of American art and culture,” he said, before calling for “a cultural revolution in the ’90s as sweeping as its political revolution in the ’80s.”97

On May 18, 1989, senators Alfonse D’Amato (R-NY) and Jesse Helms (R-NC) denounced “this piece of trash” and the “jerk” who had produced it.98 And on June 8, Rep. Dick Armey (R-TX) sent NEA director Hugh Southern a letter threatening NEA budget cuts signed by over one hundred members of Congress. Stop paying for “morally reprehensible trash” and develop new grant-making guidelines that respect “public standards of taste and decency,” Armey demanded. Otherwise, the House would “blow their budgets out of the water.”99

Since Congress created the NEA in 1965, its funds had been allocated through peer-review panels meant to insulate it from political interference and the artists it supported from infringements on artistic freedom. Serrano had received fifteen thousand dollars in NEA funds via the Southeastern Center for Contemporary Art in Winston-Salem, North Carolina. That drove D’Amato and Helms to distraction. In “a ritual counter-desecration,” D’Amato tore up a copy of Piss Christ on the Senate floor.100 “This is not a question of free speech,” he said. “This is a question of abuse of taxpayers’ money.” Taking a more pious tack, Helms called Serrano’s “so-called artwork” a blasphemous effort to “dishonor our Lord.” But Helms, too, followed the money. The Constitution may allow Serrano to produce and display his photographs, he argued, but “it certainly does not require the American taxpayers or the Federal Government to fund, promote, honor, approve or condone it.” The remedy? An amendment to appropriations legislation that would prevent the NEA from funding “obscene or indecent materials” (including homoeroticism); “materials which denigrate the objects or beliefs of the adherents of a particular religion or non-religion”; and “material which denigrates, debases, or reviles a person, group, or class of citizens on the basis of race, creed, sex, handicap, age, or national origin.”101

Robert Mapplethorpe and the NEA

ON JUNE 12, 1989, Christina Orr-Cahall, director of the Corcoran Gallery of Art, located one block from the White House, abruptly canceled an exhibition of Robert Mapplethorpe photographs slated to open July 1. Robert Mapplethorpe: The Perfect Moment had been organized by the Institute of Contemporary Art at the University of Pennsylvania, which had received thirty thousand dollars in NEA funding for it. Orr-Cahall reportedly feared that the exhibition, which included homosexually explicit images, could provoke Congress to cut NEA funding (or eliminate the agency altogether).

The conservative Washington Times called this turn of events “the best thing to happen in American arts since Philip Roth lost the manuscript of a novel in a New York men’s room.” Other traditionalists judged Mapplethorpe’s photos to be pornography, not art. Real art ennobles, they argued, expressing and reinforcing eternal values such as “the good, the true, and the beautiful.” Mapplethorpe’s “snapshots of a tourist in hell” sneered at those values and at the ordinary Americans who held them dear.102

Repeatedly critics referred to Mapplethorpe as a “homosexual artist” whose death by AIDS in 1989 was a just punishment for his sins. In a bizarre op-ed piece, Washington Times columnist Richard Grenier fantasized six separate times about locating Mapplethorpe’s corpse, dousing it with kerosene, and setting it on fire. As for the “shocking and depraved” photographs, many could not even bring themselves to describe them. Others seemed to relish the opportunity, focusing on images that (by their gaze) featured “a little girl with her dress up, one man about to urinate in the mouth of another, an elephantine penis hanging out of a pair of pants, and the ‘artist’s’ nude photo of himself, with a bullwhip sticking out of his rectum.”103 In short, Mapplethorpe seemed to epitomize everything that had gone wrong with music, literature, and the arts since the salacious 1960s. He, too, glorified the “libertinism and polymorphous perversity” that led through the drug-induced haze of Woodstock to the violence of Altamont.104

Echoing parallel disputes over education, art warriors accused the Left of seeking to destroy, in the name of “primitivism, feminism, racialism, multiculturalism, and homophilia . . . every traditional social institution, beginning with the church . . . and ending with the family.” In the name of uniformity, hierarchy, and deference, The New Criterion publisher Samuel Lipman took aim at multiculturalism, which he defined as “a widespread assault on what is variously called Western, or European, or white-dominated or male-dominated civilization.” According to Lipman, the “multicultural agenda” functioned in the arts as a sort of aesthetic relativism, asserting “that the art of all peoples is equally worthy of preservation and presentation.” In the absence of a clear commitment to a hierarchy of values, the “imperishable masterpieces” of the European tradition were reduced to “no more than one kind of ethnic manifestation.”105

The Right also accused the Left of putting advocacy before art, but the conservatives in this battle were advocates too—for pre-modern art over the avant-garde, and for heterosexuality over the “florid and variant sexualities” of Mapplethorpe’s pictures.106 Every dollar given to an African American artist or a lesbian was a dollar snatched from a straight, white male.

In this era, which historian Daniel Rodgers has characterized as an “age of fracture,” Lipman held on to the dream of one “common civilization” and an NEA committed to preserving “our common cultural and artistic inheritance.”107 But liberals, unafraid of fracture, spoke of civilizations in the plural, criticizing any appeal to universality as a political attempt to valorize one artistic vision and to render invisible the rest.

Mapplethorpe’s opponents kept returning to taxpayer funding, however. Censorship is not the issue, they said. Neither is artistic freedom. The issue is federal funding. Artists cannot expect to receive grants without some congressional oversight, especially if they are going to produce “pornography and anti-Christian bigotry.” “If art and religion are to be free of state influence, then they must indeed be free of state influence,” reasoned Sen. Slade Gorton (R-WA). “If they are to be free of censure, they cannot depend on subsidy.” Or, as Helms put it, “People who want to scrawl dirty words on the men’s room wall should furnish their own walls and their own crayons.” It was high time for this “so-called art” to come to grips with the laws of supply and demand. “Let the NEA and the artists they support meet the same tests as other artists in our society—the demand of the marketplace,” said the American Family Association.108

Defending the NEA

AS LIBERALS WERE quick to point out, this was precisely what the NEA was created not to do. It was designed to insulate artists from the vagaries of the marketplace—to support through public funds art that would not be supported privately. What the nation was now witnessing was the erosion of that buffer—the blatant interference of conservative lawmakers in aesthetic matters quite beyond their ken.

Echoing Garrison Keillor, who thanked the NEA for funding in its infancy his A Prairie Home Companion radio show, Arthur Levitt Jr., former chairman of the American Stock Exchange, observed in The Wall Street Journal in 1990 that “all four Pulitzer Prize winners this year received NEA funds at important junctures in their careers.”109 Sen. Ted Kennedy (D-MA) characterized the controversial grants as “aberrations,” adding that “Congress must not put itself in the position of serving as a board of censors for the arts.” Others drew a straight line from Helms and his “cultural Mafia” to Nazi Germany, Soviet Russia, Red China, and other totalitarian haters of free speech and artistic freedom. “Once we allow lawmakers to become art critics,” warned Robert Brustein of the Cambridge-based American Repertory Theater, “we take the first step into the world of Ayatollah Khomeini, whose murderous review of The Satanic Verses still chills the heart of everyone committed to free expression.”110

An arts coalition representing symphony orchestras, dance companies, opera companies, theater groups, museums, and state and local arts agencies vigorously defended federal funding for the arts, noting how, since the NEA’s founding in 1965, “37 professional dance companies have grown to 250 dance companies . . . 60 professional orchestras have grown to 210 professional orchestras . . . [and] 27 professional opera companies have grown to 113 professional opera companies.” And all this for a cost of just 68 cents per taxpayer per year!111 Other NEA supporters said that the Helms amendment was so vague and far reaching that, if passed, it would prohibit the NEA from funding almost anything. Sen. John Danforth (R-MO) noted that the Helms amendment would probably preclude NEA funding for The Color Purple (for denigrating men), Huckleberry Finn (for denigrating blacks), and The Merchant of Venice (for denigrating Jews).112

In the canon wars, Bloom and his supporters had defended “Truth” against “truths.” At the heart of this arts controversy was another debate about the one and the many. Is there a single standard of beauty by which all art can be objectively judged? Is beauty in the eye of the beholder? And what of morality? Is it one or many?

Casting their lot with objectivity and uniformity, conservatives claimed that there was one American public committed to one universal standard of beauty and one universal standard of morality (which also turned out to be their own). Liberals saw a society made up of different publics with competing ethical and aesthetic standards. Helms wanted to punish the NEA whenever “it strays from what he fancies to be the center line of American ethical belief,” wrote Brustein. Yet “no such line exists—not in a society as vast, various and eclectic as the real America.”113

Sen. Daniel Patrick Moynihan (D-NY), who saw in this art war a return to the moralism of the party of the past of the 1920s, doubted that Mapplethorpe’s critics were really speaking on behalf of universal beauty or a unified public:

Do we really want it to be recorded that the Senate of the United States, in the 101st Congress of this Republic, is so insensible to the traditions of liberty in our land, so fearful of what is different and new and intentionally disturbing, so anxious to record our timidity that we would sanction institutions for acting precisely as they are meant to act? Which is to say, art institutions supporting artists and exhibiting their work?

More. Are we so little mindful of the diversity of our Nation, and the centrality of censorship and persecution in the experience of not just a few but I would almost say every religious and ethnic inheritance in this land?114

Magazine editor Steven Durland argued that the censors were clinging to an obsolete view of their country as “white, Christian, heterosexual and male.” The arts should “embody our differences,” said Rep. Pat Williams (D-MT). “A free society,” added Harvard Law professor Kathleen Sullivan, “can have no official orthodoxy in art any more than in religion or politics.”115

But conservatives again flipped the script, casting themselves as Inquisitors’ victims, naturalized citizens in our “nation of outsiders.”116 Homosexuality, feminism, and multiculturalism were the new repressive orthodoxy. The real bigots were “Hollywood, homosexuals, abortionists, family planners, the sexually promiscuous, failed spouses, failed parents, failed kids.”117

Conservatives believed that Piss Christ and Mapplethorpe’s images spoke for themselves. Just as evangelicals saw the Bible as self-interpreting, they could not imagine that different viewers could see these photographs differently. So while Serrano, in a letter to the NEA, contended that his most controversial photograph “has multiple meanings and can be interpreted in various ways,”118 blasphemy was by their sights its one and only meaning.

In a New York Times interview, Serrano described Piss Christ not as a desecration of religion but as a critique of its desecration, “a protest against the commercialization of sacred imagery.”119 Elsewhere he said that “the picture is meant as a criticism of the billion dollar Christ-for-profit industry and the commercialization of spiritual values that permeates our society . . . a condemnation of those who abuse the teachings of Christ for their own ignoble ends.”120

Frank Schaeffer, prodigal son of evangelical icon Francis Schaeffer, argued that Piss Christ “is no more blasphemous than most religious television that so-called Christian organizations support. The idea of sending Oral Roberts ten dollars because he has told me God will bless me if I donate money to him is certainly just as much heresy and blasphemy as putting a crucifix in a jar of urine.”121 John Buchanan, president of People for the American Way, a nonprofit founded in 1981 to undercut the claim of the Religious Right to speak for the “real America,” told senators that Piss Christ was, indeed, “shocking,” “outrageous,” and “offensive,” but insisted it was “good theology” nonetheless—“a faithful portrayal of the shocking, outrageous, offensive reality of our sinfulness heaped upon Him.”122 One art critic discerned in Serrano’s work “the victory of the spirit in the humiliation of the flesh.”123 Another wrote that Piss Christ “pays idiosyncratic homage to ideas that Christ originally stood for.”124

Piss Helms

AS THIS ARTS drama moved into its last act, the Washington Project for the Arts agreed to host the Mapplethorpe exhibition that the Corcoran had canceled. The show then traveled to the Contemporary Arts Center in Cincinnati, where museum director Dennis Barrie was arrested on (and eventually acquitted of) obscenity charges.

Some conservative legislators, led by Rep. Dana Rohrabacher (R-CA), tried to eliminate all NEA funding. Others backed the content restrictions of the Helms amendment. In October 1989, Congress rejected that amendment and decided to fully fund the NEA, minus a symbolic forty-five thousand dollars (the amount previously granted Serrano and Mapplethorpe). But Congress did adopt language prohibiting the use of NEA funds for materials that “may be considered obscene, including but not limited to, depictions of sadomasochism, homoeroticism, the sexual exploitation of children, or individuals engaged in sex acts and which, taken as a whole, do not have serious literary, artistic, political, or scientific value.”125

Many in the arts community feared that these content restrictions—the first in NEA history—would produce a chilling effect. The chill came quickly. One month after the compromise bill passed, NEA chairman John Frohnmayer took back a ten-thousand-dollar grant his agency had awarded to Artists Space in New York City for an AIDS-themed exhibition called Witnesses: Against Our Vanishing. In his view, the catalog crossed the line separating art from politics when artist David Wojnarowicz described Cardinal O’Connor as a “fat cannibal from that house of walking swastikas up on Fifth Avenue” and fantasized about dousing Helms with gasoline and setting him on fire.126 Under intense pressure from the arts community, Frohnmayer restored the grant, but pressure continued to build from both sides. A group known as the NEA Four sued after Frohnmayer overturned decisions by NEA peer-review panels to grant them fellowships. And when the NEA announced it would require all its grant recipients to adhere to the new content restrictions, many artists turned down their grants, refusing to sign what they saw as a loyalty oath.

Meanwhile, People for the American Way ran a series of apocalyptic radio spots focused on the NEA controversy: “Imagine a world in which millions of people are at the mercy of a small band of extremists. In which works of art are subject to government censorship. And freedom of expression is a crime. Now stop imagining. Welcome to America, 1990.”127 In a song called “Jesse Don’t Like It,” Loudon Wainwright III criticized “Mr. Censorship”:

       If Jesse thinks it’s dirty, it don’t get any funds.

       They use that taxpayer’s money on tobacco and guns.

       Your freedom of expression is being denied.

       But if you’re not sure what you like, then just let Jesse decide.128

Jos Sances’s protest was more visceral: a serigraph—Piss Helms—depicting a naked black man urinating on Senator Helms. In another piece by the same name, Phoenix artist Cactus Jack immersed a photograph of Helms in a container of beer.

The Christian Coalition

IN 1989, AS the art wars were peaking, Jerry Falwell shut down the Moral Majority. It had played almost no role in the 1988 election of Reagan’s successor, Republican George H. W. Bush. Its ability to raise money had been undercut by sexual and financial scandals plaguing Jimmy Swaggart, Jim and Tammy Faye Bakker, and other televangelists. And Falwell had thrown away the little political capital he had left in support of apartheid in South Africa. THE MORAL MAJORITY IS NEITHER, read one bumper sticker, and by the time Falwell pulled the plug, his organization was hugely unpopular, with unfavorables topping favorables even among white evangelicals.

Learning from the Moral Majority’s mistakes, Pat Robertson used the political capital he had earned in his bid for the 1988 Republican presidential nomination to form, in 1989, a pressure group closely tied to the GOP and focused on local and state politics. The Christian Coalition, which under the leadership of Young Republicans leader Ralph Reed would come to dominate the Religious Right in the 1990s, got much of its early momentum from direct-mail appeals on the art wars.

“Last week I began a new organization to fight for our freedoms—The Christian Coalition,” Robertson wrote in a direct-mail appeal dated October 25, 1989. That appeal began and ended not with segregation academies or abortion but with “TAX-PAYER FUNDED Photographs Too Vulgar to Print.”129 In 1990, Reed ran a series of newspaper, radio, and television ads calling on legislators to abolish the NEA: “Do you want to face the voters in your district with the charge that you are wasting their hard-earned money to promote sodomy, child pornography, and attacks on Jesus Christ?”130 On the strength of this fusion of antigovernment conservatism and evangelical anxiety, the Christian Coalition quickly emerged as the political organization of the Religious Right in the 1990s.

In 1992, Buchanan famously brought the culture wars to the Republican primaries, attacking President Bush for allowing the NEA to fund Tongues Untied, a PBS documentary about gay black men, and pledging that, if elected, he would “shut down, padlock and fumigate” the NEA.131 Buchanan lost the Republican nomination to Bush, who lost the general election to the Democratic governor of Arkansas Bill Clinton, but on the floor of the Republican National Convention he delivered his now famous “culture wars” speech. “There is a religious war going on in our country for the soul of America,” Buchanan said. “It is a cultural war, as critical to the kind of nation we will one day be as was the Cold War itself.”132 In this speech (it “probably sounded better in the original German,” quipped columnist Molly Ivins133), Buchanan praised Bush as “a defender of right-to-life, and lifelong champion of the Judeo-Christian values and beliefs upon which this nation was built.” He then blasted “Clinton & Clinton” for promoting “abortion on demand, a litmus test for the Supreme Court, homosexual rights, discrimination against religious schools, women in combat.” “That’s change, all right,” he said, but “it is not the kind of change we can tolerate in a nation that we still call God’s country.”134

Two years later, in the 1994 midterm elections, the Christian Coalition played the role the Moral Majority had played in the 1980 elections, helping to give control of both houses of Congress to Republicans and paving the way for the “Contract with America,” crafted by House Speaker Newt Gingrich (R-GA). One year later, congressional Republicans slashed the NEA budget by 39 percent. In 1995 and again in 1996, House Republicans shut down the federal government in a miscalculated show of strength that would eventually force Gingrich to step down as Speaker of the House and propel President Clinton to reelection in 1996.

Homosexual Art

THE ART WARS were about art, of course, and about federal spending and Christianity. But antihomosexuality was the lost cause in this debate. At a time when “God’s people” had, in James Robison’s words, “come out of the closet,” were gays and lesbians supposed to stay in?135

The object of conservative anxieties was to some extent displaced in these art wars—from homosexuality to “homosexual photographs” to public funding for the same. But the NEA’s critics repeatedly blurred the categories of “obscenity” and “homosexuality,” most plainly in 1989 NEA appropriations legislation, whose laundry list of potentially “obscene” art included “homoeroticism” alongside “the sexual exploitation of children.”136

After the NEA reversed itself and provided funding for the AIDS-themed exhibition Witnesses: Against Our Vanishing, Pat Buchanan decided that both the gay rights community and the arts community were behaving like children. He reserved most of his paternal scolding for homosexuals, however, who in his view were getting precisely what they deserved in the AIDS epidemic. “The gays yearly die by the thousands of AIDS, crying out in rage for what they cannot have: respect for a lifestyle Americans simply do not respect; billions for medical research to save them from the consequences of their own suicidal self-indulgence,” he wrote. “Truly these are lost souls, fighting a war against the Author of human nature, a war that no man can win.”137 During the 1992 presidential campaign, Buchanan referred to gay men as “sodomites” and called AIDS “nature’s retribution” against homosexuals.138

Pat Robertson, in a 700 Club interview with Frohnmayer, said he and other U.S. taxpayers were “fed up to here with using our money for homo-erotic art and anti-Christian diatribes by way-out fringe artists.”139 Helms, too, invoked an ostensibly uniform “American people,” whom he said were “disgusted with the idea of giving the taxpayers’ money to artists who promote homosexuality insidiously and deliberately.” But Helms was also disgusted by homosexuality itself, and by Mapplethorpe’s effort “to gain wider exposure of, and acceptance for,” it.140

Impeaching the Sixties

AS THE REAGAN era yielded to the presidencies of George H. W. Bush and Bill Clinton, some sort of détente might have been expected. Bush played a culture warrior in the 1988 presidential campaign, which he won in part because his team was able to target his Democratic opponent, Massachusetts governor Michael Dukakis, as a sixties-style liberal who was soft on crime and hard on the military. But Bush, a moderate Republican and former U.N. ambassador, was a pragmatist at heart, a New England Episcopalian as allergic to cultural warfare as he was to speaking in tongues. In his acceptance speech at the 1988 Republican National Convention in New Orleans, Bush called for “a kinder, gentler nation.”141 His two nominees to the Supreme Court—a liberal, David Souter, and a conservative, Clarence Thomas—preserved the status quo on Roe v. Wade.

“New Democrat” Bill Clinton, who took the White House from Bush in 1992 with the mantra “It’s the economy, stupid,” also tried to turn down the temperature on hot-button social issues. Though he would be criticized as a tax-and-spend liberal, he, too, was a moderate who did more to complete the Reagan Revolution than overturn it, signing on to welfare reform and famously declaring in his 1996 State of the Union address that “the era of big government is over.” Clinton supported Roe v. Wade but spoke often about his desire to make abortion “safe, legal, and rare.” As homosexuality became a cultural battlefield, Clinton tilted right. In 1993, he codified a Don’t Ask, Don’t Tell policy that allowed gays and lesbians to serve in the military as long as they kept their homosexuality secret. In 1996, he signed the Defense of Marriage Act (DOMA), which defined marriage for federal purposes as a “union between one man and one woman.” On church–state questions, Clinton refused the strict separation long preached by Democrats. The faith-based initiative often associated with George W. Bush was actually created in 1996 by Clinton, who also affirmed the rights of students to pray in public schools. “I have never believed the Constitution required our schools to be religion-free zones,” he said, “or that our children must check their faiths at the schoolhouse door.”142

Despite the moderating influences of these two presidents, the culture wars of the Reagan era persisted throughout the 1990s, not least because there were so many constituencies—think tanks, political action committees, talk show hosts, fund-raisers, legislators, and partisan newspapers and websites—that benefited from them. As it became clear that Roe was not in imminent danger of being overturned, Randall Terry and his Operation Rescue took the abortion fight to the streets, borrowing tactics from the civil rights movement in order to stop a “holocaust” at the hands of “baby killers.” (“To vote for Bill Clinton,” Terry told ministers in 1992, “is to sin against God.”)143 When this movement stalled, some turned to violence, murdering doctors, receptionists, and other abortion clinic employees in a spate of anti-killing killings between 1993 and 1998.

During the midterm elections of 1994, the realignment of the GOP begun by Goldwater, pursued by Nixon, and consolidated under Reagan, was finally complete. As Southern Democrats converted en masse to Republicanism, the GOP won control of both houses of Congress for the first time since 1954. Newt Gingrich’s “Contract with America” steered clear of abortion, homosexuality, and school prayer, but culture warriors would not be sidelined for long. After Clinton vetoed legislation banning “partial-birth abortion” in 1996, the Watergate criminal turned evangelical activist Charles Colson compared the United States to Nazi Germany.144 That same year, in a review of court cases on abortion, gay rights, and euthanasia, Richard John Neuhaus of the “theoconservative” magazine First Things did the same. Convinced that the judiciary had declared its independence from the American people and morality itself, he called for radical action—“from noncompliance to resistance to civil disobedience to morally justified revolution”—against “judicial tyranny.”145

Culture warriors also used the 1998 sex scandal regarding Clinton’s “not appropriate” relationship with intern Monica Lewinsky—the political event of the 1990s—to revisit the 1960s. Falwell railed against “the radical homosexuals, anti-family feminists, Godless atheists,” and “the liberal media,”146 but the Religious Right’s central target was the draft dodger, adulterer, and marijuana inhaler Bill Clinton, who epitomized in its view the sexual immorality and moral relativism of the “bad sixties.” In The Death of Outrage (1998), William Bennett denounced not only Clinton’s “moral bankruptcy” but also the refusal of the nation to get as worked up about it as he and Falwell plainly were.147

In a case Harvard Law professor Alan Dershowitz denounced as “sexual McCarthyism,” Clinton was impeached by the House only to be acquitted by the Senate.148 Clinton’s political survival prompted serious soul-searching among conservative Christians, who had worked (and prayed) so hard to oust “Slick Willie.” The premise underlying the Moral Majority and the Christian Coalition had been that most Americans shared their values. Yet during the Lewinsky scandal Clinton’s approval rose sharply—to 73 percent—and in the 1998 midterm election voters sent many Republican legislators packing.149

Looking back on the impeachment debacle after that punishing election, conservative strategist Paul Weyrich concluded that the liberals had won the contemporary culture wars. “I no longer believe that there is a moral majority. I do not believe that a majority of Americans actually shares our values,” he wrote in 1999. “This is why, even when we win in politics, our victories fail to translate into the kind of policies we believe are important.” His recommendation? Evangelicals should “separate [themselves] from the institutions that have been captured by the ideology of Political Correctness.”150 That same year, in a book Randall Terry called “more dangerous than the child-pornography at Barnes and Noble,”151 columnist Cal Thomas and pastor Ed Dobson (both evangelicals) argued that the Religious Right had failed because it had focused too narrowly on elections and legislation. You can’t use government to force an immoral citizenry to become moral, they reasoned, before calling on the Religious Right to abandon the culture wars.152 Weyrich was less apocalyptic. “We need,” he concluded, “some sort of quarantine.”153

The Islam Wars

THAT QUARANTINE WAS short-lived. During a Republican caucus debate in Iowa in December 1999, Texas governor George W. Bush was asked to name his favorite political philosopher. “Christ,” he answered, “because he changed my heart.”154 When Bush won the presidency in 2000—in the closest presidential contest since the 1800 tie—white evangelicals were overjoyed to have one of their own in the Oval Office. When he won a second term, defeating Democrat John Kerry in 2004, pollsters and pundits credited “values voters” with turning the election his way and returning both houses of Congress to Republicans. Like his father, the second President Bush was a reluctant cultural warrior—a “compassionate conservative” who would rather break up a fight than instigate one. Nonetheless, he, too, was quickly drawn in.

Previous American presidents had invoked God, but in recognition of the population’s religious diversity, they typically did so in vague generalities. Jimmy Carter tested this unwritten rule, but as a Southern Baptist believer in the separation of church and state, he always respected it. George W. Bush was different. In the first official act of his “faith-based presidency,” he declared his inauguration day a National Day of Prayer and Thanksgiving. His cabinet meetings began with prayer.155 And when journalist Bob Woodward asked him whether he consulted his father before invading Iraq in 2003, Bush said, “He is the wrong father to appeal to. . . . There is a higher father that I appeal to.”156 Bush’s refusal to hide his light under a bushel exasperated “New Atheists,” who attacked religion as a poison and a pox on American politics.

After Islamic extremists crashed hijacked jets into the World Trade Center, the Pentagon, and a Pennsylvania field early in his first term, Bush worked hard to live up to his pledge to serve as “a uniter, not a divider.”157 Putting on the hat of a cultural liberal (if only on this issue), he resisted the temptation to demonize Muslims the way prior presidents had demonized Catholics and Mormons. The nation is at war with terrorists, not with Muslims, he said repeatedly: “Islam is a religion of love, not hate.”158 Bush had no interest in returning to the frisson of prior religion wars, with Islam as the new religion you had to hate. He had even less interest in starting a twenty-first-century crusade with the world’s 1.5 billion Muslims.

In a televised conversation with Pat Robertson, Falwell blamed 9/11 on those who had “mocked” God: “the pagans, and the abortionists, and the feminists, and the gays and lesbians who are actively trying to make that an alternative lifestyle.”159 But many conservative Christians blamed Islam. Shortly after 9/11, Rev. Franklin Graham, son of evangelist Billy Graham, blasted Islam on NBC Nightly News as “a very evil and wicked religion.”160 Pastor Jerry Vines, ex-president of the Southern Baptist Convention, called Muhammad a “demon-possessed pedophile.”161 In November 2001, Rep. Saxby Chambliss (R-GA), chairman of the House Subcommittee on Terrorism and Homeland Security, fantasized about empowering a Georgia sheriff to “arrest every Muslim that crosses the state line.”162 Provocateur Ann Coulter’s plan was more ambitious: “We should invade their countries, kill their leaders and convert them to Christianity.”163 Others called for a ban on Muslim immigration, and for the expulsion of Muslims who were not U.S. citizens. In a move straight out of earlier anti-Catholicism and anti-Mormonism, Pat Robertson claimed on his 700 Club that Muslims did not enjoy First Amendment religious liberty protections because “Islam is not a religion. It is a worldwide political movement meant on domination.”164

When Keith Ellison (D-MN), the first Muslim elected to the U.S. House of Representatives, announced in 2006 that he would use a Quran at his swearing-in ceremony, Rep. Virgil Goode (R-VA) blasted that choice as a threat to “the values and beliefs traditional to the United States of America.”165 FOX News personality Sean Hannity compared Ellison’s plan to using the “Nazi Bible”: Hitler’s Mein Kampf.166 Ellison deftly responded to the controversy by announcing that he would be sworn in on Jefferson’s own “Alcoran of Mohammed,” acquired by Congress in 1815. In this way, he explicitly linked today’s Islam wars to earlier clashes over Jefferson’s heresies.

During Reagan’s two presidential terms, conservatives had cut taxes, raised military spending, slashed regulations, and brought the Cold War to a close, but Reagan frustrated conservative Christians by doing little more than empathizing with their plight. He talked a good game, affirming in his first inaugural address that “we are a nation under God” and denouncing the Soviet Union as an “evil empire” in a 1983 speech to the National Association of Evangelicals. He complained about the rise of abortion and pornography and the decline of the traditional family and Christian morality. He also called for constitutional amendments to restore prayer to the public schools and to prohibit abortion.167 But he never made these issues a priority. And while he appointed Antonin Scalia, a pro-life Catholic, to the Supreme Court, he also appointed two swing voters, Sandra Day O’Connor and Anthony Kennedy, who continued to uphold abortion rights. “I knew conservatives would get the short end of the stick,” Richard Viguerie said. “I just didn’t know the stick would be this short.”168

This may have been a case of taking the man out of Hollywood without taking the Hollywood out of the man. Perhaps Reagan never intended to do much about cultural issues. Perhaps his strategy was always to string white evangelicals along, “taking advantage of voters’ anxiety about moral decline for partisan political advantage.”169 Then again, maybe there was nothing to be done. Perhaps the cultural tide was flowing so hard to the left that there was no rowing against it. In any event, many on the Religious Right felt disappointed, even betrayed. Some came to see themselves as victims of not only secular humanists but also Reagan himself.

In the end, George W. Bush disappointed conservative Christians even more than Reagan had, since he was so plainly one of their own. Bush, too, proved unable to hold back the liberal tide that continued to wash over American culture. In 2008, as Bush was finishing out his second term, Richard Cizik of the National Association of Evangelicals lamented the “unholy alliance” he had helped to cement between conservative Christians and the GOP. “Evangelicals have given everything and gotten nothing in return,” he said.170 Even in an era defined by Republican dominance in tax-and-spend policy—an era of plummeting marginal tax rates and rising defense budgets—liberals were winning the culture wars.

The Obama Wars

LIKE THE THREE presidents before him, Barack Obama came into office with little interest in serving as commander in chief in a culture war. He had made a name for himself at the 2004 Democratic National Convention in Boston, where he pledged his allegiance not to “red states” or “blue states” but to the “United States.”171 As America’s first black president, he was determined to smooth over the racial divide rather than further erode it. Convinced it made no sense to be the standard-bearer of an anti-God party in a country with more Christians than any other in human history, he invited megachurch pastor Rick Warren to pray at his inauguration, spoke regularly of his faith in Jesus as “my Lord and Savior,” sang “Amazing Grace” during a eulogy, and continued to fund faith-based organizations. To the exasperation of progressives, he tacked hard to the center on taxing and spending, and he refused to pick a fight over gun control, even after mass shootings at Fort Hood, Aurora, and Newtown. When it came to vicious personal attacks (rocker Ted Nugent called him a “subhuman mongrel” and Rev. Steven Anderson prayed for his death172), Obama turned the other cheek so fast and so often that he sometimes seemed to be spinning on his own axis.

When it comes to the culture wars, however, it only takes one to tango. And here again the conservatives got things going. During the 2008 presidential campaign, Alaska governor and GOP vice-presidential candidate Sarah Palin referred to the Illinois senator running for the Democratic presidential nomination as Barack Hussein Obama and many on the right suggested that Obama was a secret Muslim. During his presidency, conservatives attacked Obama from day one, most angrily via the Tea Party, which emerged in 2009 as a visceral substitute for an eviscerated Religious Right. Viewed initially as a purely political group intent on cutting taxes and shrinking government, the Tea Party quickly proved to be a culture wars organization even less amenable to compromise than the Moral Majority or the Christian Coalition.173

In fact, Obama (“Enemy Number One” according to one North Carolina congressman174) endured during his two terms a barrage of personal attacks unrivaled even by Jefferson. Refusing to accept the legitimacy of America’s forty-fourth president, “birthers” questioned whether Obama really was born in the United States (a presidential requirement) and demanded to see his birth certificate. At a rally of veterans on the National Mall, Larry Klayman of the right-wing Freedom Watch blasted him as “a president who bows down to Allah”—“not a president of ‘we the people’” but “a president of his people.”175 According to conservative talk show host Glenn Beck, Obama was “a racist” with “a deep-seated hatred for white people.”176 No wonder 71 percent of Tea Party conservatives came to believe that Obama would “ruin the country” and 30 percent of Republicans thought he was a Muslim.177

Perhaps the angry white men in the GOP base could not stomach a black man in the White House. Perhaps conservatives were sick and tired of electing Republicans who did nothing to roll back deficit spending and liberal creep. Perhaps the institutional power of culture wars television networks, culture wars PACs, culture wars websites, and culture war billionaires was too much to resist. The nation grew increasingly polarized during Obama’s two terms, as moderates became as rare as a snowstorm on Capitol Hill and phrases such as “liberal Democrat” and “conservative Republican” became redundant. While Rep. Maxine Waters (D-CA) was calling Speaker of the House John Boehner (R-OH) a “demon” and Rep. Randy Weber (R-TX) was calling Obama a “Socialist dictator,” Rep. Allen West (R-FL) cut to the chase, ordering the president to “get the hell out of the United States of America.”178

Such spittle was not confined to cable news channels or Capitol Hill, however. As Americans were increasingly living in areas with like-minded voters and associating on social media with like-minded “friends,” political antipathy increased. Study after study demonstrated that political partisanship was becoming the new normal with nearly two-thirds of Americans describing themselves as “strong” partisans of either Republicans or Democrats.179

True to form, these Obama wars were asymmetrical. Most of the thunder came from the Right—from talk show hosts and elected officials beholden to the Tea Party who accused the president of launching a “war on religion” and even a “war on whites.”180 The titles of books by bestselling author Ann Coulter—Treason: Liberal Treachery from the Cold War to the War on Terrorism; Godless: The Church of Liberalism; Demonic: How the Liberal Mob Is Endangering America—show not only how “liberal” had become a fighting word but also how conservatives felt mugged by the traitors and demons in their midst.

Liberals did fight back, however, in a counterattack that included Keith Olbermann, the decade’s iconic angry liberal. From his perch at MSNBC, Olbermann blasted Bush for “urinating on the Constitution” and the Tea Party for working to “march this nation as far backward as they can get, backward to Jim Crow, or backward to the breadlines of the ’30s, or backward to hanging union organizers, or backward to the Trusts and the Robber Barons.”181 In one of his most notorious rants, Olbermann called Republican Scott Brown (who was running for Ted Kennedy’s Massachusetts Senate seat in 2010) “an irresponsible, homophobic, racist, reactionary, ex-nude model, teabagging supporter of violence against women.” In the Obama years, you weren’t anybody until somebody labeled you a “fascist,” a “socialist,” or a “Nazi.”182

Gay Marriage

THERE WERE ADDITIONAL cultural clashes in the early twenty-first century over public displays of the Ten Commandments, creationism in public schools, handguns, Mel Gibson’s blockbuster movie The Passion of the Christ, immigration, stem-cell research, partial-birth abortions, hip-hop, the War on Christmas, and efforts to remove Terri Schiavo (a comatose woman) from a feeding tube. All were designed, according to Thomas Frank’s What’s the Matter with Kansas? (2004), not to reverse the course of American culture but to gin up the GOP’s white evangelical base and return pro-business Republicans to Congress, where they could continue to pass pro-business bills hostile to the economic interests of heartland voters. “The trick never ages; the illusion never wears off,” Frank writes. “Vote to stop abortion; receive a rollback in capital gains taxes. . . . Vote to strike a blow against elitism; receive a social order in which wealth is more concentrated than ever before in our lifetimes.”183 It is true that the contemporary culture wars have often been waged by politicians who have no intention of overturning Roe v. Wade. But Americans are not just economic animals. Here beliefs matter. Here moral, religious, and cultural concerns run deep. So when the culture wars during the Bush and Obama presidencies fixated on homosexuality—the cultural issue of the 2000s—they did so not only because Republican politicians stood to gain from complaining about “the gay lifestyle” but also because many heartland voters believed that homosexuality was contrary to natural law and divine commandments.

Same-sex marriage had been on the table since 1989, when, in the midst of the art wars, Andrew Sullivan made “the case for gay marriage” in a New Republic cover story.184 But that case was not really heard until the turn of the millennium. The Right won early ballot initiatives defining marriage as the union of one man and one woman, and in 2000 the Supreme Court ruled that the Boy Scouts had a “constitutional right of expressive association” to bar gay troop leaders.185 That same year, however, Vermont became the first state to recognize same-sex civil unions. In 2003, the House and Senate took up the Federal Marriage Amendment, a failed effort to declare same-sex marriage unconstitutional, which James Dobson, in a classic example of culture war overreach, described as “our D-Day, our Gettysburg, our Stalingrad.”186 In June 2003, the Supreme Court struck down a Texas antisodomy law and (much to Justice Scalia’s annoyance) enshrined in the process a right to sexual privacy. Then came a November 2003 decision by the Massachusetts Supreme Judicial Court—“the culture war equivalent of the sinking of the Lusitania”187—which, in a legal boon to the Left and a fund-raising bonanza for the Right, declared same-sex marriage legal in the commonwealth.

In 2010, a federal judge ruled that Proposition 8—a Mormon-backed voter initiative in California that defined marriage as the union of a man and a woman—was unconstitutional. In 2012, Obama became the first U.S. chief executive to endorse same-sex marriage. In 2013, the Supreme Court struck down the Defense of Marriage Act as a violation of the due process clause of the Fifth Amendment. After the Supreme Court dealt another blow to opponents of same-sex marriage in 2014, blogger and election prognosticator Nate Silver joked that “gay marriage is on pace to be legal in 73 states by next Tuesday.”188 The following year, in Obergefell v. Hodges, the Supreme Court made marriage equality the law of the land. “The right to marry is a fundamental right inherent in the liberty of the person,” wrote Associate Justice Anthony Kennedy for the majority, “and under the Due Process and Equal Protection Clauses of the Fourteenth Amendment couples of the same-sex may not be deprived of that right and that liberty.”189 On a matter as fundamental as matrimony, this was an extraordinarily quick turnabout. And another victory for the cultural Left.