3. CLASH

There is something somehow liberating about approaching the 1960s from the point of view that much of what our aging Boomer population takes credit for—retrospecting themselves regularly at five-year intervals—was already in the works before the tie-dye was dry.

“Most of the political/cultural business that late-sixties youth in America deluded themselves they were mobilizing around had already happened,” notes writer Christopher Caldwell.

Timothy Leary was ejected from Harvard for his LSD demonstrations in 1963. The Civil Rights Act passed Congress in 1964. Medicaid and other Great Society legislation became law in 1965. Even the vaunted “end of hope”—which the young radicals flatter themselves they bore so bravely in the wake of assassinations—was old hat. The “Port Huron Statement” bemoaned a loss of ideals in 1962, even before President Kennedy was killed.1

The list of pre-1960s Sixtiesiana goes on, both cultural and political. In 1948, the first Volkswagen Beetle rolled into the U.S.: That was the year the Baby Boom turned two. In 1953, Playboy magazine, porno-flagship of the sexual revolution, sailed into the mainstream; that same year, Sugar Smacks cereal became available to, among other tots, members of the classes of ’68, ’69, and ’70.2 Two years after Brown v. Board of Education, Rosa Parks boarded a bus in 1955, even as ten million Boomers donned Davy Crockett caps.3 Sputnik put the Cold War into a deep-freeze funk in 1957. Lady Chatterley’s Lover made its unexpurgated American debut in 1959. The Pill was introduced in 1960. James Meredith enrolled at the University of Mississippi in 1961. That same year, Lenny Bruce, born 1925, started beating obscenity raps.

All of which suggests that a lot of the “liberation” we tend to remember as being pushed, if not driven, by masses of impassioned students—everything from racial and sexual freedom to expletive-undeleted license—was already rolling along before the youth movement got in gear. Not even one lousy campus building was taken over before the very end of 1964, when the so-called Free Speech Movement began at Berkeley. Indeed, the dispassionate logic of the timeline makes a compelling, if politically incorrect, case for a certain measure of historical continuity: namely, that the brave, new, rebel world of the 1960s was actually an extension of, and not a break from, the boring old 1950s. In other words, the Ike-ian era was really where it was at. The 1950s “represented a period of far-reaching social transformations whose significance would become apparent during the 1960s,” writes historian Steven Mintz, adding: “It was during the 1950s that teen culture assumed its modern form and that the civil rights movement’s activist assault on school segregation got underway.”4 With this in mind, the revolutions of the 1960s begin to look like a mopping-up operation, a rearguard action to eradicate an already doomed ancien régime: the adults. Indeed, what other target of traditional authority, of social restraint, of corrupt repression was even left?

There was that matter of the Vietnam War, opposition to which directly and indirectly ignited student-stoked disruptions, violence, strikes, and shutdowns at hundreds of American college campuses. But when all is said and done—when all the boat people are counted, when all the victims of the communist bloodbath in Indochina are buried—we have to wonder whether this was about selfless politics, or maybe just self-politics. The synchronicity, post-1970, between a fizzling-out protest movement and a winding-down draft—even as the most intensive bombing campaigns of the war raged on—cannot be explained away as unrelated coincidence. In other words, it’s tough to dismiss the fact that interest in the war as a political movement waned as self-interest in the draft became a nonissue. Concern about Southeast Asian “victims” of American imperialism vanished as those same Southeast Asians became targets of North Vietnamese and Khmer Rouge aggression. This reality ultimately struck David Horowitz, a famed thinker of “second thoughts” about both the antiwar movement and his own antiwar activism as editor of the New Left magazine Ramparts. Comparing two Washington antiwar protests that fell to either side of President Nixon’s decision to “Vietnamize” the conflict and end the draft—one in June 1970, that drew close to one million people, the other on May Day, 1971, which only thirty thousand attended—Horowitz realized “the rationale for most people to protest was gone.” He continued: “When this fact registered on me, the effect was devastating. The driving force behind the massive antiwar movement on America’s campuses had been the desire to avoid military service.”5

Between 1959 and 1975, the first and last years that there were U.S. casualties in Vietnam, about sixty million Americans turned eighteen.6 Which is a lot of Americans turning eighteen. But the point here is not so much to second-guess the self-preservation instincts of a generation whose elders, after all, had tempted them with an arcane system of draft laws that could be beaten with little effort. Rather, it is to trade the myth of moral exceptionalism for some factual context. If history tells us that the antiwar movement was, as Horowitz has called it, an “antidraft movement,” then history should be allowed to strip away the cloak of sanctimony so many of our ex-revolutionaries still like to wear. That alone should relieve the rest of us from that obligatory obeisance to Baby Boomer claims of moral purity. So what are we left with? What was it all about?

New Left leader Todd Gitlin found such questions perplexing as far back as the mid-1960s, when he was asked “to write a statement of purpose for a New Republic series called ‘Thoughts of Young Radicals.’” In his 1987 memoir, The Sixties, Gitlin wrote: “I agonized for weeks about what it was, in fact, I wanted.” This is a startling admission. Shouldn’t he have thought about all this before? He continued: “The movement’s all-purpose answer to ‘What do you want?’ and ‘How do you intend to get it?’ was: ‘Build the movement.’ By contrast, much of the counterculture’s appeal was its earthy answer: ‘We want to live like this, voila!’”7

Build the movement et voilà? Pretty thin stuff, as things turned out. “When I look back, I can see that what the students were doing wasn’t a movement that had much philosphical grounding, in terms of what lasted,” J. Anthony Lukas said in Coming Apart: A Memoir of the Harvard Wars of 1969 by Roger Rosenblatt. Such turmoil culminated in the 1969 SDS (Students for a Democratic Society) takeover of the administration building, University Hall, which then-president Nathaniel Pusey ended by calling in four hundred state and local policemen. The students’ lack of “philosophical grounding”—the 1960s equivalent of “the emperor has no clothes,” which Lukas noticed in hindsight—might have had something to do with what Seymour Martin Lipset pointed out in his landmark study of campus protest: In another break from tradition, the youth movement of the 1960s had no links to adult political parties.8 But there was more—or maybe less—to it than that. As Lukas, once an admirer of the students and their protest agenda, explained: “The kids were driven by anxiety about the draft. They wanted to be heroes outside a war, so they made Harvard Square a battlefield. But,” he added, flashing his liberal bona fides as if to ward off too many second thoughts, “I’m disappointed the New Left didn’t have more legs.”9

Philosopher and government professor Harvey Mansfield, one of a paltry few conservative professors at Harvard—in fact, maybe he’s it—has pinpointed precisely why it didn’t. In a 1997 essay called “The Legacy of the Late Sixties,” Mansfield deftly put the tumult of the decade in its place.

The sixties revolution was more a rebellion of children against parents than of citizens against the government. Its assertiveness was more in style than in substance, more longhair and workingclass duds than a new form of government. The New Left, rejecting the establishment, did not have an alternative; it lacked the organizational skills, the staying power, and the ruthlessness of the Old Left. It hoped to change politics by transforming culture rather than through the dictatorship of the proletariat.10

Lacking a bona fide “alternative,” no wonder the New Left’s Gitlin agonized for weeks. Empty pockets, theoretically and practically speaking, were as much a part of the movement as dirty jeans. These junior radicals were “Rebels Without a Program”—the title of George F. Kennan’s January 1968 article about the movement. First appearing in The New York Times Magazine before being republished as a book, it was a sober critique of the “virtually meaningless slogans” of student radicalism by a sober critic of the Vietnam War.

When we are confronted only with violence for violence’s sake, and with attempts to frighten or intimidate an administration into doing things for which it can itself see neither the rationale nor the electoral mandate; when we are offered, as the only argument for change, the fact that a number of people are themselves very angry and excited; and when we are presented with a violent objection to what exists, unaccompanied by any constructive concept of what, ideally, ought to exist in its place—then we of my generation can only recognize that such behavior bears a disconcerting resemblance to phenomena we have witnessed within our own time in the origins of totalitarianism in other countries.… People should bear in mind that if this—namely, noise, violence and lawlessness—is the way they are going to put their case, then many of us who are no happier than [the student radicals] are about some of the policies that arouse their indignation will have no choice but to place ourselves on the other side of the barricades.11

Frankly, “the other side of the playpen” would have been more appropriate. If the youth movement was an all-style, no-substance rebellion of children versus parents, it was also the biggest temper tantrum in the history of the world. Failing to overturn the political system, however, the 1960s youth revolution did accelerate and intensify the transformation of the culture, and to an extent beyond the scope of change projected by the political reforms and cultural shifts of the 1950s. But even this legacy doesn’t belong to the kiddies alone. The fact is, the cultural transformation we associate with the 1960s could never have taken place without the aquiescence and complicity of their parents, along with every other adult in their lives. This dysfunctional symbiosis is the “strange conspiracy” that Roger Rosenblatt watched destroy Harvard.

The odd thing is that none of the destruction would have occurred had there not emerged a strange conspiracy between those who wanted power and those who readily ceded it to them. The fact that student radicals wanted to take over Harvard, or all of America, for that matter, did not condemn them. However naïve much of their revolution was, for the majority of them it was sincere. Even most of those who for personal reasons protested Vietnam to avoid fighting there were sincere in their objective opposition.

Yet they never could have created so much chaos at Harvard had the administration and most of the faculty not allowed them to.… There were certain critical moments in those two months when professors had the opportunity to instruct their students usefully merely by voting the right vote or by saying the right things—things in which they supposedly believed. Yet, for the most part, they offered no opposition to what they disagreed with, as if to tell the students: “If you want it, take it.” Liberalism rolled over on its back like a turtle awaiting the end. I do not know why, but there was an impulse running under the events of that spring to let things go to hell, and it was acted on by young and old alike. [Emphasis added.]12

Rosenblatt’s mystery impulse may well have been acted on by young and old alike, but “young” wasn’t the party in charge. “Old” was still responsible for “young,” or supposed to be, along with liberalism, as well. All of which is to say that it wasn’t “liberalism” that rolled over in the path of a lawless and antidemocratic youth movement, it was “old”—all the adults who were scared, cowed, or even thrilled by the juggernaut of spoiled young radicalism. It was the faculty at Berkeley who ignored the clarion warning of Seymour Martin Lipset: “Civil disobedience is only justified in the absence of democratic rights.”13 (As faculty adviser to the Young People’s Socialist League, Lipset was hardly an “Establishment pig” plant.) It was all of academia who reinvented the term “political crime” in order to justify, not define, a crime by its political motivation; this, as Professor John Silber has written, has left society with a “contempt for law through inappropriate appeal to political motivation.”14 It was the president of Cornell, James Perkins, who, abjectly and unredeemably humiliated by student radicals, declared—addressing over eight thousand angry students after having been mocked and kept sitting cross-legged on a stage—that the campus insurrection was “probably one of the most constructive, positive forces that have been set in motion in the history of Cornell.”15 It was all the parents and teachers, administrators and government officials, who laid down their arms—their grade books and their transcripts, their allowances, their tuition, their scholarships, and their fellowships, their campus disciplinary codes and their local statutes, their democratic principles, their laws, and their dignity—at the first frisson of a Movement tambourine, at the first scream of “motherf———,” at the approaching footsteps of this twentieth-century Children’s Crusade to stop the war, stop bureacracy, stop corruption, stop the “pigs,” stop racism, stop imperialism, stop poverty, stop ROTC, stop laws against illegal drugs, stop “hang-ups” about sex and start black studies on American campuses and a communist paradise in Vietnam. Writing in 1970, then antiwar activist and sociology professor Peter L. Berger commented on this mosaic of seemingly unrelated causes underlying student unrest.

“The Movement” seems to be about specific social and political issues of the time. But “The Movement” also seems to be about sexual emancipation, about the “generation gap,” about a new style of life, which, ranging from its religious interests to its tastes in music, stands in self-conscious tension with prevailing middle-class culture.… This is all more than a little confusing—even to individuals who regard themselves as being within “The Movement.” After all, it is not immediately clear how the demand to “Stop the War in Vietnam!” relates to such other demands as “Legalize Pot!” or “Student Power!”16

At about the same time, a contributor to the SDS publication New Left Notes shed some light of his own on the kaleidoscopic spread of issues.

You have to realize that the issue didn’t matter. The issues were never the issues. You could have been involved with the Panthers, the Weatherpeople, SLATE, SNCC, SDS. It didn’t really matter what. It was the revolution that was everything.… That’s why dope was good. Anything that undermined the system contributed to the revolution and was therefore good.17

Thirty years later, Roger Kimball put all the pieces together.

Students may have marched to protest the presence of the ROTC on campus, university rules governing political activism, or U.S. policy in Southeast Asia. But in the end such issues were mere rallying points for a revolution in sensibility, a revolution that brought together radical politics, drug abuse, sexual libertinage, an obsession with rock music, exotic forms of spiritual titillation, a generalized anti-bourgeois animus, and an attack on the intellectual and moral foundations of the entire humanistic enterprise.18

That about covers it. But there was something else: Virtually every list of student demands included one that always looks a little shrunken next to those living-large calls for destroying the fascist-pig-capitalist-complex. It is a demand that comes across more green eyeshade and bean counter than red eyes and iconoclasm. This demand, no less urgent than any other, was the demand for “amnesty”: namely, that student protestors be exempt from punishment for their disobedience, whether civil or criminal.

At Columbia, for example, SDS leader Mark Rudd, ensconced in the student-held administration building, pronounced any mediative effort that didn’t include amnesty for student demonstrators to be “bullshit.”19 (He and his followers would occupy the administration building a second time to protest the university’s mild disciplinary measures against four SDS leaders involved in the initial assault.) At Cornell, after nearly a week of violence and vandalism in the spring of 1969 that included the armed occupation of Willard Straight Hall (ejecting thirty campus-visiting parents and forty campus employees in the initial 6:00 A.M. takeover) and radio-broadcast death threats against seven faculty members and administrators, a faculty voice vote passed “nullification” of all penalties against the student storm troopers by an estimated seven hundred to three hundred.20 At Kent State—two years before the notorious 1970 riots during which National Guardsmen killed four students—charges against 250 black and white student protestors were dropped after several hundred black students left campus demanding amnesty.21 It was as though all the revolutionary actions—the occasional hostage-takings, the frequent arson and building takeovers, the violence, extortion and intimidation, shutdowns, shout-downs, sit-ins, destruction, and strikes—were really just so many insurrectionary internships that should be written off by campus elders; or, adding insult to injury, even accepted for credit.

Ronald Radosh, erstwhile leftist and former antiwar activist, was a faculty member at Queensborough Community College in New York during the student protest years. This period included 1970, when, after Kent State, many protest-paralyzed campuses across the country shut down their spring semesters a month early. In his memoir, Commies: A Journey Through the Old Left, the New Left and the Leftover Left, Radosh recalls an exchange over grades for revolutionaries with a philosophy professor and genuine grown-up named Katherine Stabile who “argued that it was cheating the students to let them graduate or be advanced without having done any of the work.”

She said she intended to keep her classes going, give final exams, and not cave in to left-wing tyrants who claimed to have all the virtue on their side. I, of course, had a sharp and nasty answer, which I gave in the best New Left fashion. “Some kill students with National Guard bullets while others do it with grades.” It was the worst of analogies, but my side carried the argument, and no grades were required that year at Queensborough. [Emphasis added.]22

No students, of course, were ever killed by grades, but neither, it seems, were they even flunked by them. Campus authorities—“fathers in an anxiety dream,” as writer and faculty wife Diana Trilling called them—rarely meted out appropriate punishments even as they frequently handed out passing marks, even high ones, to student provocateurs who themselves had either stopped attending classes or even shut them down.23 Worse, academy elders did what they could to shield their parent-subsidized charges from all real-world consequences of their quadrangle rampages. At Columbia, for example, the new acting university president asked the New York district attorney to grant clemency to hundreds of students who had been arrested in the spring of 1968.24

More than anything else, it is this surrender of adults that accounts for the surrender of the American academy. It was a surrender as shocking to behold as it was widespread, as hard to believe as it was transformative. As late as the spring of 1969, Richard Nixon, then newly ensconced in the White House, could still express satisfaction at news of the SDS takeover of Harvard’s University Hall. “Nixon told me that he was happy it had happened at Harvard,” explained Henry Kissinger. “At first I thought he was gloating at the discomfiture of his enemies. In fact, he had something else in mind: ‘Harvard is the leading university in the country. It will set an example for how to handle student upheavals,’” Nixon said.25 It set an example, all right, but not the one President Nixon was undoubtedly hoping for. Notwithstanding his own antipathies toward the liberal establishment, Nixon obviously still assumed the grown-ups were in charge.

Why weren’t they? This is the great ponderment behind the culture wars that have riven society for half a century. The answer lies somewhere between the abdication of the adult and the rise of the adolescent that took place as American society redirected its energies from realizing its destiny to raising its young. While children always were and remain any society’s future, what had taken place was a cultural shift in emphasis from the distant horizon, from exploration, conquest, and settlement, to the family room; from the big picture to the good life; from looking out to looking within. Maybe, briefly, the 1960s were the real “end of history”—a pause, anyway, during which the upper end of the American middle class, and especially its young, took the opportunities offered by expanding leisure and unprecedented luxury to work on the real problem: its happy, indulgent, guilty, disaffected self.

In his magisterial history of Western civilization, Jacques Barzun makes note of “the loss of nerve typical of periods of decadence.”26 Barzun—who, incidentally, as a professor and former provost of Columbia played no role during the university’s upheavals in 1968—was describing eighteenth-century France, but he could have been describing twentieth-century America, where the rotting heart of American academia revealed that same typical loss of nerve. Atypical, though, was the cause. In the American example, the ancien régime didn’t shrink from an invading army, an invading people, or an invading ideology. It retreated from the ultimate “enemy within”: its own children.

Peter L. Berger describes a “new cultural conception of childhood” that made the youth movement possible in the first place. It was linked, he writes, not only to the vast numbers of children born during the Baby Boom—seven or eight million of whom had matriculated by 1970 at rapidly expanding colleges and universities—but also to an “abrupt decline in child mortality.” This, Berger maintained, transformed the parent-child relationship.

Today, most children grow to maturity. One has to grasp the emotional consequences that this transformation has had for parents if one is to understand its staggering significance. Probably for the first time in human history, when a child is born today, the parents can bestow love on this child without having to reckon with the probability that, in the very near future, their grief will be all the more bitter for it. There is a new emotional calculus along with the new demographic one—and it is important to recall that all of this is very recent indeed.27

This “new emotional calculus” is crucial to any postmortem on the death of the grown-up. Once upon a time, more children living longer would have been a boon, say, to the family farm. In a technological economy, though, few kids feed chickens or bale hay. This has made childhood less useful to parents, even as it has made it more fun for children. At the same time, as the human condition has continually improved, childhood has also been increasingly buffered from the pain of illness and death. Additionally, according to Berger’s thesis, there was one more new attribute to modern childhood: “Within this physical and social setting the modern child, not suprisingly, comes to feel very early that he is a person of considerable importance.”28

Sure enough, self-esteem was not a problem with the occupiers of Harvard’s University Hall—or Columbia’s Hamilton Hall, or Berkeley’s Sproul Hall. And occupying Harvard’s University Hall—or Columbia’s Hamilton Hall, or Berkeley’s Sproul Hall—was not a problem for doting parents, not to mention doting professors. “Think of it as one of your own children who had been beaten like that!” cried a member of the Harvard philosophy department, “almost tearful” in his remarks at a faculty meeting about the “police brutality” that ended the University Hall takeover. Such empathy carried the day—the era, really. Sure, there was tough talk from the odd alum: “I don’t give a damn whose head got bloodied. If you get mixed up in a thing like that, you deserve what you get,” one told The New York Times. And sure, there was quiet anger in the occasional Harvard professor. After his “almost tearful” colleague spoke, Harvard professor John V. Kelleher recalled, “For the first and only time I wanted to get up and speak. I wanted to say that if it was one of my children, I could hardly wait to get him home so that I could give him a rousing kick where it wouldn’t blind him. But I didn’t, because an old friend was sitting in front of me; and his son had been in the hall.” But such were the voices of the largely Silent Majority; they may well have voted, but they didn’t make much noise.29

The fact is, general support for the Vietnam War under President Johnson, and, later, President Nixon, remained fairly solid during periods of student upheaval—although such support was going to be lower amid the increasingly liberal subset of society that sent its children to, or taught them in, college. But what did these supporters say to their children? It’s not difficult to imagine long distance arguments over what was going on at school between Junior and the ’rents—liberal or conservative—but what harsh words ever led to harsh actions? That is, what collegiate revolutionary ever saw his dining hall contract canceled, or his bursar account closed? (As one historian remarked, he knew of no other uprising in history in which the revolutionaries had fellowships.30) The conduct of the war in Vietnam, the pace of civil rights reform, or university slumlordship wasn’t ever the parenting issue. What concerned Mom and Dad—or should have—had to do with Junior’s behavior. Dirty words. Shoving. Pushing. Cutting class. Cutting fire hoses. Waving guns. Taking things. Breaking things. Throwing things (paving stones, Molotov cocktails). Burning things (buildings, records, research). But it didn’t—at least not in any consequential way. Indeed, at the University of Chicago, which may be the one campus where administrators acted swiftly to expel students who had occupied a building, “parents took out newspaper advertisements protesting the draconian punishment visited upon their darlings, thus providing a clue to what had gone wrong with their children.”31

Central to the surrender of the adult, then, was the collapse of the parent. As much as any political, demographic, or economic factors, this made the ascendancy of youth possible, and possibly inevitable, first on campus, and, later, in the wider culture. So much for the World War II–winning Greatest Generation, whose own offspring, spoiled “youths” in the 1950s, became everyone’s spoiled youth movement in the 1960s. Life may have been tough for the men and women whose formative years were marred by Depression and war, but theirs was the spawn of Dr. Spock’s “permissive society.”

Then again, there are those, particularly among our elites, who find the permissive society explanation—the spoiled brat theory—too simplistic. After all, the theory ignores the “real” issues in all their complexities: namely, that the Vietnam War was “immoral”; the American government was “corrupt”; our young people were “pure.” It also fails to take into account that the North Vietnamese were good (all communists, of course, were good), and the South Vietnamese were bad (all anticommunists, of course, were neanderthals). All of which is, well, pretty simplistic.

Of course, simplistic or not, and even spoiled or not, the youth movement had an impact. In the fall of 1969, Richard Nixon addressed the emerging danger of mob influence. (Worth noting, for the sake of context, is that at no point in 1969 did any Gallup Poll show public support for Nixon’s conduct of the war slip below 44 percent—at which point opposition stood at 26 percent. Even at the height of huge public demonstrations in the fall, 58 percent of the public supported the president, with 32 percent opposed.32)

In San Francisco a few weeks ago I saw demonstrators carrying signs reading: “Lose in Vietnam, bring the boys home.”

Well, one of the strengths of our free society is that any American has a right to reach that conclusion and to advocate that point of view. But as President of the United States, I would be untrue to my oath of office if I allowed the policy of this Nation to be dictated by the minority who hold that point of view and who try to impose it on the Nation by mounting demonstrations in the street.

For almost 200 years, the policy of this nation has been made under our Constitution by those leaders in the Congress and in the White House elected by all of the people. If a vocal minority, however fervent its cause, prevails over reason and the will of the majority, this Nation has no future as a free society.33

The words of the Republican president essentially expressed the logic behind the liberal argument against the Movement—which, of course, is not to be confused with the liberal argument against the war. It jibes with what Seymour Martin Lipset said at Berkeley at the beginning; with what George F. Kennan said in the middle; and with what David Horowitz said two decades after it was all over: “In a democracy, where the people are sovereign, what justification can there be for self-styled ‘revolutionaries’ like ourselves? In rejecting the democratic process, we had rejected the people, setting ourselves over them in judgement as though we were superior beings.”34

Of course, even superior beings could be touchy. In that same fall of 1969, during the semester following Harvard’s spring revolt, Daniel Patrick Moynihan, then a Democratic member of the Nixon administration serving as counselor to the president, attended the annual Harvard-Princeton game. There, at Soldier’s Field in Cambridge, Moynihan could see the biggest stumbling block to mounting a defense of the democratic process: the superior beings’ parents. In White House Years, Henry Kissinger recalled the memo Moynihan sent to the president on the subject.

It described a scene … in which the assembled graduates—worth, according to Pat, at least $10 billion—roared support when the Harvard University band was introduced, in a takeoff of Agnew’s denigrating phrase, as the “effete Harvard Corps of Intellectual Snobs.” Pat warned that while Nixon was right in resisting attempts to make policy in the streets, he should not needlessly challenge the young—because of their great influence on their parents. [Emphasis added.]35

In the cheers of the crowd, Moynihan heard more than just the sound of parental approval. As Peter Berger might well have pointed out, parental regard for their young had merged into a sense of shared identity, or perhaps shared vision. Parent and offspring alike saw that children could do no wrong. Which, in its universality, was a new one on the human race. It’s a safe bet that Ivy alumni at that same Harvard-Princeton matchup, circa, say, 1959—and certainly 1949 and earlier—would have slapped down any youth movement attempting to make policy in the streets.

But there was another factor in the emotional calculus that was changing the whole social equation: an increasing sense of self-awareness. This new state of self-consciousness was probably another tasty fruit of victory, relative peace, and rampant, increasingly technology-based prosperity. Indeed, such self-awareness had become a “chief characteristic of our culture,” wrote Columbia’s Lionel Trilling, high, low, and otherwise. From academia, journalism, entertainment, and advertising, he wrote, we learn “to believe not only that we can properly identify the difficulties presented by the society but also that we can cope with them, at least in spirit, and that in itself our consciousness of difficulties to be coped with gives us moral distinction [emphasis added].”36

What Trilling described was a perfectly phony brand of moral distinction, an ersatz morality, as in: I feel, therefore I am moral. Trilling found himself reflecting on it after reading the opening paragraph of the independent report on “the disturbances” at Columbia in 1968. These “disturbances” included: the seizure of five campus buildings; the overnight imprisonment of the acting dean of the college and his two assistants; an SDS manifesto from SDS leader Mark Rudd threatening Columbia’s destruction addressed to the university president (ending, “Up against the wall, motherf———”); spitting at and punching senior faculty members; urinating in trash cans, urinating out windows; destroying faculty research and papers; and the paralysis of a thirty-four-year-old policeman caused when a student jumped him from above.

Four decades later, more memorable than the report’s findings is the surrealistic, worshipful tone that was set in the opening lines by report author Archibald Cox of the Harvard Law School: “The present generation of young people in our universities are the best informed, the most intelligent, and the most idealistic this country has ever known.” (Good thing—bad thing?—the report came out a few days before that best informed, most intelligent, and most idealistic young person Mark Rudd informed The Boston Globe, “We manufactured the issues.”37) Trilling describes his “natural bewilderment” on reading Cox’s words. Then he understood.

In his high estimate of the young, Professor Cox accepted the simulacrum for the real thing: he celebrated as knowledge and intelligence what in actuality is merely a congeries of “advanced” public attitudes. When he made his affirmation of the enlightenment of the young, he affirmed his own enlightenment and that of others who would agree with his judgment—for it is from the young and not from his own experience that he was deriving his values, and for values to have this source is, in the view of a large part of our forward-looking culture, all the certification that is required to prove that the values are sounds ones. [Emphasis added.]38

The phenomenon the professor is getting at sounds very much like an old boy riff on the Twist—the juvenile dance craze adopted and popularized by adults. In “deriving his values from the young,” as Trilling wrote, Professor Cox and his fellow fact finders were defining their values down (to borrow Daniel Patrick Moynihan’s handy concept again) to those embodied by student protest. In affirming the enlightenment of the young—and thus his own, according to Trilling—Cox was likewise defining enlightenment down, mistaking “‘advanced’ public attitudes” (read: left-wing politics) for wisdom. This was a watershed moment. From this point forward, New Left values and “the enlightenment of the young” quintessentially defined the elites, effectively negating, and certainly downgrading, all experience and traditions that came before.

This elite embrace of youth-derived values and enlightened self-awareness became the basis of the 1960s legacy—and, thus, the basis of the so-called culture wars that would disrupt subsequent decades. In place of a hierarchy based on accrued wisdom, there would emerge a power structure based on accrued grievance. Authority and reason would give way to novelty and feelings. A new set of un-manners and non-mores would quickly overrun attitudes and practices that had evolved over generations by recasting refinement and restraint, honor and forbearance—virtues, not coincidentally, of maturity—as corrupt and phony, or, even worse, not “authentic.”

In that bid for “authenticity,” civility and decency, too, were quick casualties. Not for nothing, as noted by Diana Trilling at Columbia in 1968, did a filthy stream of public profanity rush through the various student upheavals. Indeed, the most memorable words of the movement are four-letter ones.

It was not alone President Kirk who was addressed as a motherf———. Vice-President Truman was a motherf———, Acting Dean Coleman was a motherf———, the police were—naturally—motherf———s, any disapproved member of the faculty was a motherf———. Rudd’s response to the mediating efforts was “bull———.” … At a tense moment on the steps of Low Library a Barnard girl-demonstrator jumped up and down in front of the faculty line—the faculty were wearing their white armbands of peace—compulsively shouting, “Shit, shit, shit, shit.”39

Small wonder, as Trilling also noted, one pun-prone professor dubbed the student revolutionaries, “Alma Materf———s.”

Oddly enough, these cataracts of obscenity were barely mentioned in the press, if at all, no doubt out of reflexive consideration for middle-class sensibilities. But, as Diana Trilling wrote, this phenomenon was “not of the gutter.” It was out of the mouths of babes from the middle class; and, as it turned out, few of their middle-class parents were willing to wash out the little darlings’ mouths with soap. “One discovered that a decent proportion of the decent American middle-class mothers and fathers of these young people, as well as other energetic spokesmen for progress, supported their offspring,” she wrote. Among the proud parents were the Rudds, with Mama Rudd giving “the proudest and tenderest interview to the Times about how her-son-the-rebel planted tulips in their suburban garden.” Up against the garden wall, motherf———, and all that. Indeed, roughly two hundred other mothers and fathers joined a Committee of Concerned Columbia Parents “to back their children and further harry the administration.”40 Strange conspiracy, indeed.

Not that everyone went along with it. The revolution on campus may well have successfully overturned the old order of the Establishment—and, more important, the established order of the old—but it didn’t overturn everyone. The youth movement booted maturity and experience from all that was deemed, in 1960s parlance, “relevant” and “valid” to the life worth living, but some few scattered grown-ups were left standing. Looking back now, these adult remnants resemble ghosts and scolds, rattling cages and writing screeds as their world disappeared, lost to that same strange conspiracy. If the 1960s tell the story of capitulation, they also write the book on “clash,” on frictions and collisions that to this day map out the shifting front in the so-called culture wars. Indeed, the culture wars themselves, from the university to the workplace to the family room, retain the basic aspects and patterns of that earlier clash—that strange conspiracy between young and old, child and parent, hip and square—that distinguish the 1960s from every decade that came before them.

Clash is key. Clash is pop versus rock; short hair versus long hair; restraint versus license. It was S. I. Hayakawa standing up to student militants at San Francisco State. It was Bosley Crowther of The New York Times crossing pens with Pauline Kael of The New Yorker over the amorality of Bonnie and Clyde.41 It was Police Code No. 205 versus the mouth of Lenny Bruce.42 It was any socially explosive face-off between hostile, if not actually warring, culture camps—the one established, the other revolutionary. Not that these two sides were evenly matched. In fact, all the Establishment ever did (ever does) was throw off a few sparks of reluctance to bear witness to cast-off traditions and subsiding sensibilities just as they were fizzing out. In the clash between Establishment (Nixon) and revolution (Elvis), revolution, it seems, wins every time. “We’re going to bury you. We’re gonna take over. You’re finished,” thirty-four-year-old Dennis Hopper (Easy Rider) said in 1970, poking a finger into the chest of seventy-one-year-old George Cukor (What Price Hollywood?, Dinner at Eight, The Women, Philadelphia Story, Adam’s Rib). “Well, well, yes, yes,” said Cukor. “That’s very possible, yes, yes.” A few months later, John Wayne in True Grit, not Dustin Hoffman in Midnight Cowboy, managed to take home the Best Actor Oscar, but it was still the end of an era.43

That’s because it’s always easier to release a genie than to coax him back into the bottle. Or, for that matter, stuff Bonnie and Clyde back into the can. It’s easier to add women’s, Latino, and gender studies to the curriculum than to remove black studies from the curriculum. It’s always easier to be the first movie to use the word, “f——” (M*A*S*H in 1970)44 than the next movie not to use it. Three esteemed professors may have resigned from Cornell in 1969 over the university’s lack of intergrity, but the university still lacked integrity, not to mention the three esteemed professors.45 In such cases, culture clash merely marks the fault line of change, an aftershock of wishful thinking that follows the initial trembler. Indeed, as Walter Berns put it, writing about Cornell’s acquiescence to militant demands in 1969, “By surrendering to students with guns, [Cornell’s President Perkins] made it easier for those who came after him to surrender to students armed only with epithets (‘racists,’ ‘sexists,’ ‘elitists,’ ‘homophobes’).”46

In July 1965, seventeen months after the Beatles entered living rooms across America via The Ed Sullivan Show, and six or seven months after the student protest generation debuted at Berkeley, Esquire magazine offered up what may now be read as the perfect clash issue on American youth. It is that rare cultural relic: an expression of a purely adult sensibility in a non-adolescent cultural mainstream before the sea change. In conducting a study of teenage life from an adult point of view, this issue stood as a bulwark, temporarily, against the adolescent deluge to come. Not having yet defined down either its sense of enlightenment or its values, Esquire in July 1965 was openly contemptuous of the new obsessions with “teen-agerism,” and argued against it, albeit with an inkling that time was passing adults by.

The traditional view, unfortunately discarded, was that you were sort of apprentice adults living through an awkward period, waiting and preparing to take your place in the real or “adult” world. There was a lot to be said for that view—one of the best things that could be said for it was that it was true. It still is, but the cultist teen-ager doesn’t believe it anymore. He behaves as if he had arrived somewhere, as if he had already achieved his own sort of perfection, celebrating what he is (very little), instead of worrying what he may become (anything). His idols are only other teen-agers more or less like himself, which puts him in a rather static bind: we tend to become what we admire or at least move in that general direction. But the pro-teen is standing dead still, combing his hair and mouthing jargon. Hip or square makes no difference. He will never be anything but a teen-ager.47

Esquire also tried to show, not just tell, the folly of this philosophy in a fashion story—“Threads, or What the Well-Dressed Teen-ager Ought to Wear.” The story went beyond just showcasing standard-issue chinos and loafers and camel-hair coats. The fashion spread instructed readers on how “non-adult” the new styles were, and it did so by contrasting the timeless look of the prepster with the mod look of the wanna-Beatle.

So leave the tight pants and the boots and the pointed shoes and the latter-day zoot suits and the Martian getups to the idols.… Better yet, don’t even dress like a teen-ager. Pretend you never were one. Pretend you couldn’t care less. For instance, take the guy in the three-piece suit on the next page. It’s a grey herringbone tweed (Cricketeer) worn with brown brogues, a button-down shirt and a paisley tie. Contrast this with the guy in the background sporting boots and a zany belted suit. He probably paid as much for his suit as the other guy. But notice which looks more comfortable. Notice, too, which one has the chick.48

Surprise, surprise: In Esquire-land—which obviously included a photo studio—the tweedy (mature) teen, not the zany (adolescent) teen, got the girl, and who could ask for anything more? This dream was nice while it lasted. In the real world of 1965 beyond the photo shoot, though, brown brogues were fast losing to tight pants. What Esquire’s editors hadn’t quite grasped was the solidity of the emerging consensus on “teenagerism.” The idea of never being anything but a teen wasn’t a threat; it was an aspiration. Soon, it would be a promise, and one that Esquire, among other magazines, would ultimately try to keep. In an editorial already sounding like a blast from the past, Esquire tried to appeal to teen logic, revealing an understanding of the more or less irrational forces that good old American capitalism had unleashed.

The danger is, though, that having become a matter of interest you will come to think of yourselves as therefore interesting, which is not quite the same thing. You have not created a valuable subcivilization merely by being too young to vote, although that is rather wise. Remember that no matter how many millions are spent catering to your taste in music, your taste in music remains very bad: even more millions are devoted to the study and treatment of your pimples, but that doesn’t make pimples a good thing.49

Such caustic confidence was as doomed as the cultural establishment that inspired it. But it makes a good marker. In July 1965, a mainstream publication not only could, but did voice an adult point of view on adolescent culture. Indeed, that same Esquire issue ran a memoir of the day Benny Goodman met the Beatles. Written by Goodman’s daughter, Rachel, the article chronicled a brief chat backstage—a cordial enough King of Swing making small talk with the Fab but clearly disinterested Four—and an even briefer review of the performance Goodman himself broadcast in exchange for publicity for an upcoming appearance of his own at the World’s Fair. “‘There was a very a strong beat,’ Daddy said, ‘but otherwise the screams drowned out most of the sound. I’m afraid there’s not much I feel qualified to say about it.’”

Rachel Goodman elaborated:

In the tidal wave of screams that greeted them [the Beatles], I thought this was what the end of the world must be like. The din went past the painful, too loud for the ear to register. It was precisely the sensation of being behind a jet plane taking off: the same pitch and intensity. Flashbulb after flashbulb made the whole stadium white, the points of brightness popping everywhere, but giving the impression of a constant unearthly glow.…

Daddy, [sister] Benjie and I watched the stadium with much greater fascination than we did the performance. Benjie turned to me and said, “Just think what people from another planet would think if they found themselves here.” There was something so apocalyptical about it; pure frenzy, an almost mystical atmosphere of heavens opening up. I thought of the subtle element that held together these four boys performing not very well, and the incredible response of the kids. It was more complicated than sex alone or music alone. There is music and music. People screamed at Carnegie Hall for Richter, and indeed for my father, but it was after the performance, and in Daddy’s case, it had something to do with skill and rhythm which captivates in a different way.50

This reaction—the sensibility, the point of view—is not only antique, it is extinct, at least in the media mainstream. In our time, in our rock culture, such a critique of any leading music act would and could never appear. There is no one who sees things that way; no one, that is, who isn’t regarded as irrelevant, antiestablishment, and downright countercultural—a subversive, who, Soviet-style, really should be sent to Siberia. This is more evidence of just how far the fault lines have shifted. As they move, we move, which is why all manner of clash is left behind.

In the end, the absence of clash becomes as telling as clash itself. In 1977, the year Queen Elizabeth II celebrated twenty-five years on the British throne, the Sex Pistols—remember them?—marked the occasion with the release of their dumb, if nasty, punk anthem “God Save the Queen,” prompting what were still predictable clucks of outrage from defenders of the British institution. Indeed, the song was officially banned for a time from the land and the airwaves, prompting a live performance on the Thames that ended in a few cheap thrills and several arrests.51 In other words, those were still the days when no rock star worth his authenticity would have dared cross the palace moat—nor would he have been invited to.

In 2002, when Queen Elizabeth II celebrated fifty years on her throne, Sir Paul McCartney, “First Lady of Soul” Aretha Franklin, and other bona fide rock icons marked the occasion by singing at the queen’s invitation at the “Party at the Palace.” Sex Pistol Sid Vicious may have been long dead of a drug overdose, but the spectacle offered a vivid juxtaposition of symbols: the amber-preserved trappings of monarchy versus the free flow of aging rock royalty, whose loyal subjects—fans—embody the image of mob rule. These rock ’n’ rollers never went to culture-war against the monarchy as openly as the Sex Pistols, but they certainly attacked the manners and mores of the vast middle class—said manners and mores in Britain that included, for example, hanging the queen’s picture in the parlor, sans irony, in reflexive obeisance to fealty, honor, duty, and other largely atavistic instincts stamped out in the rock revolution.

Given these seemingly natural cultural enmities, a golden jubilee invite to, for example, drug-addled, bleep-mouthed Ozzy Osbourne—at the time riding reality-show-high—should have struck a culturally significant spark or two somewhere in the realm. But no. As the aged Keeper of the Stiff Upper Lip and retinue prepared to receive the aging Advocate of Wild Abandon and mates at her own gala affair, there was no discernible tut-tutting, not a single letter to the editor wondering what the country was coming to. In the end, no one noted anything amiss about an event that brought together a man who bites bats with a woman who has a royal taster. Which goes to show the cultural revolution isn’t just over; it’s been forgotten entirely. This explains why, flashing forward to the 2004 Kennedy Center Honors in Washington, D.C., Billy Joel could celebrate Sir Elton John’s Lifetime Achievement Award by performing “The Bitch Is Back” for a black-tie crowd including President Bush, his White House cabinet, and a national television audience.

This was another transgressive moment of pomp and punkiness, a mix of cultivation and coarseness, but no one noticed the clash because there wasn’t any. Pomp is now open to punkiness, while punkiness will always tolerate a little pomp. The shared sensibility of a Richard Nixon and an Elvis Presley—both of whom knew better than to mix their cultural metaphors—is a long way in the past. In the years since, The New York Times has editorialized “in praise of the counterculture,” which shows, as Harvey Mansfield has written, “by its very appearance in the nation’s most prestigious newspaper how far the counterculture had become regnant.”52 The “bitch” isn’t just back, it’s here to stay.

To Mansfield, a paean to the counterculture in the “newspaper of record” has instantly, glaringly obvious implications. To almost everyone else—those who have become or were born insensate to cultural revolution—his words are meaningless. Either reaction tends to prove that the counterculture has become the establishment culture.

Sex toys and drug paraphenalia turn into Clinton White House Christmas tree ornaments, triggering the lonely ire of a solitary FBI agent on White House detail.53 An elementary school principal in upstate New York has a child out of wedlock and townspeople castigate the anonymous citizen who publicly criticized her example.54 The Erotica USA convention sets up shop in the Jacob Javits Convention Center in New York City, arousing zero complaints, but presumably plenty of customers. “‘For 35 years, I’ve been a pornographer, and we’ve always been underground,’ said Al Goldstein, who—between signing autographs—was running a booth that displayed his [Screw] magazine … ‘I never dreamed we’d be in the Javits Center. It is such a class place.’”55

So is the St. James Theater on West Forty-fourth Street. There, on the fiftieth anniversary in 1993 of the premiere of Rodgers and Hammerstein’s Oklahoma!, then-seventy-six-year-old retired Broadway tenor (and father of Bonnie) John Raitt made a surprise appearance to sing the rousing title song, a melodious anthem of ascendant Americanism.56 It must have been some surprise: The audience that night had paid to see Tommy, the 1969 rock opera by The Who (“Pinball Wizard,” “The Acid Queen”), a fever dream of the drug culture. A half century earlier, Oklahoma! broke theatrical ground by bringing the folkloric American characters of the Great Plains to Broadway; Tommy, on the other hand, made musical history by taking audiences along on an orchestrated drug trip. In other words, the 1943 season that saw the opening of Oklahoma! was, by 1993 when John Raitt opened his mouth to sing, a very long time ago. Then again, 1969 was itself nearly a quarter century past. The difference is the culture of the 1960s remains eternally accessible to contemporary audiences.

There’s a reason: We live in a 1960s world, suffused with a 1960s sensibility that is informed, if not sustained by, the very contagious rebel-persecution complex. Writing in 1997, Todd Gitlin declared, “In the not-very-gay nineties, a president associated with [the 1960s], whether he likes it or not, has had to devote considerable energy to wriggle away from the reputation.”57 Maybe Gitlin was referring to Clinton’s sidestepping of such age-old transgressions as draft evasion or womanizing. But to what extent, if at all, did our 1990s president ever have to distance himself from the 1960s era of his youth?

In answer, it’s worth considering a little noted (i.e., clashless) excursion the First Family took back in 1997 to celebrate daughter Chelsea’s seventeenth birthday. Flying to New York, the Clintons attended three of Broadway’s most popular shows that year: Chicago, Rent, and Bring in ’Da Funk, Bring in ’Da Noise. In the course of their theatrical whirl, the Clinton family contemplated same-sex kissing, heterosexual intercourse (simulated), dildos, masturbation, marijuana, and twin blasts of black racism and cultural separatism. They were also mooned. (According to The Washington Post, the mooning incident took place “at an angle to the president and his family that was as decorous as allowed by the act of pulling down one’s pants.”58) That this juxtaposition of the (pre-Lewinsky) presidency—not to mention the (pre-Lewinsky) presidential family—with so countercultural a calvacade inspired little or no comment should lead us to rephrase the question. Rather than wonder how far the forty-second president wriggled from the 1960s, maybe it’s better to ask, where else could he have possibly gone?

In so many ways, the same question may be asked of the rest of us. The answer is nowhere. This is true, but not because time has stood still. There is a link between our affinity for the adolescent culture epitomized by the 1960s, and our even older aversion to maturity. “I recall Leo Rosten observing long before Columbia that, so far as he could see, what the dissatisfied students were looking for were adults—adults to confront, to oppose, to emulate,” Irving Kristol wrote in 1968. “It is not going to be easy to satisfy this quest, since our culture for many decades now has been plowing under its adults.” He continued,

I agree with Rosten that this is what is wanted, and I am certain it will not be achieved until our institutions of higher education reach some kind of common understanding on what kind of adult a young man is ideally supposed to become. This understanding—involving a scrutiny of the values of our civilization—will not come soon or easily, if it ever comes at all. But we must begin to move toward it.…59

How? Almost forty years ago, Kristol suggested what he called a paradoxical first step that would encourage a “variety of meanings [of adulthood] to emerge.” This sounds like one way to throw open the debate. But something else needs to happen first. Remember the child with the fearless clarity who declared the emperor had no clothes? We need to take a look at our adolescent culture and declare it has no adults.