Early in 2002, Newsweek offered the latest in a series of buyouts, each one slightly less generous than the one before, and I elected to take it. That summer, Betty threw a surprise retirement party at our home that included our grandchildren as well as neighbors, old friends, and several of my Newsweek colleagues. The magazine’s editor, Mark Whitaker, and his wife, Alexis Gelber, an assistant managing editor, were in on the secret and drove up from the city to present me with the company’s equivalent of a gold watch: a framed Newsweek cover with my face on it. And then it was over. Almost.
Managing Editor Jon Meacham offered me a contract as a contributing editor, which gave me a title to put after my byline wherever my work appeared, and an unmarked office I could drop into whenever I wished. By then either early buyouts, retirement, or death had taken away most of my longtime colleagues, and with them much of the magazine’s institutional memory. I particularly missed Jack Kroll, Newsweek’s legendary arts editor and critic, who had been my lunch partner most days of the week. For years after his death in 2000, Jack’s disheveled office with its floor-to-ceiling hedgerow of books remained undisturbed as a kind of homage to this icon of Newsweek’s glory days. It was that kind of place.
Meacham’s promotion to editor of Newsweek in 2006 was historic, and not just because he was only thirty-seven years old: he was the only top editor in the magazine’s history who also functioned as the de facto religion editor when cover stories of that kind were to be written. This made sense because no one else on staff could match his knowledge of the field or enthusiasm for the subject. Jon is a graduate of the University of the South (Sewanee) and his immersion in the traditional Anglican liturgy, theology, and piety of southern Episcopalianism was as deep as mine was in midwestern Catholicism. Had the Anglican Communion established its own version of the Jesuits, I’ve always thought Jon would have made an ideal candidate.
During his brief tenure, Meacham tried to transform Newsweek into a magazine of essays and other long-form journalism. He had, he said, always admired eighteenth-century English publications like Tatler and The Spectator. “I’m not sure the future of Newsweek is with Addison and Steele,” I ventured when he told me of his plans. Jon lost his gamble—the odds it would pay off were always long—and the magazine was sold to The Daily Beast. Tina Brown tried to reanimate Newsweek digitally and in print but the later version disappeared in November 2012. The brand, which was all that was left of the magazine, eventually migrated to its present owners, but not the Newsweek we experienced as an established set of journalistic norms and practices. That newsmagazine belongs to history.
The truth is, Newsweek, as well as Time and U.S. News & World Report, had been continuously “reinventing” the newsmagazine for decades in a protracted battle first with television and then with the Internet and the advent of the 24/7 news cycle. But you couldn’t really hear the death rattle until management began to close its news bureaus around the world. A newsmagazine doesn’t need paper to survive but it cannot exist without its own reporting. Much of what passes for news on digital imitations of a newsmagazine are stories first reported by others elsewhere and aggregated by editors whose main contact with the world is confined to the computer screen in front of them.
History is useful in discerning patterns in the past, and I hope this book is helpful in understanding how—in the relationships between religion, culture, and politics—we got to where we are. History can provide comparisons with the past and offer perspectives on the present, but even recent history cannot tell us what will happen next. If, as I have argued, the religious volatility of the last half of the twentieth century is best understood as a reaction to social, cultural, and political upheavals, then the first decade and a half of the new millennium should have witnessed similar echoing effects. But it did not.
The unprecedented terrorist attacks of September 11, 2001, which killed more than three thousand Americans in the fiery destruction of the World Trade Center’s twin towers in lower Manhattan, the destruction of part of the Pentagon, and a plane crash in Pennsylvania, made all Americans realize that their homeland was no longer invulnerable. Fortunately, the assault by Muslim terrorists brought the nation together—and with it an outpouring of national prayer as well as national mourning. Unlike the previous century, there were no doomsday cults, no millennial forebodings, no retreats to rural—and presumably safer—communes.
Most Americans also supported the following month’s invasion of Afghanistan, followed in turn by President George W. Bush’s far more questionable decision in 2003 to invade Iraq, a military excursion into the quagmire of the Middle East that has kept Americans at war for the longest period in the nation’s history. (And despite various troop withdrawals, U.S. military involvement in the Middle East is again increasing.) But these military ventures, while accompanied by some voices of dissent, did not produce the kind of movement religion that accompanied the war in Vietnam. The main reason was that students were not subject to military duty, and so could ignore the war, nor were Americans at home called upon to make wartime sacrifices. As before, those who volunteered to fight alongside career soldiers came mostly from that portion of the population with few other career options. Our “brave fighting men and women” were other Americans’ proxies.
And then, in December 2007, the $8 trillion housing bubble burst, setting off the worst recession since the Great Depression. The following year saw the near collapse of overleveraged investment banks, necessitating a federal bailout of the very financial institutions whose reckless practices had precipitated what became a global financial crisis. More than eight million workers lost their jobs, the average family income plummeted, and the number of Americans living in poverty rose to a post–World War II high. But unlike the 1960s, there were no poor people’s marches on Washington, perhaps because an African American was in the White House; no prophetic voices from the pulpits, perhaps because there were no longer any nationally prominent pulpits or prophets to fill them; no pastoral letter on the economy from the U.S. Catholic bishops, as in 1986, probably because they recognized their lack of competence in the intricacies of global finance. In sum, no religious mobilizations of any kind, because there was no one of sufficient stature to do the mobilizing. Yet racial tensions remain combustible in cities like Baltimore and Chicago, the gulf between the very rich and the rest of Americans has widened, and the cultural views and values of the American people (as registered by traffickers in the new, unfiltered “social” media) are more segmented than ever.
This segmentation is readily apparent in American religion. Recall the map of religious America I described in the first chapter of this book. There we saw that in the 1960s half the counties in the United States were dominated by citizens who identified with one or another Christian denomination, and that between them, Baptists and Catholics alone accounted for 40 percent of the American population. Today, the map’s chromatics would have to include a generous portion of white—a noncolor—to identify the religiously nonaffiliated who now represent the largest single “denomination” in nineteen states and nearly a quarter of the population. A contemporary map would also have to be shaded to indicate levels of commitment within those areas where one denomination still dominates. In Massachusetts, for example, Catholics still outnumber other religious groups, yet only 17 percent of them regularly attend Sunday Mass. As a visual, the contours of a current map of religious America would resemble a mound of ice in midsummer: still rather thick down the middle and across its (southern) base but rapidly melting at its (coastal) edges.
The unanticipated surge in the nonaffiliated has bolstered the view that the United States, long considered a religious outlier among the advanced societies in the West, was in fact treading the same path as Europe toward secularization, albeit after a delayed start and—until now—at a slower pace.1 The likelier explanation is that institutional religion is experiencing a long-overdue winnowing effect. American “belongers-but-not-believers” and the vague “believers-but-not-belongers” are properly self-identifying as Nones. That’s clarifying. I think John Green is right in estimating that only one in four adult Americans put religion at or near the center of their lives.
We are also witnessing a generational slide as older and typically more religious Americans die off and are replaced by younger generations for whom religion has become progressively less relevant to their own self-identity. But the more encompassing fact is that most young Americans between the ages of eighteen and thirty do not readily identify with any institutions—political, civic, academic, or religious. This weak identification with basic social institutions is what makes the experience of growing up today sharply different from the experience of those who came of age in the 1950s and early 1960s. You don’t have to be a social scientist to recognize the difference. It’s enough to be a grandparent.
At the outset of this book I located my youthful self at the center of concentric circles of belonging: family, yes, and outlying relatives, and the town that we felt we owned even more than it owned us. But just as significant were the schools whose teachers understood that their role was to form as well as to inform their students, and the church that seemed to be everywhere I chose to be. Even now, I abide in these institutions, much like readers do their favorite books, as they abide in me. As George P. Fletcher has argued, “historical selves” (of the kind I invoked in the first chapter of this book) are not built on the personal choices of autonomous individuals; rather, they are rooted in the rich historical loom of relationships based on mutual bonds of loyalty, obligation, and trust.2
But in their journey toward adulthood most young Americans now follow a different social script. Rather than defining themselves through relationships formed within family, neighborhoods, churches, schools, and teachers, the young imagine—and culturally are encouraged to believe—that the point of growing up is to discover, nurture, and express an inwardly derived, original, and authentic self—independent of institutionally structured relationships with others. Already in 1997, philosopher Charles Taylor observed that successive generations of young Americans, many of them now parents themselves, have embraced an ethics of self-authorization, self-fulfillment, and personal choice. Selves so conceived lack a social history and a capacity for self-criticism and self-restraint. Nonetheless, as Taylor points out, they remain heavily dependent on outside recognition and affirmation. Indeed, recognition denied is often considered “a form of oppression.”3
Every new generation inhabits social structures created by their elders. If the young no longer understand themselves in relation to these inherited social institutions, neither do these institutions support, in the ways they once did, basic social needs. We acknowledge that basic institutions like families and schools are failing the poor, especially African American youth. But anyone interested in the future configurations of American faith, culture, and politics must begin by looking at how young Americans of the striving middle classes experience marriage, family, neighborhoods, and schools.
For the four thousand years that humankind has been known to exist, children have typically been raised by tribes, clans, and variously extended networks of kin. In the long view of history, therefore, the intact nuclear family (mom, dad, and the kids) is a recent social arrangement, one that is fraught with its own set of economic, emotional, and psychological problems.4 The family is still the primary institution through which children acquire a sense of self and of belonging. But over the last half century, progressive public acceptance of cohabitation, single parenthood, and no-fault divorce has profoundly compromised marriage and family as foundational social institutions. Across all social classes, marriage has become one lifestyle choice among others, rather than a set of governing cultural norms that places restrictions and expectations upon the married as well as the unmarried. More than 40 percent of new mothers are unmarried and at least half of all American children can expect to live at least part of their lives in a single-parent household. Despite the heroic efforts of many single parents, fractured families are inherently less stable than intact ones. For children growing up in these circumstances, marriage and family acquire a decidedly tentative character. And, we know, children of divorce are far more likely than others to eventually divorce themselves, thus extending the cycle of institutional instability.
Even so, children have never been more dependent on parents for achieving later success. Parents who see that that homework gets done and school assignments turned in on time are a child’s single most valuable education asset, regardless of teachers and classroom size. If children see their parents socially engaged, politically involved, or religiously convicted, chances are that when they arrive at adulthood they will be, too. With the advent of the two-paycheck family, parental stress has intensified. The same smartphone that allows parents to stay connected with their children also allows bosses and clients to contact them at any time. From arranged “playdates” for their children, to mandated attendance at weekend soccer games and after-work trips to the grocers, parental time is measured by an endless series of schedules to be met. Making them mesh is now a discipline all its own. The leisure of unencumbered weekends that even blue-collar workers once enjoyed is history. The Sabbath as a day of rest is a reliquary concept now found only in the Bible.
Neighborhoods as wider social spaces for childhood being, doing, and belonging have changed, too. The specific social gravity of any neighborhood can be measured by the number of people down the street, on the next block, or at the local stores who can greet kids by name and feel free to help or scold if necessary. But in many inner-city neighborhoods, gangs, guns, and drug dealing make out-of-doors a perilous place to be. Even in upscale neighborhoods, fearful parents keep kids on a short tether because drugs and danger can be found in every community. It is rare to see kids roaming unattended or squads of bicycling preteens freewheeling down suburban streets. Rarer yet is the sight of teenagers cutting neighbors’ lawns, delivering newspapers, shoveling snow, stocking store shelves, caddying, or pumping gas. As late as the 1980s, working jobs like these was not only a way of earning spending money but also a way of relating to adults within the larger community.5 Today, all these jobs are performed by adults trucked in from elsewhere. For a great many young Americans, the real neighborhood is now the nearest shopping mall, a social destination where youthful selves are tutored in the techniques of consumerism.
Forty years ago, social historian Christopher Lasch argued that the authority and competence of the family had been undermined by outsourcing the socialization of the young to teachers, experts, and social planners. In this way, he said, “the apparatus of mass tuition” had become “the successor to the church in a secularized society.”6 Lasch, who died in 1994, did not live long enough to witness the latest iteration of a process that he traced to the 1930s. Back then, the interventions of educators and social planners were aimed mainly at immigrants with the goal of “Americanizing” them. Today this form of socializing the young extends across the economic spectrum, beginning with day care for infants as young as six months, through preschool to primary school, with after-school “enrichment” programs. To be sure, many of these programs are designed to give children from disadvantaged families a much-needed “head start” in developing learning and social skills. But in upscale suburbs and city neighborhoods they function primarily as places to keep children occupied and safely off the streets until parents arrive from work to fetch their children home.
Despite the increase of mothers who work full-time outside the home, recent surveys tell us that today’s parents actually spend more time interacting with the children than did parents in the less hurried 1950s. In part this is because children are no longer allowed to be on their own, as I have noted. But another reason is that parents in the striving middle classes are more intent on giving their children a competitive edge in the race for higher-income jobs in later life. Unlike the Fifties, there are few manufacturing jobs awaiting high school graduates, few family businesses like insurance agencies and retail shops to pass on to children—no upward mobility at all without the certification that comes with a college degree or better. Since degrees in engineering, law, or business are not inheritable, parents who are ambitious for their children invest a great deal of time preparing them for the higher-education credentials chase.7
Thus, early in their high school years today’s adolescents learn that if they are to “make something of themselves” their immediate task is to assemble a resume—their first exercise in self-branding—that will get them into a proper college. That means not only working up a worthy grade-point average and taking Advanced Placement courses, but also engaging in a mix of school-sponsored athletics and extracurricular activities in order to attract the attention of college admissions officers. For most adolescents, this means no time for goofing off and little time for a full night’s sleep. Proof of service to the community is also necessary for building an attractive resume, although time spent cooking for an aged aunt or wheeling an infirm grandparent to the park does not earn service points. The contradiction inherent in coerced altruism should be self-evident. Lasch, I think, would recognize it as the last bureaucratic bow toward the old Protestant principle that public schooling ought to build character.
The most important rite of passage for young Americans today is their transition to college. Now, as in the past, college is where our adolescents go—or are sent—to prepare themselves socially and intellectually for adulthood. But over the last half century, student aspirations have changed dramatically. In 1967, 86 percent of incoming freshmen said their aim was to develop “a philosophy of life.” Fifty years later, the same percentage cited “being financially successful” as their goal—a major reason why humanities departments are withering on the academic vine.8
Today, most college students and their parents take a decidedly instrumentalist view of higher education, and who can blame them? The gaudy sticker prices (up to three hundred thousand dollars for four years at a college or university) and the prospect of many years paying off student debts raise legitimate questions about value received. For starters, today’s undergraduates need fewer credit hours than were required of students in the 1950s to acquire the same bachelor’s degrees. That translates into a full semester less of classroom instruction, yet students now pay many times the price in inflation-adjusted dollars.
Although the undergraduate population has expanded exponentially since the baby boomers came of age, it’s not the students who have changed so much as the institutions most of them attend. American universities have come to resemble huge youth preserves, with as many as sixty thousand students at large state campuses, where students are comfortably fed and housed, provided with contraception, state-of-the-art gyms and fitness centers, seasonal spectacles of intercollegiate sports, and a menu of touring entertainers. But in matters like academic rigor, curriculum coherence, love of learning, adult supervision, and mentorship, students are left to their own devices. Date rape is just one of the consequences of treating adolescents as if they were adults. As education critic Albert Delbanco observes, the most obvious change in undergraduate education is the expansion of student freedom—“not just sexual freedom but what might be called freedom of demeanor, and deportment, freedom of choice as fields and courses have vastly multiplied, and, perhaps most important, freedom of judgment as the role of college as arbiter of values has all but disappeared….Except in the hard sciences, academic failure, especially in the elite colleges, is rare and cheating, except in the military academies, tends to be treated as a minor lapse.”9 This is college that is neither Alma nor Mater. On the contrary, observes sociologist Christian Smith, author of several books on American youth, “it’s like putting a bunch of novice tennis players together on the court and expecting them to emerge later with advanced skills and experience.”10
Criticism of undergraduate education has mounted over the last decade and the evidence produced by researchers is sobering. Two features stand out. First, college students are routinely awarded high grades—in fact, the most common grade (43 percent) is A—for little effort. By comparison, in the 1960s, when students in each class were still graded on a curve, only 15 percent were awarded A’s.11 B is now the gentleman’s C for students who are just passing time in college. More than a third of students, according to one major research sample, reported spending fewer than five hours a week studying alone. For that they were rewarded with a grade-point average of 3.2. Over the course of their undergraduate studies a preponderance of students showed little improvement—in some cases none at all—in fundamental skills like writing, creative thinking, and analytic reasoning, based on Collegiate Learning Assessment scores.12 At graduation, those students who drift through college are rewarded with Wizard of Oz degrees that tell prospective employers only where they spent the last four to six years.
The second feature is lack of student interaction with adults during their undergraduate years. Typically, students neither seek out their professors for conversation outside the classroom nor are sought after by their teachers. Most, in fact, are taught by teaching assistants or adjunct professors their first two years, while tenured professors appear only for lectures. Assignments are given, received, and graded via email and Internet posting. Nor are undergraduates interested in the adult world outside the campus. They very rarely read a newspaper, in print or online, and very rarely discuss current events with family or friends.13
The minority of young Americans who arrive on campus aiming to develop a “philosophy of life” discover soon enough that neither the curriculum nor the student culture encourages reflection on the good life, much less on how to wrestle with the moral choices they personally face in the freedom that college life confers. The reigning ethic is moral self-authorization and nonjudgmentalism: what is right for me may not be right for you but no one has the right to judge anyone else. Nor should anyone be required to justify their actions. In fact, most students lack a moral vocabulary for doing so.
In the most searching studies we have of the moral lives of American collegians, sociologist Christian Smith found that many of the students he and his colleagues studied either could not identify a moral problem they had recently faced or misidentified a problem that was not moral at all. Asked what made something right, an all-too-typical response was this: “I mean for me I guess what makes something right is how I feel about it, but different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and what’s wrong.”14 This soft-core moral relativism is part of what Smith calls the “the dark side” of emergent adulthood, but it is also the prevailing moral context against which students with a sturdier sense of right and wrong must defend themselves in and out of classrooms.
In a pair of companion studies, Smith and his colleagues at the University of Notre Dame provide similar assessments of young Americans’ attitudes toward religion as they moved through high school and college. Regardless of religious background, their studies found that the majority of young Americans hewed to what they call a “new de facto religion: moralistic, therapeutic deism.” The basic beliefs of MTD are easily summarized:
First, a God exists who created and orders the world and watches over human life on earth. Second, God wants people to be good, nice and fair to each other, as taught in the Bible and other world religions. Third, the central goal in life is to be happy and to feel good about oneself. Fourth, God does not need to be particularly involved in one’s life except when God is needed to resolve a problem. Fifth, good people go to heaven when they die.15
If moralistic, therapeutic deism sounds terribly abstract and anemic, that’s because it is. It’s religion with a shrug.
Since the 1970s, the transition of young Americans from adolescence to adulthood has expanded into a genuinely new phase of the American life cycle that social scientists have variously labeled “delayed adolescence,” “adultolesence,” youthhood,” and “emerging adulthood.” The causes are many but the cumulative effect has been the postponement by a dozen years or more of the standard responsibilities that mark adulthood: a real career, a stable residence, marriage, and parenthood. For the disciplined, talented, and ambitious, these transition years provide time for further education, for creative exploration of career options, and for learning—often for the first time—how to budget money and time.
But for a great many others it’s a time of drift—a continuation of adolescent college habits like binge drinking, sexual hookups, intellectual stagnation, and, for more than a third of them, extended dependence on parents for money and housing. Whatever their living arrangements or degree of economic success, most emergent adults continue to inhabit same-age social enclaves. They remain relatively isolated from adults who could be their mentors, still uninvolved in civic, political, or religious organizations and the concerns, constraints, and social practices these institutions entail. I think this goes a long way toward explaining why, despite the turbulent early years of this century, young Americans have been so unconcerned and politically quiescent. Life’s on hold; adulthood can wait.
Given what we are learning about emergent American adults and the social structures they inhabit, there are no good reasons to believe that the immediate future will be any different. Their widespread inability to think through moral problems or even recognize them, even more the unwillingness to hold themselves accountable to anyone but themselves, should disturb us deeply. So should their inurement to values that transcend consumerism and “being well-off.”
The social history of the next half century will depend in part on how—and how many—young Americans overcome the limitations of their protracted coming-of-age. Can they build for their own children more responsible social institutions than those they themselves experienced? The future of American faith, culture, and politics may not be all that hangs in the balance.
But the young cannot do this alone. In his study of the human life cycle, Erik Erikson described the challenge of those who live into old age—a stage that I honestly if reluctantly acknowledge to have reached—as one that pits the temptation to terminal self-absorption against the opportunity to exercise continued care and concern for those generations moving behind us. Which may be the real reason I have written this book.
One of the blessings of old age is the clarity with which diminishing energy of mind and body allows us to see what has been our human lot all along—namely, contingency, transience, and finitude. We cannot control what may happen to us. Nothing lasts forever. We must die. These hold true for believer and nonbeliever alike. They are the existential facts of life that all religions in different ways address.16 In reply, Christians like myself are called to abide in Faith, Hope, and Love. What matters is that God’s grace is everywhere.