Einstein, the Automobile,
and the Marginalization of Grandma
Albert Einstein dominates every part of the twentieth century including, and more or less directly, religion. He began his significant work in 1902, but it is 1905 that is known as Annus Mirabilis, the year always to be marveled at. In that year Einstein published four papers that changed the consensual illusion forever.
First, he postulated that the photoelectric effect could be explained if light were understood as being at times “bundles,” or what he called “quanta,” interacting with matter. Max Planck, another mighty giant of the century, had introduced such an idea in 1900 on the basis of hypothetical mathematics, but it was Einstein who gave us the quantum world incarnated. And as surely as Newton had once upon a time postulated the classical physics that was the descriptor of the visible world, so Einstein’s students, associates, and even some detractors would give us the quantum physics that was the descriptor of the invisible world. As had been true with Faraday’s work, so again much of the kingdom of the angels, of the mystery of soul, was forever breached by the simple process of being exposed as physical and subject to incredible, but still describable, laws.
In the Annus Mirabilis, Einstein also published a paper on Brownian motion, though he apparently was unaware at first that the phenomenon he was studying went by that name. Robert Brown, who died in 1859, had been a friend and confidant of Darwin. He was also a crotchety, but methodical, botanist who first noticed that any small, small bit of anything—dead or alive—will zig and zag about frantically when it is suspended in a liquid. Because he described that fact, the zigging and zagging is named after him—Brownian motion.
Amateur as well as professional scientists had already played with, and commented upon, Brownian motion for half a century before Einstein ever decided to try to describe quantitatively the nature of the motion. In his study, though, Einstein demonstrated that the movement of tiny things in liquid suspension is proof of molecular activity and, as a result, offered almost irrefutable support for the existence of atoms. The angst of the mid-twentieth century had been born. Welcome to the birthing cries of a world that understood, for the first time in human history, that we really could destroy the earth and each other totally, completely, without hope of escape. Welcome to Hiroshima.
In the third of his 1905 papers, Einstein—brilliant, sassy, and twenty-six years old—published the theory that, over the course of his lifetime, would cause him the greatest consternation. Based on his work on the electrodynamics of moving things, Einstein postulated the “special theory of relativity.” In effect, what special relativity did was overthrow any notion that there might be such a thing as absolute space or absolute time by showing that both are dependent upon an observer and that each of them is perceived differently, depending on the observer doing the observing.
Heisenberg and Uncertainty
The special theory led Einstein to argue, in his fourth paper, that matter and energy, which had always been thought to be separate entities, were equivalent, giving the world what is perhaps its most famous scientific formula: E = mc2. But the special theory also led, in 1927, to what undeniably is the most famous principle in twentieth-century science—Heisenberg’s Uncertainty Principle. It was this that would break the heart of Einstein and, in many ways, that of his century.
The Heisenberg Uncertainty Principle has been reduced, by now, to an almost commonplace tidbit of everyday conversation: You can measure the speed of something (in Heisenberg’s case, a particle) or you can measure its position; but you cannot measure them both. That is, the more you know about the speed of a thing, the less you know about its position until finally one has to concede that the act of observing itself changes the thing observed.
Einstein saw Heisenberg’s Uncertainty as a corruption, and even a reprehensible misuse, of his special theory of relativity, arguing that it destroyed the basis for any “fact” in life. Einstein was right in his interpretation of the consequences, but Heisenberg was right in his science. Nor would the Heisenberg Principle stay safely tucked away in physics labs. Instead, “uncertainty” became the only fact that could be accepted as fact, not only in the popular mind, but also in large segments of the academic mind as well.
In particular, literary deconstruction planted its standard dead in the center of Heisenberg, claiming that there is no absolute truth, only truth relative to the perceiver. And, as an obvious consequence, all writing—be it sacred or secular—has no innate meaning until it is read and, therefore, has no meaning outside of the circumstances and disposition of the reader. Enter the battle of The Book. Enter the warriors, both human and inanimate, who will hack the already wounded body of sola scriptura into buriable pieces. Enter the twentieth century’s great, garish opening in the cable’s waterproof casing of story.[9]
Looking for the Real Jesus
But in the name of historical accuracy as well as fairness, we need to remind ourselves, before we go any further, that “Scripture only and only Scripture” really was, if not badly wounded, then certainly badly bruised, well before Einstein or Heisenberg ever came along. Their work would only reinforce and broaden an investigation already in progress.
At about the time this country was being established, a German theologian, Hermann Samuel Reimarus, first asked the question that would haunt the twentieth century far more than it ever haunted his own. Basically, what Reimarus asked himself and his colleagues was a deceptively simple question: What, he said, if Jesus of Nazareth and the Jesus of Western history and thought are not the same?
Although Reimarus eventually wrote a masterful treatise, The Aims of Jesus and His Disciples, to address the subject, neither his question nor his carefully considered responses unsettled many folk at the time, simply because they had little or no access to either Reimarus or his ideas. His basic question, however, would prove to be like the miseries in Pandora’s box; once it had been articulated, there was no putting it back in the recesses of academic halls and moldering libraries ever again. Most auspiciously, it would be asked in print again in 1901 by Albert Schweitzer in a book called The Quest for the Historical Jesus.
Schweitzer, unlike Reimarus, was a popular public figure, an organist of some international stature, at that time, as well as a clergyman. He lived, as well, in the early twentieth century, when the beginnings of mass communication—cheap books, ubiquitous newspapers, a reliable and inexpensive postal service—made it harder to keep ideas contained, especially if they were a bit scandalous or insurrectionist. And Schweitzer’s ideas were; for he concluded that Jesus of Nazareth was not the same entity as the Christ of Western Christianity and Western thought. He concluded as well that we could never know that “real” or historical Jesus. As a result of its huge, popular impact, Schweitzer’s Quest is usually regarded now as marking the end of one era in sola scriptura and empowering the opening of another.
Some four decades later, before the midcentury, scholars would begin to theorize, using literary deconstruction and form criticism, just where and how editors or redactors had changed original texts into those we today recognize as the canonical Gospels. Others would work to physically discover and define the accumulating layers of text that undergird the editions we have. The discoveries of Nag Hammadi and Qumran in 1945 and 1947 respectively, along with more recent archaeological finds, would furnish primary sources in physical support of much of what earlier had been only theory. By the closing decades of the twentieth century, Jesus scholarship, with Reimarus, Schweitzer, and Heisenberg as its intellectual forebears, had become the life work, in public space, of superb and popularizing scholars like Marcus Borg, John Dominic Crossan, Elaine Pagels, and Karen King.
What their work in aggregate seemed to offer up to public view was a Jesus who was as much guru and sage as God Incarnate. In response, other, equally well-known and popularly published thinkers and researchers, working with the same tools and as various in background as Fr. Raymond E. Brown and Rabbi Jacob Neusner or Bishop N. T. Wright, worked to lessen the subjectivity of Jesus scholarship by focusing on the Judaism in which He lived, contending that historical context is the soundest critical tool available to us for faithful exploration and discovery. Either way, Heisenberg, had he been alive in 2000, would undoubtedly have been amazed at just how much difference a little physics can make in a village church.
And though Einstein may have deplored Heisenberg’s Uncertainty Principle and correctly foreseen, unlike Heisenberg, what the cultural and religious ramifications of it would most surely be, he too cannot be allowed to leave the Scripture only–only Scripture conversation scot-free. In 1915–16, Einstein published what was the last of his great papers, his “General Theory of Relativity.”
Out of the mathematics of general relativity would come ideas and postulates that are themselves also matters now of household conversation: time as another, and fourth, dimension; time as capable of being slowed; the ongoing expansion of the universe; the Big Bang. And in conjunction with the work of other brilliant, popularly known physicists like Edwin Hubble, general relativity would eventually make it possible, on July 20, 1969, for Neil Armstrong and Buzz Aldrin to walk on the surface of Earth’s moon. In doing so, they walked on what always before had been the footstool of God, and that made all the difference. Literalism based on inerrancy could not survive the blow (though it would die a slow and painful death); and without inerrancy-based literalism, the divine authority of Scripture was decentralized, subject to the caprices of human interpretation, turned into some kind of pick-and-choose bazaar for skillful hagglers. Where now is our authority?
Enter Pentecostalism
But if 1905 had been an annus mirabilis, 1906 could hardly be called a slouch either. In February of that year, a young black preacher named William Seymour left Kansas, headed toward Los Angeles and the call to come and preach his strange doctrine that baptism in the Holy Spirit was accompanied by the gift of speaking in tongues (both glossolalia and zenolalia). Within less than two weeks, the church that originally had invited Seymour to preach had barred their doors against him and his appalling doctrine, forcing him to move his sermonizing to the home of a supportive couple, Richard and Ruth Asberry. From the Asberrys’ modest home on North Bonnie Brae Street in Los Angeles, Seymour preached to a small, but growing crowd without incident or fanfare until April 9. That night, during the evening sermon, one of Seymour’s listeners, Edward S. Lee, suddenly spoke in tongues for the first time. Three days later, Seymour himself received the gift, as did many of the others present.
Word of what was happening on North Bonnie Brae spread like a wildfire through the Latino and Negro communities of L.A. and, shockingly enough for those days, through Caucasian ones as well. The next night, so large a crowd of every race and social class and both genders gathered on the porch of the Asberry home that the porch itself collapsed, doing damage as well to the house’s foundation. Two days later, on April 14, 1906, Seymour preached his first sermon in an old, cleaned-up and converted livery stable at 312 Azusa Street, and the rest is history. The rest is soul-changing, history-changing history, in fact; Pentecostalism would become a major player in the new rummage sale.
There had been a series of pentecostal-like events before the Azusa Street Revival. Some were as far away as Wales and Switzerland, and others as close as western North Carolina. Charles Parham, whose ministry was located in Kansas, for instance, was the one who originally had taught Seymour; and Parham is still regarded today as one of the founders of Pentecostalism. It is always Azusa Street, however, that is acknowledged as its true starting point. And over the next century, Pentecostalism, Azusa Street style, would sweep not only North America but the whole globe. By 2006, the number of Pentecostal and Charismatic Christians would exceed five hundred million, making them second only to Roman Catholicism as the world’s largest Christian body.[10]
Because Pentecostalism had its roots deep in egalitarianism, it was to come into North American Christian experience as the first, visible fulfillment of the apostle’s cry that “In Christ, we are all one body.” Pentecostalism’s demonstration of a Church of all classes and races and both genders became a kind of living proof text that first horrified, then unsettled, then convicted, and ultimately helped change congregational structure in the United States, regardless of denomination. In addition, the often loud, often apparently disorderly, always musical and participatory worship of the Pentecostal movement came in time to make the worship of the established Protestant denominations look as if they were somewhere between corseted and downright dead by boredom. Participatory worship became the standard, especially in evangelical Christianity which is Pentecostalism’s nearest kin in bloodline.
The impact of the African-American experience on North American Christianity was and remains enormous. To begin even to sketch it requires a freestanding volume just on that subject alone, a largess we do not have, unfortunately. Suffice it here, then, simply to say that the Afro-American community in 1900 was, by and large, the only part of American Christianity that had an active, native, or “largely untheologized,” community-accepted spirituality. One of the great gifts of Pentecostalism to the greater body of the whole Church was its origins in, and incorporation of, the African-American spiritual experience. The efficacy of historic black spirituality and the immediacy of palpable contact with the divine which it enabled have been central to Pentecostalism since Azusa Street. It is almost undoubtedly this last component of Pentecostalism that has caused it, quite literally, to encompass the globe as well as change the ways and expectations of non-Pentecostal worshipers.
All that having been said, however, we must hasten to say that in terms of the Great Emergence as an event in religio-cultural history, there is an even greater point to be made here. Pentecostalism by definition assumes the direct contact of the believer with God and, by extension, the direct agency of the Holy Spirit as instructor and counselor and commander as well as comforter. As such and stated practically, Pentecostalism assumes that ultimate authority is experiential rather than canonical. This is not either to say or to imply that there is denial of the Holy Scriptures. It is to say, rather, that forced into a choice between what a believer thinks with his or her own mind to be said in the Holy Scripture and an apparently contradictory message from the Holy Spirit, many a Pentecostal must prayerfully, fearfully, humbly accept the more immediate authority of the received message. The same thing is true when the contradiction occurs between a received message and the words of a pastor or bishop. Pentecostalism, in other words, offered the Great Emergence its first, solid, applied answer to the question of where now is our authority. Probably just slightly more than a quarter of emergent Christians and the emergent Church are Pentecostal by heritage or affinity, and they have brought with them into the new aggregate this central belief in the Holy Spirit as authority.
Leaving Grandma in the Rearview Mirror
Having come from so lofty a set of considerations as those about Pentecostalism, we need to look at something that appears far more mundane and less portentous, at least at first blush. That is, before we leave the early years of the twentieth century, we have to look at the automobile. It had been around for many years by the time 1900 arrived, especially in Europe where men like Karl Benz were making automotive history long before the average Americans ever even thought about driving one of the things. So it was not the automobile per se that would impact American Christianity. It was the Tin Lizzie, the Flivver, the Model T. Call it by whatever popular name you want, it came upon us in 1908, and it was affordable, reliable, easier than a horse and buggy to care for . . . and fun! America took to the roads and never looked back.
The car was a boon that, like a sharp knife, cut two ways, however. It freed Americans to roam at will, thereby loosening them from the physical ties that had bound earlier generations to one place, one piece of land, one township, one schoolhouse, and one community-owned consensual illusion, of which a large component was the community church. The affordable car enabled city dwelling in a way that had not been possible for many Americans in the past. It also provided, very early, the mechanism by which what had been the Sabbath became Sunday instead.
Family afternoons on Grandpa’s front porch after Grandma’s hearty Sabbath lunch gave way to spins out into the countryside with or without a Sunday picnic. Sabbath afternoons with one’s kin gave way as well to carefully tabulated afternoon calls on friends who lived down the road a bit. Within a few decades, the Tin Lizzie and her offspring would so erode the Sabbath that Sunday would become the day for shopping, for mall visits, movies, and dozens of Little League games, not to mention a significant number of major league ones. Sunday evening services all but disappeared; and early Sunday morning ones (or Saturday evening ones) were invented in order to allow the faithful to get their Sabbath worship over and done with early enough so that there would still be some Sunday left to enjoy.
None of this is inherently either bad or good, so much as it just is. What we have is a set of cultural shifts that came about, in large measure, because of yet another piece of technology, in this case the automobile. What does matter, though, is that Reformation Christianity had rested for centuries on biblical literacy, the nuclear family, and the conserving effect of shared, multigenerational reading, theology, and worship. While we may, at first glance, scoff at Norman Rockwell’s short, chubby, apron-clad, wispy-haired Grandmas serving feasts to multigenerational hordes, a second glance should tell us something else. When mid-twentieth-century Caucasian Protestantism lost Rockwell’s Grandmas, it lost a large part of itself.
It was Grandma, in general, who asked during each Sabbath lunch exactly what little Johnny had learned in Sunday School. And while Johnny might be forgiven for occasionally fluffing a question or two, his parents most surely would not, were it to be discovered that Johnny had not even been to Sunday School in the first place. It was Grandma as well who, by and large, rode herd on the preacher and his tendency toward fancy or newfangled sermons and imported theories of God. Grandma was, in essence, a brake—a formidable one, in fact—on social/cultural/theological change. And because she was and because she asked often and directly about the biblical instruction going on in her families’ homes, she served as something somewhere between the Archivist and the Enforcer of Protestant codes and sheer Bible fact and story. When the Tin Lizzie took away her kingdom of influence, it was Protestantism more than Grandma that came untethered and was diminished. We should note as well that the re-definition of traditionally female roles across all the generations was, and still is, a principal contributor to the shredding of the cable and the exposing of its parts. It certainly is one to which we shall return.
The Influence of Karl Marx
American Christianity in the first two decades of the twentieth century was directly impacted, of course, by more than scientific discoveries and technological inventions as such. Whenever the question of the rightful placement of authority begins to come into play, it is political theory that most markedly begins to change. It is, in principle anyway, the task of political theory to accommodate the secular part of the authority question by furnishing it with new answers. Unfortunately, the answers so born are never entirely secular in scope, implementation, or aftereffects.
More than one historian has remarked that the French Revolution of 1848 was born before its time. That is, it was a kind of limited (though deadly) and preliminary dry run for what was to become, less than seventy years later, the first in a series of wars that would mark the twentieth century as the bloodiest in human history. Karl Marx, with Friedrich Engels, published the Communist Manifesto in 1848; and Marx’s fingerprints were all over the French Revolution. Despite the ferocity of that revolt and the radical propositions that lay beneath it, Marx’s theories of economics and political structure did not enjoy broad circulation or really have much global impact until the closing decades of the nineteenth century.
As with Einstein, so with Marx, in that it would probably be impossible to overstate the influence he and his ideas would come to have on the world of the twentieth century. Like Einstein, Marx built upon the work of those who had come before him, being at times more a realizer than an innovator. In particular, Marx built upon the theories of George Wilhelm Friedrich Hegel. Hegel, who died when Marx was only thirteen, had taught that everything had, inherent in it, its opposite. Good and evil were not antithetical to one another, but rather were two parts of a thing that itself would exist only so long as the two were in opposition to one another. Once the two opposites in any thing had resolved their conflict, they would synthesize, and the thing they were would cease to be. Thus all life was only a becoming, never a being. And all of creation was simply pieces and parts of some great Absolute that was itself becoming.
Known as dialectical materialism, it and its corollaries were revolutionary ideas that, at the time, lacked any popular audience or influence. Marx’s contribution originally was to take Hegel’s Absolute and de-spiritualize it, so to speak. He argued that the becoming process had to happen now and not later, on earth and in temporal affairs and not in some state of affairs-yet-to-be. To that end, government or the state becomes the presence of the Absolute on earth, and it is the duty and salvation of every person to serve the state. And to that end, all other forms of authority must be eliminated, principally all notions of god or God and all forms of organized religion. They MUST be stamped out for the state to be supreme, and the state must be supreme for the people to thrive.
Marx would mix this Hegelian heritage with his own theories of economics, publishing in 1867 the first volume of his other, great work, Das Kapital. The basic argument here was that those who make and own goods will always be looking for the means by which to make more things more cheaply. At some point, the owner-citizens would succeed so well that they would drive the worker-citizens, on whose backs their economic empires were built, to revolt. Such revolt would destabilize and wreck the state. Such a turn of events must, therefore, at all costs be prevented. Prevention lay in making sure that there never was reason for revolt, and that could only happen if the state removed all means of ownership from individual people and instead owned everything itself in trust for the good of all people.
It is a line of thought that is all too familiar to almost every North American Christian, regardless of his or her age. As an attempt to answer the question of where to place authority, it was a frontal attack not only on religion but also on traditional Reformation concepts about human responsibility, individual worth, and existential purpose. Twenty million people in the Soviet Union alone would be sacrificed on the altar of such thinking before Stalin was done with it.
But there was also a genuine attractiveness to Marx’s ideas, and we must be quite clear about that. Good people with bright minds and empowered backgrounds, many of them artists and singers and intellectual leaders, earnestly argued, often to their own social and professional detriment, the virtues of a socialist or a communist state. They argued against the chaos of money-based power and the recurrence of the devastation of worldwide depressions like the Great Depression of 1929. They argued, instead, for the advantages of an authority based on a rational determination of what is best for the most people at any given time and for a kind of proto-secular humanism. This approach, they argued, trumped completely some God-infused, biblically defined code or hierarchy that had been designed for premodern societies. Enlightenment and reason, they said, had set humanity free from ignorance and social vulnerability by furnishing us, instead, with scientifically accurate descriptions of what the cosmos really is and how it works.
An old axiom of folk wisdom holds that one always picks up a bit of whatever it is that one opposes simply by virtue of wrestling with it. As folk wisdom goes, this piece contains an inordinate amount of accuracy. Twentieth-century Christianity in this country met the statism and atheism in communist theory head-on, and American political theory militated from the beginning against the heinous brutality inherent in unfettered power. Nonetheless, we voted in Roosevelt’s New Deal and Johnson’s Great Society.
Likewise, the midcentury, local church was reconceived as the centralized, hierarchal, and stabilizing organization, the life-giving replacement for, and integration of, all that had been lost when urbanization and automotive mobility ripped us away from a common imagination. Churches began to have more building programs for basketball courts and swimming pools and fellowship halls than for sanctuaries and naves. Hugely expensive to maintain as well as to build, none of those courts and pools and meeting halls had as much to do with spiritual or religious growth in faith as they did with effecting a uniformity of social experience and formation that would be conducive to a uniformity of belief. And the thing to be believed in was a God-infused, biblically sanctioned code of conduct that would have made Jonathan Edwards proud. More to the point, as a code of conduct, it was to be believed in as a means of salvation which, as it turns out, is considerably different from believing in God-among-us as a means of salvation.
The Spiritual Strand and Alcoholics Anonymous
By the 1970s, the young men and women who had been products of all those basketball courts and fellowship halls were rebelling against the burden and the sterility and the disconnect with reality that they constituted. Those children of the late ’40s and the ’50s who were entering their adulthood would be spiritual, they said, but no longer and never again religious. The first strand in the braid had just been pulled up out of the cable for inspection. It would take almost half a century to finally work it back into place again. But more than just rebellion per se was behind the “I’m spiritual but not religious” mantra.
When speaking of which sociocultural events in the twentieth century most affected North American Christianity and its shifting relationships with spirituality, many sociologists of religion will cite the founding of Alcoholics Anonymous as the first in the list of prime movers. AA officially dates itself, as it should, from 1935 when Bill Wilson and Dr. Bob Smith began to formalize a method of addiction recovery. In actuality, as with Pentecostalism and Azusa Street, so with AA and 1935. That is, AA also had its precursors, primarily in parts of early twentieth-century evangelicalism like the Oxford Group or Calvary House. It certainly had its roots, to some extent, in the work of William James, whose Varieties of Religious Experience, published in 1902, still stands today as one of the early twentieth century’s most seminal books. In any event, by 1935 Wilson and Smith had evolved six “principles” or “steps” toward recovery. Shortly thereafter, Wilson would rework the six into smaller units, the result being the now-familiar Twelve Steps of almost every recovery group since.
The informing thing about AA, however, was not so much the Steps themselves as their bases and their implementation. The Steps repeatedly make the point that the addict can be helped only by God . . . not God by the name of Jehovah or El or Adonai or Yahweh or Jesus, but “God as we understand Him.” “Choose your own concept of God” was to be one of the early principles that liberated Wilson from his own torment, and he would remain true to it throughout his life. God could even be addressed not as God, but as a/the Higher Power. In fact, health itself seemed to depend upon one’s having the power or facility to make just such a leap from the doctrinal to the experiential, and who could effectively argue with that, especially given the increasingly obvious success rate AA was producing?
More than the principle of generic God, which arguably has its popular accession here, AA also assumed from the start that the addicted were better, more effective healers of the addicted than were non-addicted (or non-confessing) experts and authorities, including most particularly pastors and clerics. Now help—effective, productive, demonstrable help—was coming from other, equally wounded and empathetic nonprofessionals. While the American experience was built from the start on anti-clericism, AA and its success, however unintentionally, delivered a serious blow to the role and authority of the clergy, especially Protestant clergy, in this country. That professional standing and influence would receive other, debilitating blows over the rest of the twentieth century, especially during the Civil Rights movement and the Vietnam War; but AA was the first to strike a blow right at the Pastor’s Study as the seat of all good advice, holy counsel, wisdom, and amelioration.
Not only did AA, almost by default, begin to supplant the pastoral authority of the professional clergy and open the door to spirituality in the experiencing of a nondoctrinally specific Higher Power, but it also revived the small-group dynamic that would come to characterize later twentieth-century Protestantism and, paradoxically, to enable the disintegration of many of its congregations into pieces and parts. Indeed, so dramatic was the aftereffect of AA’s small-group model, some commentators do not even regard it as having had any substantial relation at all to the small-group phenomenon of early Methodism, choosing instead to see AA’s approach as being of a different and far more intentional and defined kind. Whatever the case may be, AA opened the floodgates to spirituality by removing the confines of organized religion. The great irony in all of this is that many, many AA groups now meet in church buildings and/or are housed in church-owned property.
Strangers and Countrymen
Even those historians of American religion who commence their commentary on the “spiritual but not religious” phenomenon by citing the advent of AA as its prime enabler, have no problem putting their finger on 1965 as another—or the other—great impetus to the burgeoning of free-form spirituality during the latter half of the last century. This was the year in which Congress passed the Immigration and Nationality Services Act.
America, which in its common imagination sees itself as a country of immigrants, has in reality had a very checkered history where immigration policy is concerned. For the closing decades of the nineteenth century, much of the bitterness and furor was over Chinese immigration and the influx of cheap labor that was synonymous with it. During those years, the colorful, railroad barons of the era were trying to outdo one another in laying down the tracks that, by century’s end, would connect our East Coast with our West. The problem was that the barons were not laying those tracks with American labor. Indeed not! It was the work-for-next-to-nothing Chinese immigrant who was exploited. And while the barons profited outrageously and the Chinese labored in conditions somewhere between serfdom and outright slavery, it was the average American manual laborer who, caught between the two, starved.
The resulting animus was so vocal and ultimately so violent that Congress in 1882 banned any further immigration of Chinese into this country. Over the years after that, other bills barring entry to all people of Asian descent and/or denying full citizenship to those who were already in the United States were enacted; the United States became almost entirely devoid of Asian influence or perspective; and Pearl Buck’s China was as close to any cultural engagement with “the Far East” and its ways as Americans ever got. But then the Second World War came and the Korean War came and, after that, the Vietnam War began its slow march toward disaster, all of them involving Asian theatres of operation, all of them eventuating in person-to-person, human contact between young Americans and the peoples and ways of Asia. Human nature is driven by the imperatives from which it comes, though, and with increasing frequency, the person-to-person contact slipped into romantic love between soldiers and the Asians with whom they wished to spend their lives and by whom they wished to have children. This time the pressure on Congress was diametrically opposite to that of the previous century. This time the cry was for full freedom of immigration and full access to the privileges and status of citizenship. The 1965 Act granted those very things. It also opened the doors wide to a spirituality that did not require a wrap of religion to function.
Generalizations are dangerous in that they invite the truth of what they say to be destroyed by the inaccuracies or inapplicability of the details that they are generalizing. Nonetheless, generalizations usually have a substantial core of truth in them, as well as provide an economy of observation. The generalization to be made here is that before the coming of the twentieth century, the bulk of American Christianity was word based, rationally argued, and singularly lacking in aesthetic experience. For a rural culture closely entwined with the flow of the natural world and deeply engaged in physical labor, such limitations arguably are of minimal concern. But by the end of the First World War and certainly by the end of the Great Depression, Americans were no longer primarily rural. Instead, they were city dwellers and technology users with that previously unheard of, and very mixed gift of, “free time” or “time on their hands.”
A New Religion
The boundary line between free time and boredom is not a clear-cut one; but eventually free time will lead most of us to increasing awareness of our internal experience. The problem for thousands of American Christians—and especially for the American Protestant majority—was that the Christianity they had been born into had given them little or no religion-based vocabulary and few or no religion-based practices or canons by which to articulate, assess, utilize, or interpret this burgeoning world of subjective experience. The words of the more or less new science of psychology were ready to hand, but they were also, by intention, as rational, clinical, sterile, and unsatisfying as they could possibly be. And then came the 1965 Services Act—or more to the point, then came Buddhism.
Then came Buddhism with its rich, rich narrative of wisdom experience, with its centuries of comfortable conversation about the life of the human spirit, with its full vocabulary and lush rhetoric, with its sensible and sensate practices for incorporating the body into the spirit’s world, with its exotic ornaments and tranquil aesthetic, with its assurance that worthy and even enviable cultures can arise from meditation as readily as from a frenetic work ethic, with its emphasis on stillness and its teaching about the reality beyond the illusion.
Then came Buddhism with all the tools and appointments needed to enter the subjective experience fully and fearlessly . . . fully, fearlessly, and unencumbered by theism.
The pivot point here is not, per se, the fact that Buddhism, at least in some of its branches, is nontheistic. The pivot point is that, because of its being nontheistic, Buddhism can insinuate itself, quite innocently even, into the practice of almost any institutionalized religion without abrasion or apparent conflict for that religion’s faithful. But what happened after 1965 and for two or three decades afterward was much nearer to a wildfire than to infiltration. What happened was that American Christians—and American Jews with them—rushed like the subjectively starving people they were toward the feast of Asian spiritual expertise and experience. Books on how to be a Buddhist Christian or a Buddhist with a proclivity for Christian theology made the country’s bestseller lists time and again. Sanghas sprang up, as did Buddhist retreat centers, most of which drew non-Buddhist retreatants in increasing numbers; and satori became a buzzword as well as a goal. The gates were indeed now open. The case had been clearly made that the journey of the spirit did not require the baggage of religion to be a worthy and rewarding trek.[11]
The Drug Age
At the same time that Buddhism was opening new worlds to Americans’ exploration, so too was the third, great causative agent in this burgeoning of nondoctrinal spirituality. The drug age that came upon us in the 1960s and ’70s probably has spawned more human sorrow and waste and wreckage than did any of the century’s wars. Yet devastating as that time was and debilitating as its consequences continue to be, the drugs it proffered also proffered a radically different understanding of reality and a radically adjusted perception of subjectivity.
Not only were young America’s initial experiments with drugs often approached in religious terms and their results expressed in religious rhetoric, but the vividness of the experience also militated for some deeper, more sophisticated cartography of what the world of the nonphysical was and by whom or what it is inhabited. The barrenness in American culture of Christian teaching about spirituality—and indeed the barrenness of the spirituality that was taught—was equaled only by the stumbling and ineptitude with which an ill-prepared American Protestantism began to try to address the shifting situation. The result was a further exacerbation of “I’m spiritual but not religious” among those who knew to the depths of their interiors that there was more here than the Church had ever told them about. Maybe, even, there was more here than the Church had ever known . . . a possibility very analogous to the repercussions of Columbus’s not falling off the flat world of Latin theology, and with much the same disorienting consequences.
While no one wishes to belabor a point, especially in this kind of general survey, we still cannot leave the drug era without noting as well that more than any other single thing, drugs opened to public view the question of what is consciousness. As a question, the nature of consciousness certainly, as we have already noted, has long roots in history and strong ones in the work of nineteenth-century medicine and pure science, but it has its first stentorian cry of full birthing here. There is a clear trajectory from Timothy Leary straight to the Great Emergence and our current disorientation about what exactly consciousness is and we are.
The Erosion of Sola Scriptura
When we look at the question of consciousness in terms of the drug revolution, we obviously are revisiting one of our secondary questions, this time in terms more of the experiential and immediate than of the theoretical. We need to stop a moment and do the same thing now with our overarching question of authority, and for the same reasons.
As we know, sola scriptura, scriptura sola had answered the authority question in the sixteenth century and, more or less, had sustained the centuries between the Great Reformation and the latter half of the nineteenth century when the seeds of the Great Emergence were being planted. But there was—and still is—another, ongoing chain of experiential events that leads inexorably from the nineteenth century straight to the disestablishment of “only Scripture and Scripture only” in American Christian belief.
The first such blow to Luther’s resolution of the authority question came in this country with the Civil War and the years preceding it. While the Bible does not order up slavery as a practice to be followed by the faithful, it certainly does acknowledge it as an institution. And while it does not sanction slavery, it likewise nowhere condemns it. We do ourselves and our understanding of our forebears a great disservice if we do not acknowledge the fact that on the very basis of this biblical ambivalence, thousands and thousands of godly and devout Christians fought for the practice of slavery as being biblically permitted and accepted. No one presumably is naive enough to think that the War Between the States did not have huge cultural and economic factors at work in every heated debate that preceded the outbreak of war. It is equally naive and redactionist, however, to ignore the fact that America’s Protestant churches almost all split in two, violently and on theological grounds, over the issue of scriptural teachings about slavery. Those agonized cries on both sides of the divide have to be remembered now for what they were: the fearsome cries of those for whom the undergirding of “Scripture only and only Scripture” had been, if not ripped asunder, then most certainly set atilt.
Because the business of one person’s owning another person is neither morally defensible nor economically sensible in an industrialized society, we got over this major blow to sola scriptura. It was a slow and sometimes exquisitely painful recovery, but we did recover, until the Great War rattled our bars again, this time over gender instead of race. Although we may argue with some success that the Garden of Eden does not really make woman subject to man, it is impossible to argue that St. Paul does not operate from that principle. Yet now, in this new century, American women were demanding with increased ferocity their equal enfranchisement in American life and politics. This clearly was a violation of the Bible’s way! . . . Well, it may have been, but the truth was that the biblical way simply could not stand up to the grinding, day-by-day onslaught of domestic pressure. In a relatively short time, women got the vote, and men got their suppers hot and on time again. It was hardly a religious solution, but nonetheless it was a very welcome one.
By midcentury, a far more intractable question had arisen, however; that of divorce. There is almost no way to revisit the divorce debates without unearthing personal stories of the abuses and horrors that led, ultimately, to its acceptance into American Christianity. Every family has its tales about the great-aunt who was beaten routinely by her husband or the family reduced to chronic illness and malnutrition by an alcoholic householder or the distant cousin that was repeatedly abused sexually because the non-offending parent could neither control the situation nor find faithful means to escape it.
In all truth, we must acknowledge that what the Bible actually says about divorce is not quite so black-and-white or unbending as were the Church’s teachings on the subject. That distinction either was not seen at the time, however, or else it was seen by the average preacher as only a fine line which it was very dangerous to cross. But in time divorce came anyway, leaving in its wake the inevitable and predictable carnage of family instability and too easy an escape from the problems of shared living. And leaving in its wake as well another—and this time more intimate and personal—blow to sola scriptura. Now the Church was accepting what clearly it had taught against for centuries. Beyond that, and even more discouraging or debilitating, was the fact that before century’s end, the Church would be accepting divorced clergy as not only professionally able but also morally uncompromised.
The next assault in this progression of assaults was the ordination of women to the Protestant clergy. Here it is indeed impossible to wiggle around the scripturally recorded edict that a woman must keep quiet in the assembly. If she has questions, St. Paul says, she is to ask them of her husband later and at home. This time there was not, and could never be, any question of alternative interpretations or variant translations or Jewish practices that had been rendered obsolete by Christianity’s coming.
The ordination of women was followed, of course, by their elevation to the episcopacy in the Episcopal Church in the United States. Clearly the battle of “Scripture only” was being lost. Now there was only one more tool left in sola scriptura’s war chest. There was only one more pawn left on the board, only one more puck on the playing field. Enter “the gay issue.”
To approach any of the arguments and questions surrounding homosexuality in the closing years of the twentieth century and the opening ones of the twenty-first is to approach a battle to the death. When it is all resolved—and it most surely will be—the Reformation’s understanding of Scripture as it had been taught by Protestantism for almost five centuries will be dead. That is not to say that Scripture as the base of authority is dead. Rather it is to say that what the Protestant tradition has taught about the nature of that authority will be either dead or in mortal need of reconfiguration. And that kind of summation is agonizing for the surrounding culture in general. In particular, it is agonizing for the individual lives that have been built upon it. Such an ending is to be staved off with every means available and resisted with every bit of energy that can be mustered. Of all the fights, the gay one must be—has to be—the bitterest, because once it is lost, there are no more fights to be had. It is finished. Where now is the authority?
The Corporeal Strand
Before we leave this particular line of thought, however, we need to note one more thing of significance about the progression of assaults on Protestantism’s interpretation of Scripture as sole authority. While the erosion of sola scriptura is clearly an erosion of the base of traditional, denominational Protestantism’s authority, we must remember that it is a corporeal, not a spiritual or moral, issue. It is part of the second strand of the interior braid in our cable of meaning. That is, because Protestantism planted its standard dead center of a biblical absolutism without mercy or malleability, it planted itself in doctrine, in a codified set of beliefs that must be adhered to. Protestants are and always have been “believers,” one’s beliefs becoming one’s self-definition of what “Christian” is. Defined as a codified set of beliefs, doctrine, once it exists, is by definition proof positive that an institutionalized form of religion exists. It is proof positive that a set of religious sensibilities has now assumed body and form and power. It is corporeal.
Our North American fingering of the second or corporeal strand in the braid has been going on for decades, of course, in more ways than changes in social mores. Most commonly, it has presented itself as dissension over a proposed new hymnal or a translation of Holy Writ that differs in some way from that of previous decades or a reintroduction of ancient practices more associated in the popular mind with Latin Christianity than with Early Church Christianity. Raucous as some of those scrimmages have been, they have lacked the trans-denominational ferocity of the race/gender/sexual preference progression. They were not, in other words, fights that jumped a communion’s walls to involve the surrounding, general culture. The fact that race/gender/sexual preference have jumped the barriers and become cultural fights means that we may be nearing the end of our absorption with the corporeal strand; we may be almost ready to think about stuffing it, like spirituality, snugly back into the braid so that we can begin to focus ever so loudly on morality.
The Moral Strand
Our re-formation absorption or fascination with morality—with the third strand in the cable’s braid—is usually presented as having begun to rear its head with Roe v. Wade and the abortion issue. As an interpretive position, that one is arguable. That is, the protesting pro-lifers generally claim the doctrinal position of biblical literacy as the basis for their stance. “Thou shalt not kill” and “Let the little children come unto me” are indeed clearly biblical, as well as pertinent, citations. On the other hand, what allows the argument (and what will block its resolution for many years) is a moral, rather than a doctrinal, issue. That issue is the distinctly emergent, definitively second-tier, question of what is and is not a human being. Is a morula a human being or a product of conception? Does a blastula know itself? Does an embryo? A fetus? When? And is knowing self a definition of life? Is the perception of pain, life? By what standard of assessment? And so on and so forth.
Where one chooses to position the pro-life/pro-choice debate does not change the fact, however, that since April 2005, we, as a culture in re-formation, have been deeply preoccupied with fingering the third strand of the braid. Terri Schiavo died in April of that year; and the months running up to her death and those running down from it since have been ones of distinctly moral debate. “Thou shalt not kill” still appertains, but to permit death is not the same as to inflict it. And the distinction between permit and inflict lies inexorably buried in the question of what is human consciousness and/or consciousness’s relation to humanness.
Almost as much to the point is the fact that mercy is too fluid a concept to be doctrinalized. Yet, it is theories of mercy that shape and inform the morality of permit. We have sensed this for quite some time now, of course. As a people, we were first flummoxed by it well before the Schiavo case, in the public furor that attended Dr. Jack Kevorkian. The problem is that, all these years later, we still have not conceptualized an ethos based on it. Generally accepted principles of morality are a work in progress for emergence culture, in other words. Presumably, they will be for quite some time yet.
Technological Advances
Time and space will hardly permit the elaboration of some of the three-dozen-plus other social, technological, political, and cultural changes that rose up in the peri-Emergence of the twentieth century. Certainly, before we leave this part of our discussion, though, we need to acknowledge just a few of them, even if we do so with no more than a brief mention.
We need to remind ourselves, for instance, of two things we already know: first, the religious expression or result of the Great Emergence is a new configuration of Christianity, and second, this new “emerging” or “emergent” Christianity is fundamentally a body of people, a conversation, if you will. Only after that does it become a corpus of solutions and characteristics, accommodations and principles. It is a conversation being conducted, moreover, by people from diverse cultures and points of reference, as well as from widely divergent Christian backgrounds.
As we will soon see, approximately one quarter of today’s “emergents” and “emergings” are Roman Catholic, not Protestant, in background and natal formation. For that reason, any treatment of the peri-Emergence must acknowledge the presence and enormous, formative impact of both Vatican I and Vatican II on Roman Catholicism in particular and on re-traditioning and emergent/emerging Christianity in general.
Vatican I, convoked in 1869, technically did not end until 1960, when Pope John XXIII formally closed it in order to make way for Vatican II in 1962. The two councils, which have been the basis of innumerable volumes in and of themselves, anticipated, as Protestantism did not, the central questions of the new re-formation. In effect, they did much of the original spade work or heavy lifting, so to speak, in that they attempted to forestall the questions by answering them before they could be fully articulated in the communion at large.
While Vatican I most famously dealt with the authority issue by establishing the principle of Papal Infallibility as dogma, it also dealt extensively with Latin understanding of Scripture and its applications, origins, and role. Vatican II, which is more familiar to most Americans, was a course correction of another sort. That is, it sought to ameliorate much of the Church’s traditionalist reaction to modernism; but it was also deeply engaged with the issues surrounding ecumenism, interfaith dialogue, and the formulation of an acceptable theology of religion. Regardless of what form or forms of Christianity may rise up out of the Great Emergence, in other words, it is safe to say that much of the thinking and many of the effectual conclusions will have had their initial roots in the Vatican Councils.
We need, certainly, to recognize here the impact of medical advances and how they drastically changed the form and nature of perceived human vulnerability and, as a result, the popular understanding of exactly what the role of the Church and/or its clergy was and is in healing. Second, those very advances, with their greater skills in defeating disease and staving off death, have eventuated, obviously, in questions exactly like the Schiavo and Kevorkian ones. Less flamboyant and far less theoretical and distant, however, are the questions they have evoked about routine geriatric treatment and end-of-life intervention, its morality, its imperatives, its costs, and its standards.
We must recognize that the coming of individually programmed technologies like the Sony Walkman or the iPod or the programmable cell phone made superb music not only accessible outside of churches and concert halls, but also made it highly participatory. One has only to watch folk, their ears soundly plugged, walking down the street with their fingers clicking, their feet jazzing, and their eyes half closed to understand why performed music coming from ordinary organs to seated audiences in meetinghouse sanctuaries lacks a certain immediacy and/or street appeal. Perhaps no other single thing has so threatened and changed the hegemony of formal Christian worship as has this shift in our general affection from performed to participatory music.
We cannot ignore the fact that computer science has unleashed upon us nanotechnology and artificial intelligence and concepts like the Singularity with all their concomitant legal, moral, and religious questions. The problem inherent in all of them is that we are a public whose extant religious institutions have to date shown themselves to be ill-prepared both theologically and intellectually to wrestle with the practical implications involved in such intellectual and technological developments.
We must acknowledge as well that the world has indeed gone flat again, the Reformation’s nation-state having given way to the Emergence’s globalization. Cash, which replaced blood as the basis of power during the peri-Reformation, now has had to cede power over to sheer information in the Emergence. And to some greater or lesser extent, every social or political unit is in thrall to those who know the most about how to destroy the most or expedite the most, whether such threatening agents be next door or three continents away.
We cannot ignore the passing of much religious experience, instruction, and formal worship from sacred space to secular space and, perhaps even more significantly, into electronic space. The progression from the radio preachers of the first half of the twentieth century to the television “sermons” or visits of Bishop Fulton Sheen in the midcentury to the televangelists of the later half of the century to the churches and worship sites of the Internet is an uninterrupted movement to a more and more interiorized or imaged religious praxis. Millions of Americans now receive their entire pastoral care and have their whole religious instruction and engagement on the Internet through websites ranging from the sociability of worship in Second Life to the prayerful quiet of gratefulness.org to the informational and formative offerings of sites like beliefnet.com.
Nor can we, in speaking of the computer and cyberspace, forget that both have connected each of us to all the rest of us. The hierarchal arrangement or structure of most extant Churches and denominations is based on the hierarchal arrangement of the Reformation’s evolving nation-states. It is, however, quite alien and suspect, if not outright abhorrent, to second-generation citizens of cyberspace where networking and open- or crowd-sourcing are more logical and considerably more comfortable. In our connectedness, of course, we also experience with immediacy the pain and agony, incongruities and horrors, of life as it is lived globally, forcing the question of theodicy to take on a kind of total-humanity angst or urgency that has not accrued since the Black Death leveled the earth five and six centuries ago. The rise of aggressive atheism in the opening decade of the twenty-first century, in fact, finds much of its explanation and raison d’être in this very fact.
It has been said over and over again—and quite correctly—that the Reformation’s cry of sola scriptura was accompanied and supported by the doctrine of the priesthood of all believers. The computer, opening up as it does, the whole of humankind’s bank of collective information, enables the priesthood of all believers in ways the Reformation could never have envisioned. It also, however, opens up all that information to anybody, but without the traditional restraints of vetting or jurying; without the controls of informed, credentialed access; and without the accompaniment or grace of mentoring. It even opens up with equal élan the world’s bank of dis-information. To the extent that faith can be formed or dissuaded by the contents of the mind as well as those of the heart, then such license has huge implications for the Great Emergence and for what it will decide to do about factuality in a wiki world.
Rosie the Riveter
But before we conclude our overview of how the Great Emergence came to be, and of the more obvious events of the twentieth century that have shaped emergents themselves, we need to look in detail at one last chain of circumstances. At first blush, this one may seem as peripheral as did our discussion of the coming of the Model T (to which it is, by the way, related). In reality, though, like the coming of the family automobile, this string of changes has worked in concert with the rest of the twentieth century to create what arguably may be one of the most informing elements of them all.
Her name was Rosie—Rosie the Riveter—and she was born in 1941. That was the year that the gathering storm of World War II broke forth in all its fury, and there was no more accommodation. America was at war, the irony being that we had next to nothing with which to fight a war. Our tanks and guns and ships were all antiques wrapped in mothballs or else they were the property of Japan, to whom we had sold them some years before simply as a means of getting rid of them all. The First World War had, after all, been the war to end war . . . in theory, that is.
When war was declared, soldiers were mustered up and conscription was begun, the result being that over the next five years almost every able-bodied American male was on active military duty. We affectionately called him/them “Johnny” and prayed for safety now and peace soon. But Johnny had nothing with which to protect himself and very little with which to fight his way toward peace. Johnny did, however, have a wife. We, within a matter of months, came to call her Rosie; and the years would make of her one of America’s most loved and honored icons.
The traditional family, the so-called nuclear family that the peri-Reformation created and Protestantism enshrined, was, as we all know, hierarchal. The male head-of-house was the unit’s chief defender, provider, and director. Second in command was “his” wife whose area of influence and responsibility was domestic primarily, and social only secondarily, if at all. The children were the plebes of the family, but there was no question that in most cases, it was for them and their furtherance that the family existed. The father’s economic and managerial efforts and the mother’s domestic ones were directed toward the support and maintenance of the home.
When war came, however, and all the Johnnies went off empty-handed to fight, the American government turned to the only workforce still available to us. We turned to Rosie. And to get Johnny safely home and to assure that their children would never live under an alien regime, America’s women responded. Young and middle-aged women who had never worked for pay in their lives, much less outside of their own homes, took their little ones down the street to Grandma’s house or Aunt Susie’s, then rolled up their sleeves, punched in their time cards, and went to work throwing the rivets that made the planes that made America and home safe again.[12]
By any name, what America’s women of the 1940s did was an amazement. Women, made strong by years of hoeing gardens, toting wet laundry, chopping stove wood, and riding herd on children, took their mechanical and economic naiveté in hand and assumed the very same jobs that their husbands had always said were unladylike, not fitting, too arduous. If one listens to the Rosies of World War II, however, if one reads their memoirs and letters, one finds not the faintest whiff of feminism. If there is pride here—and there is—it is pride in a job well done for the sake of protecting what was and is.
Certainly there was a paycheck. It was needed, because Johnny’s army pay was hardly equal to the routine costs of caring for a house and children. If there was a shifting about in the nature and range of social contact, then that was only coincidental to the business of throwing rivets. If there were a certain subtle easing of stress when there was no director other than one’s self to determine domestic policy and decisions, then it was embraced as compensatory, not as a pleasure to be desired forever. If there were a kind of unnatural relief in being able to hand one’s children off to others for much of the day and sometimes even for a night or two, then weariness and duty overwhelmed any luxuriating in some sense of false freedom.
Before the war’s end, over twenty million American women would be gainfully employed in defense work. Once the war was over, though, and once the men were home, most of those women went back quite willingly, even gratefully, to the domestic role they had originally been reared to fulfill. There were two or three problems, however, with that resumption of business-as-once-it-had-been. The first was the kind of restiveness that attends when one has seen a wider world and then is returned to a more socially, fiscally, and potentially restricted one. Perhaps not an expressed or even an explosive problem, that restiveness was nonetheless an erosive one.
The second problem was that with war and the increasing sophistication required to win it had come such technology as the world never before had known. There was little or no chopping of stove wood to occupy an hour a day and considerable amounts of energy. The stove in the kitchen worked on switches, and the heat came from an automatic furnace in the basement. There were no more dirty clothes to scrub on the scrub board, and no more wet laundry to tote outside and hang on the line. One machine did the washing, and six inches away was the matching machine that did the drying. Sweeping and mopping gave way to vacuuming. In fact, Mrs. Johnny found her gender-assigned work strangely lacking in physical outlets or logistical challenges. She also found herself possessed of hours of time and little notion of exactly what to do with them.
The third problem was that the children whose fathers had left for war and come back again, whose mothers had worked the factories and manned, quite literally, the war effort, remembered a different domestic structure and a different set of domestic politics. They remembered, and their notions of home had been shaped by, five years when the rules had been different. They remembered when Mama had been somebody, when her picture—or the picture of some other lady just like her—had been plastered on walls and public buildings all over town as evidence of the best of America and the American spirit. Some of them—a lot of them, perhaps—remembered when there had been no fights after lights-out, when harsh words overheard had not threatened one’s sense of safety and stability. Some of them undoubtedly remembered a time when mama had had money to share or even, occasionally, to spend on her own fancy.
At the risk of once again generalizing too much, it is still true that the stereotypical or average Rosie took care of her restiveness by increasing her social life via the telephone and the nearest church. The church, in fact, became for her and many of her kind the solution of choice for that second problem of freed or empty time. The midcentury church could invent programs faster than their women congregants could man them; and busy is, if not good, then at least sedative. In effect, Rosie morphed into June Cleaver, and Johnny morphed into Ward. Beaver, being a boy, was the cliché of national choice for all the happy children in post–World War II, proper, American, Christian society.
The pity was that Beaver had a brother, Wally, but no sister. Had she existed, however, she might have left us some kind of archival record of how she got from being June’s little girl to being one of Betty Friedan’s groupies. The third problem, in other words, was one neither Rosie nor Johnny could fix. The memory—the actual, lived knowledge—of another way of being female was ingrained in the heads and hearts of thousands of young women who had been born under the original model, been reared in the amended one, and been returned in the heat of adolescence to the original one.
They had no catchy name, those young women who had seen a different way of being female; but they had fury and intention. Never, never would they be the submissive wives their mothers had begun as and returned to. They would riot and defy, but they would also get themselves college educations and teach themselves and each other financial acumen. They would break the old rules and live in freedom with a man, but without the entanglement of legal indenture or the liability of common finances. They would work harder than men, if that’s what it took, but they would, at whatever cost, be respected as equal, not secondary, citizens.
The world had pivoted.
Family Reconfigured
Re-formations do indeed always have the requirement of answering anew the question of the proper location and definition of authority. They likewise always have one or two subsidiary questions like the ones for the Great Emergence of a theology of religion and the definition of what “human” is and what consciousness is and does. There just may be, however, a third subsidiary question for this re-formation of ours, though ours will not be the first upheaval in which it has reared its head. The third question is, “What now is society’s basic or foundational unit?”
For five hundred years, the nuclear family was the established unit upon which the larger society was itself established. In this country alone, for over three centuries, everything from our legal codes and political proselytizing to our religious propaganda and church programming assumed and rested upon that unit with its traditional deployment of responsibilities and its unquestioned chain of command. When the country preacher of mid-twentieth-century America decried divorce as a threat “right at the heart of America,” he was neither in error nor benighted. He may not have been arguing from religious conviction so much as from his own private unease about what a Pandora’s box there was for the status quo at the end of that road; but he was still right.
When, in the same two or three decades, not only divorce, but Rosie came upon us, there was no turning back, except for one small thing. True equality of the sexes in opportunity and public power would forever be limited, or so the common wisdom held, so long as women were emotionally and physically compromised by monthly menses and by pregnancy. Those fundamental functions of biology would hold forever, and forever guarantee at least some periods of vulnerability when the male’s natural strength and protective aggression would be necessary and appreciated. And that would have been true had 1960 not come upon us and, with it, the release to general use by the Food and Drug Administration of the birth control pill.
It slipped up on us, so to speak. Greeted first with curiosity, then with tentative acceptance, and within two or three years hailed as God’s gift to the overly fertile, the pill soon thereafter became God’s gift to emancipation, God’s tool for total equality. The playing field was now level in a way that even legal divorce could never have made it.
“An important meeting is coming up at the office next Friday, and I must be at my very best, but my period is due on Wednesday. I can’t afford the distraction or the dullness of menstruation, so I’ll just take an extra day or two of pills. It won’t hurt anything, and nobody will be the wiser,” Rosie’s daughter says to herself. And she, too, is right. It did not hurt anything, and it did ensure that she was at the top of her game. And by the turn of the twenty-first century, not only would the American woman have changed, but so also would the pill. By the turn of the century, the science behind the pill would have advanced enough so that not only could America’s women delay childbearing for as long as they wished, but they could also completely block menses for years without any obvious detriment.
There is, again, nothing inherently right or wrong in these changes. There is only change itself. What change meant, in this scenario, was that now the average husband and wife were, first, a two-income family. Increasingly, those incomes were close to equivalent; sometimes they were even disproportionate in favor of the woman’s wages. Because money is power, money—a salary check of her own—is also freedom; or it is a ready-to-hand ticket to freedom should the present arrangement cease to be acceptable for some reason or other.
Additionally, both marriage partners, for the first time in American history, were receiving not only their fiscal but also their psychological income and rewards from sources external to their family unit. And for perhaps the first time in human history, the home—the physical place and the children and relationships that were in it—was not what work was first and foremost about. Once upon a time, the father had gone forth to conquer the world only so he could bring the world, or at least a portion of it, home as trophy and enabling means for the family. Once upon a time, the mother had been there always to soothe and appreciate the father and to create the home for which he worked. Now both went forth to conquer the world. In doing so, they had to “make arrangements for child care,” which usually meant that one or another of them also had to pick up weary children on the way home from work. And home? Why, home was no more the reason for work. It was instead the place where all the members of the family came to regroup and regain the energy required to go back out there and conquer again.
With the automobile, as we have noted, we lost some of the conservatory influence of the traditional matriarch. With the acceptability of divorce, with the Rosie years of World War II and their aftermath, and finally with the coming of the pill, we lost the traditional mother, and with her going, we lost the traditional or nuclear family. Census figures, early in the twenty-first century, predictably enough, were already showing declining birth rates among women of European descent and an advancing age, across the board, for first pregnancies. But they were also reporting for the first time in our history that just barely—by a slim point or two, but still irrefutably—more Americans lived in nontraditional family structures than traditional ones. Slightly more of us, in other words, lived alone or out of wedlock or in extended families or with affinity groups than lived in households composed of married partners of the opposite sex rearing the biological or adopted children of their union. Where now—or what now—is the basis for our social order?
While we do not presently know the answer to that question (though there are some intriguing and educated guesses moiling about), we do know one of the more obvious problems that has arisen from our lack of, if not an answer, a temporary fix. The most obvious is—and has been for three or four decades—that once the female is occupied outside the home for a full working day, she suffers the same physical and mental exhaustion as does the male. What that translates to is the complete reorientation of the evening hours in the family’s life. The solidifying bond of a shared meal is often sacrificed, certainly, but more to the point for the Christianity of the Great Emergence, so too are the traditional time of family-based religious instruction and formation.
Scripture’s Place
When World War II broke out, the average American youngster, whether Protestant or Roman Catholic, was possessed of a reasonable familiarity with Bible stories and a formative grasp of the religious and moral points contained in them. Most of that sub-rosa information had been instilled at home in dinner conversations, family altars, Bible-story reading, and bedtime prayers. Biblical literacy and cultural literacy were totally entwined, one with the other, as was biblical and familial instruction. When the mother as principal storyteller and domestic rabbi ceased, bit by bit, to function in those roles, America’s younger generations became more and more untethered from the parables and prophecies, interpretations and principles that supported both the story itself and the consensual illusion that was based on it.[13]
The result, theologically, for both emergent Christianity and the reactive bodies of American Protestantism and American Roman Catholicism is stark. Each one of them, in dealing with Americans under fifty, is dealing in large measure with scriptural innocents whose very ignorance is pushing them in one of two directions. Either innocence of scriptural experience is propelling them to seek ever more eagerly for structured engagement with it, or else a total lack of prior exposure is propelling Scripture itself farther and farther into the attics of life where all antiques are stored for a respectful period of time before being thrown completely away. Which extreme is worse is hard to say, for naifs of every kind are vulnerable at every turn . . . easily exploited, easily crippled, easily sacrificed.
But enough of such overviews, listings, and history. It is time to turn our attention at last to the more immediate present and to our near future. It is time to answer our final question: Where is this thing going, even as it is carrying all of us along with it in its mad careen?