SIX

Faith-Based Certainty Meets the Gospel of Doubt

Two paragraphs into Civilization and Its Discontents, Freud’s 1929 treatise on man’s “unhappiness in culture,” a dilemma surfaced. Having described how a recent book of his treated religious ideas “as an illusion”—indeed, as “fulfillments of the oldest, strongest and most urgent wishes of mankind”—Freud stopped almost dead in his tracks.1 It was not that he failed to grasp the urgency of such wishes. A sense of their appeal simply was not in him.

When his friend Romain Rolland (author of several religious biographies and a later winner of the Nobel Prize for Literature) insisted that religion gave him a “sensation of ‘eternity,’ a feeling as of something limitless, unbounded,” Freud was frankly puzzled (CD, 11). “I cannot discover this ‘oceanic’ feeling in myself,” he wrote bluntly. “From my own experience, I could not convince myself of the primary nature of such a feeling. But this gives me no right to deny that it does in fact occur in other people.” The matter, for Freud, boiled down to how we interpret such feelings. As he noted, “It is not easy to deal scientifically” with them (12).

He was certainly right on that score. Setting that matter aside for now, it is a surprising beginning to a now-famous argument about our “unhappiness” in modern society.2 Having used religion to voice its deepest needs, humanity, Freud argued, found those needs sharply in conflict, at times irreconcilable. While the gods served as “cultural ideals” (CD, 38), they helped to solve riddles and settle doubts about the meaning and purpose of life, including the suffering it causes. But the idea that everyone must follow the same illusion—and that others should be converted to it—troubled Freud, who was quick to point out the price of unbelief for skeptics and doubters. “The impossibility of proving the truth of religious doctrines … has been felt at all times—undoubtedly, too, by the ancestors who bequeathed us this legacy. Many of them probably nourished the same doubts as ours, but the pressure imposed on them was too strong for them to have dared to utter them.” “Since then,” Freud concludes, meaning between antiquity and his own generation of Victorian doubters, “countless people have been tormented by similar doubts, and have striven to suppress them because they thought it was their duty to believe.”3

In the years immediately preceding the Holocaust and the Stalinist gulags, before Freud himself had to flee Vienna for London—aged eighty-two—to escape Nazi persecution, he continued to call religion a man-made response to spiritual and anthropological needs.4 According to Freud, we cannot divorce one from the other. The individual comfort that friends like Rolland draw from spiritual practices unites wider groups of people in a set of shared beliefs, which mark them as distinct from other groups drawn to different, often conflicting creeds. Antagonism predictably erupts over whose beliefs have greater force, cogency, or priority. As Freud noted with bitter sadness, his historical perspective greatly overdetermined by the surrounding anti-Semitic violence in Austria and Germany:

The Jewish people, scattered everywhere, have rendered most useful services to the civilizations of the countries that have been their hosts; but unfortunately all the massacres of the Jews in the Middle Ages did not suffice to make that period more peaceful and secure for their Christian fellows. When once the Apostle Paul had posited universal love between men as the foundation of his Christian community, extreme intolerance on the part of Christendom towards those who remained outside it became the inevitable consequence. To the Romans, who had not founded their communal life as a State upon love, religious intolerance was something foreign, although with them religion was a concern of the State and the State was permeated by religion. Neither was it an unaccountable chance that the dream of a Germanic world-dominion called for anti-semitism as its complement; and it is intelligible that the attempt to establish a new, communist civilization in Russia should find its psychological support in the persecution of the bourgeois. One only wonders, with concern, what the Soviets will do after they have wiped out their bourgeois. (CD, 61-62)

The passage is so well known to philosophy freshmen that it serves as a textbook example of what Freud dubbed the “narcissism of minor differences” (CD, 61). The smallest discrepancies in perception and belief assume outsized importance—are enough, indeed, to create sectarian splits within all the major faiths. The outcome of those splits depends greatly on the meaning we give them and the context in which they unfold, but anyone who has studied recent history will note how quickly they can escalate, to the point of apparently justifying the eradication of the opposing group from the very face of the earth.

Freud’s passage glosses the fate of many religious doubters, pariahs, and minorities. Still, as Christopher Hitchens underlines with awful precision in his survey of religious conflict from Belfast, Belgrade, and Beirut to Bethlehem, Baghdad, and Bombay, “just to stay within the letter ‘B,’” the fundamental truth of Freud’s observation is hard to dispute.5

At the same time, that base-level disconnect between Freud and Rolland means that the two friends were still miles apart over the fundamental question of religious feeling—the sensation of “eternity”—that one of them insisted was real and the other assured him was not.

Whom we take to be right or off-base on that matter says a lot about how we’ll react to broader cultural debates about the role that belief and faith should play in society today. To secularists, Freud’s admission, “I cannot discover this ‘oceanic’ feeling in myself,” may communicate insight, including into how belief works. To his many detractors, by contrast, including those drawn to Carl Jung’s interest in mysticism, it conveys a personal limitation, and even (for orthodox believers) a moral deficiency.

When Jung was asked during a 1959 BBC interview if he believed in the existence of God, he replied a little differently: “I don’t believe, I know.”6 The statement conveys such unshakable certainty and, by extension, so little self-doubt, that it seems to end discussion, making belief a bedrock for knowledge.

To Freud, however (and I think he was right here), there were risks, even dangers, in confusing one with the other. Indeed, his own life experience painfully confirmed it, with his daughter, Anna, detained for twelve hours of interrogation by the Gestapo and his four sisters murdered in Nazi concentration camps. His concerns about enmity and intolerance—within and over religion—proved to be tragically well founded. So, too, were his general anxieties about the historically fraught boundary between belief and hostility.

In the wealth of books to appear recently over these ongoing concerns, from The God Delusion to God Is No Delusion, from The End of Faith to The Reason for God, and from Irreligion to Answering the New Atheism, one topic recurs, even to the point of predictability: the wide chasm over how we perceive faith and doubt, including what we think both should mean and accomplish.7 The chasm is, in a way, quite similar to the mental gulf separating Freud from Rolland. It also reenacts nineteenth-century debates over reason and faith.

The sense of mystery that Michael Novak relishes, for instance, in No One Sees God (his account of the “dark night” that atheists and believers apparently share), is to Richard Dawkins a tiresome obfuscation, when it is not a dangerous “delusion.” Novak wants to bring to our attention that both atheists and believers can spend “long years in the dark and windswept open spaces between unbelief and belief.”8 As Novak observes, both groups may also experience “the same ‘dark night’ in which God’s presence seems absent, and the conflict between faith and doubt stems not from objective differences but from divergent attitudes toward the unknown.”9

For Dawkins, by contrast, the aim of encircling belief with reason is to “eviscerate” religious faith, to help bring about its end.10 Small wonder that Dawkins’ approach enrages the devout: it looks to them like an all-out assault on their most cherished beliefs. And in many respects, it is. Numerous questions ensue: Which side permits criticism? Which, a tradition of questioning and doubt? Above all, whose criteria are right?

Dawkins insists that he distinguishes at bottom between the ordinarily “deluded” and the literalists that he once called “faith-heads.” Still, one could be forgiven for missing such subtleties in his sweeping denunciations of religious belief, especially after he conflates religious moderates and extremists, and likens faith to a “celestial comfort blanket.”11 It is noticed less often that he reserves particular scorn for agnostics—oddly, even those like Thomas Huxley, “Darwin’s bulldog,” for stopping a fraction shy of the certainties that Dawkins professes.

To be sure, Dawkins lets a “robust Muscular Christian” preacher from his schooldays condemn agnostics as “namby-pamby, mushy pap, weak-tea, weedy, pallid fence-sitters” before he chimes in: “He was partly right, but for wholly the wrong reason” (GD, 46). Still, the term pap comes to characterize, for Dawkins, “a deeply inescapable kind of fence-sitting, which I shall call PAP (Permanent Agnosticism in Principle)” (47). So it is not, he writes, a complete accident that the acronym echoes his schoolmaster’s ridicule.

“Agnosticism, of a kind, is an appropriate stance on many scientific questions,” Dawkins concedes, but in his estimation it is not the right stance to adopt over minor, apparently easily settled matters such as whether God exists (GD, 47). “Agnosticism about the existence of God belongs firmly in the temporary or TAP category,” he states categorically. Case apparently closed. “Either he exists or he doesn’t. It is a scientific question” (48).

So that little conundrum is settled, after all! Concerning Huxley, Dawkins is a fraction more hesitant: “One doesn’t criticize T. H. Huxley lightly. But … in his concentration upon the absolute impossibility of proving or disproving [the existence of] God, [he] seems to have been ignoring the shading of probability. The fact that we can neither prove nor disprove the existence of something does not put existence and non-existence on an even footing” (GD, 49). True. But how one establishes consensus there depends greatly on whom one asks in the first place. “Contrary to Huxley,” Dawkins continues, “I shall suggest that the existence of God is a scientific hypothesis like any other” (50).

So that minor quandary is settled, too!

Huxley believed that science and religion belong to different realms, an idea that Stephen Jay Gould more recently dubbed a matter of “non-overlapping magisteria”: religion is aligned with emotion and feeling, and science tied to facts.12 But Dawkins replaces what he calls Huxley’s “agnostic faith”—the insistence that these fundamentals likely will never be answered—with the proposition that they could be settled by scientific discovery, though we cannot yet say when. This is, in one respect, where scientific speculation and religious faith share some common ground, though Dawkins would never call his expectation a kind of faith. Still, Huxley’s firm open-endedness, even as it put the burden of proof on religion, troubles Dawkins. “Why there almost certainly is no God” is his preferred statement, though he cannot quite give up that “almost” (GD, 111). Yet he quotes Huxley’s famous statement in “Agnosticism” (1889): “I was quite sure I had not [solved the problem of existence], and had a pretty strong conviction that the problem was insoluble.”13

The difference comes down to the gap between Dawkins’ “almost certainly” no God and Huxley’s “pretty strong conviction” that the problem of existence is insoluble. It isn’t quite an example of the “narcissism of minor differences,” since a philosopher could still drive a truck between those statements, but it doesn’t seem to warrant the breezy dismissal that Dawkins ultimately gives it: “I am agnostic only to the extent that I am agnostic about fairies at the bottom of the garden” (GD, 51).

To invoke only John Humphrys’ fascinating book In God We Doubt: Confessions of a Failed Atheist (which he or his publisher decided to retitle for American audiences as Confessions of an Angry Agnostic, as if these things had to be ramped up in the United States before anyone would listen), there is plenty about secular materialism that leaves many people cold and unsatisfied.14 Indeed, one rather hefty consequence faces those who wrestle with the limits of secularism: Are materialist explanations for the world enough to satisfy, including those longing for a sacred reality?15 At some point, it needs to be asked whether secularism fails to answer a fundamental need in many for a purpose and reality that surpasses human comprehension.

When the debates become so heated, polarized, and trivialized that pressing ontological matters are likened to belief in “fairies at the bottom of the garden,” and even agnosticism is cast as weak-kneed evasion, there’s a clear whiff of hubris and hyperbole in the air. As Chris Lehmann noted in Reason Magazine of the “new atheists” and their critics, “Each side retreats to its corner, more convinced than ever that the other is trafficking in pure, self-infatuated delusion for the basest of reasons: Believers accuse skeptics and unbelievers of thoughtless hedonism and nihilism; the secular set accuses the believoisie of superstition and antiscientific senselessness.”16

Dawkins’ contempt for agnosticism is in one sense surprising, as doubt tempers extremism, religious and otherwise, while “healthy agnosticism,” Bernard Lightman reminds us, “actively questions everything, including itself.”17 For that reason, nineteenth-century agnosticism amounted to neither fence-sitting nor “mushy pap” but a stance advancing a plain and laudable admission: “I don’t know.” Indeed, “I probably won’t ever know.” It makes doubt integral to one’s position. And in doing so it assumes a level of neutrality in the faith wars that leaves people with the option to change their minds—which seems to be part of what irks absolutists on either side. As Bill Maher puts it in Religulous, his light-hearted but probing 2008 film about faith, as he debates a handful of truckers in a North Carolina chapel over whether eternal salvation is possible: “Yeah, you could be right. I don’t think it’s very likely. But yes, you could be right. Because my big thing is, I don’t know. That’s what I ‘preach.’ I preach the gospel of I don’t know. I mean, that’s what I’m here promoting: doubt. It’s my product. The other guys are selling certainty. Not me [laughing]. I’m in the corner with doubt.”18

With its comedy, Maher’s perspective is an antidote to extremism, and surely one of its remedies. When certainty seems so necessary that any doubt feels like a catastrophic admission of weakness, it is too easy to group fanatics over there, to represent oneself (and one’s group) as under attack, and to miss the elements of strident certainty that mark one’s own absolute objections to the absolutism of others. At the same time, it is a convenience —and an error—to continue talking about “The Three Atheists,” as Dawkins, Hitchens, and Sam Harris are often dubbed, as if they were the only ones, and as if they did not have large areas of disagreement (just as their critics do).19 As Hitchens points out early in his book, God Is Not Great, “My own annoyance at Professor Dawkins and Daniel Dennett, for their cringe-making proposal that atheists should conceitedly nominate themselves to be called ‘brights,’ is part of a continuous argument”(GING, 5).

Those who can get past the subtitle of Hitchens’ book, How Religion Poisons Everything, may be surprised to read his admission, concerning his first religious teacher in southwest England, “If I went back to Devon, where Mrs. Watts has her unvisited tomb, I would surely find myself sitting quietly at the back of some old Celtic or Saxon church” (GING, 11). It is an appealing image, not because it puts an avowed atheist in a church, but because it so clearly shows his understanding of the history and diversity of religious practice. “If I went back” echoes Hardy’s syntax and sentiment, even as Hitchens consciously invokes another Victorian, George Eliot, whose novel Middlemarch (1871-72) tries to sum up the secular accomplishments of her idealistic protagonist, Dorothea Brooke. “If I went back” also reminds us, as Dawkins unwittingly made clear in his dealings with “Darwin’s bulldog,” of the ongoing importance of the Victorians.

One reason for that is because their energetic debates about faith and reason helped to set the terms and parameters of comparable ones today. Yet while they engaged intensively with all facets of religious belief and doubt, Terry Eagleton justly criticizes Dawkins for glossing this cultural and religious history with almost scandalous imprecision, as if none of it really mattered in the first place, no matter how deeply embedded its traditions remain in our lives and shared past.20 When put that way, Dawkins has comparatively less to say to us than Hardy, Froude, Huxley, Chambers, or Eliot, while they and many other Victorians have a lot more to contribute on religious faith and doubt than we have fully acknowledged.

When I recently flew to Cincinnati to visit the so-called Creation Museum, on the outskirts of the city, that denial of nineteenth-century arguments struck me in full force. At first it seemed easy to dismiss the animatronic versions of Adam and Eve gazing passionately at each other not too far from a hungry-looking dinosaur. It was just cheesy Midwest shtick, I thought, and not to be taken seriously. “Enjoy the first six days of history,” I was cheerfully told, before I munched a sandwich in Noah’s Café, the organization’s restaurant, and listened to a friendly volunteer report that they had had seventy-four thousand visitors in the first few months of opening, with three thousand stopping by on a normal day.

I sat in an auditorium where the seat in front of me sprinkled tiny jets of water in my face as a film tried to impress on me what a global deluge might look and feel like. I read posters explaining, quite seriously, how two members of every species, including dinosaurs, could conceivably be made to squeeze onto a large wooden ark without eating each other first (a handy to-scale model nearby helped make that seem a fraction more plausible). The “museum” Web site even encouraged visitors to stop by its Dinosaur Den with the tag: “Our dinosaurs cater to groups.”21

image

Figure 18. Adam and Eve Prepare for Passion at the Creation Museum, Petersburg, Kentucky, September 2007. Author collection, reproduced with permission.

But it was difficult to ignore the distortions that other visitors (including dozens of schoolchildren) were taking very seriously indeed. “What did dinosaurs eat?” asks one of the informational signs. Presumably not Adam and Eve, looking so intently at each other nearby. Though the fangs on the tyrannosaur looked more forbidding than the plastic snake representing Satan, all my Tennysonian thoughts of “Nature, red in tooth and claw” apparently were wrong.22 Before Adam’s fall, every dinosaur was herbivorous! The sign nearby read: “God said, ‘To every beast of the earth and to every bird of the air, and to everything that creeps upon the earth, wherein there is life, I have given every green herb for food’” (Genesis 1:30). For the museum “curators,” one imagines, the fangs on that tyrannosaur had evolved after the Fall.

Not surprisingly, too, the Creation Museum argues that dinosaurs existed quite recently and may not even be extinct. As Ken Ham puts it in “What Really Happened to the Dinosaurs?” a chapter of his New Answers Book: “According to the Bible, dinosaurs first existed around 6,000 years ago. God made the dinosaurs, along with the other land animals, on Day 6 of the Creation Week (Genesis 1:20-25, 31). Adam and Eve were also made on Day 6—so dinosaurs lived at the same time as people, not separated by eons of time.”23 Dinosaurs, continues Ham, president and CEO of Answers in Genesis (USA) and author of The Lie: Evolution, “could not have died out before people appeared because dinosaurs had not previously existed; and death, bloodshed, disease, [meat-eating], and suffering are a result of Adam’s sin (Genesis 1:29-30; Romans 5:12, 14; 1 Corinthians 15:21-22).”24

If we can get past the syllogism that no other land creatures could have preceded man because the Bible dates their origin to the same “day,” Ham’s assertion might strike us as running together two quite different issues. In deed, one need not jettison the entire concept of sin (or collective guilt) to insist that dinosaurs still pre-existed us, and by millions of years. Nor, of course, need one defend Genesis to the letter to maintain a faith in God. The secular scholar Keith Thomson, for instance, acknowledges the “glory” of the opening verses of the Bible in ways that grasp their beauty as a meditation on origins: “And the earth was without form, and void; and darkness was upon the face of the deep.”25

image

Figure 19. Girl Playing Near Dinosaurs at the Creation Museum, Petersburg, Kentucky, September 2007. Author collection, reproduced with permission.

To insist that such lines be burdened with scientific accuracy is to overlook that the Bible was written centuries before the emergence of modern science, under completely different conditions. At the same time, to ignore the poetry of Genesis and to insist that materialism resolve such vast enigmas as why we evolved on this planet is to risk making “science … the sacrosanct fetish,” as the ambassador cautions in Joseph Conrad’s Edwardian novel The Secret Agent?26

One’s immediate question to the founders of the Creation Museum may well be, to paraphrase Bill Maher, “Why does insisting on the coexistence of Adam and Eve with the dinosaurs matter so much?” especially when so many other Christians take the Book of Genesis on faith, not as historical truth. What is at stake in arguing so vociferously against the teaching of evolution, moreover, when the Roman Catholic Church, encouraged by Pope Pius XII, more or less accommodated itself to that phenomenon in the 1950s?

In the decades since, the Catholic Church has in fact moved from neutrality over the issue to implicit acceptance of it through “theistic evolution,” where faith and scientific evidence about human evolution are not in apparent conflict, though humanity is still regarded as a special creation.27 Even so, it bears repeating that James Hutton struggled to make that argument in 1788, and that Sir Charles Lyell reluctantly abandoned it in 1859 when the full weight of scientific evidence made clear to him that it was no longer tenable, no matter how much he wanted to believe otherwise.

With its fierce commitment to creationism, American fundamentalism and Christian Evangelicalism since the 1920s have largely turned their backs on Republican freethinkers like Robert Ingersoll, who christened doubt “the womb and cradle of progress,” and pro-evolutionary Calvinists like Asa Gray, who as early as 1874 argued, “The attitude of theologians toward doctrines of evolution, from the nebular hypothesis down to ‘Darwinism,’ is no less worthy of consideration, and hardly less diverse, than that of naturalists.”28 With its own forms of unbelief comparatively weaker than those that emerged on the other side of the Atlantic, the United States also tended to reject the looser interpretations of the Bible that transformed 1860s Britain. Those, we have seen, allowed biblical accounts to be cast as metaphorical, a move thought anathema to literalists then and now. Denying the credibility of such readings, literalists denounced as heresy the idea that Christianity might adapt to scientific discovery. They superimposed on scripture a pristine start for Adam and Eve and ignored that the extant version of Genesis is likely only part of a much-longer original story.29 In doing so, they pushed aside Christians in the 1890s who felt it necessary to concede that “the old belief that the beginning of creation preceded our time by only about six thousand years” has been “shown to be untenable.”30

In 1925, a law passed in Tennessee preventing any state-funded educational establishment from teaching “any theory that denies the story of divine creation of man as taught in the Bible, and teach instead that man has descended from a lower order of animals.”31 The law effectively made it illegal to deny the biblical account of man’s creation. Challenging the constitutionality of the ruling, the American Civil Liberties Union approached John T. Scopes, a teacher who was willing to insist that evolutionary theory be part of the curriculum for high-school students in the state. During what would soon be dubbed the Scopes Monkey Trial, the prosecution, led by celebrity lawyer and former secretary of state William Jennings Bryan, lambasted evolutionary theory for contending that the descent of man was “not even from American monkeys, but from old world monkeys.” The courtroom response was laughter.32

Scopes was found guilty and fined. The Tennessee Supreme Court reversed the conviction on a technicality concerning the fine, but the state legislature did not repeal the 1925 Butler Act until May 17, 1967. And though that ruling became a turning point in the U.S. version of the conflict over evolution and creationism, repeal of the law did not end or resolve the underlying problem of antipathy to evolutionary theory and its scientific argument, based on biblical literalism.

Nor does the Scopes trial appear to be over in some parts of the country. A recent study of religious beliefs in thirty-four industrialized nations found that more people in the United States doubted the existence of evolution (60 percent) than in any other country but Turkey (at 75 percent).33 The results of a 2007 Gallup poll were even more lopsided, finding that only 14 percent believed in evolution without God (up from 10 percent in 1997) and that the majority of Republicans doubt the theory of evolution.34 Those numbers came two decades after the U.S. Supreme Court ruled in 1987 that creationism could not be taught alongside evolutionary theory on the grounds that creationism is designed explicitly to advance religious interests.35 Nor are matters substantially better in Britain: in January 2006, BBC News reported that “just under half of Britons accept the theory of evolution as the best description for the development of life.”36

Yet the Christians who supported the Creation Museum with twentyseven million dollars in donations feel embattled and under attack.37 One exhibit on display there is a model of an imaginary screeching teacher, dubbed “Miss E. Certainty,” whose answer to the question “What if I don’t believe?” is, “Then you’re in violation of the Constitution.” That’s not the answer most agnostics and atheists would give to such a question. Nor do most teachers confuse knowledge with belief. To insist that they be the same, after all, is the cause of so much trouble. What the organizers of the “exhibit” obviously meant, and perhaps did not feel the need to spell out, was something more like: “What if I don’t believe in secular principles?”

That is not a trivial or easily answered question, given the number of Americans who are currently asking it. But one plausible rejoinder could run, “Well, then you have a serious problem with more or less the entire history of your country, including not only its founding documents, its Constitution, and its Bill of Rights, but also the founders of those principles.”

It is often forgotten that one founding father, Thomas Jefferson, decided to remove from the Bible the many parts of it that he considered false, supernatural, and distractingly magical. He did so while in office, it is notable to recall, and excised from the Bible the virgin birth, the miracles, the resurrection, and indeed that God recognized Jesus as part of the divinity. The result, completed in 1820 but published posthumously in 1895, has since become known as The Jefferson Bible. Jefferson himself called it The Life and Morals of Jesus of Nazareth, Extracted Textually from the Gospels in Greek, Latin, French, and English.38

In the United States today, however, the entanglement of religion with politics is far greater than in Jefferson’s day or even that of James Madison, the country’s fourth president, who railed against “religious bondage” and insisted on a celebrated solution to it: “Religion & Govt. will both exist in greater purity, the less they are mixed together.”39

During the 2004 presidential campaign, President George W. Bush openly told the American public: “I believe that God wants everybody to be free. That’s what I believe. And that’s part of my … foreign policy.”40 Small wonder that even mainstream U.S. News and World Report, characterizing the campaign as one in which “churchgoers and secular voters live in parallel universes,” called its report “Separate Worlds.”41 It was uncannily true.

By contrast, most British papers are fairly bullish about the Church of England’s disestablishment as the country’s official religion. Nor is the archbishop of Canterbury especially troubled by the prospect, saying that it wouldn’t be “the end of the world.”42 He comes across as sanguine about the realities facing the Church, which simply doesn’t interest large numbers of Britons (though controversy continues to flare over the ordination of women and the openly gay).43 Indeed, when I was last in England (where I was born and lived the first twenty-four years of my life), Birmingham Cathedral was being pilloried for suggesting that it might open a wine bar and introduce loyalty cards to help fill the empty pews. “Cathedral wine bars,” declared the cathedral’s first director of hospitality and welcome, “should be seen as a potential commercial operation with profits going into the upkeep of the building and paying for evangelistic work.”44

What got more press in Britain was a winter 2008 campaign to raise money for a series of advertisements that would appear on thirty London buses. These would bear the message: “There’s probably no God. Now stop worrying and enjoy your life.”45 That is quite a statement, after all, and much of its impact hangs on the interesting modifier, “probably.” The campaign was a response to the “Jesus Said” ads that had appeared on London buses earlier that summer. Those had published a series of Bible quotations, followed by a link to a Web site, where visitors were told that non-Christians “will be condemned to everlasting separation from God and then [will] spend all eternity in torment in hell. … Jesus spoke about this as a lake of fire prepared for the devil.”46

The counter-campaign to add agnostic posters raised more than enough money (rather quickly, as it happens, after word spread that there had been little enthusiasm for it). Sporadic complaints were heard, along with reports of a driver refusing to drive one of the buses because he opposed the message. The group Christian Voice even grumbled to the Advertising Standards Authority that the second campaign (though not the first) broke the country’s regulatory code, that “marketers must hold documentary evidence to prove all claims.” Understandably or not, that drew “peals of laughter” from the British Humanist Association, which had organized the counter-response.47 Its campaign went on to raise enough money to put the posters on eight hundred buses nationwide.

It may be harder to imagine similar advertisements in the United States, even in liberal pockets of the country. Yet in fact the American Humanist Association launched a nearly identical campaign in Washington, D.C., just after the one in Britain. Buses in the capital carried the slogan, “Why believe in a God? Just be good for goodness sake.”48 And in April 2009, the New York Times reported that a group of secular humanists had put up a billboard in Charleston, South Carolina, bearing the words: “Don’t Believe in God? You Are Not Alone.” Although the group was relatively small, the reporter noted, residents of the state claiming “no religion” had more than tripled, to 10 percent (from 3 percent in 1990).

Across the United States, in fact, that total is now hovering between 15 and 16 percent of the general population (up from 8 percent in 1990), and the trend toward secularism and away from religious affiliation is growing.49 So much so that a recent op-ed piece on Fox News carried the hyperbolic title, “Where Have All the Christians Gone?” “Christianity is plummeting in America,” the author Bruce Feiler warned, “while the number of non-believers is skyrocketing.” Feiler continued, “The number of Christians has declined 12% since 1990, and is now 76%, the lowest percentage in American history. The growth of non-believers has come largely from men. Twenty percent of men express no religious affiliation; 12% of women. Young people are fleeing faith. Nearly a quarter of Americans in their 20’S profess no organized religion.” Today, he concluded, “the rise of disaffection is so powerful that different denominations needs [sic] to band together to find a shared language of God that can move beyond the fading divisions of the past and begin moving toward a partnership of different-but-equal traditions. Or risk becoming Europe, where religion is fast becoming an afterthought.”50

That Feiler spoke of “plummeting” rates of faith, with the number of Christians in America still at 76 percent, indicates how devout the country was before the 1990s and arguably how devout it remains. To equate secularism with a “rise of disaffection” is to betray a characterization of faith comparable in distortion to Richard Dawkins’ calling atheists “brights.” But while Christians such as Feiler are feeling anxious and embattled, many non-Christians are frankly very worried about the arguments and rhetoric driving religious fanaticism in the United States.

Unlike in nineteenth-century America, today far larger numbers of the country’s citizens want to “reclaim America for Christ” and argue that there are really only two types of people in the world: those who will be saved, based on their love of Jesus, and those who will be “left behind,” to deal with the aftermath of a religious apocalypse that they, as Christians, are disturbingly keen to instigate. As Hitchens puts it, a weighty—and increasingly obstreperous—aspect of religion “looks forward to the destruction of the world. By this,” he adds, “I do not mean it ‘looks forward’ in the purely eschatological sense of anticipating the end. I mean, rather, that it openly or covertly wishes that end to occur” (GING, 56).51

Earlier in this book, we saw the Reverend John Cumming writing, in his Apocalyptic Sketches (1853), about the coming day of judgment circa 1867, when the “Christian dispensation would come to a glorious end.”52 One wonders how he felt in the final years of his life, before his death from natural causes in 1881. Still, military technology in Cumming’s day extended largely to muskets, swords, and sabers, not precision-guided missiles and nuclear weapons designed to wipe out entire populations and make their land uninhabitable for decades.

Almost cheering on that end, “30-40 percent of Americans” (according to one recent estimate)53 seem as if they cannot wait to test the “egotistical … hope that [they] will be personally spared, gathered contentedly to the bosom of the mass exterminator, and from a safe place observe the sufferings of those less fortunate” (GING, 57). They have been encouraged—even coached—in this thinking by the Left Behind series, coauthored by Christian evangelists Tim LaHaye and Jerry Jenkins, which has sold more than seventy million books worldwide since it debuted in the 1990s, the decade when End Times assumptions were truly reborn.

When Newsweek decided to feature the two men on a May 2004 cover, it added in boldface: “The elites may not recognize them, but LaHaye and Jenkins are America’s best-selling authors. At a time of uncertainty, they have struck a chord with novels that combine scriptural literalism with a sci-fi sensibility.”54

“Sci-fi sensibility” may not be too far off the mark, but the religious apocalypse that LaHaye and Jenkins fervently describe and anxiously await is far from small bore. They not only welcome but also have strongly tried to promote what they call “a literal interpretation of end-time prophecies.”55 Hitchens quotes a snapshot: “The blood continued to rise. Millions of birds flocked into the area and feasted on the remains … and the winepress was trampled outside the city, and blood came out of the winepress, up to the horse’s bridles, for one thousand six hundred furlongs” (GING, 57). This is “sheer manic relish,” Hitchens points out, “larded with half-quotations” (57). To LaHaye and Jenkins, however, “all prophecy should be interpreted literally whenever possible. We have been guided,” LaHaye continues, in a notably passive clause, “by the golden rule of interpretation: When the plain sense of Scripture makes common sense, seek no other sense” (TBLB, 7; emphasis in original).

As but one example of such guided reading, LaHaye and Jenkins turn the period of “tribulation,” which Jesus is said to undergo in the Gospel of Matthew, into a 490-year interval (seventy sets of seven-year periods). With a preposterous fantasy of accuracy, they even proclaim that a “divine prophetic clock” began ticking on March 5, 444 BC, “when the Persian king Artaxerxes issue[d] a decree allowing the Jews to return under Nehemiah’s leadership to rebuild the city of Jerusalem” (TBLB, 99). It is not the first time—and it likely won’t be the last—that people have had visions of messianic certainty about the beginning and end of life, even down to the day and minute. The Reverend Cumming, recall, found evidence for the End Times in “everything from the French Revolution to the Irish potato famine to the invention of the telegraph and steamship.”56

Isaiah, Jeremiah, and especially the Book of Revelation repeatedly mention the ancient city of Babylon. For LaHaye, Jenkins—and, presumably, many of their readers—that means but one thing: “The city of New Babylon [Baghdad] will be rebuilt in Iraq in the last days as a great world political and economic center for the Antichrist’s empire” (TBLB, 125). Reflexive irony, alas, is missing here. The empire in question—to justify the costly, protracted, and illegal occupation of that land—apparently did not fabricate evidence that it might soon be attacked. If there is cause for marvel, it is surely how quickly the catastrophes of American foreign policy are turned into opportunities for religious prophecy.

If anything, the 2008 American presidential election intensified such fundamentalist anxiety. As Barack Obama’s and John McCain’s campaigns slogged it out in the final months, the mainstream newsmagazine Time found itself needing to ask, quite seriously, whether one advertisement from McCain’s increasingly erratic campaign appealed directly to Christian fundamentalist voters and Left Behind readers. That was the one, supposedly tongue in cheek, that asked whether candidate Obama was really “The One”—as in, The New Messiah. It was also the ad that led Time to wonder, with good reason, whether the Republican nominee was cravenly trying to stoke apocalyptic fear among Evangelicals.

“It should be known that in 2008 the world shall be blessed,” the August McCain ad warned portentously. “They will call him ‘The One.’ And the world will receive his blessings.” The ad encouraged viewers to scorn Barack Obama’s experience and popularity—normally seen as a plus—by transforming these qualities into disturbing hints of messianism.57 If the McCain camp “wanted to be funny,” one Democratic consultant noted with amazing restraint, “if they really wanted to play up the idea that Obama thinks he’s the Second Coming, there [were] better ways to do it. Why use awkward lines like, ‘And the world will receive his blessings’?”58

Why indeed? It was ominous and disturbing, not funny at all. Time— whose article bore the boggling title “An Antichrist Obama in McCain Ad?” —was not alone in noting that the ad, and especially its biblical language, openly invoked the Left Behind series.59 Those books feature a charismatic young political leader, Nicolae Carpathia, who promises to heal the world after a time of deep division. The series finally exposes him as the Antichrist, but not before he tries to end interfaith strife with the ominous, even heretical slogan “We Are God.”

Of course, many religious traditions—some of them Christian—come very close to voicing that exact sentiment. Among Evangelicals, however, such ideas as healing the world of religious division must seem as if they come from the devil himself. Hence the dangerous high-wire act that McCain’s ad performed. When Rick Davis, McCain’s campaign manager, followed up by calling media criticism of vice presidential nominee Sarah Palin “literally an attack on Christianity itself,” he revealed the campaign’s scorched-earth policy.60 The McCain camp was playing with fire and clearly knew it.

The disturbing backstory to that moment in McCain’s campaign concerns the Evangelical literalism that underwrote it. Soon after candidate Obama began racking up a string of victories in the Democratic primary during the spring of 2008, Time noted that a Google search for “Obama and Antichrist” could generate more than seven hundred thousand hits, “including at least one blog dedicated solely to the topic.” “A more obscure search for ‘Obama’ and ‘Nicolae Carpathia,’” the columnist continued, “yield[ed] a surprising 200,000 references.”61

In the teeth of such religious certainty, it is worth noting that there were many other responses to Obama’s convincing victory, including by Maira Kalman, who published a series of beautiful paintings about American democracy and the 2008 campaign called “The Inauguration: At Last.” At the start of that series, Kalman depicted an angel almost blessing the following text: “Hallelujah. The Angels are Singing on this Glorious Day.”62 The many millions of Americans who swept Obama into power are likely to have felt similarly about his resounding, and very secular, victory.

Long before he won large majorities in the House and Senate, then, President Obama was represented, even by stalwart political rivals, as the enemy incarnate. Most of that animus has been directed at his person and his religious beliefs, which are well documented in his books, but some of the animus came from an early dispute with the influential Evangelical James Dobson, founder of Focus on the Family.

Dobson attracted press attention when he criticized then Senator Obama for “distorting” the Bible and adopting a “fruitcake interpretation” of the Constitution. His comment stemmed from Obama’s simple, earlier insistence that it would be “impractical” to govern solely from the word of the Bible and especially the Ten Commandments. Unlawful, too, given the Constitution.63

Since Obama would not back down over this fight, he may unwittingly have fueled resentment among influential Evangelicals, particularly since he was chiding them on home turf. “Which passages of Scripture should guide our public policy?” he asked appropriately. “Should we go with Leviticus, which suggests slavery is okay and that eating shellfish is an abomination? Or we could go with Deuteronomy, which suggests stoning your child ifhe strays from the faith, or should we just stick to the Sermon on the Mount?”64

image

Figure 20. Maira Kalman, “The Inauguration: At Last,” New York Times, January 29, 2009. Courtesy Maira Kalman.

Backed into a corner, Dobson replied that Obama “should not be referencing antiquated dietary codes and passages from the Old Testament that are no longer relevant.” But moving the debate to questions of relevance and hinting at his own selective application of verses was Dobson’s immediate undoing. Leviticus is an old standby, not least because it contains a broad range of injunctions—including that eating “all that are in the waters, in the seas and in the rivers” is “an abomination” (11:9-12). The charge is thrice repeated; and Leviticus is no less merciful on those who eat hare, fowl, and pork. So Obama cautioned, to cheers, “Before we get carried away, let’s read our Bible”—as worshippers and readers, that is, who recognize, without sanctioning, the ancient allusions.65 Dobson didn’t appear to take kindly to the humiliation.

Amid all this religious certainty, what then of doubt and its ongoing importance?

One answer comes from a Pulitzer Prize-winning play, Doubt: A Parable, which John Patrick Shanley, its author, has since adapted into a major motion picture. Both the play and the film are set in 1964, at a Catholic church and school in the Bronx. And though the play has nothing to say about scientific uncertainty and the teaching of evolution, the opening scene is a sermon that Father Flynn feels moved to give on the subject of religious doubt amid a growing sense of confusion and collective self-doubt in the United States as a whole.

Only a few months have passed since the Kennedy assassination, and the most common feeling is still “profound disorientation. Despair. ‘What now? Which way? What do I say to my kids? What do I tell myself?’ [The catastrophe] was a time of people sitting together, bound together by a common feeling of hopelessness. Your bond with your fellow beings was your despair. It was a public experience, shared by everyone in our society. It was awful, but we were in it together!”66 From that togetherness over uncertainty, Flynn tells his congregation, in a statement that for some still borders on heresy, “doubt can be a bond as powerful and sustaining as certainty” (D, 6).

That Shanley casts his theater audience as the assembled congregation adds something to the atmosphere and tension of the play that the film cannot quite recapture. Willingly or not, the audience is rendered a silent witness to the extraordinary drama that unfolds before them.

Anyone who has seen the play or the film will know that Doubt is extremely hard to sum up. One’s sense of what takes place, and why, is almost impossible to disentangle from the perspectives and judgments that an audience inevitably brings to the drama. Should we see the play—and its cast—through the lens of certainty or of doubt? There is no single way to read the play, of course, and no one telling us which is the right answer or interpretation. Still, how we read it is rather like a Rorschach test of our opinions and convictions, religious and otherwise.

The central conundrum is whether something untoward, perhaps sexual, has occurred between Father Flynn and Donald Muller, the school’s “first Negro student,” whom Flynn sees as the object of minor bullying (D, 21). Having joined the predominantly Irish and Italian school two months earlier, at a time when many U.S. schools were still segregated, Muller “has no friends” and is “thirteenth in class.” He does however have “a protector” in Father Flynn, who is described, rather open-endedly, as having “taken an interest” in him (19, 20).

With the Catholic Church making headlines in the late 1990s and early 2000s for lawsuits brought by hundreds of abused pupils over predatory priests, few audience members could fail to imagine the moral dilemma that Sister Aloysius experiences, even briefly, over whether to investigate further. At the same time, as in a Hawthorne novel or The Crucible, Arthur Miller’s remarkable adaptation of the 1692 Salem witch trials, the “certainty” (her word) with which she pronounces Father Flynn guilty on highly equivocal “evidence” (mostly hearsay) is tinged—even greatly overdetermined—by her own interest in seeing him punished for misdemeanors that may be more in her head than outside it (D, 54).

True, Father Flynn resigns abruptly, believing that she has dredged up some professional disgrace from his past. It is never clear whether that infraction was sexual harassment, something far less troubling, or even whether it was trumped up earlier, too. At the same time, Sister Aloysius is deeply disturbed by Flynn’s efforts to modernize the church, in part by being more friendly to the pupils and congregation. He gives the boys a talk, for instance, on “how to be a man,” which irks her, not least because it includes advice on how to ask girls out on dates (D, 18). Far from being treacherous or predatory, it all looks very innocent, even admirable. And of course it may be.

Given all the uncertainties—religious and otherwise—that swirl in Shanley’s play, the audience and the reader have to cope with a profound, almost unsettling ambiguity over what they believe happened, and why. As Shanley asks in the play’s preface, “Have you ever held a position in an argument past the point of comfort? Have you ever defended a way of life you were on the verge of exhausting? … I have. That’s an interesting moment” (D, vii–viii).

For Shanley, doubt and uncertainty bring to the fore “something silent under every person and under every play.” They also manifest, however awkwardly, “something unsaid under any given society” (vii). It is doubt, he says, “(so often experienced initially as weakness) that changes things.” Yet doubt that oddly “requires more courage than conviction does, and more energy; because conviction is a resting place and doubt is infinite—it is a passionate exercise” (viii, ix). That may explain why Sister Aloysius breaks down moments before the final curtain, apparently victorious but actually “bent with emotion.” One critic called her character “a triumph of hard-won conviction over human indecisiveness,” and indeed her exclamations (the play’s final words) are meant to seem like a catharsis of self-recognition, though one coming far too late to reassure her or the audience: “I have doubts! I have such doubts!” (D, 58).67

One of the many lessons of Shanley’s play is how little “room or value [is] placed on doubt,” a quality, Shanley asserts, that is still “one of the hallmarks of the wise.”68 Another lesson is how firmly, even tenaciously, we hold onto our beliefs. Certainly we do not give them up lightly or willingly! An ability to withstand opposition to, even derision of, one’s beliefs and faith is how large numbers of worshippers self-define. That should be a fairly important reminder to those seeking to temper belief with reason, the better to “eviscerate” faith. The sense of assault on belief is precisely what gives it tenacity in the teeth of all opposition. It sends belief into a realm of justification from which it is hard to withdraw.

With beliefs harnessing intense psychological and political power, it should also be clear that while the world wrestles with the near-insuperable task of trying to honor everyone’s religious tenets, even as they predictably collide with those of others, the experience of losing those beliefs (or tempering them with doubt) can be dramatic, often culminating in a profound reorientation to the world as it is. Similarly life-altering, I hope to have shown, is the sense of uncertainty, creativity, and peculiar freedom that can ensue when one set of explanations gives way, leaving in its wake concerns and dilemmas that faith once seemed to answer. I quoted Leslie Stephen’s “Agnostic’s Apology” in the previous chapter, including his insistence that the central “question is not which system excludes the doubt, but how it expresses the doubt.”69 But it was his daughter Virginia Woolf who captured the full intensity of that insight:

The mind is full of monstrous, hybrid, unmanageable emotions. That the age of the earth is 3,000,000,000 years; that human life lasts but a second; that the capacity of the human mind is nevertheless boundless; that life is infinitely beautiful yet repulsive; that one’s fellow creatures are adorable but disgusting; that science and religion have between them destroyed belief; that all bonds of union seem broken, yet some control must exist—it is in this atmosphere of doubt and conflict that writers have now to create.70

If “all bonds of union seem[ed] broken” for Woolf in the late 1920s, what of the “powerful and sustaining … bond” of doubt that Father Flynn invokes at the start of Shanley’s play almost a century later? The kind of “public experience” of despair following the Kennedy assassination or 9/11, more recently, is not of course the same as the religious crises we have followed in this book. Nor does the doubt that Sister Aloysius states at the end of the play mean that she’s capable of forming a bond with the man whose life she not only has derailed but also might easily have destroyed—and perhaps for no reason at all.

The desire and willingness to voice such doubt is, however, a start. As Shanley claims, “Doubt is nothing less than an opportunity to reenter the Present,” to rethink how it came to exist in this form, and how one assesses the historical difference between our now and our then (D, viii). It “stimulates the evaluation of beliefs,” Robert Baird emphasizes, including ones that may “be misplaced,” and thus is a catalyst for change and moderation.71 Much of the ability to voice that doubt is due to freethinkers such as William Nicholson dramatizing “The Doubts of Infidels”; to Robert Chambers inquiring into “the mode in which the Divine Author proceeded in the organic creation”; to Thomas Huxley standing up to ridicule from the bishop of Oxford; and to James Anthony Froude wondering, among others, “Why is it thought so very wicked to be an unbeliever?”72

Through these thinkers and the “passionate exercise” of their doubt, we have an “opportunity to reenter the Present” by questioning what beliefs mean to us and what role we are prepared to assign them (D, ix). The rise of religious extremism in many parts of the world makes such questioning more urgent than ever. The process of working through doubt remains one of the best ways to go on thinking, reflecting on choices, and wondering at uncertain outcomes.