Chapter Eight

WONDERING

image

Incest at “a Genetically Discreet Remove”

Invited by the University of Miami to address members of the class of 2005, the columnist repaid this courtesy by telling them that even though they surely had showered before donning their caps and gowns, each of them had about a trillion bacteria feeding on the 10 billion flakes of skin each of us sheds in a day. If each 2005 graduate were disassembled into his or her constituent atoms, each graduation gown would contain nothing but atomic dust. But as currently assembled, this stardust—really: we are all residues of the big bang—is living stuff, capable of sublime emotions like love, patriotism, and delight in defeating Florida State.

The body of every Miami graduate has about 10 thousand trillion cells, each containing a strand of DNA that, uncoiled, would extend about six feet. If that person’s DNA were spliced into a single strand, it would extend 20 million kilometers—enough to stretch from Miami to Los Angeles and back 2,270 times.

So says Bill Bryson, author of the delightful A Short History of Nearly Everything. According to him, everyone now alive contains some Shakespeare. That is, some of the physical stuff he was made of. And Julius Caesar’s stuff, and Genghis Khan’s and Charlemagne’s. And Charlemagne’s cook’s. There are trillions of trillions of atoms in each of us, so lots—probably billions—of atoms have been recycled in each of us from Beethoven. In that sense we all are, as Bryson says, reincarnations.

Indeed, each member of Miami’s class of 2005 is related to every other member and to—facts must be faced—every graduate of Florida State. It took 2 parents to produce each of us, and 4 people to produce our parents. If we look back eight generations, to Lincoln’s day, Bryson says that more than 250 people contributed to the creation of each of us. Look back to Shakespeare’s day, and we are directly descended from 16,384 ancestors. Look back 64 generations, to the era of the Roman Empire, and we have a thousand trillion ancestors.

But wait. A thousand trillion people is thousands of times more than the number of human beings who have ever lived. So everyone is the product of a lot of incest—but incest at what Bryson calls “a genetically discreet remove.” This extended single family—humanity—inhabits the little planet Earth, whose continents are wandering.

Bryson says Europe and North America are moving away from each other at about the speed that a fingernail grows—about two yards in a normal human lifetime. The African continent is creeping northward and someday will squeeze the Mediterranean Sea out of existence and will shove up a chain of mountains as high as the Himalayas extending from Paris to Calcutta.

The Earth is restless partly because its molten core retains heat amazingly well: It has lost only about two hundred degrees in the 4 billion years since the planet coalesced. Not that we have come close to that core: Bryson says that if the planet were an apple, our underground exploration would not yet have broken the skin.

The sun around which Earth orbits is one of perhaps 400 billion stars in the Milky Way, which is a piddling galaxy next door to nothing much. There are perhaps 140 billion galaxies in the still-unfolding universe. If all the stars in the universe were only the size of the head of a pin, they still would fill Miami’s Orange Bowl to overflowing more than 3 billion times.

We should by now be used to strange thoughts. It has been one hundred years since June 1905, when Albert Einstein began publishing the scientific papers that taught us that gravity bends light, that space and time are warped, that matter and energy are interchangeable, that the mass of an object increases the faster it moves and that the experience of time is a function of speed.

But there is a not-at-all-strange reason that a Washington columnist would belabor Miami graduates with strange facts. It is this: The more they appreciate the complexity and improbability of everyday things—including themselves—the more they can understand the role that accidents, contingencies, and luck have played in bringing the human story to its current chapter. And the more they understand the vast and mysterious indeterminacy of things, the more suited they will be to participate in writing the next chapter.

This is so because the greatest threat to civility—and ultimately to civilization—is an excess of certitude. The world is much menaced just now by people who think that the world and their duties in it are clear and simple. They are certain that they know what, or who, created the universe and what this creator wants them to do to make our little speck in the universe perfect, even if extreme measures—even violence—are required.

America is currently awash in an unpleasant surplus of clanging, clashing certitudes. That is why there is a rhetorical bitterness absurdly disproportionate to our real differences. It has been well said that the spirit of liberty is the spirit of not being too sure that you are right. One way to immunize ourselves against misplaced certitude is to contemplate—even to savor—the unfathomable strangeness of everything, including ourselves.

[MAY 23, 2005]

An Intellectual Hijacking

Not since the medieval church baptized, as it were, Aristotle as some sort of early—very early—church father has there been an intellectual hijacking as audacious as the attempt to present America’s principal founders as devout Christians. Such an attempt is now in high gear among people who argue that the founders were kindred spirits with today’s evangelicals, and that they founded a “Christian nation.”

This irritates Brooke Allen, an author and critic who has distilled her annoyance into Moral Minority: Our Skeptical Founding Fathers. It is a wonderfully high-spirited and informative polemic that, as polemics often do, occasionally goes too far. Her thesis is that the six most important Founders—Franklin, Washington, Adams, Jefferson, Madison, and Hamilton—subscribed, in different ways, to the watery and undemanding Enlightenment faith called deism. That doctrine appealed to rationalists by being explanatory but not inciting: it made the universe intelligible without arousing dangerous zeal.

Eighteenth-century deists believed there was a God but, tellingly, they frequently preferred synonyms for him—“Almighty Being” or “Divine Author” (Washington) or “a Superior Agent” (Jefferson). Having set the universe in motion like a clockmaker, Providence might reward and punish, perhaps in the hereafter, but does not intervene promiscuously, or perhaps at all, in human affairs. (Washington did see “the hand of Providence” in the result of the Revolutionary War.) Deists rejected the Incarnation, hence the divinity of Jesus. “Christian deist” is an oxymoron.

Allen’s challenge is to square the six founders’ often pious public words and behavior with her conviction that their real beliefs placed all six far from Christianity. Her conviction is well documented, exuberantly argued, and quite persuasive.

When Franklin was given some books written to refute deism, the deists’ arguments “appeared to me much stronger than the refutations; in short, I soon became a thorough deist.” Revelation “had indeed no weight with me.” He believed in a creator and the immortality of the soul, but considered these “the essentials of every religion.”

What Allen calls Washington’s “famous gift of silence” was particularly employed regarding religion. But his behavior spoke. He would not kneel to pray, and when his pastor rebuked him for setting a bad example by leaving services before communion, Washington mended his ways in his austere manner: he stayed away from church on communion Sundays. He acknowledged Christianity’s “benign influence” on society, but no ministers were present and no prayers were uttered as he died a Stoic’s death.

Adams declared that “phylosophy looks with an impartial Eye on all terrestrial religions,” and told a correspondent that if they had been on Mount Sinai with Moses and had been told the doctrine of the Trinity, “We might not have had courage to deny it, but We could not have believed it.” It is true that the longer he lived, the shorter grew his creed, and in the end his creed was Unitarianism.

Jefferson, writing as a laconic utilitarian, urged his nephew to inquire into the truthfulness of Christianity without fear of consequences: “If it ends in a belief that there is no god, you will find incitements to virtue in the comforts and pleasantness you feel in its exercise, and the love of others which it will procure you.”

Madison, always commonsensical, briskly explained—essentially, explained away—religion as an innate appetite: “The mind prefers at once the idea of a self-existing cause to that of an infinite series of cause & effect.” When Congress hired a chaplain, he said “it was not with my approbation.”

In 1781, the Articles of Confederation acknowledged “the Great Governor of the World,” but six years later the Constitution made no mention of God. When Hamilton was asked why, he jauntily said, “We forgot.” Ten years after the Constitutional Convention, the Senate unanimously ratified a treaty with Islamic Tripoli that declared the United States government “is not in any sense founded on the Christian religion.”

Allen neglects one argument for her thesis that the United States is a “secular project”: The Constitution mandates the establishment of a political truth by guaranteeing each state the same form of government (“republican”). It does so because the Founders thought the most important political truths are knowable. But because they thought religious truths are unknowable, they proscribed the establishment of religion.

Allen succumbs to what her six heroes rightly feared—zeal—in her prosecution of today’s religious zealots. In a grating anachronism unworthy of her serious argument, she calls the founders “the very prototypes, in fact, of the East Coast intellectuals we are always being warned against by today’s religious right.” (Madison, an NPR listener? Maybe not.) When she says, “Richard Nixon and George W. Bush, among other recent American statesmen,” have subscribed to the “philosophy” that there should be legal impediments to an atheist becoming president, she is simply daft. And when she says that Bible study sessions in the White House and Justice Department today are “a form of potential religious harassment that should be considered as unacceptable as the sexual variety,” she is exhibiting the sort of hostility to the free exercise of religion that has energized religious voters, to her sorrow.

Two days after Jefferson wrote his letter endorsing a “wall of separation” between church and state, he attended, as he occasionally did, religious services in the House of Representatives. Jefferson was an observant yet unbelieving Anglican/Episcopalian throughout his public life. This was a statesmanlike accommodation of the public’s strong preference, which then as now was for religion to have ample space in the public square.

Christianity, particularly its post-Reformation ferments, fostered attitudes and aptitudes associated with popular government. Protestantism’s emphasis on the individual’s direct, unmediated relationship with God, and the primacy of individual conscience and choice, subverted conventions of hierarchical societies in which deference was expected from the many toward the few. But beyond that, America’s founding owes much more to John Locke than to Jesus.

The founders created a distinctly modern regime, one respectful of preexisting rights—natural rights, not creations of the regime. And in 1786, the year before the Constitutional Convention constructed the regime, Jefferson, in the preamble to the Virginia Statute for Religious Freedom, proclaimed that “our civil rights have no dependence on our religious opinions, any more than our opinions in physics or geometry.”

Since the founding, America’s religious enthusiasms have waxed and waned, confounding Jefferson’s prediction, made in 1822, four years before his death, that “there is not a young man now living in the United States who will not die an Unitarian.” In 1908, William Jennings Bryan, the Democrats’ presidential nominee, said his Republican opponent, William Howard Taft, was unfit because, being a Unitarian, he did not believe in the Virgin Birth. The electorate yawned and chose Taft.

A century on, when the most reliable predictor of a voter’s behavior is whether he or she regularly attends church services, it is highly unlikely that Republicans would nominate a Unitarian. In 1967, when Governor George Romney of Michigan evinced interest in the Republican presidential nomination, his Mormonism was of little interest and hence was no impediment. Four decades later, the same may not be true if his son Mitt, also a Mormon, seeks the Republican nomination in 2008.

In 1953, the year before “under God” was added to the Pledge of Allegiance, President Dwight D. Eisenhower declared July 4 a day of “penance and prayer.” That day he fished in the morning, golfed in the afternoon, and played bridge in the evening. Allen and others who fret about a possibly theocratic future can take comfort from the fact that America’s public piety is more frequently avowed than constraining.

[OCTOBER 22, 2006]

From Dayton, Tennessee, to Rhode Island’s Committee on Fish and Game

John Scopes attended high school in Salem, Illinois, where his commencement speaker was the town’s most famous native son, William Jennings Bryan. Their paths would cross again.

Eighty years ago, Scopes, twenty-four, a high school football coach and general-science teacher, attended a meeting in Robinson’s drugstore in Dayton, Tennessee. There, to the satisfaction of community leaders who thought that what was to come would be good for business, Scopes agreed to become the defendant in a trial testing Tennessee’s law against teaching “any theory that denies the story of the divine creation of man as taught in the Bible, and to teach instead that man has descended from a lower order of animals.”

So began “the most widely publicized misdemeanor case in American history.” That is Edward J. Larson’s description of the “monkey trial” in his 1997 Pulitzer Prize–winning Summer for the Gods: The Scopes Trial and America’s Continuing Debate over Science and Religion. With that debate again at a rolling boil, that book by Larson, professor of history and law at the University of Georgia, demonstrates that the trial pitted a modernism with unpleasant dimensions against a religious fundamentalism that believed, not without reason, that it was faithful to progressive values.

By 1925, many Christian geologists were comfortable with the fact that Earth has a long geologic history. They saw God immanent in the dynamic of appearance and disappearance of life forms. What most distressed some Christians was not the fact of evolution but the postulated mechanism—a nature-red-in-tooth-and-claw randomness that erased God’s purposefulness and benevolence.

Since the publication of Charles Darwin’s Origin of Species in 1859, religiously motivated critics of the theory of evolution by natural selection had stressed the supposed failure of paleontology to supply the “missing link” that would establish continuity in the descent of man.

Darwinism did not ignite a culture war until the 1920s, when high school education became common in the rural South, where Christian fundamentalism was strong. When school seemed to threaten children’s souls, fundamentalists sought and found a champion in Bryan, a three-time Democratic presidential nominee and star of the prosecution team in Scopes’s trial.

Scopes’s defense, led by Clarence Darrow, stressed individual rights—academic freedom. The prosecution stressed the community’s right to control the curriculum of public schools. As a young man, Bryan had been a force for progressivism understood as, Larson says, a “sunny faith in the curative power of majoritarian reforms,” such as popular election of U.S. senators. So the vocabulary of progressivism served Bryan’s argument that the issue was not what should be taught, but who should decide.

He, like many antievolutionists, believed that the idea of natural selection fueled merciless social Darwinism in domestic policies and militarism and imperialism among nations, justifying the survival of the fittest nations or races, and their dominion over lesser breeds. Modernists considered World War I a progressive crusade. Bryan resigned in protest as President Wilson’s secretary of state.

Many scientists at the time were, Larson says, receptive to the idea that we could channel human evolution through selective breeding. Some believed that acquired human characteristics could be inherited, hence improvement of the human race could be engineered. And many evolutionary biologists embraced eugenics. By 1935, thirty-five states had laws compelling the sexual segregation and sterilization of people considered unfit—the mentally ill and retarded, habitual criminals, and epileptics.

Today’s proponents of “intelligent design” theory are doing nothing novel when they say the complexity of nature is more plausibly explained by postulating a designing mind—aka God—than by natural adaptation and selection. By 1925, Larson’s book notes, “Christian apologists had long regarded the intricate design of the eye as a ‘cure for atheism.’”

The problem with intelligent-design theory is not that it is false but that it is not falsifiable: Not being susceptible to contradicting evidence, it is not a testable hypothesis. Hence it is not a scientific but a creedal tenet—a matter of faith, unsuited to a public school’s science curriculum.

The Dayton jurors were eager to get on with their lives—“The peach crop will soon be coming in,” one said—and did not even sit down before deciding that Scopes was guilty. But Bryan did not believe penalties should be attached to antievolution laws—“We are not dealing with a criminal class”—and offered to pay Scopes’s $100 fine.

Bryan died five days after the trial. Scopes left to study geology—how fitting—at the University of Chicago and became a petroleum engineer. The argument about science, religion, the rights of communities’ majorities, and academic freedom rolled on, but not everywhere. When an antievolution bill was introduced in the Rhode Island Legislature, it was referred to the Committee on Fish and Game.

[JULY 4, 2005]

Earth: Not Altogether Intelligently Designed

Earth, that living, seething, often inhospitable and not altogether intelligently designed thing, has again shrugged, and tens of thousands of Pakistanis are dead. That earthquake struck ten months after the undersea quake that caused the December 2004 tsunami that killed 285,000 in Asia. Americans reeling from Katrina, and warned of scores of millions of potential deaths from avian flu, have a vague feeling—never mind the disturbing rest of the news—of pervasive menace from things out of control. Too vague, according to Simon Winchester.

His timely new book, A Crack in the Edge of the World: America and the Great California Earthquake of 1906, teaches—reminds, really—that we should have quite precise worries about the incurably unstable ground on which scores of millions of Americans live. This almost certainly will result in a huge calamity, probably in the lifetime of most people now living.

Before the study of plate tectonics revolutionized geology just forty years ago, that science, Winchester writes, was concerned with “rocks, fossils, faults and minerals that were scattered around simply and solely on the surface of the earth.” But the surface consists of between—depending how they are defined—six and thirty-six floating plates, which Winchester calls “rafts of solid rock.” The plates’ slow movements are powered by earth’s molten innards, the boiling and bubbling radioactive residue of the planet’s formation 4.5 billion years ago.

The plates grind against—and slide up on, or plunge below—one another. But not smoothly, which is the lethal problem. When friction freezes them for a while, stupendous energy builds up until, suddenly, plates unlock and the energy is released, sometimes in ways that seem to involve related spasms around the world.

On the last day of January 1906, that seismically dangerous year, an earthquake in Ecuador and Colombia of perhaps 8.8 magnitude on the Richter scale killed about 2,000. Sixteen days later there was a large Caribbean quake, followed five days later by one in the Caucusus, and on March 17 by one that killed 1,228 on the island of Formosa. On April 6, a ten-day eruption of the volcano Vesuvius began with rocks blown forty thousand feet into the air over Naples. Two days after Vesuvius subsided, San Francisco was knocked down, and twenty-six hundred acres of it were then devoured by three days of fires. About 3,000 San Franciscans died then, four months before a Chilean quake killed 20,000.

San Francisco’s quake was smaller than the series of shocks around New Madrid, Missouri, over a few winter weeks in 1811–1812. They were strong enough to ring the bells in a Charleston, South Carolina, church that was later destroyed in that city’s 1886 quake. Scores of millions of Americans now live on the unstable faults that shook mid-America in 1811–1812.

For San Francisco, the bad news is that the quake that killed sixty-three in 1989 (6.9 magnitude, compared to 8.3 in 1906) was caused not by the San Andreas fault, but by a neighboring one. So the big menace, the San Andreas, has not recently lurched, as it surely will because it is moving, sporadically, in grinding concert with the Pacific Plate. Since 1906 there have been only five major earthquakes along the 750 miles of the San Andreas, and none in Northern California. The U.S. Geological Survey estimates a 62 percent probability of a quake in that area of at least 6.7 magnitude before 2032. Pondering the prosperous town of Portola Valley, south of San Francisco, exactly astride two of the most active strands of the San Andreas, Winchester, like many geologists who have warned the town, is fascinated by “humankind’s insistent folly in living in places where they shouldn’t.”

After Earth’s heavings subside, they reverberate in people’s minds. Winchester says that when the 1755 Lisbon earthquake killed sixty thousand, “priests roved around the ruins, selecting at random those they believed guilty of heresy and thus to blame for annoying the Divine, who in turn had ordered up the disaster. The priests had them hanged on the spot.”

The 1883 eruption of Krakatoa in what is now Indonesia fueled the growth of an extremist strain of Islam, bent on purging society of impurities displeasing to God. That strain has twice recently been heard from in Bali terrorist attacks.

San Francisco’s 1906 disaster prompted the explosive growth of a Pentecostal movement based in Los Angeles, a movement then embryonic but now mighty. Yet when A. P. Hotaling’s whiskey warehouse survived San Francisco’s postquake inferno, a wit wondered:


If, as some say, God spanked the town

For being over frisky,

Why did He burn the churches down

And save Hotaling’s whiskey?


Good question.

[OCTOBER 11, 2005]

Intelligent Design and Unintelligent Movies

This summer’s movie stars are not the usual bipeds, but other animals—emperor penguins and grizzly bears. Their performances are pertinent to some ongoing arguments.

March of the Penguins raises this question: If an Intelligent Designer designed nature, why did it decide to make breeding so tedious for those penguins? The movie documents the seventy-mile march of thousands of Antarctic penguins from the sea to an icy breeding place barren of nutrition. These perhaps intelligently but certainly oddly designed birds march because they cannot fly. They cannot even march well, being most at home in the sea.

In temperatures of eighty below and lashed by one hundred mile per hour winds, the females take months to produce an egg while the males trek back to the sea to fatten up. Returning, the males are entrusted with keeping the eggs warm during foodless months while the females march back to the sea to fill their stomachs with nutriments they will share with the hatched chicks.

The penguins’ hardiness is remarkable, as is the intricate choreography of the march, the breeding, and the nurturing. But the movie, vigorously anthropomorphizing the birds, invites us to find all this inexplicably amazing, even heroic. But the penguins are made for that behavior in that place. What made them? Adaptive evolution. They have been “designed” for all that rigor—meaning they have been shaped by adapting to many millennia of nature’s harshness.

Speaking of harshness, Timothy Treadwell, college dropout, drug abuser, and failed actor, became a Southern California beach bum, had a heroin overdose, and then an epiphany: He must spend summers in Alaska “protecting” the grizzlies. The idea that these huge, robust carnivores need protection provided by this mentally wobbly narcissist—a developmentally arrested adolescent in his forties—would be funny, had not Alaska officials “hauled four garbage bags of people out of the bear” that devoured Treadwell and his girlfriend at the end of his fifth summer filming grizzly bears to which he gave cute names like Mr. Chocolate and Sgt. Brown.

About half of Grizzly Man, a documentary about Treadwell, is his film. The rest consists of interviews with, among others, a dry-eyed Alaskan who says “he got what he was asking for.” Although Treadwell has been described as an “animal lover,” the grandiosity of his self-praise as he preens and waxes metaphysical in front of his camera reveals that his great love was himself. His cooing of “I love you” at magnificently indifferent bears and his swooning over the warmth of bear feces (“This was just inside of her!”) is as repulsive as his weeping over evidence that nature really is red in tooth and claw.

Evidence such as bear cubs killed by mature male bears so the mother will stop lactating and be sexually available. Call that the Summer of Love, Alaska-style.

Treadwell was not far from mental illness, or from a social stance—nature is sweet, civilization is nasty—not easily distinguished from mental illness. Call it Sixties Envy. So, see Grizzly Man, then read T. C. Boyle’s 2003 novel Drop City.

It is about a bunch of Treadwells who, having dropped out and dropped acid, are addled but able to see that their California commune, based on “voluntary primitivism,” has become overrun with inane philosophy and the communards’ sewage. Also, the county sheriff is angry. So a few of them decide to found Drop City North in Alaska. As one of these pioneers explains, in Alaska there are “no rules,” but there are food stamps.

There they plan “to live the vegetarian ideal,” but where will the cheese come from, now that a wolverine has eaten the communal goats? When an Alaskan explains that “we eat bear and anything else we can get our hands on,” a nature worshipper is horrified:

“‘But to kill another creature, another living soul, a soul progressing through all the karmic stages to nirvana’—she paused to slap a mosquito on the back of her wrist with a neat slash of her hand—‘that’s something I just couldn’t do.’

“‘You just did.’

“‘What? Oh, that. All right…I shouldn’t have…but a bug is one thing…and like a bear is something else. They’re almost human, aren’t they?’”

The movies and novel prompt a thought: Reality’s swirling complexity is sometimes lovely, sometime brutal; its laws propel the comings and goings of life forms in processes as impersonal as Antarctica is to the penguins or the bears were to Treadwell or Alaska was to Drop City North. It is so grand that nothing is gained by dragging an Intelligent Designer into the picture for praise. Or blame.

[AUGUST 28, 2005]

The Pope, the Neurosurgeon, and the Ghost in the Machine

A concatenation of three events last week—two protracted deaths and one literary birth—was, as a stimulus to reflection, remarkable. Or, some will say, providential.

In a utilitarian, if humane, place, a hospice in Florida, a woman tangled in some toils of modernity—medical technology, and the machinery of litigation and legislation—died because, after fifteen years in a vegetative state, and supposedly out of respect for her autonomy, nutrition and hydration were withdrawn from her. In any other age—even a generation or two ago—she would not have become an appendage of devices that can sustain the body, or most of it, while a part of it, the brain, has stopped performing the functions essential to personhood, as normally understood.

There was another death, in a place of purposeful splendor, the Vatican, which was built as a defiant assertion of confident authority in response to the tempest of the Reformation. The foremost contemporary steward of an ancient faith, Pope John Paul II did more than anyone in our lifetimes to make vivid the task of defining respect for life in the context of modern science.

It used to be said matter-of-factly that a person who died “gave up the ghost.” But are we still confident there is, in the language of a modern philosopher, a “ghost in the machine”?

Last week a gifted novelist published a new work that is, among other things, a materialist’s manifesto. Ian McEwan lives in London, where Saturday is set. His protagonist, Henry Perowne, has the sensibility of today’s post-Christian Europe. Perowne believes we are, in a sense, machines—matter and nothing more. He thinks as those people do who say, “I do not have a body, I am a body.” Perowne is a neurosurgeon.

With sharpened steel a neurosurgeon slices and splices and pares physical matter to palliate injuries to minds—to consciousness. Pharmacology also can do that. McEwan writes:

“A man who attempts to ease the miseries of failing minds by repairing brains is bound to respect the material world, its limits, and what it can sustain—consciousness, no less. It isn’t an article of faith with him, he knows it for a quotidian fact, the mind is what the brain, mere matter, performs.”

Perowne, the voice of scientifically sophisticated secularism, and presumably of McEwan, almost lyrically, and rightly, exhorts us to appreciate the “wonder of the real.” One can, however, imagine a faint, droll smile flickering on the strong, intelligent face of John Paul II were he to have read those almost casually appended three words—“consciousness, no less.” He might think to himself: The materialist must not tarry, he must hurry on, because as Emerson said, when skating on thin ice, safety lies in speed.

This pope might have read Emerson, and it is easy to imagine him, before frailty conquered his body, keeping abreast of contemporary literature, including McEwan. Before he was John Paul II he was Karol Wojtyla, a skiing poet, playwright, and philosopher. And a defining theme of his papacy was the compatibility of faith and science, the explainer of reality. The explainer, but only up to a point, so far.

Perowne reads an arresting sentence from Darwin—“There is a grandeur in this view of life”—and says, yes, indeed:

“Endless and beautiful forms of life, such as you see in a common hedgerow, including exalted beings like ourselves, arose from physical laws, from war or nature, famine and death. This is the grandeur. And a bracing kind of consolation in the brief privilege of consciousness.”

But a bemused John Paul II, no stranger to materialism, dialectical and otherwise, might have responded: There you go again—that word consciousness. What is the grandeur in the spectacle, however interesting, of the blind, brute, violent necessity of physical laws at work? Is consciousness of an existence supposedly governed by such laws really much of a privilege?

It is, John Paul II might have responded to Perowne, one thing to say that a cosmic sneeze, aka the big bang (never mind what, or who, or Who, produced that) set all this physical-law-governed necessity in motion, and this process resulted in matter, including the cooling, unstable and sometimes lethally violent lump of it called Earth. But how much progress has science really made in explaining how some matter came to be conscious of itself?

Such arguments are not just hardy perennials in philosophy, they are part of today’s political arguments. Arguments about school curricula (evolution, intelligent design). And about conundrums that modern science confronts us with concerning the end of life and the waning of consciousness. And about aborting what some people call “fetal material” as it grows toward consciousness.

Alas, death has removed from the unending and probably unendable debates about respect for consciousness and life the intellectual pope who, one imagines, appreciated the profundity, and perhaps the finality, of the philosophic jest: “Of course I believe in free will—I can’t help it.”

[APRIL 11, 2005]

How Biology Buttresses Morality, Which Conforms to…Biology

Science is reshaping the argument about whether nature or nurture is decisive in determining human destinies, and about what the answer means for social policy. Consider a fascinating new report arguing the scientific evidence for the importance of “authoritative communities”—groups, religious or secular, devoted to transmitting a model of the moral life.

The report is from the thirty-three research scientists, children’s doctors and mental health and youth services professionals comprising a commission jointly sponsored by the Dartmouth Medical School, the Institute for American Values, and the YMCA of the USA. The report’s conclusion is in its title: human beings are Hardwired to Connect.

In an era of increasing prosperity, the evidence of children’s failures to thrive—depression, anxiety, substance abuse, conduct disorders—is also increasing. Pharmacological and psychotherapeutic responses to such deteriorating mental and social health are necessary but insufficient. Also needed is recognition of how environmental conditions—the social environment—contribute to childhood suffering.

The problem is a deficit of connectedness. The deficit is the difference between what the biological makeup of human beings demands and what many children’s social situations supply in the way of connections to other people, and to institutions that satisfy the natural need for moral and spiritual meaning.

The need expresses itself in religious cravings—the search for moral meaning and an openness to the possibility of a transcendent reality. The need is natural in that it arises from “our basic biology and how our brains develop.” The report draws upon the science of infant attachment, and of brain development, particularly during adolescence, when the brain changes significantly.

The report argues that our understanding of children’s difficulties is thwarted by the assumption that each child’s problems are exclusively personal and individual, thereby ignoring social and communal factors. In fact the report argues that we are “biologically primed” for finding meaning through attachments to others.

The need for meaning is increasingly discernible in the basic structure of the brain. “The idea,” says Allan N. Schore of the UCLA School of Medicine, “is that we are born to form attachments, that our brains are physically wired to develop in tandem with another’s, through emotional communication, beginning before words are spoken.”

Furthermore, the report says, social environments that meet—or defeat—this need “affect gene transcription and the development of brain circuitry.” And “a social environment can change the relationship between a specific gene and the behavior associated with that gene.” A child’s “relational context,” says Schore, “imprints into the developing right brain either a resilience against or a vulnerability to later forming psychiatric disorders.”

“The biochemistry of connection” will seem too, well, deflating for some people’s comfort. The report cites, for example, another study that says oxytocin, a hormone, enters a woman’s bloodstream during sexual intercourse, childbirth, and lactation, promoting, the report says, “emotional intimacy and bonding (also sometimes known as ‘love’).” In men, marriage—sexual and emotional intimacy with a spouse—seems to lower testosterone levels, thereby lowering the “biological basis for violent male behavior and male sexual promiscuity.”

So biology, it seems, buttresses important moral conventions. And they may have evolved in conformity to biological facts.

The scientific fact, if such it is, that religious expression is natural to personhood does not vindicate any religion’s truth claims. A naturalistic hypothesis is that the emotions of religious experience have neurobiological origins: The brain evolved that way to serve individual and group survival.

In any case, the social utility of religion remains. And there may be a biological basis for religious affiliation reducing the risk of certain pathologies, and even enhancing immune systems.

The most basic authoritative community, the family, is the most crucial. Its decline weakens the other institutions of civil society. The result is a thinness of social connectedness, and what Tocqueville warned was a risk of American individualism—each person confined “entirely within the solitude of his own heart.”

Hardwired to Connect suggests that there is no simple “versus” in “nature versus nurture.” There is a complex interaction, which means, among many other important things, that IQ is not a simple genetic inheritance, it is a function of that inheritance and the influence on it of a context of connections.

The implication for governance is that social policies should foster the health of authoritative communities, especially given the fact that the yearning for such communities among adolescents often takes the form of gang membership. And evidently the Bush administration’s belief in the wisdom of delivering social services through faith-based institutions is not just a matter of faith.

But then, the utility of faith does not establish its validity.

[SEPTEMBER 21, 2003]

The Space Program’s Search for…Us

PASADENA, CALIFORNIA—On September 8, a spacecraft will insert into the atmosphere over Utah a glider that will spiral to Earth carrying a sealed canister containing gusts of solar wind captured over the past two years. The wind contains dust, gases, and other possible evidence of the dynamics of the solar system, dynamics that have somehow given rise to the splendor of…us. NASA’s name for the canister project: the Genesis mission.

September 8 will be just another day at the office here on the campus of the Jet Propulsion Laboratory, where office work is not mundane but otherworldly. JPL—an appendage of, but not contiguous to, Caltech—may be the only place on the planet where you can gather around a lunch table with people who, in a sense, work on another planet.

On a recent day, some of them were behind a laboratory at a pile of sand that resembles the surface of Mars. They were trying to drive a rover, like the two currently on Mars. The JPL scientists were trying to operate one with the kind of mechanical defect—an inoperative drive wheel—that is giving a slight limp to one of the rovers on the red planet 110 million miles away.

One Mars rover was landed in a crater—what one scientist calls “an interplanetary hole-in-one.” Both were expected to rove about six hundred meters, but they have covered three thousand. Their batteries are recharged daily by solar panels; they already have lasted twice as long as had been expected and may last ten times longer.

Earth, which is constantly changing, became home for life 4 billion years ago. We know neither the conditions then nor the processes by which life ignited. However, Mars may have had an early history like Earth’s. One question the rovers may answer is: Were there, long ago, pools of standing water—standing for hundreds of millions of years—where life could have developed?

The rovers’ arms, manipulated at JPL, put instruments in contact with rocks and read their mineral contents. By drilling into rocks, through several billion years worth of settled dust, the rovers have found sediments that were formed in bodies of water. Within the next decade, samples may be put robotically in canisters and launched off Mars to rendezvous with an orbiting spacecraft for a six-month trip to Earth.

It would be understandable if the people at JPL were jubilant, having just—after a seven-year flight—precisely inserted the Cassini spacecraft into the rings of Saturn. This multinational project will include putting instruments on Titan, Saturn’s largest moon, which may have some tantalizing similarities to Earth of 4 billion years ago.

But people here know that all their marvels—JPL’s deep-space control center is monitoring thirty-five space ventures—are performed against a backdrop of deepening public indifference. And cosmology’s human capital is declining as young scientists choose other career paths.

The public’s diminishing capacity for astonishment is astonishing. Perhaps second only to Einstein’s question (Did God have a choice in the creation of the world?) is this one: How did matter, which is what we are, become conscious, then curious? Not all clues can be found on Earth.

Curiosity is why a Voyager spacecraft launched in 1977 is now 8.5 billion miles away. It is, in the scheme of things, just next door: traveling now at 1 million miles per day, it would have to continue for forty thousand more years just to be closer to another star than to our sun. Still, here in our wee solar system—our little smudge on the skies of uncountable billions of galaxies—Voyager’s and JPL’s other undertakings must be measured against Einstein’s axiom: “All our science, measured against reality, is primitive and childlike—and yet it is the most precious thing we have.”

All space programs search for…us. For, that is, understanding of how we came to be. Does that mean space exploration amounts to species narcissism? Yes, and that is an excellent thing. It is noble to strive to go beyond the book of Genesis and other poetry, to scientific evidence about our origins, and perhaps destiny.

The Scottish physicist James Clerk Maxwell (1831–1879), an early authority on Saturn’s rings, had, as cosmologists should, a poetic bent:


At quite uncertain times and places,

The atoms left their heavenly path,

And by fortuitous embraces,

Engendered all that being hath.


The phrase “fortuitous embraces,” although lovely, is not explanatory. Knowledge, tickled from the heavens, is the business of a small band of possible explainers—the people of JPL and NASA, government at its best.

[AUGUST 26, 2004]

Nuclear Waste: That’s Us

The Department of Transportation deals with the movement of things, which is important. The Department of Agriculture deals with food, which is vital. However, the National Aeronautics and Space Administration deals with the origin, nature, and meaning, if any, of the universe. Attention should be paid.

Space lost its hold on America’s imagination after the last lunar expedition in 1972. But the really exciting research had just begun, with the 1965 discovery that the universe is permeated with background radiation which confirmed that a big bang had indeed set what are now distant galaxies flying apart.

A famous aphorism holds that the most incomprehensible thing about the universe is that it is comprehensible. It is remarkably so because of advances in particle physics and mathematics. And because of magnificent telescopes, like the Hubble, which is now eleven years old and due to cease functioning in 2010. Operating above the filter of Earth’s atmosphere, it “sees” the past by capturing for analysis light emitted from events perhaps—we cannot be sure how fast the universe is expanding—12 billion to 14 billion years ago.

Astronomy is history, and NASA’s Next Generation Space Telescope, coming late in this decade, will see even nearer the big bang of 13 billion to 15 billion years ago. That was when, in a trillionth of a trillionth of a trillionth of a second, the big bang inflated from a microscopic speck to all that now can be seen by NASA’s wondrous instruments.

Mankind is being put in its place, but where is that? Mankind felt demoted by Copernicus’s news that this cooled cinder, Earth, is not the center of the universe. Now Martin Rees, Britain’s Astronomer Royal, in his new book, Our Cosmic Habitat, adds insult to injury: “particle chauvinism” must go. All the atoms that make us are, it is truly said, stardust. But Rees puts it more prosaically: They are nuclear waste from the fuel that makes stars shine.

So, is life a cosmic fluke or a cosmic imperative? Because everything is a reverberation from the big bang, what is the difference between fluke and imperative?

Rees says our universe is “biophilic”—friendly to life—in that molecules of water and atoms of carbon, which are necessary for life, would not have resulted from a big bang with even a slightly different recipe. That recipe was cooked in the universe’s first one-hundredth of a second, when its temperature was a hundred thousand million degrees centigrade. A biophilic universe is like Goldilocks’ porridge, not too hot and not too cold—just right.

Here cosmology is pressed into the service of natural theology, which rests on probability—actually, on the stupendous improbability of the emergence from chaos of complexity and then consciousness. Natural theology says: A watch implies a watchmaker, and what has happened in the universe—the distillation of the post-big-bang cosmic soup into particles, then atoms, then, about a billion years ago, the first multicellular organisms that led, on Earth, to an oxygen-rich atmosphere and eventually to us—implies a Creator with a design so precise.

Perhaps. But not necessarily, unless you stipulate that no consequential accident is an accident. “Biological evolution,” says Rees, “is sensitive to accidents—climatic changes, asteroid impacts, epidemics and so forth—so that, if Earth’s history were to be rerun, its biosphere would end up quite different.” There is a lot of stuff in the universe—the estimated number of stars is 10 followed by 22 zeros. But as to whether there are other planets with life like Earth’s, Rees says the chance of two similar ecologies is less than the chance of two randomly typing monkeys producing the same Shakespearean play.

“Eternity,” says Woody Allen, “is very long, especially toward the end.” The end of our universe—long after our sun has died, 5 billion years from now—is certain to be disagreeable.

In his book on the universe’s infancy (The First Three Minutes), Steven Weinberg concludes that “there is not much of comfort” in cosmology. It indicates that Earth, “a tiny part of an overwhelmingly hostile universe,” is headed for “extinction of endless cold or intolerable heat,” either an unending expansion or a fiery collapse backward—a big crunch.

Yet research like NASA’s is its own consolation. “The effort to understand the universe is,” says Weinberg, “one of the very few things that lifts human life a little above the level of farce, and gives it some of the grace of tragedy.” Not a negligible mission for NASA.

[MARCH 24, 2002]

The Loudest Sound in Human Experience

Ira Gershwin didn’t know the half of it. He said the Rockies may crumble, Gibraltar may tumble. But terra firma itself is far from firm.

Even the continents are wandering, half an inch to four inches a year. Earth is a work in violent progress. The engine of its evolution is heat—boiling gas, molten rock, and other stuff—left over from the planet’s formation 4.5 billion years ago. The heat frequently bursts through Earth’s crust, although rarely as catastrophically as it did 120 years ago on the island of Krakatoa.

If Simon Winchester is correct in his new book—Krakatoa: The Day the World Exploded, August 27, 1883—the current trial in Indonesia of accused perpetrators of last year’s terrorist bombing in Bali may be part of the lingering reverberation of the volcanic eruption—the loudest sound in modern human experience, heard three thousand miles away—that made an island disappear.

Billions of tons of material—six cubic miles of it—were hurled 120,000 to 160,000 feet in the air. They filtered sunlight, lowering Earth’s temperature and creating spectacular sunsets that for months inspired painters and poets.

And in the East Indies outpost of the Dutch empire, where a notably relaxed and tolerant Islamic faith had long flourished, Krakatoa, by terrifying and dispossessing people, may have catalyzed the much fiercer form of Islam that fused with anticolonialism. It is alive and dealing death today.

Although the people of the East Indies will be forgiven for not appreciating this at the time, Winchester says volcanoes are part of what makes this planet hospitable to humans. They do not erupt so promiscuously as to render the planet unfit for life. And by churning Earth’s mantle, they bring fertile soil and useful minerals to the surface, thereby sustaining the outer earth and the biosphere. For a while.

As Earth heads for frigid lifelessness, the leakage of heat from Earth’s interior causes currents of matter to flow—movements measured in millimeters a year—above the molten core and below the crust. Just as, writes Winchester, “one sees working in a vat of vegetable soup simmering on the stovetop.”

Science in the 1960s at least explained what had long pricked curiosity—the matching concavity of Africa’s west coast and the convexity of South America’s east coast. According to the study of plate tectonics, there are, depending on how they are defined, between six and thirty-six rigid plates on Earth’s surface. In “subduction zones,” where one plate slips beneath another, the descending plate pulls down untold billions of tons of material and water. This fuels white-hot seas of soup in immense chambers, from which energy seeks to break through Earth’s surface.

Which is what happened in 1883 in the archipelago that now is Indonesia. Krakatoa’s eruption resulted in the destruction of 165 villages and the death of 36,417 people. Most died not from the searing ash, pumice, and gas but from giant sea waves produced by Earth’s spasm.

The shock wave circled Earth seven recordable times. Sea surges were detectable in the English Channel. Three months after the eruption, firemen in Poughkeepsie, New York, scrambled in search of what they thought was an immense conflagration that caused the sky to glow. Actually, the glow was light refracted by Krakatoa’s debris.

The first major catastrophe to occur after the invention of the telegraph and undersea cables, Krakatoa produced an intimation of the “global village” seventy-seven years before Marshall McLuhan coined that phrase to describe the world-contracting effect of television. Krakatoa was, Winchester argues, “the event that presaged all the debates that continue to this day: about global warming, greenhouse gases, acid rain, ecological interdependence.” Suddenly the world seemed to be less a collection of isolated individuals and events and more “interconnected individuals and perpetually intersecting events.”

As an epigraph for his book, Winchester chose this from a W. H. Auden poem written in 1944, when the world was in agony and, unbeknownst to Auden, potentially world-shattering knowledge was being acquired at Los Alamos, New Mexico:


At any given instant

All solids dissolve, no wheels revolve,

And facts have no endurance—

And who knows if it is by design or pure inadvertence

That the Present destroys its inherited self-importance?


Geology has joined biology in lowering mankind’s self-esteem. Geology suggests how mankind’s existence is contingent on the geological consent of the planet. Although the planet is hospitable for the moment, it is indifferent—eventually it will be lethally indifferent—to its human passengers.

[MAY 22, 2003]

L = BB + pw + BC/BF

One hundred years ago, a minor Swiss civil servant, having traveled home in a streetcar from his job in the Bern patent office, wondered: What would the city’s clock tower look like if observed from a streetcar racing away from the tower at the speed of light? The clock, he decided, would appear stopped because light could not catch up to the streetcar, but his own watch would tick normally.

“A storm broke loose in my mind,” Albert Einstein later remembered. He produced five papers in 1905, and for physicists, the world has never been the same. For laypeople, it has never felt the same.

In his book Einstein’s Cosmos, Michio Kaku, professor of theoretical physics at the City University of New York, makes Einstein’s genius seem akin to a poet’s sensibility. Einstein, says Kaku, was able to “see everything in terms of physical pictures”—to see “the laws of physics as clear as simple images.”

Hitherto, space and time were assumed to be absolutes. They still can be for our everyday business, because we and the objects we deal with do not move at the speed of light. But since Einstein’s postulate of relativity, measurements of space and time are thought to be relative to speed.

One implication of Einstein’s theories did have thunderous practical implications: Matter and energy are interchangeable, and the mass of an object increases the faster it moves. In the most famous equation in the history of science, energy equals mass multiplied by the speed of light squared. A wee bit of matter can be converted into a city-leveling amount of energy.

In the 1920s, while people were enjoying being told that space is warped and it pushes things down (that is the real “force” of gravity), Einstein became an international celebrity of a sort not seen before or since. Selfridges department store in London pasted the six pages of an Einstein paper on a plate-glass window for passersby to read. Charlie Chaplin said to him, “The people applaud me because everyone understands me, and they applaud you because no one understands you.”

The precision of modern scientific instruments makes possible the confirmation of implications of Einstein’s theories—e.g., the universe had a beginning (the big bang) and its expansion is accelerating; time slows in a large gravitational field and beats slower the faster one moves; the sun bends starlight from across the sky and there are black holes so dense that they swallow light. Does all this bewilder you? The late Richard Feynman, winner of the Nobel Prize in physics, said, “I think I can safely say that nobody understands quantum mechanics.”

Three years ago we learned that the Milky Way galaxy, which is next door, contains a black hole weighing as much as 2 million suns. “Thus,” says Kaku, “our moon revolves around the earth, the earth revolves around the sun, and the sun revolves around a black hole.” Can this story have a happy ending?

Science offers no guarantees. Astronomy evicted us from our presumed place at the center of the universe many centuries before we learned that “center” is unintelligible in an expanding universe where space and time are warped. And before nineteenth-century biology further diminished our sense of grandeur by connecting us with undignified ancestors, eighteenth-century geology indicated that seashells unearthed on mountain tops proved that Earth has a longer, more turbulent and unfinished history than most creation stories suggest. December 26, 2004, brought another geological challenge to the biblical notion of an intervening, caring God.

Einstein’s theism, such as it was, was his faith that God does not play dice with the universe—that there are elegant, eventually discoverable laws, not randomness, at work. Saying “I’m not an atheist,” he explained:

“We are in the position of a little child entering a huge library filled with books in many different languages. The child knows someone must have written those books. It does not know how. It does not understand the languages in which they are written. The child dimly suspects a mysterious order in the arrangement of the books but doesn’t know what it is.”

A century on from Einstein’s “miracle year,” never mind E = mc2. Try this: L = BB + pw + BC/BF. Meaning: Life equals the big bang, followed by lots of paperwork, ending with either a big crunch, as the universe collapses back on itself, or a big freeze as it expands forever.

A bad ending? Compared to what? Everything, as has been said since Einstein, is relative.

[JANUARY 6, 2005]

Wonder What We Are For? Wondering

A TOP MAUNA KEA, ON HAWAII’S BIG ISLAND—On a clear day, you can see almost forever. With the help of adaptive optics, almost back to the beginning of this universe. And it is usually clear here at 13,796 feet above sea level, and above half of the atmosphere’s oxygen. That is why the W. M. Keck Observatory’s two telescopes, primarily operated by the University of California and the California Institute of Technology, are here, far from urban lights and above much of the atmosphere, which, although it makes the stars twinkle prettily, does so by distorting light.

Hence the need for adaptive optics. This technology became available for civilian science when the end of the Cold War led to the declassification of some devices developed for the Strategic Defense Initiative (SDI).

The Keck telescopes—the world’s largest—are gathering light produced shortly (as these things are reckoned; about 800 million years) after the big bang, just under 14 billion years ago. Analysis of the light, which can be done by astronomers working anywhere, yields information about the life cycle of stars. (Grim news: our star, the sun, is doomed, so we are, too, in less than 2 billion years.) The Keck telescopes have detected more than sixty planets orbiting other stars. Ten years ago, the only planets we knew of were those orbiting our own sun.

It is axiomatic that not only is the universe stranger than we know, it is stranger than we can know. But one reason the Keck telescopes are significantly augmenting our store of knowledge is the application to astronomy of adaptive optics developed for SDI. SDI’s challenge is to target, from space, ballistic missiles launched on Earth. This requires making ultraprecise measurements from space, through the distortions of Earth’s atmosphere. Astronomy’s challenge involves looking outward—analyzing light that is distorted by the atmosphere before it reaches telescopes on Earth.

The Keck telescopes each weigh three hundred tons, stand eight stories tall, and involve operations of more precision than those of the finest wristwatch. They can gather 13-billion-year-old light that is 500 million times fainter than the naked eye can see. They gather the light using a primary mirror ten meters (thirty-three feet) in diameter, composed of thirty-six hexagonal segments, each engineered to conform to within a millionth of an inch of single continuous surface.

But the really remarkable device is a mirror about the size of a man’s hand. Distortions in the gathered light are removed by bouncing the light off this mirror, which has four hundred pistons operated by tiny, computer-driven motors that make adjustments in the mirror’s surface 642 times a second.

From 1609, when Galileo built a refracting telescope (a lens assisting the naked eye), until the Hubble space-based telescope was launched in 1990, the atmosphere complicated astronomy. However, Hubble, which cost more than all other telescopes in history combined, does not make Earth-based telescopes anachronistic.

Hubble and its successors—next comes the Next Generation Space Telescope—operate in the cold vacuum that is space. But a multinational consortium has proposed an Overwhelmingly Large Telescope that would gather light on Earth with two thousand panels of mirrors in an apparatus the size of a football field. Ever-better land-and space-based telescopes will find tantalizing hints about how the expansion of the universe (actually, this universe; there may be many others) began—a big bang?—and whether it will continue to expand or will collapse back on itself in a big crunch.

In any case, Earth’s fate is not going to be pretty, so what’s the use in wondering? Because wondering is what we are for.

Mauna Kea is a dormant volcano. About forty miles southeast of here, lava from the Kilauea volcano, boiling with heat left over from Earth’s formation 4.35 billion years ago, has recently been spilling across a highway and into the ocean. To stand a few hundred feet from the stream of lava plunging into the Pacific, amid the searing heat and sulfurous fumes, is to sense what the Keck Observatory, in its very different setting, explores—the violent impermanence that permeates the entire universe.

“We are curious people,” says Keck Observatory director Frederic Chaffee matter-of-factly. “And the universe is an amazing place.” The most amazing things in it are the curious creatures. They have evolved literally from stardust, becoming conscious beings capable of building—indeed, their glory is that they are, in a sense, incapable of not building—mountaintop telescopes, silhouetted against the edge of the atmosphere, searching for clues as to how all this started and how it will end.

[SEPTEMBER 1, 2002]