The world exists, as I understand it, to teach the science of liberty.
—RALPH WALDO EMERSON
Science and democracy are based on the rejection of dogmatism.
—DICK TAVERNE
One can live by dogma or by discovery. Dogmas (from the Greek for received opinions that “seem good”) may seek to unify people (as is the implied intent of religious dogma, religio being Latin for “binding together”) but insofar as a dogma must be taken on faith it winds up bifurcating humanity into a faithful us and a suspect other. Scientific discovery might have divided the world, but instead has found that all human beings are kin—to one another and to all other living things—in a universe where stars and starfish alike obey the same physical laws. So as humans move from dogma toward discovery, we increasingly find ourselves inhabiting one world.
This development raises the prospect that as the influence of science grows, people may overcome old prejudices and parochialisms and treat one another more liberally. To an extent this is already happening—the world today is more scientific and more liberal, better informed and less violent, than it was three centuries ago—but with such prospects have also come problems. Religious and political dogmatists react against science and liberalism with everything from denial and attempted suppression (of, for instance, the teaching of biological evolution) to terrorism. The liberal democracies have too often responded to such threats with insecurity rather than strength, reverting in times of trouble to illiberal practices little better than those of their adversaries. Meanwhile scientific findings challenge everybody’s received opinions, while the growth of technology creates conundrums—with global warming currently at the top of the heap—that unless competently addressed threaten to reverse much of the progress our species has so recently made.
Dogmatists like to portray science as just another dogma—to the brazen all is brass—but science is a method, not a faith, and the unity of the universe was discovered by scientists who set out to demonstrate no such thing. When Newton identified the laws of gravitation he did not assert that they held sway everywhere, but wondered whether “God is able…to vary the laws of nature…in several parts of the universe.” The physicist Ernest Rutherford, whose experiments exposed the structure of the atom, was so skeptical about drawing grand implications that he threatened to bar from his laboratory any scientist who so much as uttered the word “universe.” When the astronomer Edwin Hubble established that the Milky Way was one among many galaxies, he called them “island universes” and questioned whether “the principle of the uniformity of nature” pertained across such enormous distances. This is the opposite of starting with a deeply held faith and accumulating evidence to support it. Scientists have a story of discovery to tell, dogmatists a story of obedience to authority.
The scientific discovery that everything—and everybody—is interwoven with everything else was a boon for liberalism, which took a unified view of humanity before such a stance could be justified empirically. When John Locke argued for human equality under the law, women were still considered unfit to vote and Europeans thought of their black and brown brothers as the benighted descendants of the biblically accursed Ham. The liberal claim that people ought to have equal rights was a theory, vulnerable to test by experiment and properly to be judged by the results. The experiment having since succeeded, while science determined that all human beings belong to the same species, we can now understand that we’re all us; there is no other.
Darwin’s discovery that biological evolution functions through random mutation and natural selection revealed the common ancestry of all human beings, but it did so at the cost of exposing the unsettling fact that we are here by virtue of chance. Genes mutate randomly, DNA/RNA copying errors altering the genetic inheritance of every organism. Changes in the environment—which are themselves random, to a first approximation—can alter circumstances in such a way that previously marginal mutants are better able to survive and reproduce than are those superlatively adapted to the prior order. The environmental changes involved may be as slow as the parting of continents or as sudden as an asteroid impact, but they never cease: Stasis is an illusion. Homo sapiens did not emerge because they were superior to other animals, but because their ancestors happened to be in the right place at the right time. This rather stark finding is difficult for humans to absorb; hence we are apt to regard ourselves as distinctly different from the other animals, and to imagine that we are here for a special purpose. To entertain this illusion is to approach biology the wrong way round.
The cognitive scientist Daniel Dennett, in his book Darwin’s Dangerous Idea, offers a thought experiment to illuminate just how unaccustomed human beings are to thinking in Darwinian terms. In his scenario, you meet a gambler who claims that he can produce a man who, in your presence, will win ten consecutive coin tosses. You take the bet, knowing that the odds against anyone’s winning ten straight coin tosses are 1,024 to 1. The gambler shows up in the morning accompanied by 1,024 men, who proceed to toss coins. At the end of the first round, half the men have lost. The same happens on the ensuing rounds, until only two men remain for the tenth and final round, the winner of which has indeed won ten straight coin tosses. He wasn’t personally destined to win, any more than any given tennis player is destined to win next year’s Wimbledon singles title, but somebody had to win, and it just happened to be him. “If the winner of the [coin-tossing] tournament thinks there has to be an explanation of why he won, he is mistaken,” Dennett notes. “There is no reason at all why he won, there is only a very good reason why somebody won.”
So to wonder, “Why am I here?” is to ask the wrong question. Nothing requires that you or I exist, or that the human species exists; it’s just that so long as there is life on Earth some creatures will exist, and you and I happen, at present, to be among them. I may imagine that my existence is magically full of hidden meanings—just as amateur gamblers think they perceive patterns in the wholly random behavior of roulette wheels—but the silent majority of species that once thrived and are now gone would take a decidedly different view of the matter, were they around to be interviewed about it. Evolution reveals that humans got here the way everything else got here, through a long historical process of accident and selection.
This discovery offends secular as well as religious dogmatists.
Leftists have opposed evolution out of fear that genetics might reveal human beings to be other than John Locke’s blank slates, whose faults were entirely attributable to their social milieu and who could, therefore, be redeemed entirely through political reforms. Many believed, as had Rousseau, that “Man is naturally good, and only by institutions is he made bad.” Progressive constructs like behavioral psychology and the Standard Social Science Model (“Instincts do not create customs; customs create instincts…. The putative instincts of human beings are always learned and never native”) were based on a genetics-free environmental determinism—a belief that, as the anthropologist Margaret Mead put it, “Human nature is almost unbelievably malleable.” The young Charles Darwin, while sailing round the world aboard the scientific research vessel Beagle, wrote in his notebook, “He who understands baboon would do more towards metaphysics than Locke.” Mead and many other progressives feared that Darwin was right, and made extravagant assertions to the contrary. “Give me a dozen healthy infants,” claimed the psychologist Jon Watson, “and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any kind of specialist I might select—doctor, lawyer, merchant chief…regardless of his talents.” In studying the black and brown peoples of America and the South Pacific, Darwin, a steadfast abolitionist, was struck by “how similar their minds were to ours” and by “the close similarity between the men of all races in taste, dispositions and habits.” Yet many progressives rejected whole swathes of his work rather than question their dogmatic belief that human behavior must be determined entirely by societal circumstances.
Rightists have depicted Darwinism as the engine driving socialistic efforts to engineer the human gene pool. In this narrative, the publication of Darwin’s books in the latter half of the nineteenth century was soon followed by an upwelling of public enthusiasm for eugenics—the political commandeering of biological selection to attain political ends—that reached its apotheosis in the Nazi death camps and sterilization campaigns. This view of history is, while odd, far from fanciful; many early Darwinists did favor eugenics. In England, the science-fiction writer H. G. Wells, a prominent socialist, called for “the sterilization of failures” to bring about “an improvement of the human stock,” while cheerfully proposing that state breeding methods be used reduce the world’s “swarms of black and brown, and dirty-white and yellow people.” His compatriot George Bernard Shaw argued that “the only fundamental and possible socialism is the socialization of the selective breeding of Man.” John Maynard Keynes, who served on the board of the British Eugenics Society, called eugenics “the most important, significant and, I would add, genuine branch of sociology.” In America, Justice Oliver Wendell Holmes maintained that by ruling in favor of “sterilizing imbeciles…I was getting near the first principle of real reform” (his italics) while the New Jersey governor and future president Woodrow Wilson created a state board to determine whether “procreation is inadvisable” for the “feebleminded, epileptics, and other defectives,” including criminals and those living in poor-houses. Conservatives sensitive to this sad chapter in history may perhaps be forgiven for confusing Darwinian-sounding leftist dogma with evolutionary science, although their stance is more difficult to justify now that eugenics has been driven from the political arena.
Religious rather than secular convictions, however, motivate most of those Americans who reject evolution. Among the liberal democracies such staunch religious dogmatism is almost exclusively an American phenomenon. (The reason, economists theorize, is that innovative new churches found fresh parishioners in free-market America while European churches stayed put and lost their market share. If so, Christian fundamentalism is a product of what might be called social Darwinism.) In the United States, however, this latest spiritual awakening has been sufficiently influential that nearly a third of American teachers say they feel pressured to omit evolution from their lessons, or to mix in nonscientific concoctions such as creationism or intelligent design theory, while even science museums shy away from presenting films and exhibits about evolution.
The religious dogmatists who campaign against Darwin’s theory of evolution seem to think that it involves little beyond the relatively narrow question of human origins—but its implications are much broader than that. In addition to being the only way to make sense of biology, evolution sheds light on phenomena ranging from universal behavior patterns (altruism, selfishness, “dad or cad” mating strategies) to the arts, where greatness equals universality (the Tokyo String Quartet playing Beethoven, Shakespeare performed in French) and so may harbor clues to our common evolutionary heritage. “If one reads accounts of the systematic nonintrusive observations of troops of bonobo,” writes the novelist Ian McEwan, “one sees rehearsed all the major themes of the English nineteenth-century novel: alliances made and broken, individuals rising while others fall, plots hatched, revenge, gratitude, injured pride, successful and unsuccessful courtship, bereavement and mourning.” Thinking itself may be evolution in action. According to the theory of neural Darwinism, the brain works by generating myriads of neuron patterns that jockey for preeminence, with the survivors mapping themselves onto cortical tissue in a contentious dynamic that works much like competition within and among biological species. The American neurobiologist Gerald Edelman, who originated the theory, maintains that “the key principle governing brain organization is a populational one and that in its operation the brain is a selective system” involving “hundreds of thousands of strongly interconnected neurons.” Neuron groups that prevail are able to maintain and enlarge their territory. As Edelman puts it, “There is a kind of neuroecology occurring at several complex levels of development and behavior…. In certain respects, groups within a region [of brain tissue] resemble a species subdivided into many races, each interacting mainly within itself, but occasionally cross-breeding.” Neural Darwinism may explain, among other things, why the human brain is able to compose music and do physics even though these activities do not seem to have been historically essential to human survival. If the brain functions in an evolutionary way, it has to be big, because most of its activities—thoughts, prethoughts, and unconscious impulses—amount to experiments, most of which are selected against. It may seem strange to think of evolution as taking place in the brain at lightning speed, but electrochemistry moves at the speed of thought, as thinking itself constantly demonstrates.
Dogmatism has been losing ground in the liberal-scientific world—even in the United States, where the ranks of the religiously devout are in general decline despite a fundamentalist resurgence. Meanwhile, though, religious and political extremism is on the rise in parts of the Middle East and the wider Muslim world, reaching a reduction to absurdity in the campaigns of Islamist terrorists. (The word Islamist designates Muslim extremists inclined toward violence. It came into general use with the 1979 hostage takings at the U.S. embassy in Iran. It relates to Islam in somewhat the way that communist relates to commune.) Islamism remains a minority dogma, with surveys indicating that Muslims generally support democracy and human rights while rejecting terrorism and theocracy, but terrorism always draws vastly more attention than its support merits; that’s the point of it.
Following the 9/11 attacks, Americans asked, “Why do they hate us?” and wondered whether Islam itself was to any extent responsible for Isla-mist terrorism. Many articles and books explored this question but mostly followed predictable political precepts. The left proclaimed Islam and Islamism to be almost equally innocent, and blamed terrorism on the evils of an allegedly neocolonialist West. The right often lumped Islam and Islamism together on one side of a “clash of civilizations”—the title of a widely read 1993 Foreign Affairs essay by the historian Samuel Huntington that also contributed the only marginally more illuminating claim, “Islam has bloody borders.” From a liberal perspective, however, it is clear that Islam has little to do with Islamism—which is better understood as just one more totalitarian political dogma. All such dogmas justify present sins as the price to be paid to attain a future perfection. For communists, the wished-for state was Rousseau’s heaven on earth, where nobody could own anything or evince anything short of complete faith in the government. For the Nazis it was a Wagnerian opera, with blond Nordic heroes playing all the leading roles. For Islamists it is the “Caliphate,” their term for the domain carved out circa 632–750, when Islamic rule expanded from Arabia to Spain in the west and Afghanistan in the east—an epoch to which the Islamists would have all Muslims repair.
That regime was less an empire than a collection of loosely allied caliphates (from khalifa, meaning “descendant of Muhammed”), but it displayed the attitudes characteristic of empires: It cherished the heroic narrative of its own establishment, took pride in its considerable artistic and intellectual attainments, and regarded the outside world with complacent indifference.
Its establishment narrative was one of extraordinarily rapid military victories achieved from the very dawn of Islam. The prophet Muhammad began dictating his revelations in the year 610, when he was about forty years of age, and continued until three years prior to his death in 632. His words were subsequently collected as the Quran (“recitation”), a standardized edition of which appeared in 653. The Quran contains advice on almost every aspect of life, bound together by a call to return to traditional Arab values and to spread them across the known world—and spread they did, with remarkable alacrity. By 661, just one generation after Muhammad’s demise, four major caliphates had extended the reach of Islam (“submission”) from Persia to Egypt. By 750, Islam ruled Spain and had driven deep into the Byzantine Empire, where the tide turned when the Muslim forces—beset by Greek Fire, an early form of napalm—failed to conquer Constantinople (today’s Istanbul, which in 1453 would be conquered by the Turks and become central to the Ottoman Empire). Artistically and intellectually, the Islamic world could boast of exquisite architecture, mosaics, poetry, and calligraphy; significant contributions to mathematics (algebra is an Arabic word for an Arabic invention), and astronomy (many bright stars, such as Algol, Altair, and Rigel, bear Arabic names); plus libraries housing the Greek and Latin classics that would incite the European Renaissance. Though imposed by force, early Muslim rule was fairly liberal by the standards of its day. The caliphs generally paid respect to the teachings of Moses and Jesus, tolerated the religious observances of non-Muslims living under their governance, and accepted a degree of emancipation of women (whose veiling and sequestering was neither practiced nor advocated by Muhammad), although conservative Muslims maintained that a woman should leave home on only three occasions—when she took up residence with her bridegroom, attended to the death of her parents, and when her body was carried to the cemetery.
The caliphs were intrigued by the arts and crafts of the Far East that came through on the Silk Road, but paid scant attention to the rise of science and liberalism in a relatively primitive Europe. When Napoleon invaded Egypt in 1798, he might almost have come from Mars. The historian Abd al-Rahman al-Jabarti, who was living in Cairo at the time, noted wonderingly that should a Muslim such as himself approach the scholars on Napoleon’s staff and display a desire for knowledge, “They showed their friendship and love for him, and they would bring out all kinds of pictures and maps, and animals and birds and plants…. I went to them often, and they showed me that.” The European scientific revolution, notes Bernard Lewis, “passed virtually unnoticed in the lands of Islam, where they were still inclined to dismiss the denizens of the lands beyond the Western frontier as benighted barbarians…. It was a judgment that had for long been reasonably accurate. It was becoming dangerously out of date.” By the time Sadik Rifat Pasha, the Ottoman ambassador to Vienna in 1837, warned that the Europeans were flourishing thanks to a combination of science, technology, and “the necessary rights of freedom,” it was already too late. Western ships ruled the seas; British forces were extending their power inland from the coasts of India; and the U.S. Navy, having been created expressly for this purpose, had curtailed the plundering of American merchant ships and the hostage taking of their passengers by North African pirates hailing from what the Americans called the Barbary Coast. By 1920 Arabia was encircled by the British Empire—while the Ottoman Empire, having sided with Germany in World War I, had seen its lands divided between Britain and France.
The technological exploitation of Arabian oil made matters worse. When the American explorer Joel Roberts Poinsett (who discovered the flower that bears his name) spotted a pool of petroleum in Persia in 1806 and speculated that it might someday prove useful as a fuel, nobody yet knew what to do with it. By the middle of the twentieth century a thirst for oil had drawn the Western powers ever more deeply into the Middle East. The states possessing large oil reserves—Iraq, Saudi Arabia, and to a lesser extent Kuwait, Oman, Bahrain, Qatar, and the United Arab Emirates—were transformed from an assortment of small fishing, herding, and trading villages into economically more vertical societies where the few who controlled the oil became rich and the rest stayed poor. Such inequities were offensive to Islam—which, like Christianity, had originated as a religion centered on the poor and devoted to social justice—but the attempts of Muslim leaders to redress them by resorting to wealth redistribution through state socialism failed.
Most Muslim intellectuals accommodated the incursions of the West by adopting what they found to be useful in Western thought and otherwise sticking to their own traditions. This was the approach taken by moderate nineteenth-century thinkers like the journalist and educator Muhammad Abdu, who maintained that science and democracy could strengthen Islamic societies, and Sayyid Ahmad Khan, who, finding no contradiction between science and Islam, established a school where Muslims could study science. But a minority reacted by reverting to political extremism.
The more retrograde among these extremists drew inspiration from the eighteenth-century fundamentalist Abdul Wahhab, who shrugged off a thousand years of scholarship to claim that the literal words of the Quran should govern political procedures and judicial values. (He had a particular enthusiasm for punishing women convicted of adultery by stoning them to death.) Assailed by many of his fellow Muslims, Wahhab came under the protection of a local chieftain, Muhammad ibn Saud, who in 1744 founded Saudi Arabia—today a police state that has spent nearly one hundred billion dollars indoctrinating students around the world in Wahhab’s intolerant doctrines.
The progressives among the modern Islamic radicals, eager to find the wellsprings of Western power, tapped into the Parisian intellectual currents of the 1920s and 1930s—and there, seeking dreams, reaped nightmares. Their desire for social justice (Muslims fast during Ramadan to stay mindful of the poor) resonated with communism. Their hopes of regaining the lost glories of the Caliphate drew them to fascism. Their thirst for philosophical verities attracted them to the pretentions of Rousseau, Nietzsche, and Hegel. Thus inspired, they set up Islamist organizations like the Young Egypt Society, the Arab Socialist Baath Party, and the Muslim Brotherhood. The members of Young Egypt called themselves Greenshirts, in emulation of the Nazi Brownshirts. The Baath Party—“We were racist,” recalled Sami al-Jundi, one of its early leaders, “admiring Nazism, reading its books and the source of its thought, particularly Nietzsche”—empowered Saddam Hussein, the Iraqi ruler who modeled his career on Stalin’s and murdered a quarter-million Iraqis. The Muslim Brotherhood spawned Hamas and Al-Qaeda. Its resident ideologue was Sayyid Qutb.
Born in 1906 in the northern Egyptian village of Musha, Qutb attended a provincial school, memorized the Quran, then studied education at Cairo University. He emerged as a young intellectual of the middling sort—hanging around after graduation to teach a few classes, writing novels that nobody read, and spouting quotations from Hegel, Heidegger, and the French eugenicist and Nazi sympathizer Alexis Carrel. By the late forties Qutb’s views had become sufficiently irritating that the government shipped him off to the United States, where he took graduate courses in education and found little that met with his approval. American men, he wrote in his journal, were “primitive” though “armed with science” the women in their tight-fitting sweaters were “live, screaming temptations” and everyone was enamored of jazz, the “music that the savage bushmen created to satisfy their primitive desires.” A dogmatic dualist, Qutb saw the world as divided between a perfect but as yet unrealized Islamist law and a scientific, technological culture that was materially powerful but devoid of “human values.” The Caliphate had long ago done wonderful things like clothing naked Africans and delivering them “out of the narrow circles of tribe and clan [and] the worship of pagan gods into the worship of the Creator of the worlds.” (“If this is not civilization, than what is it?”) The West, in contrast, was preoccupied with science, which can discover “only what is apparent,” makes overweening claims (“Darwinist biology goes beyond the scope of its observations…only for the sake of expressing an opinion”), and anyway was already in decline: “The resurgence of science has…come to an end,” Qutb asserted, because science “does not possess a reviving spirit.” To thinkers of this stripe, the world is a nightmare from which humankind shall awaken only once the wished-for dogmas gain control.
On returning home Qutb became an advisor to Gamal Abdel Nasser, a former member of the fascist Young Egypt Society who had switched to a Stalinist-style pan-Arabism, becoming a Hero of the Soviet Union and winning the Order of Lenin. When Nasser shrank from subjecting Egypt to outright sharia, or Islamic religious law, members of the Muslim Brotherhood tried to assassinate him. Qutb’s younger brother Muhammad escaped the ensuing government crackdown by fleeing to Saudi Arabia, where he became a professor of Islamic Studies. (One of his students was the wealthy and indolent, later to become austere and fanatically dedicated, Osama bin Laden, who went on to underwrite the 9/11 attacks, denounce freedom and human rights as “a mockery,” and declare it Muslims’ religious duty to murder millions of Americans.) Sayyid Qutb remained in Egypt. He was jailed and tortured but nonetheless managed to write dozens of books before being hanged in prison in 1966, at age sixty-one, on charges of advocating the violent overthrow of the government. The Quran forbids Muslims from attempting to depose any Muslim ruler, but Qutb got around this prohibition by claiming that a “society whose legislation does not rest on divine law…is not Muslim, however ardently its individuals may proclaim themselves Muslim.”
Islamism thus resuscitates the totalitarian enthusiasms that nearly wrecked Europe. As a recent study puts it, “The line from the guillotine and the Cheka to the suicide bomber is clear.” Nor were the shocks of 9/11 required for Americans to see that bright line. As early as 1954, the historian Bernard Lewis noted “certain uncomfortable resemblances” between communism and Islamism:
Both groups profess a totalitarian doctrine, with complete and final answers to all questions on heaven and earth…. Both groups offer to their members and followers the agreeable sensation of belonging to a community of believers, who are always right, as against an outer world of unbelievers, who are always wrong…. The humorist who summed up the communist creed as “There is no God and Karl Marx is his Prophet,” was laying his finger on a real affinity. The call to a communist Jihad, a Holy War for the faith—a new faith, but against the self-same Western Christian enemy—might well strike a responsive note.
Selling state power in the Arab world was made easier by the fact that much of the Middle East was already statist: Even today most of Saudi Arabia’s adult males, and 95 percent of those in Kuwait, work for the government, while 80 percent of Iran’s wealth is controlled by the government.
As is customary in such campaigns, the Islamists portrayed the West as a decaying shell and Western man as, in Qutb’s words, “suffering from affliction [and] killing his monotony and weariness by such means as…narcotics, alcohol and…perverted dark ideas.” (Westerners must be deluded or debased; otherwise there would be no accounting for their support of liberal democracy.) The Islamist remedy was to enforce total obedience to an ideology capable of inspiring peoples’ imaginations and dictating their every action—an ideology that combined the tactics of fascism and communism with faith in the Quran as a guide to good government. “Religion and politics are one and the same in Islam,” stated Gulam Sarwar, whose books have influenced madrasa students in Britain and elsewhere. “Just as Islam teaches us how to pray, fast, pay charity and perform the Haj, it also teaches us how to run a state, form a government, elect councilors and members of parliament, make treaties and conduct business and commerce.” Once Quranic law was imposed by all-powerful governments, it would be possible to resurrect the Caliphate.
To this end idealistic students were recruited and organized into cells. In his book The Islamist, Ed Husain recounts how he became beguiled by radical Islamist literature while studying medicine in London. When his father, a devout Muslim, saw in his room a quotation from Hasan al-Banna, the founder of the Muslim Brotherhood—“The Quran is Our Constitution; Jihad is Our Way; Martyrdom is Our Desire”—he reacted with apprehension:
My son, the Prophet is not our leader, he is our master, the source of our spiritual nourishment. Leaders are for political movements, which Islam is not. The Quran is his articulation, as inspired by God, not a political document. It is not a constitution, but guidance and serenity for the believing heart. How can you believe in these new definitions of everything we hold so dear?
Undeterred, Husain and his fellow Islamist students circulated posters proclaiming, “Islam: The Final Solution.” “We failed to comprehend the totalitarian nature of what we were promoting,” he writes.
“Democracy is haram! Forbidden in Islam,” a cell member scolded Husain. “Don’t you know that? Democracy is a Greek concept, rooted in demos and kratos—people’s rule. In Islam, we don’t rule; Allah rules…. The world today suffers from the malignant cancers of freedom and democracy.” Husain eventually fell in with members of Hizb ut-Tahrir, the political party founded by Taqiuddin al-Nabhani, whose works enthralled Husain until he discovered that they were “based on the writings of Rousseau [who] called for God to legislate, because man is incapable of legislation,” and Hegel, the author of stirring totalitarian edicts such as, “The state is the march of God through the world.”
Like the European totalitarians who preceded them, the Islamists preach an ideology of purification through total mobilization, violent struggle, and death. “Combat is today the individual duty of every Muslim man and woman,” asserts the Abu-Hafs al-Masri Brigades, an affiliate of Al-Qaeda. “History does not write its lines except with blood,” declares Abdullah Azzam, a disciple of Qutb who fought with bin Laden in Afghanistan. “Glory does not build its lofty edifice except with skulls. Honor and respect cannot be established except on a foundation of cripples and corpses.” Posters decorating the walls of Palestinian kindergartens proclaim, “The Children are the Holy Martyrs of Tomorrow.” Nor have such injunctions remained on the level of street marches, or of teenagers blowing themselves up in marketplaces on the promise of eternal salvation. When the Islamist scholar Hassan al-Turabi (B.A., Khartoum University; M.A., London School of Economics; Ph.D., the Sorbonne) seized power in Sudan and instituted sharia and jihad, his reign resulted in the deaths of more than a million Sudanese. The Islamists would bring such campaigns to the world at large: “Islam is a revolutionary doctrine and system that overthrows governments [and] seeks to overturn the whole universal social order,” writes Abul Ala Mawdudi, a Pakistani journalist who promotes Islam as a political ideology. “Islam wants the whole earth and does not content itself with only a part thereof. It wants and requires the entire inhabited world…. It is not satisfied by a piece of land but demands the whole universe [and] does not hesitate to utilize the means of war to implement its goal.”
To justify such ambitions, Islamists fan the flames of resentment over colonialism and the intrusion of Western wealth, popular culture, and technological power into the Middle East. “The problem of modern Islam in a nutshell,” writes Omar Nasiri, a Moroccan who once advocated global jihad, is that “we are totally dependent on the West—for our dishwashers, our clothes, our cars, our education, everything. It is humiliating and every Muslim feels it…. We were the most sophisticated civilization in the world. Now we are backward. We can’t even fight our wars without our enemies’ weapons.” In a widely quoted passage, David Frum and Richard Perle describe the situation this way:
Take a vast area of the earth’s surface, inhabited by people who remember a great history. Enrich them enough that they can afford satellite television and Internet connections, so that they can see what life is like across the Mediterranean or across the Atlantic. Then sentence them to live in choking, miserable, polluted cities ruled by corrupt, incompetent officials. Entangle them in regulations and controls so that nobody can ever make much of a living except by paying off some crooked official. Subordinate them to elites who have suddenly become incalculably wealthy from shady dealings involving petroleum resources that supposedly belong to all. Tax them for the benefit of governments that provide nothing in return except military establishments that lose every war they fight: not roads, not clinics, not clean water, not street lighting. Reduce their living standards year after year for two decades. Deny them any forum or institution—not a parliament, not even a city council—where they may freely discuss their grievances. Kill, jail, corrupt, or drive into exile every political figure, artist, or intellectual who could articulate a modern alternative to bureaucratic tyranny. Neglect, close, or simply fail to create an effective school system—so that the minds of the next generation are formed entirely by clerics whose own minds contain nothing but medieval theology and a smattering of third world nationalist self-pity. Combine all this, and what else would one expect to create but an enraged populace ready to transmute all the frustrations in its frustrating daily life into a fanatical hatred of everything “un-Islamic.”
The Islamist diagnosis is that Muslims, though they rank among the world’s most religiously devout peoples, just aren’t devout enough—that only by bringing back old-time religion plus totalitarian governance will Muslims surpass the West. Somehow they will do this without resorting to science, since, as the London-based Islamist writer Ziauddin Sardar asserts, “Western science is inherently destructive and does not, cannot, fulfill the needs of Muslim societies.” Impatient with compatriots who blanch at their extremism, Islamist terrorists are as content to murder Muslims (the “near enemy”) as they are Christians (the “distant enemy”). “Fighting the near enemy is more important than fighting the distant enemy,” declared the Egyptian Islamist Abd al-Salam Faraj. “In jihad the blood of the Muslims must flow until victory is achieved.” Their actions have matched their words: To date, the majority of their victims have been Muslims. Islamists have murdered Muslim real estate agents, barbers, and ice vendors for selling services that were not available when the Prophet was alive. To discourage the education of women, the Taliban throws acid in the faces of Muslim schoolgirls and blows up their schools. Such conduct is so far removed from Muslim morality that the terrorists, when called upon to explain themselves, soon abandon their religious rhetoric and revert to the language of totalitarian realpolitik. Fouad Ali Saleh, the leader of an Islamist student network that detonated bombs in Paris in the eighties, was confronted at his trial by a man who had been badly burned in one of the attacks. “I am a practicing Muslim,” said the victim. “Did God tell you to bomb babies and pregnant women?” Saleh replied not with a quotation from the Quran, but with an appeal to ethnic grievances worthy of Lenin: “You are an Algerian. Remember what [the French] did to your fathers.”
A liberal-scientific diagnosis of what has gone wrong in the Middle East is that the problem arises from a paucity of science and liberalism.
Regarding science: Arab investment in R&D is only a tenth of the world average and a third of what other developing countries spend. Muslim nations produce 8.5 scientists, engineers and technicians per 1,000 persons—the world average is 40—and contribute less than 2 percent of the world’s scientific literature. When the Turkish-American physicist Taner Edis was asked, “How would you assess the state of scientific knowledge in the Islamic world?” he replied, “Dismal. Right now, if all Muslim scientists working in basic science vanished from the face of the earth, the rest of the scientific community would barely notice.”
Regarding liberalism: Only a quarter of the world’s Muslim-majority nations are electoral democracies, compared to almost three quarters of the non-Muslim nations. No Arab leader has yet lost power in a general election, nor are many likely to expose themselves to any such peril, considering the wealth that customarily comes with the position: Seven of the world’s ten richest heads of state are Arabs. Saudi Arabia has more military fighter jets than trained pilots, presumably because at least a third of the jets’ purchase price is kicked back to the royal family. A public appearance by a Middle Eastern ruler, writes the Tunisian historian Mohammed Talbi, “always triggers thunderous applause. The most zealous vow to sacrifice their blood for him and shriek their undying loyalty until their voices are hoarse and their bodies exhausted…. It is a gripping spectacle, greatly enjoyed by the leader, whose passage through the crowds has the effect of a huge collective brainwashing.” That so many such dictators have enjoyed American support, and that so much of their wealth has come from oil, fosters Islamist condemnations of the West as brutal and untrustworthy.
“The Middle East has become dominated by a totalitarian model that destroyed traditional freedoms and stifled economic growth, while educating generations of Arabs to oppose commerce, pragmatic compromise, and western science,” notes the political scientist Colin Rubenstein. Sexism pervades society, from education (four-fifths of adult women in Bahrain, Jordan, Kuwait, Qatar, and the United Arab Emirates are illiterate) to the law (Iranian law weighs the testimony of female witnesses as half that of males), and even in the terrorist cells, which pay women half what men are paid to strap on a bomb and blow themselves up on a bus or in a marketplace. Young Arabs find their prospects crimped by mediocre schools, high unemployment rates, and repressive social mores that make it difficult to meet, much less romance, members of the opposite sex. Iranians and Saudis who do manage to get together are discouraged by the morals police from holding hands, dancing, or attending mixed-gender theatrical events. The three-thousand-seat King Fahd Cultural Center, built in Riyadh, Saudi Arabia, in 1989 at a cost of $140 million, sat empty for a decade owing to the risks involved in actually staging a play. Small wonder that a majority of Arab youths say they want to emigrate.
Islamism is opposed by most Muslims—in Pakistan, Egypt, Morocco, Indonesia, and many other Islamic nations. Thousands of young Iranians defied their government and took to the streets on the night of 9/11 to hold candlelight vigils for the American victims. Reformers like the Iranian chemist and philosopher of science Abdul Karim Soroush (“science operates in a matrix of freedom of research and adversarial dialogue of ideas, a practice incompatible with political repression”) and the Tunisian philosopher Rachid al-Ghannouchi (“the only legitimacy is the legitimacy of elections”) have continued to speak out even when jailed by the authorities or beaten up by Islamist goons. A majority of Arabs polled say they approve of “American freedom and democracy.”
Can such sentiments be translated into genuinely democratic governments? Doubters argue that Muslims are not ready for democracy or that liberalism is not part of their culture, but such claims are too vague to be disproved and there are factual grounds for hope. Millions of Muslims already live in democratic nations, 200 million of them in Indonesia alone. Economically, a number of Middle Eastern nations are wealthy enough to be promising candidates for democracy, were the wealth better distributed. They include the United Arab Emirates, Qatar, Bahrain, and Iran (which has a pseudodemocratic government kept firmly under Islamist rule). Extremism may be brazen in the Middle East and the voices of moderation muted, dictatorship the rule and liberal governance the exception, the Islamists full of passionate intensity and moderates slow to respond, but those who blame the situation on Islam itself are invited to compare the torpid, muddled reactions of many European and American politicians, preachers, and scholars to the threats posed by communism and fascism just decades ago—or, for that matter, the cynicism with which science, liberalism, and democracy continue to be viewed by many in the West today. If rising income levels tend to foster liberalism, science, and democracy, then to claim that the troubles afflicting the Islamic world are cultural rather than material is to make the rather tortured case that Muslims are somehow unique.
As anguished and confused as Americans may have been over the 9/11 attacks, their responses were on the whole admirably liberal, and reported instances of public hostility toward Muslims were rare. But while the George W. Bush administration rightly drew a clear distinction between Islam and Islamism, it also demonstrated a lamentable insecurity regarding the strength of America’s liberal-democratic institutions. Foreign nationals were kidnapped and tortured while others were held without bail or legal representation in a makeshift prison at the Guantanamo naval base, President Bush going so far as to assert that he had the power to imprison American citizens, without legal counsel, if he deemed them to be a security threat. Such measures were imposed dogmatically, as if it were self-evidently true that liberal democracies are incapable of dealing with terrorism through legal due process. The administration ignored the many foreign-policy and intelligence professionals who warned that, as one put it, they had made the United States look like “a fearful superpower that has relaxed its own standards of openness and the rule of law at home.”
While these steps were being taken by a conservative American administration, progressives in England promoted illiberal campaigns aimed at scotching anti-Islamic “hate speech.” As so often happens when civil liberties are suppressed, the intentions were good but the consequences lamentable. A woman picnicking in London’s Parliament Square was arrested for having the word “Peace” inscribed in icing on a cake, another for reading aloud the names of British soldiers killed in Iraq. A conservative member of the Dutch parliament, Geert Wilders, was turned back at Heathrow Airport to prevent his accepting an invitation to show the House of Lords a film he had made depicting the Quran as a fascist book—the Foreign Office advising that his presence in England might “threaten community harmony and therefore public security.” For one liberal democracy to bar another’s elected representative from showing its own legislature a film that could be viewed by anyone with Internet access was so odd that even many British Muslims were moved to remind their compatriots of the virtues of liberalism. “Freedom of speech should be protected—so long as people do not use this freedom to call for violence against others,” said the Quilliam Foundation, an antiterrorism Muslim organization funded in part by the government, adding that it would have been better to admit Wilders and evaluate his views “through debate and argument.”
Across a wide political spectrum—from conservatives who would abridge the legal rights of suspected terrorists to progressives who would silence politically incorrect opinions about them—liberalism was mistaken for weakness. “It seems we need to fight the battle for the Enlightenment all over again,” lamented the author Salman Rushdie, who opposed hate speech legislation even though he had spent years in hiding after the Ayatollah Khomeini of Iran issued a fatwa in 1989 urging his murder. When Theo van Gogh, who had also made a film critical of Islam, was shot while riding his bicycle and lay dying on an Amsterdam sidewalk, he pleaded with his assassin, “Surely we can talk about this.” The young thug instead slit van Gogh’s throat, kicked him, and walked away. (Later arrested, he was sentenced to life in prison without parole.) What did van Gogh die for, if not free speech? “If liberty means anything at all,” observed George Orwell, “it means the right to tell people what they do not want to hear.”
Many Muslims share with other religious believers a conviction that religion is the sole or at least the most effective defender of morality. It is not. If it were, religious believers ought at the very least to commit fewer serious crimes than do atheists and agnostics, but such is not the case. As many surveys have shown, atheists and agnostics are, if anything, less apt to commit serious crimes—and they persist in their erstwhile ethicality even though they belong to the most distrusted minority in the modern world. What is called secularism—meaning atheism, agnosticism, or simply having no interest in religious faith—is on the rise in the United States, having jumped from 8 percent of the population in 1990 to 15 percent in 2008. The trend is geographically widespread—secularism is growing in all fifty states—and likely to accelerate. While only 5 percent of Americans born before 1946 describe themselves as nonbelievers, that number more than doubles for those born in the years 1946–1964 and reaches nearly 20 percent for Americans born since 1977. Yet the American violent crime rate remained flat from 1990 to 1993, and has since been declining. Indeed, crime correlates inversely with levels of religious conviction, if it correlates at all. While 15 percent of all Americans identify themselves as having no religious beliefs, the Federal Bureau of Prisons reports that nonbelievers make up only two-tenths of 1 percent of its inmates. (Christians constitute about 80 percent of the American population and 75 percent of its prisoners.) A ten-year study of death-row inmates at Sing-Sing found that 91 percent of those executed for murder were Christians, less than a third of 1 percent atheists. Similar anticorrelations between religion and crime are found internationally. Only 20 percent of Europeans say God plays an important part in their lives, as opposed to 60 percent of Americans, but Europe’s crime rates are lower than America’s. Denmark and Sweden rank among the most atheistic nations in the world—up to three quarters of their citizens identify themselves as nonbelievers—yet these Godless souls somehow enjoy admirably low levels of corruption and violent crime while scoring near the top of the international happiness indices.
Religious fundamentalists are often surprised to hear this, just as their forebears were surprised to learn from explorers’ reports that upright Hindus and Buddhists living in faraway lands comported themselves as ethically as did Anglican bishops. But the basis of such confusion disappears when the genesis of morals is examined empirically. The basic tenets of morality, such as prohibitions against murder and incest, are common to most peoples and most religions. This makes sense if the moral precepts evolved over time, socially and perhaps biologically, because they promoted human survival—as they obviously do—and are reflected in religious texts rather than having been handed down from heaven. If morality evolved, rather than having been independently invented by thousands of gods, people should behave at least as ethically without religion as with it—as, evidently, they do.
Neither liberalism nor science need quarrel with religion. Liberals defend religious rights and support the separation of church and state—a stance that benefits churches as well as states, as many wise religious leaders have pointed out. President Jefferson’s 1802 letter advocating “a wall of separation between church and state” was written at the instigation of the famous Baptist minister John Leland, who presented Jefferson with a 1,235-pound wheel of cheese to express his gratitude for Jefferson’s keeping religion safe from politics. American Baptists heeded Leland’s doctrine for almost two centuries thereafter, only to repudiate it in 1998 in hopes of gaining political leverage—a step that awakened widespread opposition to the Christian right and sparked a spate of widely read books assailing religion. Liberals caution the faithful that seeking political privileges for their particular church is unlikely to benefit it or any other religious group, in the United States or any other liberal democracy.
Scientists originally were as religious as the rest of the population, but the scientific process and the knowledge it obtains are so different from religious practices and doctrines that it is becoming increasingly difficult, as science progresses, to accommodate both within a single worldview. Religions value faith but scientists have found, often to their own embarrassment, that having faith in an idea has no bearing on whether the experimental evidence will verify it. (Nobody asked for, much less prayed for, irrational numbers or quantum nonlocality, but they became part of science anyway; nature is as it is, regardless of what we wish for.) Scientific theories stand or fall on their ability to make accurate predictions; religions have such a poor record in this regard that to champion divine prophecy is to risk being thought supercilious or deranged. Religions account for natural phenomena by positing the existence of an invisible and miraculously complex agency, science by sticking to discernable phenomena that are simpler than what they seek to explain. In that sense, God is literally the last thing a scientist should look for when studying nature.
Global warming, discovered by various scientists over more than a century of research, has opened an arena of contention among dogmatists, discoverers, and everyone in between that is likely to be with us for many years to come. The phenomenon may be threatening but the ongoing matter of dealing with it, involving as it must all the peoples of the world and raising scientific and ethical questions that resound down through future generations, could have a healthy effect on global discourse. Extremists on the left see global warming as a condemnation of capitalism and globalization and an invitation to impose stricter state controls. Their counterparts on the right dismiss it as a hoax or an exaggeration and express a touching faith that God or nature will see us through—the same God or nature that, in spasms like the Permian mass extinction of 248 million years ago, has repeatedly and unapologetically killed off most living species. Clearly, dogma won't help. Global warming will require ongoing scientific investigation, quantitative analysis, and open, liberal discussion and debate on a worldwide scale.
Its essentials are not terribly complicated. The earth’s atmosphere is thin—thinner, relative to the planet’s diameter, than the layer of moisture covering one's eyes. Through this membrane passes sunlight, which warms the earth’s surface. Much of the resulting heat is radiated back into the atmosphere as invisible, long-wavelength (“infrared”) light. Nitrogen and oxygen comprise 99 percent of the atmosphere. These gases are transparent to infrared light, so if they were the whole story the heat would simply be radiated back into space, in which case the earth would be considerably colder than it is. But the remaining 1 percent of the atmosphere includes several “greenhouse” gases—notably water vapor, carbon dioxide (CO2), and methane—that absorb the infrared light rather than letting it escape. They act like the closed windows of a parked car, trapping heat and warming the planet. Since the dawn of the Industrial Revolution, humans have been pumping carbon dioxide and other greenhouse gases into the atmosphere at an accelerating rate, through such activities as burning coal in power plants and gasoline in automobiles. It is estimated that nearly a third of the CO2 in the atmosphere today is anthropogenic.
Meanwhile the earth’s average temperature has been slowly rising, up about 0.74 degrees Celsius from 1905 to 2005. Some of the warming may be due to other causes, such as fluctuations in the amount of energy released by the sun or by the climatic cycles that, over the eons, have plunged the earth into a series of ice ages. Conceivably, all the warming might be of nonhuman origin, but the present trend is historically unusual—the world today evidently is hotter than it has been for many centuries—and anthropogenic greenhouse gases constitute the most plausible explanation for it.
If temperatures continue to rise, the results could be extremely unpleasant and dauntingly expensive to reverse. An increase of only one to two degrees Celsius over preindustrial levels could drive a third of the world’s living species to extinction. An increase of two to three degrees could sharply reduce the world’s supplies of drinking water and acidify the oceans, with unknown consequences for fish stocks and other marine ecosystems. A three- to four-degree increase could melt the Antarctic and Greenland ice caps, raising sea levels and driving hundreds of millions of people from their homes. (Sea levels have risen nearly a foot already, impacting coastal real estate markets and obliging the 11,000 citizens of low-lying Tuvalu, a tiny Pacific nation where the intruding ocean is already contaminating drinking water and destroying farmland, to apply to New Zealand for permission to emigrate there en masse.) At higher levels the threat arises of runaway warming, in which greenhouse gases currently locked into permafrost are released in such quantities that global temperatures spiral upward, utterly out of control.
On which point it may be useful to contemplate Venus, the brightest planet in the sky. Venus is virtually Earth’s twin—the two planets have the same diameter and the same mass—but while much of the earth’s carbon is bound up in its oceans and plants and in fossil fuels like coal, oil, and natural gas, the carbon on Venus resides in its atmosphere. The surface temperature of Venus is 457 degrees Celsius, hot enough to melt lead. Should the earth be pushed into runaway greenhouse warming, it might wind up resembling the Venus of today.
That, in essence, is the problem. The current rise in both anthropogenic greenhouse gases and global temperatures might be a coincidence, but how much do you want to bet on it? The cost of bringing global warming under control is about 1 percent of global GDP, estimates the economist Nicholas Stern, whereas the cost of inaction, should global warming continue, “will be equivalent to losing at least 5 percent of global GDP each year, now and forever.”
You can dispute Stern’s figures, but unless you can demonstrate that a significantly hotter world would be a boon for humanity and that there is no reason to worry about a runaway greenhouse effect—a very long row to hoe—you come up against questions of what to do about it, now or later. Those who doubt that human activity has anything to do with global warming often point out that scientists have made mistakes in the past (which they certainly have) and that their models of global climate are inadequate (which they may be). But nothing can be accomplished if those involved are required to know everything about the relevant systems and to predict their every possible future. If you examine a pond where fish are dying and discover a pipe through which raw carbolic acid is gushing into the water, you are not obliged to ascertain every specific of the pond’s ecosystem before plugging the pipe to see whether that allows the fish population to recover. Just such empirical approaches have succeeded in mitigating previous air-pollution problems: Urban smog levels were reduced by fitting automobiles with catalytic converters, acid rain diminished by fitting sulfur-reducing scrubbers on the smokestacks of coal-burning power plants, and ozone depletion addressed by banning chlorofluorocarbons, which had been employed in refrigeration systems. Had the industrialized nations instead waited until their scientists could accurately model the atmosphere in every detail, people today would be choking on auto emissions while forests and lakes were dying from acid rain and sterilizing ultraviolet sunlight poured through an ever-widening ozone hole.
Global warming was discovered in a haphazard manner that illustrates the benefits of having an open scientific community where all sorts of individuals are free to pursue their interests, report their findings, and make themselves heard. In the eighteenth century a wealthy Swiss amateur physicist named Horace Saussure—a pioneering alpinist who climbed with barometers, thermometers, and hygrometers in his kit, paying close attention to the play of sunlight at altitude and the way different materials reflected or absorbed sunlight—noted that a black box in direct sunlight got warmer when a thin pane of glass was placed on top of it. The reason, a scientist would say today, was that the glass, like the atmosphere, was opaque to infrared light—but that much would not be understood until William Herschel discovered infrared light decades later.
Meanwhile the French scientist Joseph Fourier realized that Saussure’s result might explain why the earth is so warm. Fourier had already calculated that the earth would be cold enough to freeze the oceans unless some unknown mechanism was retaining part of the sun’s heat. Now he theorized that the atmosphere was responsible—that it acted like the sheet of glass atop Saussure’s black box. The question of just how this worked remained for John Tyndall, in England, to investigate in 1859. Measuring the temperature of infrared light after it passed through various atmospheric gases, Tyndall established that methane and carbon dioxide are opaque in the infrared and so retain solar heat. With that it became apparent that the greenhouse effect keeps the earth warm.
Global warming seemed to be strictly of scientific interest, although two worrisome possibilities had appeared in the scientific literature by the end of the nineteenth century. The first was the prospect of climatic feedback loops, identified by the Swedish physicist Svante Arrhenius. People had been thinking of the climate as a linear system—if you pumped in some additional greenhouse gas you’d get back a directly related amount of global warming—but Arrhenius realized that one such change might amplify others. Suppose the amount of carbon dioxide in the atmosphere increases, warming the air. Warm air holds more water vapor than does cold air, and water vapor is a major greenhouse gas, so the slight initial increase in one greenhouse gas can spur the increase of another. Intrigued, Arrhenius consulted a colleague, Arvid Högbom, who had been studying how carbon dioxide erupts from volcanoes, circulates in the air, and is absorbed by trees and seas. Högbom had considered that coal-burning in factories might contribute to the greenhouse effect, but anthropogenic emissions were still negligible in the 1890s, so few paid much attention. By 1908, though, the rate of coal burning in the industrialized world was rising rapidly, and Arrhenius speculated in print that industrialization might eventually contribute enough CO2 to cause significant global warming. The problem was thought to be remote—nobody anticipated that the human population would quadruple by the end of the twentieth century or that its per capita energy use would also quadruple, resulting in a sixteenfold increase in CO2 emissions—and anyway, as the historian of science Spencer Weart remarks, “Warming seemed like a good thing in chilly Sweden.”
The first study connecting the rise in atmospheric greenhouse gases with rising global temperature was presented in the 1930s by an English steam-power engineer and amateur meteorologist named Guy Stewart Callendar, who by checking old weather reports confirmed rumors that average temperatures were increasing. He then compiled measurements of atmospheric CO2 and found that it, too, was increasing. Addressing the Royal Meteorological Society in London in 1938, Callendar suggested that the two trends were related—that industrial dumping of carbon dioxide into the air contributed to the warming trend. “We owe much to Callendar’s courage,” writes Weart. “His claims rescued the idea of global warming from obscurity and thrust it into the marketplace of scientific ideas.” It helped that more than a few scientists were willing to consider evidence presented by a rawboned engineer for whom scientific research was only a hobby.
In 1956, Gilbert Plass of the U.S. Office of Naval Research ran some calculations on an early digital computer and predicted that the ongoing release of greenhouse gases by human activity would raise earth’s average temperature by 1.1 degrees Celsius per century. His work attracted the attention of the Caltech chemist Charles David Keeling, who established carbon dioxide monitoring stations in California, Antarctica, and atop Mauna Loa in Hawaii. Despite a break in data gathering imposed when his funding ran out, Keeling’s results clearly showed a substantial rise in global CO2 levels, from an average of around 315 parts per million (ppm) in the late fifties to over 370 ppm by 2002. (Updated estimates find levels of 280 ppm in 1850 and 380 ppm in 2008.) Moreover, the rate was accelerating as industrialization spread. The world in 2008 produced three times as much CO2 as in 1957. Obviously this could not continue indefinitely—project the current rate of rise far enough into the future and you get Venus—but too little was understood about the atmosphere to establish how much CO2 it could absorb safely. “The atmosphere is not a dump of unlimited capacity,” warned the National Academy of Sciences in 1966, but, “We do not yet know what the atmosphere’s capacity is.”
As more scientists got into global-warming research, the geological history of global temperature change became better understood. Cesare Emiliani, a geologist who collected core samples from the ocean floor (creating a mathematical formula to determine exactly how many bottles of beer could be stored in the empty tubes, based on the crew’s anticipated consumption rate and the scheduled dates when each tube would be crammed with ancient mud and sand) compiled a temperature record dating back three hundred thousand years. Emiliani found that there have been dozens of ice ages—not just four, as had been thought—and that their chronology fits the theory that ice ages result from the slow oscillation, or precession, of the earth’s axis. By the nineteen eighties, sufficient core-sample data were on hand to show that global carbon dioxide levels historically have been low during the ice ages and high when the earth was warm. The other greenhouse gases appear to have behaved similarly. Methane, for instance, is mostly locked up in peat moss and permafrost, but could be released should the world warm sufficiently. Molecule for molecule, methane is twenty times as potent as carbon dioxide in fueling the greenhouse effect. The world’s wetlands and permafrost are already melting, with studies indicating that methane levels are increasing by more than 10 percent per decade. They contain more greenhouse gases than have yet been released by all human activities combined.
Each ice age typically was preceded by a period of global warming. One way to visualize how this can happen is by considering the phenomenon known as glacial surge. Giant glaciers like those in Antarctica, Greenland, and Alaska date back to the last ice age, when New York and Chicago were buried under a mile of ice. Glaciers normally move with trademark slowness but occasionally slip, accelerating to velocities as high as three hundred meters per day. A feedback loop is thought to be responsible: Water under the glacier melts sufficiently to create a slippery surface, the glacier starts speeding up, and friction from its increased speed liquefies more of the ice, so the glacier accelerates. In recent decades, geologists have found that the mechanisms holding glaciers in place can be so delicate that a small rise in global temperatures suffices to release massive amounts of glacial ice: The glaciers could slide off the landmasses that supported them and splash into the sea. They contain enough water to raise global sea levels by several meters—enough to flood New York and many other cities, much of Florida and Bangladesh, and the entirety of inhabited atolls like Tuvalu. Geologists caution that such a catastrophe might be followed not by further warming but by the onset of a new ice age. The reason is that the glaciers create huge icebergs, which, combined with the remaining ice left behind in Iceland and Antarctica, makes the earth a much whiter planet, one that reflects much more sunlight than before and consequently cools down rather rapidly. That is why global warming is sometimes called global climate change, to emphasize that it can invoke cold as well as heat.
By 1981, enough scientists were convinced that increased levels of greenhouse gases could set off rapid climate change to put the story on the front page of the New York Times. Public response was torpid, until the summer of 1988 brought heat waves not experienced in the United States since the Dust Bowl of the 1930s. Four in five Americans thereafter had at least heard of the greenhouse effect. A turning point was reached in 1990, when 170 scientists comprising the Intergovernmental Panel on Climate Change (IPCC) released the first in a series of reports confirming that the world was getting warmer and stating that human activities were at least partly responsible. “Warming of the climate system is unequivocal,” warned the IPCC in 2007, adding that global sea levels were rising and northern-hemisphere snow covers shrinking, that anthropogenic greenhouse gas emissions had increased 70 percent between 1970 and 2004, and that “most of the global average warming over the past 50 years is very likely due to anthropogenic GHG [greenhouse gas] increases.”
The IPCC uses the words very likely to mean “having a probability of over 90 percent.” So when it reports that “most of the observed increase in global average temperatures since the mid-twentieth century is very likely due to the observed increase in anthropogenic GHG concentrations,” it is assigning a 90 percent probability to the theory that anthropogenic greenhouse emissions are warming the planet. Other conclusions assigned that probability in the 2007 report are that “the observed widespread warming of the atmosphere and ocean, together with ice mass loss…is not due to known natural causes alone,” that human activities “contributed to sea level rise during the latter half of the 20th century,” and “that climate change can slow the pace of progress toward sustainable development,” retarding or reversing the recent gains made in alleviating poverty and hunger.
Dogmatists accustomed to the rhetorics of religious or political certitude may regard such quantifications as shilly-shallying, but all predictions are statements of probabilities. The odds are good that the sun will rise tomorrow morning but they are less than 100 percent, as are the odds that the book you are reading will still be here ten seconds from now. (It is possible, though unlikely, that the sun will explode tonight, or that every atom in the book will simultaneously perform a quantum leap to the asteroid belt.) Lyman Bryson, writing a half century ago about the “moral atmosphere in which [scientists] live and for which its acolytes are trained,” cited as an example of scientific ethics “that no judgments will be given as absolutes.” To dismiss the IPCC conclusions as merely probabilistic is like playing Russian roulette with a ten-shot revolver containing nine live rounds.
Global warming poses ethical issues of long duration and universal human concern—issues which demand quantitative analysis, and cannot be addressed satisfactorily by invoking the on-off switches (Thou shalt do this, Thou shalt not do that) that characterized the moral precepts of old. With the rise of science and mathematics it became commonplace to bifurcate human thought into two realms—quantitative thinking (employed in science, finance, and the like) and something called qualitative thinking, alleged to be the realm of creative pursuits like painting and poetry. This distinction is probably false—most likely all thought is quantitative, whether we know it or not—but in any event the ethics of global warming demand quantitative analysis on virtually all levels. Rich nations produce far more greenhouse gases per capita than poor nations do—but how much more, weighed against the rich world’s contribution to global productivity and growth? Greenhouse gases put into the atmosphere today will stay up there for decades: Is it ethical to bequeath them to our descendants, and if so to what degree?
Risks are calculated by multiplying the odds of something happening times the cost if it does. The potential costs of global warming are high, so the risk must be taken seriously even if the odds of its occurring are low. The question of when to address them is also important, but more subtle than is generally realized. All else being equal, it should be cheaper to head off catastrophic global warming now than later; only a slight course correction is required to avoid an iceberg miles ahead, whereas if you wait till the last moment, the energy required to evade the iceberg may exceed the capacity of your ship. But all else is not equal. Our descendants, decades from now, will probably still be dealing with global warming, but they will have the advantages of advanced technology and a stronger economy—or of neither, should climate change badly erode the economy. Any responsible policy has to take these prospects into account.
Suppose that within the next few decades the world has taken all the relatively easy steps indicated to abate global warming—improving energy efficiency, reducing carbon emissions through cap-and-trade or some such approach, and promoting the development of new technologies ranging from solar cells to genetically engineered, carbon-devouring trees—and has now come up against the harder, more costly measures required to bring the problem fully under control. Should more money be spent immediately, or later? At one extreme all the money in the world could immediately be allocated to address it, but that would wreck the economy, inflicting more human suffering immediately than global warming would have wrought in the long term. At the other extreme nothing more would be spent, but then the problem might just get worse. The point is that light-switch ethics—pounding the table and insisting, “We must save the lives of our grandchildren,” or, alternately, “We must leave free enterprise alone”—are inadequate when dealing with predictions, since predictions are inherently probabilistic. You can ignore the odds, just as you can close your eyes while driving a speeding automobile, but to do the job responsibly means staring at a lot of curving lines on graphs and trying to find the sweet spot. One line predicts the growth of the world economy—which increases the potential ability of our descendants to deal with the problem. Another projects a long-term economic decline occasioned by global warming—which weakens our descendants’ spending power, though it may also reduce pollution. A third line predicts the “discount rate,” which takes into account the declining purchasing power of a dollar over time—and so forth. Responsible stewardship requires dealing in quantities and probabilities, with little recourse to comforting certitudes.
A quantitative ethics involving intersecting lines on graphs may be less than ideal, but traditional ethics are not ideal, either. One way to evaluate ethical systems is to ask which are invariant, meaning that they are accepted and practiced across virtually all cultures and nationalities.
Religious ethical systems tend to mix invariant prescriptions with others more peculiar to the cultures from which each religion emerged. Six of the Ten Commandments, for example, are invariant—honor your parents; do not commit murder, adultery, or theft; and do not bear false witness or covet your neighbor’s property. The other four are parochial: To observe the Sabbath and to worship only Yahweh, making no graven images of him and never taking his name in vain, are precepts neither honored nor esteemed in all cultures. Nor is invariance characteristic of hundreds of other religious injunctions, such as the Bible’s advice that if parents find their son to be stubborn and rebellious, “All the men of the city shall stone him to death,” or the Quran’s advising women to “stay in your houses and not display your finery.” The Golden Rule—“Do unto others as you would have others do unto you,” or, as Kant put it, “Act only on those maxims that you can at the same time will should become a universal law”—is invariant across the human species, and hence is widely regarded as an objective basis for ethics generally.
Among political philosophies, conservative values of patriotism and honoring military service are embraced by many cultures but can break down over national differences: American conservatives supported the mass demonstrations in which hundreds of thousands of Iranians protested the rigged elections of 2009, but Iranian conservatives did not; American conservatives are anticommunist, but many Chinese conservatives belong to the Communist Party. Progressivism lacks invariance to the extent that leftists in various societies differ in their goals and in their estimations of how best to achieve them: Progressives like to blame social problems on wealthy elitists, yet many wealthy elites are progressive. The one genuinely invariant political philosophy is liberalism. Its prescription of personal freedom and universal human rights works without amendment in all cultures. When people express admiration for American freedom and democratic values, they are admiring not the fact that Americans are patriotic, or that they have Medicare and Social Security, but that thanks to liberal democracy the American electorate has the power to go from a President George W. Bush to a President Barack Hussein Obama—or the other way around.
Science is likewise invariant: Scientific research is practiced in much the same ways everywhere, deriving results that apply throughout the known universe. Therefore it would be surprising if science did not eventually imply and evince invariant ethical standards. Some of these emerging values may strike us as odd—if, for instance, humans are animals, and differ from other animals only by degrees, then shouldn’t the other animals have some rights?—but the teachings of Jesus seemed odd at first, too, and yet have proved lastingly popular. What might a scientific ethics look like? Two promisingly invariant precepts were suggested recently by the geneticist Sydney Brenner, who when asked by a student at the Salk Institute what commandments should govern the behavior of scientists, replied, “To tell the truth,” and, “To stand up for all humanity.”
People everywhere wish for a better world—a more peaceful and prosperous world, where their children can live healthy, happy lives—and they have long sought the right intellectual tools with which to pursue this goal. Religion works best when it emphasizes common decency, philosophy when stressing our ignorance, art when exposing us to visions larger than ourselves, history by drawing lessons from the past—but the most effective tools are liberalism and science. They may on occasion lead to harmful results, as may anything else: You can poison a prophet with a Girl Scout cookie. But science and liberalism have an unequaled capacity for doing good—for reducing cruel ignorance and villainous certitude, encouraging freedom and effective government, promoting human rights, putting food in the mouths of the hungry and attainable prospects in their future. If we keep our heads, use our heads, nourish learning, tend the fires of freedom, and treat one another with justice and compassion, our descendants may say of us that we had the vision to do science, and the courage to live by liberty.