2

SURVIVING THE CENTURY

FHTI_infinity%20symbol_3.ai

As an astronomer, I sometimes get mistaken for an astrologer—but I cast no horoscopes and have no crystal ball. The past record of scientific forecasters is, in fact, dismal. When Alexander Graham Bell invented the telephone, he enthused that “some day, every town in America will have one.” The great physicist Lord Rutherford averred that nuclear energy was moonshine; Thomas Watson, founder of IBM, said, “I think there is a world market for maybe five computers”; and one of my predecessors as England’s Astronomer Royal said that space travel was “utter bilge.” I won’t add to this inglorious roll call. Instead, let us focus on a key question: How can our scientific capabilities be deployed to ease rather than aggravate the tensions that the world will confront in the coming decades?

Our lives today are molded by three innovations that gestated in the 1950s, but whose pervasive impact certainly wasn’t then foreseen. Indeed, forecasters generally underestimate long-term changes, even when overestimating short-term ones. It was in 1958 that Jack Kilby and Robert Noyce built the first integrated circuit, the precursor of today’s ubiquitous silicon chip—perhaps the most transformative single invention of the last century. It has spawned the worldwide reach of mobile phones and Internet, promoting economic growth while itself being sparing of energy and resources. In the same decade James Watson and Francis Crick discovered the bedrock mechanism of heredity—the famous double helix. This launched the science of molecular biology, opening profound prospects whose main impact still lies ahead. Ten years ago, the first draft of the human genome was decoded. It was a huge international project, acclaimed by President Clinton and Prime Minister Blair at a special press conference, and the cost was around $3 billion. Today, genome sequencing—the “read out” of our genetic inheritance—is becoming a routine technique that costs only a few thousand dollars.

And there’s a third technology that emerged during this period: space. It’s over fifty years since the launch of Sputnik, an event that led President Kennedy to inaugurate the Apollo program to land men on the Moon. Kennedy’s prime motive was of course superpower rivalry; cynics could deride it as a stunt but it was undoubtedly an extraordinary technical triumph. And Apollo had an inspirational legacy too. Distant images of Earth—its delicate biosphere of clouds, land, and oceans contrasting with the sterile moonscape where the astronauts left their footprints—have, ever since the 1960s, been iconic for environmentalists.

But of course there was always a dark side to space. Rockets were primarily developed in order to carry nuclear weapons, and those weapons were themselves the outcome of the World War II Manhattan Project, which inaugurated the nuclear age and was even more intense and focused than the Apollo program. We lived, throughout the Cold War, under a threat of nuclear catastrophe that could have shattered the fabric of civilization, a threat especially acute at the time of the Cuba crisis in 1962. It wasn’t until he’d long retired that Robert McNamara, then US Secretary of Defense, spoke frankly about the events in which he’d been so deeply implicated. In his confessional documentary film The Fog of War, he said, “We came within a hair’s-breadth of nuclear war without realizing it. It’s no credit to us that we escaped—Krushchev and Kennedy were lucky as well as wise.” Indeed on several occasions during the Cold War the superpowers could have stumbled toward armageddon.

The threat of global nuclear annihilation involving tens of thousands of bombs is, thankfully, in abeyance, but this prospect hasn’t gone for good: we can’t rule out, by midcentury, a global political realignment leading to a standoff between new superpowers that could be handled less well or less luckily than was the Cuban missile crisis. And the risk that smaller nuclear arsenals proliferate, and are used in a regional context, is higher than it ever was. Moreover, al-Qaida-style terrorists might someday acquire a nuclear weapon and willingly detonate it in a city, killing tens of thousands along with themselves.

The nuclear age inaugurated an era when humans could threaten the entire Earth’s future. We’ll never be completely rid of the nuclear threat—H-bombs can’t be disinvented—but the twenty-first century confronts us with grave new perils. They may not threaten a sudden worldwide catastrophe, but they are, in aggregate, worrying and challenging. Some will be consequences of new technologies that we can’t yet envisage—any more than Rutherford could have predicted thermonuclear weapons.

World population trends

There’s one trend that we can predict with confidence. There will, by midcentury, be far more people on the Earth than there are today. Fifty years ago, world population was below 3 billion. It has more than doubled since then and reached 7 billion in 2011; the projections for 2050 range between 8.5 and 10 billion, the growth being mainly in the developing world. More than 50 percent of the world’s population now live in cities, and this proportion is growing.

The majority of the world’s people live in countries where fertility has fallen below the replacement level of about 2.1 births per woman. The falls over the last twenty-five years have been dramatic (by 50 percent in Brazil, and more than 70 percent in Iran). The current rates range from 7.1 in Niger to 1.2 in South Korea; the European average is about 1.4. This so-called demographic transition is a consequence of declining infant mortality, availability of contraceptive advice, and women’s education, among other things. However, more than half of those in the developing world who are alive today are less than twenty-five years old—and with rising life expectancy. That’s why a continuing population rise until midcentury seems almost inevitable. If the demographic transition quickly extended to all countries, then the global population could gradually decline after 2050. But numbers are rising fast in India, whose population is projected to overtake China’s by 2030 and could exceed 1.6 billion by 2050. Population projections for Africa are also rising. A hundred years ago, Ethiopia’s population was 5 million. That figure is now about 80 million and is predicted to almost double by 2050. The populations of both Sudan and Uganda are also estimated to more than double by midcentury, putting increasing pressure on the water resources of the Nile basin. In total, estimates suggest that there could be a billion more people in Africa in 2050 than there are today.

But the trends beyond 2050 will depend on what people now in their teens and twenties decide about the number and spacing of their children. In Africa there are around 200 million women who are denied such a choice. Enhancing the life chances of Africa’s poorest people—by providing clean water, primary education, and other basics—should be a humanitarian imperative. But it would seem also a precondition for achieving throughout that continent the demographic transition that has occurred elsewhere. Failure to achieve this would be a failure of governance—the resources required are modest—and a continental-scale tragedy that would also trigger massive migratory pressures. (An earlier Reith Lecturer, Jeffrey Sachs, pointed out that the resources needed to achieve the UN’s millennium goals, and thereby enhance the lives of the world’s “bottom billion,” were less than those possessed by the thousand hyperwealthy individuals in the world.)

To feed Africa’s people, farming productivity must be enhanced (but without degrading the soil or ecology, as is now happening) using modern agricultural methods, among which genetic modification could be one. And water shortages, perhaps aggravated by climate change, must be contended with. To produce a kilogram of vegetables with present methods takes 2,000 liters of water; a kilogram of beef takes 15,000 liters. But as well as biological advances, modern engineering practices must be adopted to conserve water, reduce food waste, and so on.

It is not possible to specify an optimum population for the world. This is because we can’t confidently conceive what people’s lifestyles, diet, travel patterns, and energy needs will be beyond 2050. The world couldn’t sustain anywhere near its present population if everyone lived like present-day Americans, profligate of energy and resources. On the other hand, more than 10 billion people could live sustainably, with a high quality of life, if all adopted a vegetarian diet, traveling little but interacting via super-Internet and virtual reality (though this particular scenario is plainly not probable, nor, necessarily, attractive). So it’s naive to quote a single unqualified headline figure for the world’s carrying capacity or optimum population.

Two messages, in particular, should surely be proclaimed more widely. First, enhanced education and empowerment of women within this decade—surely a benign priority in itself—would reduce fertility rates in the poorest nations, and could thereby reduce the projected world population beyond 2050 by as much as a billion. Second, the higher the post-2050 population becomes, the greater the pressures on resources—especially if the developing world, where most of the growth will be, narrows its gap with the developed world in its per capita consumption of energy and other resources.

Over 200 years ago, Thomas Malthus famously argued that populations would rise until limited by food shortages. His gloomy prognosis has been forestalled—despite a sevenfold rise in population since his time—by advancing technology and the green revolution. He could one day be tragically vindicated. But this need not happen. There is real potential, in Africa and elsewhere, for sustainably enhancing food production so that the larger and much demanding population expected in 2050 can be fed—but only if current knowledge and techniques are efficiently and appropriately applied.

It is important to highlight global population because the topic seems currently underdiscussed. That is because doom-laden forecasts made in the past have proved off the mark, and because it is deemed by some a taboo subject­—tainted by association with eugenics in the 1920s and 1930s, with Indian policies under Indira Gandhi, and, more recently, with China’s effective but hard-line one-child policy.

Responses to potential climate change

Another firm prediction about the post-2050 world is that, as well as being more crowded, it will on average be warmer. Human actions—mainly the burning of fossil fuels—have already raised the carbon dioxide concentration higher than it’s ever been in the last half million years. The graph of carbon dioxide concentration, the so-called Keeling Curve, shows a steady rise over the last fifty years—a mere geological instant—superimposed on the readily detectable annual oscillations caused by the seasonal growth and decay of vegetation in the Northern Hemisphere. These beautiful and precise measurements are entirely uncontroversial. Moreover, according to business-as-usual scenarios where we remain as dependent as today on fossil fuels, the concentration could reach twice the preindustrial level within fifty years, and go on rising. This much is uncontroversial too. Nor is there significant doubt that CO2 is a greenhouse gas, and that the higher its concentration rises, the greater the warming—and, more important still, the greater the chance of triggering something grave and irreversible: rising sea levels due to the melting of Greenland’s ice cap; runaway greenhouse warming due to release of methane in the tundra; and so on.

There is, as mentioned in Chapter 1, still substantial uncertainty about just how sensitive the temperature is to the CO2 level. The climate models can, however, assess the likelihood of a range of temperature rises. It is the high-end tail of the probability distribution that should worry us most: the small probability of a really drastic climatic shift. Climate scientists now aim to refine their calculations, and to address questions like: Where will the flood risks be concentrated? What parts of Africa will suffer severest drought? Where will the worst hurricanes strike? Will extreme weather events become more frequent? The headline figures routinely quoted—2-, 3-, or 5-degree rises in the mean global temperature—might seem too small to fuss about, but two comments should put this into perspective. First, even in the depth of the last ice age the mean temperature was lower by just 5 degrees. Second, it’s important to realize that the rise won’t be uniform: the land warms more than the sea, and high latitudes more than low. Quoting a single figure glosses over shifts in global weather patterns that will be more drastic in some regions than in others (and indeed may cause some regions to cool rather than warm). A mean global rise of 4 degrees could lead to a warming of 10 degrees centigrade (18 degrees Fahrenheit) in western and southern Africa. Indeed the worst effects of any warming may be in Africa and Bangladesh, which have contributed least to the emission. Rising carbon dioxide could induce relatively sudden flips rather than just gradual changes.

The science of climate change is intricate, but it’s straightforward compared to the economics and politics. The economist Nicholas Stern, in his influential review “The Economics of Climate Change,” written for the British government in 2006, averred that a response to global warming needs “all the economics you ever learnt, and some more. It’s a market failure on a colossal scale.” It poses a unique political challenge for two reasons. First, the effect is nonlocalized: CO2 emissions from the United States have no more effect there than they do in Australia, and vice versa. That means that any credible regime whereby the polluter pays has to be broadly international. Second, there are long time lags; it takes decades for the oceans to adjust to a new equilibrium, and centuries for ice sheets to melt completely. Even though anthropogenic climate change is already perceptible, the main downsides of global warming lie a century or more in the future. Concepts of intergenerational justice then come into play: How should we rate the rights and interests of future generations compared to our own? What discount rate should we apply?

The declared political goal has been to halve global carbon dioxide emissions by 2050. On the basis of the best current modeling, this is the reduction needed in order to bring below 50 percent the probability that the mean temperature will rise by more than 2 degrees. This target corresponds to a ration of 2 tons of carbon dioxide per year for each person on the planet. For comparison, the current US level is 20 tons per person, per year, the European figure is about 10, the Chinese level is already 5.5, and the Indian is 1.5. This target must be achieved without stifling economic growth in the developing world, where the emissions in the short term are bound to rise, so it’s the richer countries that must take the lead in making cuts. (In particular, nothing should take priority over quick action to bring electricity to the poorest, whose meager power supply, from the burning of cow dung or wood, is hazardous to health. Deaths from indoor household fuel pollution, worldwide, run at 1.6 million per year—as many as from malaria. Two billion lives could be transformed without adding more than 1 percent to current emissions.)

Success in halving global carbon emissions would be a momentous achievement—one where all nations acted together, in the interests of a future beyond the normal political horizon. The meager progress in Copenhagen in December 2009 led to a pessimism that was only partly allayed by the outcome of the Cancun meeting a year later. On the other hand, odd though this may sound, the political response to the 2009 financial crisis may offer encouragement. Who would have thought three years ago that the world’s financial system would have been so transformed that big banks were nationalized? Likewise, we need coordinated outside-the-box action to avoid serious risk of a long-term energy crisis.

There is, incidentally, at least one precedent for long-term altruism. In discussing the safe disposal of nuclear waste, policy makers talk with a straight face about what might happen more than 10,000 years from now, thereby implicitly applying zero discount to the cost of future hazards or benefits. To concern ourselves with such a remote “posthuman” era might seem bizarre, but all of us can surely empathize at least a century ahead. Especially in Europe, we’re mindful of the heritage we owe to centuries past and history will judge us harshly if we discount too heavily what might happen when our grandchildren grow old.

The European Union (EU) is pursuing progressive policies aimed at reducing the carbon emissions of its member states. Yet whatever the EU and the rest of the world do, if the United States and China do not alter their current policies there is little or no hope of meeting the targets: these two giants between them contribute over 40 percent of total global emissions. Despite the pledge made by President Obama at his inauguration that containing climate change was a priority goal, political paralysis seemingly prevails at the federal level, though there are positive initiatives at the level of local communities, third-sector organizations, cities, and states. China’s leaders (many of whom are trained engineers) seem aware of their nation’s vulnerability to climate change and are investing in renewable technologies and nuclear power on a substantial scale. They emphasize the need to improve energy efficiency, but the rapid economic growth (and dependence on coal-fired power stations) is generating emissions that are still increasing year by year, albeit more slowly than the GNP. Whatever targets are set, they won’t be met without a transition to a lifestyle dependent on clean and efficient energy technology. Enlightened business leaders recognize that the actions needed to abate the threat of climate change will create manifold new economic opportunities.

The world spends more than $5 trillion a year on energy and its infrastructure. But currently far too little is invested in developing techniques for economizing on energy, storing it, and generating it by low-carbon methods. Certainly the major utilities are spending very little: according to the journalist Tom Friedman, US energy utilities spend less on R&D than the American pet-food industry does. The main US investments are in small start-up companies, especially in solar energy. (In Britain, meanwhile, R&D within the energy industries is now expanding, but has yet to fully attain the pre-privatization level of the late 1980s.) There’s a glaring contrast here with the fields of health and medicine, where the worldwide R&D expenditures are disproportionately higher. The clean energy challenge deserves a priority and commitment akin to the Manhattan Project or the Apollo Moon-landing. Indeed it would be hard to think of anything more likely to enthuse young people toward careers in engineering than a firmly proclaimed priority to develop clean energy for the developing and the developed world.

My own country, Britain, is so small that its stance may seem of marginal import: our carbon emissions constitute only 1 or 2 percent of the problem. But we have tried to exert leverage in two respects. We gained influence because of Tony Blair’s efforts at the Gleneagles G8 Summit, which he hosted in 2005, and have already enshrined in the Climate Change Act a commitment to cut our own emissions by 80 percent over the next forty years. It’s important to give credit to several politicians, of all parties, who worked hard to keep these issues high on the agenda even though long-term altruism is plainly not a vote-winner; the Conservative-led coalition currently in power has not yet backtracked.

The United States and Europe should both prioritize the technologies needed for a low-carbon economy. There’s a need to ensure our own energy, but, beyond that imperative, it’s in our interest not to fall behind the Chinese in developing clean energy technologies for which the demand will be worldwide.

Wave or tidal energy is a niche market, but it’s one where Britain has a competitive advantage. This island nation has the geography—capes round its coast with fast-flowing tidal currents—and also expertise in marine technology spun off from North Sea oil and gas projects. There is a long-studied scheme to get 7 percent of our electric power by tapping the exceptional (40-foot) tidal range in the estuary of the River Severn by a 10-mile-long barrage.

What about biofuels? There’s been ambivalence about them because they compete for land use with food-­growing and forests, but in the long run GM techniques may lead to novel developments: bugs that break down cellulose, or marine algae that convert solar energy directly into fuel. Another need is for improved energy storage. Steven Chu, the Nobel Prize–winning physicist whom President Obama appointed as Energy Secretary, has given priority to improving batteries—for electric cars, and to complement unsteady power sources such as sun and wind.

As to the role of nuclear power, I myself would favor Britain and the United States having at least a replacement generation of power stations. But risks can be catastrophic and the nuclear non­proliferation regime is fragile. One can’t be relaxed about a worldwide program of nuclear power unless internationally regulated fuel banks are established to provide enriched uranium and remove and store the waste—and unless there is a strictly enforced safety code to guard against risks analogous to those from poorly maintained “third-world airlines.” Despite the ambivalence about widespread nuclear energy, it’s surely worthwhile to boost R&D into fourth-generation reactors, which could be more flexible in size, and safer. The industry has been relatively dormant for the last twenty years, and current designs date back to the 1960s.

And of course nuclear fusion, the process that powers the Sun, still beckons as an inexhaustible source of energy. Attempts to harness this power have been pursued ever since the 1950s, but the history here is of receding horizons: commercial fusion power is still at least thirty years away. The challenge is to use magnetic forces to confine gas at a temperature of millions of degrees—as hot as the center of the Sun. Despite its cost, the potential payoff is so great that it is surely worth continuing to develop experiments and prototypes. The largest such effort is the International Thermonuclear Experimental Reactor, internationally funded and based in France; similar projects are being pursued in Korea and elsewhere. (These all involve magnetic confinement of ultrahot gas. An alternative concept, whereby tiny deuterium pellets are imploded and heated by converging beams from immense lasers, is being pursued at the Livermore Laboratory in California, but this seems primarily a defense project to provide lab-scale substitutes for H-bomb tests, where the promise of controlled fusion power is a political fig leaf.)

An attractive long-term option is solar energy—huge collectors in the desert could generate power that’s distributed via a continent-wide smart grid. Achieving this would require vision, commitment, and public-private investment on the same scale as the building of the railways in the nineteenth century. Indeed, any transformation of the world’s energy and transport infrastructure is a massive project that would take several decades. (And to match the United States and China the EU needs a more coordinated and less labyrinthine organizational structure. Henry Kissinger once famously complained, “When I want to speak to Europe, whom do I call?” Pan-European decisions are needed on more and more issues.) Even those who favor renewable energy in principle often oppose specific proposals because of their environmental or aesthetic impact. But we need to remember that all the renewables—wind, tidal, solar, or biofuels—have downsides so some such impact is unavoidable. So of course does nuclear power, though the aggregate depends on how much energy per capita is needed for the lifestyle that we want. And of course alternative energy sources would supersede coal mines and oil rigs, of whose environmental and human risks we are all too aware already.

Many of us still hope that our civilization can segue toward a low-carbon future and a lower population, and can achieve this transition without trauma and disaster. But that will need determined action by governments, urgently implemented; and such urgency won’t be achieved unless sustained campaigning can transform public attitudes and lifestyles. Of course, no politician will gain much resonance by advocating a bare-bones approach that entails unwelcome lifestyle changes. The priority for all developed countries should be to implement measures that actually save money—by using energy more efficiently, insulating buildings better—and to incentivise new clean technologies so that, as fossil fuel prices rise, a transition to clean energy is less costly. But what is very important is to prioritize the development of those new energy sources, be they wind, tides, solar, or nuclear. There may be need for a new international body, along the lines of the World Health Organization or the International Atomic Energy Agency, to monitor fossil fuel use and facilitate the transition to clean energy.

The strongest motive for urgent action is that the worst-case climatic scenarios predict really serious stresses. Even within fifty years, we could face serious climatic stresses if effective mitigation is too-long postponed and the warming rate is at the upper end of the predicted range.

In twenty years, we will know—from firmer science, improved computer modeling, and also from a longer time span of data on actual climatic trends—whether the feedback from water vapor and clouds strongly amplifies the effect of CO2 itself in creating a greenhouse effect. If so, and if the world consequently seems on a rapidly warming trajectory because international efforts to reduce emissions haven’t been successful, there may be a pressure for panic measures. These would have to involve a plan B—being fatalistic about continuing dependence on fossil fuels, but combating its effects by some form of geoengineering. One option is to counteract the greenhouse warming by, for instance, putting reflecting aerosols in the upper atmosphere or even vast sunshades in space. The political problems of such geoengineering may be overwhelming. Not all nations would want to turn down the thermostat equally, and there could be unintended side effects. Moreover, the warming would return with a vengeance if the countermeasures were ever discontinued; and other consequences of rising CO2 (especially the deleterious effects of ocean acidification) would be unchecked. An alternative strategy, which currently seems less practicable, would involve direct extraction of carbon from the atmosphere, either by deploying, on vast scales, the principles used by scrubbers to purify air in submarines, or else by growing trees and “fixing” the carbon they absorb as charcoal. This approach would be politically more acceptable: we’d essentially just be undoing the unwitting geoengineering we’ve done by burning fossil fuels.

It seems prudent at least to study geoengineering, to clarify which options make sense and perhaps counter undue optimism about a technical quick fix of our climate. However, it already seems clear that it would be feasible and affordable to throw enough material into the stratosphere to change the world’s climate—indeed what is scary is that this capacity might be within the resources of a single nation, or even a corporation or plutocratic individual. Very elaborate climatic modeling would be needed in order to calculate the regional impacts of such an intervention. That is why it is crucial to sort out the complex governance issues raised by geoengineering and to do this well before there is any chance that urgent pressures for action will build up.

Other vulnerabilities

Energy security, food supplies, and climate change are the prime long-term “threats without enemies” that confront us, all aggravated by rising populations. But there are others. For instance, rapid changes in land use may jeopardize whole ecosystems. There have been five great extinctions in the geological past; human actions are causing a sixth. The extinction rate is a hundred, even a thousand, times higher than normal. We are destroying the book of life before we have read it. To quote E. O. Wilson, one of the world’s most distinguished ecologists, certainly its most eloquent: “At the heart of the environmentalist world view is the conviction that human physical and spiritual health depends on the planet Earth . . . Natural ecosystems—­forests, coral reefs, marine blue waters—maintain the world as we would wish it to be maintained. Our body and our mind evolved to live in this particular planetary environment and no other.”

Biodiversity is often proclaimed as a crucial component of human well-being and economic growth. It manifestly is: we’re clearly harmed if fish stocks dwindle to extinction; there are plants in the rain forest whose gene pool might be useful to us. But for Wilson these instrumental­—and anthropocentric—arguments aren’t the only compelling ones. For him, preserving the richness of our biosphere has value in its own right, over and above what it means to us humans.

So far my focus has been on threats that we’re collectively imposing on the biosphere. But it is important also to highlight a new type of vulnerability that could stem from empowerment of individuals or small groups by fast-developing technologies.

Almost all innovations entail new risks. Most surgical procedures, even if now routine, were often fatal when pioneered; and, in the early days of steam, people died when poorly designed boilers exploded. But something has changed. Hitherto, most of the risks caused by humans have been localized and limited. If a boiler explodes it is horrible, but there is a limit to just how horrible. But there are new hazards whose consequences could be so widespread that even a tiny probability is disquieting. Global society is precariously dependent on elaborate networks—electricity grids, air traffic control, international finance, just-in-time delivery, and so forth. It’s crucial to ensure maximal resilience of all such systems, and maximum security against sabotage, at a time when concern about cyber attack, by criminals or by hostile nations, is rising sharply. Otherwise the manifest benefits of those systems could be outweighed by catastrophic (albeit rare) breakdowns cascading through them.

There are other potential vulnerabilities. It’s becoming feasible, for instance, to stitch together long strands of DNA, and thereby construct from scratch the blueprint of an organism. The potential for medicine and agriculture is huge, but there are risks too. Already the genomes for some viruses—polio, Spanish flu, and SARS—have been synthesized. Expertise in such techniques will become widespread, posing a manifest risk of bioerror or bioterror. In the early days of recombinant DNA (gene splicing) in the 1970s, there was concern about unintended consequences, and a moratorium was imposed after a conference of experts held at Asilomar, California. This moratorium soon came to seem unduly cautious, but that doesn’t mean that it was unwise at the time, since the level of risk was then genuinely uncertain. It showed that an international group of leading scientists could agree to a self-denying ordinance, and influence the research community powerfully enough to ensure that it was implemented. There have recently been moves to control the still more powerful techniques of synthetic biology that can create organisms from the ground up by synthesizing a new genome bit by bit. A voluntary consensus is harder to achieve today: the academic community is far larger, and competition (enhanced by commercial pressures) is more intense.

We’re kidding ourselves if we think that those with technical expertise will all be balanced and rational: expertise can be allied with fanaticism—not just the types of fundamentalism with which we are currently familiar, but that exemplified by some New Age cults, extreme eco-freaks, violent animal rights campaigners, and the like. And there will be individual weirdos, with the mindset of those who now unleash computer viruses—the mindset of arsonists. The global village will have its village idiots. In a future era of vast individual empowerment, where even one malign or foolish act could be too many, how can our open society be safeguarded? Perhaps our society will make a shift toward more intrusion and less privacy. (Indeed, the rash abandon with which people put their intimate details on Facebook, and our acquiescence in ubiquitous CCTV, suggests that such a shift would meet surprisingly little resistance.) Or will there be pressures to constrain diversity and individualism? These may become serious issues.

Some years ago I wrote a short book on the theme of this chapter, entitled Our Final Century? My British publishers deleted the question mark; the American publishers changed the title to Our Final Hour (the US public seeks instant [dis]gratification). I conjectured that, taking all risks into account, there was only a 50 percent chance that we would get through to 2100 without a disastrous setback. This seemed a depressing conclusion. However, I have been surprised by how many of my colleagues thought a catastrophe was even more likely than I did, and so considered me an optimist. I am actually an optimist—at least a techno-optimist. Over the last decade, we’ve experienced astonishing advances in communication and in access to information. Our lives have been hugely enriched by consumer electronics and web-based services that we would willingly pay far more for, and which surpass any expectations we had a decade ago. And the impact on the developing world has been dramatic: there are more mobile phones than toilets in India. Mobile phones have penetrated Africa too, offering rural farmers access to market information that prevents them from being ripped off by traders, and enabling money transfers. There is now great demand for low-power solar generators in order to charge them up. (In his 2010 BBC series The World in 100 Objects, Neil MacGregor, Director of the British Museum, chose a mobile phone and its charger as the 100th object—emblematic of the transformational power of optimally applied science in impoverished regions.) Broadband Internet, soon to achieve worldwide reach, should further stimulate education and the adoption of modern health care, agriculture, and technology.

Recent history augurs well for the transformational power of optimally applied science; and we can hope that health and agriculture surge ahead likewise. There seems no scientific impediment to achieving a sustainable world beyond 2050, in which the developing countries have narrowed the gap with the developed, and all benefit from further advances that could have as great and benign an impact as information technology has had in the last decade. But the intractable politics and sociology—the gap between potentialities and what actually happens—engender pessimism. Will richer countries recognize that it’s in their self-interest for the developing world to prosper, sharing fully in the benefits of globalization? Can nations sustain effective but nonrepressive governance in the face of threats from small groups with high-tech expertise? And can the focus of our sympathies become more broadly international? And, above all, can our institutions prioritize projects that are long-term in political perspective even if a mere instant in the history of our planet?

A cosmic perspective

I’ll conclude with a cosmic vignette. Suppose some aliens had been watching our planet from afar for its entire history. What would they have seen? Over nearly all that immense time, 45 million centuries, Earth’s appearance would have altered very gradually. Continents drifted; the ice cover waxed and waned; successive species emerged, evolved, and became extinct. But in just a tiny sliver of the Earth’s history—the last one-millionth part—patterns of vegetation altered at an accelerating rate. This signaled the advent of agriculture and the growing impact of humans.

Then, in just one century, came other changes. The amount of carbon dioxide in the air began to rise anomalously fast. The planet became an intense emitter of radio waves (the output from TV, cell phone, and radar transmissions). And something else unprecedented happened: small projectiles launched from the planet’s surface escaped the biosphere completely. Some were propelled into orbits around the Earth; some journeyed to the Moon and planets. If they understood astrophysics, the aliens could predict that the biosphere would face doom in a few billion years when the Sun flares up and dies. But could they have predicted this sudden “fever” less than halfway through the Earth’s life?

And if they continued to keep watch, what might these hypothetical aliens witness in the next hundred years—in this unique century? Will a final spasm be followed by silence? Or will the planet itself stabilize? And will some of the objects launched from the Earth spawn new oases of life elsewhere? The answer depends on how the challenges I’ve addressed can be met. Twenty-first-century science, if optimally applied, could offer immense benefits to the developing and the developed worlds—but it will present new threats. To confront these successfully—and to avoid foreclosing humanity’s long-term potential—is the political and social challenge for the coming decades.