“You cannot do business on a dead planet.”
– David Brower
WE GREW TOO BIG TOO FAST, and now we’re running up against intractable physical limits. We were warned, and we paid no attention. In 1972, a few young scientists, computer modelers mostly, used a new program called World3 and a skeptical intelligence to write the first iteration of The Limits to Growth. Their aim, and that of their sponsors, a global think-tank called the Club of Rome, was to use systems analysis to examine the implications of unchecked growth on a small, closed-ecology planet like Earth. It predicted, among other things, that economic growth could not continue indefinitely because of the limited availability of natural resources, particularly oil.
The lead author of The Limits was Donella Meadows, who died in 2001, and the report she and her colleagues produced became a bestseller, selling 30 million copies in as many languages. In 2004, a “30-year update” was produced by Meadows’s husband, Dennis, and another of the original four, Jorgen Randers, reflecting the by now conventional wisdom that human beings and the natural world are on a collision course. The major change in emphasis after the 30-year lapse was a shift away from the potential shortage of raw materials, especially nonrenewable resources. On reflection, the original authors had asked an intelligent question, but the wrong one. At current rates of consumption, there is enough copper, iron, and nickel, say, to last for centuries, even without recycling. In the interim, though, the question has become not whether we can grow but whether we should.
Reactions to the first iteration of The Limits from orthodox economists and politicians were vigorous, sustained, and vituperative. Meadows and her sponsors were dismissed by economic theorists because she was not an economic technician, and was therefore not privy to what they consider the deeper mysteries of economics. The main argument was that The Limits failed to use one of economic theory’s dearest theorems, the feedback loop of pricing – the allocation of scarce resources over time and between sectors. (At its simplest, the theory says that if prices rise through scarcity, money will pour into finding substitutes, which will then become abundant, and prices will drop.) The authors were thus said to have underestimated the powers inherent in technological advance. New York Times columnist Thomas Friedman is among the main proponents of the notion that endless growth was possible. “The limits of growth were overcome in the 1970s with technology,” he declares in The World Is Flat. “We got smarter than before, equipment became more efficient, and energy consumption per head was lower.”1
Forty years on, it is easy to see that if The Limits asked the wrong question, then the criticism came in turn to the wrong conclusions. It is no longer a matter of whether a resource will be exhausted, or whether substitutes can be found, but of the impact on the global ecology of the wastes that extraction and manufacturing impose. And the original study did plead for “profound, proactive, societal innovation through technological, cultural and institutional change in order to avoid an increase in the ecological footprint of humanity beyond the carrying capacity of planet Earth.”
Ah, say the economists, but modern technology has less impact than the old, manufacturing is being “dematerialized” through efficiency and invention, and we can go on for ever.
Ah, say the ecologists, but you cannot reduce industrial impact to zero, and the scale is constantly increasing.
Impasse.
The Limits to Growth did at least introduce into popular consciousness several ecological terms without which intelligent debate is impossible. These were the concepts of overshoot, throughput, and “sinks, sources, and synthesis.”
“Overshoot” is just a word for a negative feedback loop, and is familiar to anyone who has ever driven a car on ice or gotten caught in a skid, when the tendency to overcorrect is almost impossible to resist. In an economy, overshoot is similar – a tendency to overcorrect for a belatedly perceived error. On the very first page, The Limits’ authors declared that “the three causes of overshoot are always the same, at any scale from personal to planetary. First, there is growth, acceleration, rapid change. Second, there is some form of limit or barrier, beyond which the moving system may not safely go. Third, there is a delay or mistake in the perceptions and the responses that strive to keep the system within its limits.”2 In a later chapter, they write: “The final contributor to overshoot is the pursuit of growth. If you were driving a car with fogged windows or faulty brakes, the first thing you’d do would be to slow down. You would certainly not insist on accelerating … Constant acceleration will take any system, no matter how clever or farsighted and well designed, to the point where it can’t react in time.”3 The global economy, in this view, is still in overshoot mode and has yet to apply corrections.
“Sources and sinks” are obvious enough to any ecologist: what goes into an economy must eventually come out as waste. Sources are natural resources, either extracted through mining or harvested; sinks are where waste goes. These sinks are not necessarily discrete basins; some are global, like water runoff and air pollution. Some are not even pollution: carbon dioxide – regarded as the main culprit in global warming – is not a real pollutant; it is a perfectly natural gas, now being produced in unnatural quantities. Orthodox economics has traditionally ignored sinks.
“Throughput” is the work done in transforming sources into sinks. It is basically a measure of economic activity. “Synthesis” is harder to grasp and harder to measure, though ecological footprint comes close.
The Limits update sounded a cheerful note. “The good news is that the current high rates of throughput are not necessary to support a decent standard of living for all the world’s people. The ecological footprint could be reduced by lowering population, altering consumption norms, or implementing more resource-efficient technologies.”4
The first reason for weaning ourselves off fossil fuels is that they – or rather their profligate and careless overuse, and the number of us who are using them – are threatening the way of life we have made on our planet. We’re killing ourselves with the stuff.
For the purposes of this chapter, let’s stipulate that global warming is real. We will deal with the politics a little later – with the exaggerations and hyperboles of both sides (neither side is made up just of fools and villains). Therefore let’s accept here that climate change and possible climate chaos is a fact. If any doubts remained, they would surely have been put to rest by a 2,000-year decadal study of Arctic temperatures undertaken by a slew of scientists under the umbrella heading Arctic Lakes 2K Project. Their report, lead-authored by Darrell Kaufman of the Northern Arizona University School of Earth Sciences and Environmental Sustainability, found unambiguously that a pervasive cooling in progress 2,000 years ago continued through the Middle Ages and into the Little Ice Age (from about 1300 to the middle of the nineteenth century), which is consistent with other studies that show a gradual reduction in the intensity of summer solar radiation, caused by changes in Earth’s orbit. This cooling trend was reversed by the beginning of the twentieth century, when temperatures began to increase rapidly, with four of the five warmest decades of the 2,000 years occurring between 1950 and 2000. The data are consistent with human-caused warming.5 Finally, in May 2010, the U.S. National Research Council published three new reports, drawing on five years of peer-reviewed science, and concluded that their data showed unambiguously that the world’s governments “should act, now.” A supplementary document, issued a few months later, concluded that “each 1°C of warming will reduce rain in the southwest of North America, the Mediterranean and southern Africa by 5–10 percent, cut yields of some crops, including maize (corn) and wheat, by 5–15 percent, and increase the area burned by wildfires in the western United States by 200–400 percent.”
The magic numbers to bear in mind in the debate over energy and climate change are these: 350 ppm (parts per million) CO2, and 2 degrees Celsius.
Carbon dioxide isn’t the only greenhouse gas, by any means – methane and nitrous oxide are considerably more potent.6 Nitrous oxide comes from farming, from nitric acid production, from fossil fuel combustion, livestock manure management, and human sewage. Potent sources of methane include landfills, coal mining, natural gas and oil processing systems, livestock and manure, wastewater treatment, and rice cultivation. But all terrestrial plants produce some methane; again, it is a matter of degree. Nevertheless, it has become conventional to express all gas emissions and concentrations in terms of “CO2 equivalent” effects.
In the year 2000, global greenhouse gas emissions were 34 billion tons of CO2 equivalent per year. This works out to 5.5 tons for every person on the planet. In the West, we emit a great deal more.
When climate scientists first began to look at the “greenhouse effect” sometime in the 1970s, the target they based their models on was a figure of 550 parts per million of atmospheric CO2. The number was chosen arbitrarily – they needed something to base their studies on, after all – and 550 ppm was roughly double the levels found just before the Industrial Revolution. Early studies, particularly a 1979 one for the U.S. National Academy of Sciences by weather researcher Jule Charney, showed a 550 ppm world warming noticeably, somewhere between 1.5 and 4.5 degrees Celsius.
A decade later, climate scientist James Hansen appeared before the U.S. Congress to announce that the greenhouse effect had been detected at work, that it was real, and that it was changing the climate already. His testimony was regarded as alarmist, and largely dismissed, but when his climate models successfully predicted how much the earth would cool after the eruption of Mount Pinatubo in 1991, he began to be taken more seriously.
Scientists, and some politicians, subsequently concluded that the consequences of 2 to 4 degrees warming would be too dire, and that warming must be held below 2 degrees. That, in turn, meant lowering the target for CO2 to 450 ppm from 550, which, it was hoped, would avoid what the UN’s Framework Convention on Climate Change called “dangerous climate change,” a fraught phrase and one itself dangerously open to interpretation.
Soon, however, it became clear that in the real world outside climate models change was happening faster than predicted – Arctic sea ice was thinner than expected, Greenland’s glaciers were retreating, tropical and desert regions were expanding. Greenhouse gases were ticking upwards past 350 ppm to 385 ppm by 2006 (they were 385.2 in 2008), and Hansen and others soon concluded that bringing them back to 350 ppm was the only safe way to ensure 2 degrees or less of warming.
This is not an insignificant revision. As Richard Monastersky pointed out in a climate survey for the journal Nature, the difference between 350 and 450 ppm “is not just one of degree. It’s one of direction. A CO2 concentration of 450 ppm awaits the world at some point in the future that might conceivably, though with difficulty, be averted. But 350 ppm can be seen only in the rear-view mirror … the world needs not just to stop but to reverse course.”7 As Hansen said in the same article, “When you say 450 or 550, you’re talking about what rates of growth you are going to allow. When you say we have to get to 350, that means you have to phase down CO2 emissions in the next few decades.”8 Moreover, the effects of warming would persist for centuries. The National Research Council has concluded that if CO2 ever did reach 550 ppm, it would initially stabilize temperatures at 1.6°C higher than now – but further warming would leave the total temperature rise closer to 3°C, and would persist for millennia.
Consider the scale of the action needed. Tim Jackson, sustainable development adviser to Britain’s former Labour government, uses the IPAT formula to calculate the needed changes:
The global population is just under 7 billion and the average level of affluence is around $8,000 per person. The T factor [in IPAT] is just over 0.5 tons of carbon dioxide per thousand dollars of GDP – in other words, every $1,000 worth of goods and services using today’s technology releases 0.5 tons of CO2 into the atmosphere. So today’s global CO2 emissions work out at 7 billion × 8 × 0.5, or 28 billion tons per year. [The real number, as indicated, is higher than this – 34 billion tons.] The IPCC [International Panel on Climate Change] has stated that to stabilize greenhouse gas emissions in the atmosphere at 450 ppm, we need to reduce annual CO2 emissions to less than 5 billion tons by 2050. With a global population of 9 billion thought likely, that works out at an average carbon footprint of less than 0.6 tons per person – considerably less than in India today.9
For Western industrialized countries this is a pretty drastic change. Some countries, including Britain, have committed to at least a 60 percent reduction in greenhouse-gas emissions by 2050. (Canada has suggested it could manage 3 percent by 2020, but is unlikely even to achieve that; 2010 figures suggest that emissions will actually continue to rise until 2015 at least, and in 2012 will be 30 percent above 1990 levels.10) If global emissions were reduced by 60 percent by 2050, it is still probable that temperatures will rise by more than 2°C.
This means that global emissions will have to drop by 70 to 85 percent by 2050, at least. Which in turn means no more fossil fuels. And soon.
The second good reason for weaning ourselves off fossil fuels, especially oil, is that sometime soon there won’t be any.
There are two, or perhaps three, quite different versions of the state of our oil supplies. One is that there is no problem – if we begin to run out, human ingenuity and technical advances will always find more, or find alternatives. This comforting view is nicely represented by the Heritage Foundation’s Becky Norton Dunlop, who said in a discussion in 2008 that “they’ve been predicting peak oil since the 1950s, and now we have more than ever.” Until recently this has been the position of orthodox economics, which has remained insouciant, taking on faith that price increases will simply lead to more exploration and therefore more oil, or that some replacement technology will surely be found in time. Another equally insouciant view comes from University of Calgary economist John R. Boyce, who pooh-poohs the whole idea of oil peaking, and whose data have shown that production always rises when prices do – or at least has done so in the past. (Boyce’s analysis rather optimistically assumes the past is always a good guide to the future.)
The second version is that peak oil (the point at which global production starts to decline) is either upon us, or imminent, and will have a cataclysmic effect on the global economy. The proponents of this theory are legion, but the most interesting is a dire U.S. Joint Forces Command study, released in April 2010, that predicts that the world will begin to face shortfalls in petroleum within two decades.11 Its reasoning is simple: the globalized economy depends on cheap oil, but we have already burned most of the easily obtained free-flowing oil. That’s why we’re now spending vast amounts to wring oil from the deep sea (BP in the Gulf of Mexico, for example) and the tar sands. As traditional oil fields become depleted, the world needs to find another Saudi Arabia every seven years just to maintain production, an unlikely event for two very good reasons: there may simply not be another dozen or so Saudi Arabias anywhere on the planet, and the oil industry is short of equipment and manpower after decades of underinvestment when prices were low.
The third view is agnostic on whether oil has peaked or not, but believes that we should stop using it anyway, because its presence is more damaging than its absence could ever be. The absence of petroleum might well damage the economy, in this view, but it would benefit the planet, and all of us in the long run.
All three of these viewpoints are firmly held, usually by people with impeccable credentials. But the optimists and the pessimists are not as far apart as they appear. They merely differ on the timing of the production peak by a few decades. All agree that petroleum is the most limited of the important fossil fuels (natural gas and coal are far more available), and that global production will reach a maximum during the first half of this century, and sharply decline thereafter. (I like the formulation by David Lloyd Greene of the U.S. National Transportation Research Center that proponents of peak oil “are sometimes referred to a pessimists, and at other times as geologists, while deniers are either called optimists or economists.”)
For now, the best guess is probably that of the United Kingdom’s Energy Research Centre, which steers something of a middle ground between the Pollyannas and the Jeremiahs. World production is likely to peak before 2030, and could reach its limits before 2020, the center’s 2009 report said. Think of the implications. If, sometime in the next few years, a consensus develops that peak oil is imminent and production in decline, it would set off a spike in prices and a market panic, dumping the world once more into recession from which it might not emerge for a very long time.
The world needs to be weaned off fossil fuels, and soon. Nevertheless, late in 2009, the newly rescued car companies announced a new line of gas-guzzling muscle cars.
Can the world support the number of people who now live here, nearly 7 billion and rapidly counting? Yes.
Are the food and water crises caused by too many people? Only partly.
Are European and North American populations actually static or declining? Yes.
Is it true that the environmental movement has historically been rather too prone to fits of gloomy Malthusian doomsaying? Yes, it is.
But are human numbers a problem? Yes, they are, because rapidly increasing numbers dramatically compound our use of the earth’s resources, including its dwindling fossil fuels. And our numbers are rapidly increasing, despite a slowing momentum.
By the time it takes you to finish reading this sentence, seven people have been added to the world’s population. Even with significant reductions in birthrate, the population is expected to increase from 6.7 billion now to 9.15 billion by 2050. In other words, between 2005 and 2050 the world population will grow by 2.45 billion, Earth’s entire population in 1950.
About 250 million people lived on Earth two millennia ago. Now there are double that number in the Middle East and North Africa alone. And there are 1 million new Egyptians every year. Niger, a country already poor and stressed, has 15 million people today and could hit 80 million by 2050. We will need the equivalent of 50 new Nile Rivers to provide enough water just to grow food for all the new people to be born in the next few decades, a rate the equivalent of a Great Britain and more a year. The projected population increase would mean that even if we could reduce the per capita global carbon footprint by an unlikely 50 percent (with all the efficiency, conservation, and considerable political co-operation that would take), the increase in people would simply wipe it out.
You can’t solve anything without also solving population. But it is really not the sheer numbers that count. It wasn’t raw numbers that worried Thomas Malthus, the godfather of gloomy prognosticators. He was concerned about the intersection of resource availability (food, in his case) and numbers – the same worry that concerned Adam Smith in The Wealth of Nations. In those more innocent days, the worry was that human numbers would outstrip available food, leading to famines, starvation, and epidemics. Now we have a different set of concerns. The eminent environmentalist James Lovelock, who has become for our times what Malthus was for his, puts it this way in The Vanishing Face of Gaia: “True enough, the world total of domestic and industrial emissions of 30 gigatons of carbon dioxide annually is far too great, but so are the consequences of too many people competing for land with the natural forests of the world.”12
The other concern related to population – other than sheer numbers – is technological. This is the development of what the biologist Alfred. J. Lotka in 1925 called “exosomatic organs” – that is, machines. For all of planetary history, and for all human history until comparatively recent times, energy was endosomatic – muscle power, if you want a less technical description. This is still true of most species, with a few insignificant exceptions – chimpanzees fishing for termites with straws, or otters using stone tools to break open shellfish. Man’s first exosomatic instrument was probably a club, which extended his arm and helped him transcend his biological limits. “This … brought down upon the human species two fundamental and irrevocable changes,” says mathematician and ecologist Nicholas Georgescu-Roegen. “The first is the irreducible social conflict which characterizes the human species. The second is man’s addiction to exosomatic instruments.”13
Only a species with a genius for exosomatic extension could get to where we are today: the average North American born in the 1990s will produce in a lifetime about 1 million kilograms of atmospheric wastes, 10 million kilograms of liquid wastes, and 1 million kilograms of solid wastes. The same person will consume 700,000 kilograms of minerals and 24 billion BTUs of energy, which is equivalent to about 4,000 barrels of oil, and will eat 25,000 kilograms of major plant foods and 28,000 kilograms of animal products, provided in part by slaughtering 2,000 animals.14
So human population is a problem compounded by human ingenuity. Sheer numbers multiplied by our exosomatic extensions are stripping our planet of its resources, and threatening our very existence, as well as that of other species. But there is a solution and it lies in solving both aspects of the problem – in reducing our numbers, and in reducing our harmful technological effects. If you live on a finite planet, you must, ipso facto, live within your resource base. And since the capacity to increase resources from that base is bounded, there must, in the end, be a balance between population and those resources. We simply cannot continue to grow forever, either in numbers or in wealth.
The issue of human population is fraught with deep emotions, predictably and necessarily. In individual terms, to perpetuate family and to defeat death by leaving behind something of yourself is a powerful generative force. And in the context of evolution, a species’ duty is, after all, to survive. What to do, then, when a single species threatens to overwhelm the carrying capacity of the planet, as ours does?
The multiple crises are real, and the linkages between them are clear.
The climate crisis is largely an industrial/energy crisis, which in turn is based on population numbers. Oil prices are linked to food production, and both are fine-tuned to ultimate supply and interim demand, and are therefore connected to the population issue. Oil, of course, is a critical component of the energy issue, perhaps the critical part. Solving the scale of its use will aid in solving the global warming problem, directly affecting farming through less desertification and ecosystem mitigation. Like the pollution issue, food – and population – will have to be solved in concert with solutions to the other crises. In which case, their solution will become relatively easier.
That’s the point, isn’t it? They’ll all become easier if we tackle them together.