As we saw in chapter 2, scientists have many ways of figuring out how much carbon dioxide was in the atmosphere in the distant past—by looking at air bubbles trapped in ancient ice that preserve samples of the atmosphere from as long ago as 800,000 years or by examining forams at the bottom of the sea. More recently, around the late eighteenth century, people started taking direct air samples and measuring their CO2 content. And of course, scientists do that today as well, with even greater accuracy.
Based on all of these ways of measuring, scientists agree that CO2 levels varied between 260 and 280 parts per million, or ppm (that is, 260–280 CO2 molecules for every million air molecules), during the most recent ten thousand years. Levels rose gradually over this time, settling at 280 ppm at the start of the Industrial Revolution in the nineteenth century. Some people argue that this slow preindustrial rise was a result of human activities, primarily clearing of forests for agriculture.
But in the late eighteenth century, the Scottish inventor James Watt patented an efficient version of the steam engine, which helped launch modern industrial civilization. Most steam engines run on coal or oil, which releases CO2 when burned. The machines of the Industrial Revolution fueled a dramatic rise in economic activity, which triggered a huge population increase, from about 700 million people in 1750 to almost 7 billion today.
Nearly all of those people, especially in developed countries like the United States, use energy for transportation, heating, and—something new since 1900 or so—electricity. While steam engines have become mostly a quaint curiosity, most of the energy we use still comes from the burning of carbon-rich coal, oil, and natural gas.
It’s not even a little bit surprising, therefore, that CO2 levels in the atmosphere have climbed from about 280 ppm in 1800, at the beginning of the Industrial Revolution, to about 390 ppm in 2011.