9 CLIMATE MODELS AND THE FUTURE
UP TO NOW in this book, the focus has been on past and present climate and how the climate system works. The workings of the climate system are at the heart of our ability to make realistic projections of how climate will change in the future. To make such projections, we need models, so this chapter describes the general nature of climate models. Models, of course, are not reality, so inevitably the question of how well climate models represent the real world arises. The models possess both intrinsic and practical limitations, which, along with the uncertainties about how the climate system works, translate into uncertainty in projections. Nonetheless, experience suggests that climate models provide some useful and insightful information about our future.
The big question of what the future climate will be hinges less on uncertainties associated with science-based model projections than on uncertainty about future levels of greenhouse-gas emissions. Accordingly, we must be able to relate greenhouse-gas emissions to climate change, which is one of the important utilities of climate models. This question also brings us to a final question of how we go about developing sensible policies to respond to climate change. The response has to be based in risk assessment, which we touch on as a way of determining the climate-change targets needed to guide policy.
What Are Climate Models?
Models of the climate system evolved from early computer models that sought to forecast weather. The basic processes, which are represented by mathematical equations in the models, are the motions of air and water around the planet in response to uneven heating by the Sun, the Coriolis forces, and other factors, with the motions limited by conservation of mass, energy, and momentum.1 The most complex models, known as atmosphere–ocean general circulation models, treat the coupled circulation of the atmosphere and the ocean, but simpler models of just atmospheric circulation are also of great use. The models produce a three-dimensional change in temperature and other fundamental characteristics of the climate system over time.
To solve the equations that describe their motions, models represent the atmosphere and the ocean as three-dimensional grids of points, such that each grid point represents a certain volume and has a specified set of properties. In the case of the atmosphere, for example, the properties include barometric pressure, wind velocity, humidity, and temperature. As the properties of one point change, so do those of all the neighboring points. The models calculate how the properties of each point change with time in response to external influences, such as the amount of solar energy received, and to changes in the neighboring points’ properties.
The distances between grid points are important. The closer the grid points, the better the model’s spatial resolution—in other words, the size of features that can be distinguished. Good spatial resolution is necessary for depicting certain features, such as patterns of tropical rainfall, that can influence projections of the future. To illustrate, in the National Center for Atmospheric Research CCSM3 model, the atmosphere volume is represented by a 2.8 × 2.8 degree (about 300 × 300 kilometers [180 × 180 miles] at the equator) horizontal grid of points. The vertical dimension is accommodated by a stack of 26 more closely spaced grids. The ocean volume in this model is represented by a 1 × 1 degree horizontal grid of points (locally the spacing is less) stacked 40 high. These spacings determine the resolutions of atmosphere and ocean phenomena that can be modeled. Smaller features, such as individual storms or ocean eddies, are not seen in the model simulations. The time steps (a typical step is a half-hour) are also important in producing realistic solutions. However, available computer power sets practical limits to spatial and temporal resolutions.
One challenge in climate models is to represent the feedbacks realistically. To understand why, imagine that one of our grid points represents a volume of air over the North Atlantic Ocean: How are we going to determine the temperature of that volume? Suppose the air is colder than the ocean, so heat moves directly from the ocean to the atmosphere. The properties of the water and the atmosphere, on the one hand, and the equation describing heat transfer, on the other, are well known, so the amount of heat that will flow between the two can readily be calculated. However, heat is also transferred by evaporation of the water, which is dependent on temperature. The water vapor forms clouds, which shade and thus cool the ocean surface, so we have to know the proportion of clouds. Here we must estimate. Perhaps we have an equation that relates the fraction of clouds to relative humidity, which is known as a parameterization. It may be based on theory or on observations; in any case, the parameterization is only an approximation of reality and perhaps not a very good one. Different climate models have different parameterizations relating the system’s different characteristics, and for this reason they yield somewhat different results.
Specifying a set of properties for our volume of atmosphere for given values of ocean temperature, ocean albedo, sea-surface roughness, wind velocity, cloud cover, atmospheric aerosol load, atmospheric greenhouse-gas content, solar radiation, and so on leaves plenty of room for uncertainty. Indeed, much of the work in climate modeling revolves around developing realistic parameterizations relating the various feedbacks. The interaction of the small-scale physics from which local characteristics are computed and the extension of these features in space result in extremely complex climate models, which is to be expected because, after all, the models are constructed to mimic an extremely complex real world.
Are Climate Models Credible?
Can climate models produce sensible results? Two important characteristics of the models suggest they can and do. First, the models mimic the “emergent” character of the real climate system. In other words, large-scale features develop as a consequence of the system’s complexity, not because the models embody any mathematical description of the phenomena. For example, the Intertropical Convergence Zone (ITCZ) of tropical rainfall is a feature of the real climate system and results from a combination of processes, such as the Coriolis force, the seasonal cycle of insolation, and the convective motions in the atmosphere (chapter 2). The ITCZ also appears in the models and results only from the forces acting on each grid point, not because ITCZ-like features have been built into them.2
One may wonder how climate models can be meaningful in light of the fact that weather is chaotic. In chaotic systems, small differences in initial conditions lead to large differences in how the systems evolve. The evolution of chaotic systems is therefore predictable only in the short term, and the predictions rapidly lose accuracy and thus meaningfulness the farther out into the future we go. For that reason, the weather forecasts for today or tomorrow are now fairly accurate, but forecasts four or five days out are not. Climate models, in contrast, do not display chaotic behavior.3 Instead, they produce stable climates—that is, ones that persist through time. This is their second most important characteristic. Moreover, as time passes, the models also display familiar natural phenomena, such as seasonal cycles, trade winds, modulations in the jet stream, common weather patterns, cyclonic storm patterns, and even events like El Niño–Southern Oscillation (ENSO), but without changes in long-term average conditions (unless external forcings are folded into the model).4
That climate models display the same emergent behavior and reach stable states as the real climate system does mean that the models allow researchers to explore how the real system works. Thus one can query a model regarding what happens to the climate system in response to a specific forcing, such as (no surprise) an increase in the greenhouse-gas content of the atmosphere—which is in fact what modelers do. In this way, the model becomes the basis for projections.
Climate models cannot be evaluated precisely. They can, however, be at least generally evaluated by being tested against the climate of the recent past, particularly of the past century or even of the past 25 years, which is well documented from satellite observations. Thus when fed data on greenhouse-gas emissions and other known quantities, the models closely simulate climate of the past century. For example, they capture changes due to external forcings, such as the global cooling due to the eruption in 1991 of Mount Pinatubo.5
In addition, certain periods of large climate shifts in the more distant past, such as the end of the most recent glacial maximum, serve as “benchmarks” for climate models. In general, the models are adept at simulating these shifts, although the climates of more remote periods are obviously far less precisely known than that of the previous century.
An interesting characteristic of climate models is that the average of a number of models better represents the basic features of the observed climate system than does any individual model. The implication is thus that the simulations have myriad small biases, but that in aggregate many of the biases cancel each other out. For this reason, the 2007 report of the Intergovernmental Panel on Climate Change (IPCC) simulated climate with an “ensemble” of 23 well-tested models.6
Modelers can also glean valuable information by producing multiple simulations with the same model. A climate model’s output is a combination of natural climate variability and external forcings. The former is random and thus unpredictable, representing model “noise.” Running a number of simulations under identical forcings is a way to obtain a measure of the noise and to distinguish it from the effects of external forcings.
A particular model’s ability to simulate the present climate is not necessarily a good criterion with which to judge its ability to predict future climate correctly. The climate system is now operating and will continue to operate beyond the bounds of conditions with which we have experience. In fact, for the future there is no single “best model.” One model may be more accurate under one set of conditions, but another model may be more accurate under another set. As a corollary, the range of model output is not a real measure of uncertainty, which might in reality be greater.
Despite the growing confidence in models and their “credibility,” they are not portrayals of reality because they contain many uncertainties and do not include all natural phenomena. For example, water vapor and cloud feedbacks (chapter 5) are significant sources of uncertainty. Also, most models do not now include the poorly understood feedbacks involving the terrestrial biosphere. Model simulations are more realistic for some parameters—notably changes in global mean temperature—than for others. Among the latter, for example, is change in sea level, which is unlikely to be accurately predicted because the models cannot consider all the important mechanisms of ice sheet loss (chapter 8).
Simulating the Twentieth Century: Anthropogenic Versus Natural Causes of Climate Change
As noted in chapter 7, the observed changes in climate, particularly over the past three or four decades, are consistent with the assertion that warming is the result of human activities—it is being driven mainly by anthropogenic forcings rather than by natural forcings. Model simulations support this assertion. But attributing the warming to human activities requires modelers to distinguish between the effects of external forcings, be they natural or anthropogenic, and of natural internal climate variability. The distinction thus must be made for a specific timescale because internal climate variability operates across a wide range of timescales, as described in chapter 1. Even so, attribution can seldom, if ever, be made with absolute certainty, so in practice it means being able to state that an observed change is consistent with the attributed cause.
The most general measure of warming is global mean surface temperature. In an effort to identify the cause of warming, the 2007 IPCC report compared 58 simulations from 14 models of twentieth-century warming.7 When the models were run with both anthropogenic and natural forcings, the simulations closely replicated the observed warming. But when they were run with only natural forcings—that is, volcanic eruptions and variations in solar irradiance—the simulations displayed a significant and increasing diversion from the observed trend as of around 1960 (figure 9.1). In other words, to reproduce the actual late-twentieth-century warming, the models required the input of anthropogenic forcings, specifically those due to greenhouse gases and aerosols.
If the warming is due primarily to anthropogenic forcings rather than to natural internal climate variability, then other signatures of the warming should also be apparent in the model simulations. The spatial pattern of warming is one such diagnostic. The entire globe has essentially warmed, with the Arctic in particular and to a lesser extent the Antarctic warming much more rapidly than the middle and tropical latitudes. This pattern is not expected from natural modes of climate variability such as modulations of the North Atlantic Oscillation (chapter 2) and ENSO (chapter 3). Simulations that included anthropogenic as well as natural forcings replicated the observed spatial pattern of warming; those with no anthropogenic component did not (figure 9.2). This result has also been found to be the case in analyses of regional patterns of warming.8
image
FIGURE 9.1
Modeling global mean surface temperature
The observed change in global mean surface temperature (black line) during the twentieth century was compared with (a) 58 simulations (range shaded in light orange) from 14 climate models that included both anthropogenic and natural forcing factors, the ensemble mean of which is shown by the heavy red line; and (b) 19 simulations (range shaded in light blue) from 5 climate models that included only natural forcing factors, with the ensemble mean indicated by the heavy blue line. The vertical lines indicate times of major explosive volcanic eruptions. Only the simulations that include anthropogenic forcings (a) reproduced actual climate observations. (After Hegerl et al. 2007:fig. 9.5)
Several other diagnostic, temperature-based indexes have also been developed. They include the land–ocean temperature contrast, the Northern Hemisphere’s north–south temperature gradient, temperature difference between hemispheres, and temperature variation with the seasons.9 These indexes’ utility is that they should display a coherent response to greenhouse-gas warming, but not to natural variability. The indexes change coherently through the twentieth century, and model simulations indicate that anthropogenic forcing can account for all the changes.
The other observed changes described in chapter 7—fewer cold days and nights, more warm nights, warming of the troposphere but cooling of the stratosphere, rise of the tropopause, increasing heat content of the ocean, and even heat waves—arise only in simulations that include anthropogenic forcings.10 Although no one claims that climate models are precise representations of nature, they do provide strong, credible evidence that the warming of the late twentieth century is driven by anthropogenic forcings.
image
FIGURE 9.2
Spatial distribution of observed warming compared with the distribution obtained from model simulations, 1901–2005 and 1979–2005
The top maps (“Observed”) show the observed gridded temperature change. The middle maps (“ALL simulations”) show the average gridded temperature change of 58 simulations from 14 models that included both anthropogenic and natural forcings and most closely resemble the observed temperature change (reds). The bottom maps (“NAT simulations”) show the average gridded temperature change of 19 simulations from 5 climate models that included only natural forcings. (From Hegerl et al. 2007:fig. 9.6)
Chapter 7 also noted that the probability of the occurrences of highly unusual events, such as the European heat wave in 2003, has increased because of anthropogenic forcings. Observations support this assertion; in addition, model simulations show that distinct geographical patterns of extreme heat emerge in a world driven by greenhouse gases, but not in one driven only by natural forcings.11
Peering into the Future
Credible predictions of future climate obviously depend on the availability of climate models that are adequate representations of nature. They also obviously depend on the rate at which greenhouse gases continue to be emitted. Patently not obvious, however, is what emission rates will actually be because they will depend on a complex group of interacting factors, such as the increase in population, the growth of the economy, the distribution of income, the policies of governments, the pace of technological change, the adoption of new technologies, and so on.
EMISSION SCENARIOS AND TEMPERATURE PROJECTIONS
It is, of course, impossible to know how the various factors affecting emission rates will play out to shape our future world. In order to provide a means of making model projections of future climate, the IPCC and other agencies have developed a set of “emissions scenarios”: “Scenarios are [alternative] images of the future…. They are neither predictions nor forecasts. Rather, each scenario is one alternative image of how the future might unfold.”12
In its 2007 report, the IPCC chose three previously developed (in 2000) emission scenarios as a basis for making model projections.13 The three scenarios (described in detail in note 13) were chosen to cover a widely varying future world, ranging from one in which emissions are high (A2) to one in which they are medium (A1B) and then to one in which they are low (B).14 To emphasize again, the future is hardly certain, so wishful thinking aside, no one scenario is more or less probable than any other. In the IPCC’s modeling effort,15 the projected increase in global mean surface temperature by the year 2100 for the high-emissions (A2) scenario is 3.6°C (6.5°F) higher than the 1980 to 1999 average; for the medium-emissions (A1B) and low-emissions (B1) scenarios, the projected mean temperature increases are, obviously, less: 2.8°C (5°F) and 1.8°C (3.2°F), respectively (figure 9.3).
There are several things to note about these projections. First, these temperature increases refer to the mean of an ensemble of the 23 models. To reiterate a point made earlier, the ensemble mean has been found to be a more accurate representation of simulations of past climate, so that may be true of future projections as well.
image
FIGURE 9.3
Projected changes in mean global surface temperature under three emissions scenarios
The heavy lines are the ensemble means of 23 model simulations run as part of the report of the Intergovernmental Panel on Climate Change (Meehl et al. 2007), and the shading refers to the ±1 standard deviation range of individual models. Also shown is the change in temperature if the greenhouse-gas content of the atmosphere had been held constant at year 2000 concentrations. The brackets to the right of the graph show the approximate uncertainties in future temperature changes resulting from lack of scientific knowledge and from growth of emissions, assuming that the three emission scenarios represent the possible range in emissions growth. (After Intergovernmental Panel on Climate Change 2007:fig. SPM 5)
Second, the temperatures change with time, as shown in figure 9.3. Thus the temperatures projected for the high- and medium-emissions (A2 and A1B) scenarios do not significantly diverge from each other until the late twenty-first century. This outcome reflects the mitigating effect of aerosols on warming (chapter 5), which is greatest in the high-emissions (A2) scenario (that is, the atmosphere in the A2 world contains more surface-cooling aerosols than it does in the A1B world). Another feature is that the projected temperatures under all scenarios are about the same until about 2030, after which the full benefit of the low-emissions (B1) scenario starts to become apparent. One reason for this outcome is that at least for the next decade we will be experiencing warming that is already “committed” or “in the pipeline” from recent greenhouse-gas emissions (chapter 5), as is illustrated by the curve in figure 9.3 showing how warming would have been dampened had the greenhouse-gas content of the atmosphere been contained at year 2000 concentrations. According to this projection, there would have been about 0.4°C (0.7°F) of committed warming in the system had greenhouse gases been kept at 2000 levels. But the levels are now higher, and the committed warming is thus greater.
Third, figure 9.3 shows for each scenario the shaded range in model results,16 which by 2100 are approximately ±0.5°C (0.9°F) for each scenario. Again, recall that this uncertainty is not the true one, which may be somewhat larger; it simply reflects the difference among models. Nonetheless, this spread of 1°C (that is, ±0.5°C) is probably a reasonable minimum estimate of the uncertainty, given that we do not understand the climate system completely. Contrast this 1°C uncertainty with the spread in projected global temperature for the three emission scenarios. In the latter case, the spread is nearly 2°C (3.6°F). In other words, uncertainty about how the future plays out depends more on the growth of emissions than it does on lack of scientific knowledge.
OTHER PROJECTED CHANGES
Changes in global mean surface temperature do not tell the whole story, of course. There are also distinct geographical patterns of temperature change, with the greatest warming occurring over land, especially over the Arctic, and the least warming occurring over the Southern Ocean and the North Atlantic (figure 9.4). The warming is also accompanied by other projected changes, which, not surprisingly, are the ones that are already beginning to occur.
image
FIGURE 9.4
Projected changes in temperature, precipitation, and sea-level air pressure for winter (December–February) and summer (June–August), 2080–2099
The patterns of these changes, under the ICCP medium-emissions scenario, in part reflect large-scale changes in atmospheric circulation. The stippled regions denote those changes for which there is a relatively high degree of agreement among the different model simulations. hPa, hPascal = millibar. (From Meehl et al. 2007:fig. 10.9)
Most ominously, precipitation patterns are projected to change, with a large increase in rainfall in equatorial regions, less precipitation in the midlatitudes, and somewhat greater precipitation at high latitudes (see figure 9.4). The changing patterns of precipitation are reflected in changes in air pressure, which themselves are an indication of how atmospheric circulation patterns may be changing (see figure 9.4). In particular, the sea-level pressure changes indicate a poleward expansion of the Hadley cells (chapter 2), which in turn forces a poleward shift of midlatitude storm tracks, in part accounting for the higher precipitation at high latitudes and the drop in precipitation at midlatitudes. The latter will occur especially in the summer, with an attendant decrease in soil moisture because of the combined effects of less rain and higher temperature. The precipitation will be concentrated in more intense but less frequent events, with longer periods of no precipitation. These factors imply that the midlatitudes will experience a much greater risk of drought. The models also project a future with even more frequent and more intense heat waves, fewer and less severe cold spells, a steady decrease in the daily temperature range as nights become warmer, and a decrease in the number of frost days throughout the high and middle latitudes.
The model simulations also indicate that the oceans will continue to warm. The warming will initially occur slowly and be restricted mainly to the ocean mixed layer, typically the upper 100 meters (330 feet) or so, but later in the twenty-first century it will begin to extend into the deep ocean. Recall that the slow warming of the ocean represents committed warming, which cannot be stopped. Because the ocean is slow to warm, the surface water remains cooler than its equilibrium temperature with the atmosphere would dictate, which keeps the atmosphere cooler than it would otherwise be.
The ocean and living biota are projected to become progressively less efficient at removing carbon dioxide (CO2) from the atmosphere, a potentially important feedback that will force CO2 levels higher than they would otherwise be. As the atmospheric CO2 content rises, the ocean will become more acidic, upsetting the biological cycling of carbon in ways that are not understood (chapter 4). Warming will bring widespread thawing of permafrost as well as emissions of CO2 and methane from the carbon now held in permafrost, another potentially important feedback (chapter 8).
The extreme warming of the Arctic results in loss of sea ice, as the people and wildlife that live there are already experiencing (chapter 8). Under the high-emissions (A2) scenario, models indicate that the Arctic becomes ice free in the summer by the latter part of the twenty-first century. But the extent of sea ice has been shrinking much faster over the past several years than projected (chapter 8), suggesting that an ice-free Arctic may be upon us much sooner than that.
The models also indicate that sea level will rise. For example, under the medium-emissions (A1B) scenario, models project that by 2100 sea level will be 0.21 to 0.48 meter (8.3 to 18.9 inches) higher than the 1980 to 1999 average sea level. This projection considers only sea-level change from thermal expansion of the ocean, melting of glaciers and ice caps, and changes in rates of snowfall. It does not consider disintegration of parts of the Greenland and West Antarctic ice sheets. The increased flow of ice and other changes to those ice sheets, as we have seen (chapter 8), may begin to have an important influence on sea level in this century, but will likely have greater influence in subsequent centuries.
Perspective
How do we go about developing a rational policy response to climate change in view of the uncertainties about what the future will bring? First, we should recall the argument that future climate depends on future greenhouse-gas emissions and realize that these emissions are closely tied to our energy future. In other words, rational response to climate change essentially involves developing rational energy policy.
One way of figuring out what to do in the face of uncertainty is through risk assessment. At the most basic level, this assessment involves (1) identifying a consequence of climate change and then calculating its cost, (2) determining the probability that the consequence will come about as function of temperature increase,17 and (3) establishing the cost of preventing the consequence (or, alternatively, the cost of adaptation) as a function of the probability.18 Needless to say, numerous complexities accompany such an analysis. Some consequences, such as destruction of sensitive ecosystems, are far more likely to occur with a small degree of warming than are other consequences, such as widespread drought with a substantial negative impact on agriculture. Also, we can only consider the probabilities that events will occur, and these probabilities increase with warming.
Then there is the matter of determining costs. Those associated with the consequences of warming are difficult to assess because they depend, in part, on the degree of warming; the costs of mitigation and adaptation efforts are even more difficult to determine for the same reason. The point, however, is that assessing risks in an uncertain and probabilistic world is a rational basis for determining target temperatures above which mitigation and/or adaptation costs to society are less than the costs of consequences.
Risk assessment is a formidable subject far beyond this book’s scope. An essential part of assessing risk, however, is being able to relate greenhouse-gas emissions to climate change, which is what climate models allow us to do. The IPCC’s effort to relate emissions and global temperature change is illustrated in table 9.1, which shows the changes in year 2050 emissions as a percentage of year 2000 emissions that will bring about various target temperature increases.
TABLE 9.1 CHANGES IN THE GROWTH OF CARBON DIOXIDE (CO2) EMISSIONS REQUIRED BY 2050 TO BRING ABOUT SPECIFIC WARMING TARGETS
image
a Calculated radiative forcing from greenhouse-gas buildup. W/m2 = watts per square meter.
b ppm = parts per million by volume.
c CO2-eq. is the concentration of CO2 that would have the same radiative forcing as the forcing due to all of the greenhouse gases (CO2, methane, nitrogen oxide, ozone, halocarbons).
d The equilibrium global mean temperature (the temperature at the time the climate finally stops changing—that is, the time that all the committed warming has occurred) above the preindustrial temperature, based on a best estimate of climate sensitivity to radiative forcing.
Source: The data were derived by the 2007 IPCC report from numerous studies. See T. Barker, I. Bashmakov, L. Bernstein, J. E. Bogner, P. R. Bosch, R. Dave, O. R. Davidson, B. S. Fisher, S. Gupta, K. Halsnæs, G. J. Heij, S. Kahn Ribeiro, S. Kobayashi, M. D. Levine, D. L. Martino, O. Masera, B. Metz, L. A. Meyer, G.-J. Nabuurs, A. Najam, N. Nakićenović, H.-H. Rogner, J. Roy, J. Sathaye, R. Schock, P. Shukla, R. E. H. Sims, P. Smith, D. A. Tirpak, D. Urge-Vorsatz, and D. Zhou, “Technical Summary,” in Climate Change 2007: Mitigation of Climate Change. Contribution of Working Group III to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, edited by B. Metz, O. R. Davidson, P. R. Bosch, R. Dave, and L. A. Meyer (Cambridge: Cambridge University Press, 2007), 25–94.
Two points are evident. First, limiting the global mean temperature increase to, say, 3°C (5.4°F), which is a commonly cited, prudent target temperature increase, will require limiting atmospheric CO2 content to less than about 480 parts per million. To achieve this goal, global CO2 emissions by 2050 will have to be reduced below the year 2000 emission level or at best to that level—not just slowed and certainly not allowed to grow. Second, due to the inertia in the climate system, the reduction will have to begin soon—probably within the next decade—if the target is to be achieved. Perhaps the particular goal of limiting temperature increase to less than 3°C is impractical. With this question in mind, in any case, we can now time to turn our attention to our energy future.