Heidi Cullen, Climate Expert
Claudia Tebaldi, Climate Central
This week on The Weather Channel we aired, for the first time, a look at what state-of-the-art climate models say about future regional temperature changes in the United States. You can view the segment by going here. These complex computer models are an important part of developing adaptation strategies at the regional and someday even local level. And they become even more useful (and policy-relevant) when we focus on timescales over the next several decades. Because the strengths and weaknesses of these models are at the heart of how we plan climate adaptation strategies, we wanted to take this opportunity to drill down and provide some background on what these models are and provide a frank discussion of their strengths and weaknesses.
A climate model is essentially a 'twin earth'. It's a handy way to run experiments on the planet and see how variables like temperature and precipitation change over time when we add things like CO2 to our atmosphere. These models represent the surface of the Earth, the depth of the oceans and the layers of the atmosphere as a set of regularly shaped boxes. Thinking for simplicity only about the surface of the Earth, just imagine taking the globe and overlaying it with a mesh grid and you can get a pretty good sense of what it looks like. (see picture ).
Calculations are performed within each grid cell that represent how that area reacts to the winds, the sun, the surface, and the amount of GHG's. These models serve as a way to fast forward in time and project how much warmer (or wetter) things could get in the future. Climate models are designed to simulate the decade-to-decade evolution of climate, but not to predict precisely what a single day will look like - they are different in that sense from weather forecast models.
Here is the map that we presented in the broadcast of the difference (in degrees F) between July temperatures by 2050 and the current July climatology. These temperature differences presented were calculated by averaging together the output of twenty Global Climate Model (GCM) simulations of the Earth's future temperature assuming that greenhouse gases will continue to rise.
Model Resolution and Uncertainty
The resolution of these models is determined by the size of the squares (we refer to them as grid cells) overlaid on the globe. The current generation of models uses grid cells of about 120 by 120 miles (200 x 200 km) in size. They are this size mainly because of computing resource constraints. The smaller the grid cells, the higher the resolution. Computing power continues to increase, and the next generation of models that will be using smaller cells for their grids. Because of current constraints, however, the models have to approximate processes that can't be simulated explicitly because they are happening ‘within the grid cell'. Atmospheric convection, is a good example, since we all know that a thunderstorm cloud does not generally span hundreds of kilometers in size.
Approximations inevitably introduce uncertainty, but that is why we take an average of 20 climate model projections (called an 'ensemble'). Different models will use different solutions to approximate what they cannot represent directly in each grid box, and as a result, they project different degrees of warming. This ensemble approach is a way to start working towards a probability distribution of the changes in store, in other words, to get at the statistics of what the possible future temperature range looks like, and hence what might be more or less likely to occur. If we are pressed to present a single estimate, choosing the ensemble average is a way of relying on the consensus of these models, rather than on any single one. The assumption is that the approximation errors among models tend to cancel each other when we average their projections, and the common, most robust tendencies are thus brought out.
As mentioned in the video piece, the rise in greenhouse gases was determined by what is known as the "A1B scenario". This scenario has concentrations of carbon dioxide (CO2) increasing from 385 parts per million (ppm) today to 600+ ppm by 2050, hypothesizing a rate of increase of 1.7% per year. This can be viewed as an optimistic pathway for future emissions because of the recent acceleration in the rate of greenhouse gases emissions at the global scale from 1.3% in the decade 1990-1999 to 3.3% between 2000 and 2006. If this recent increase in the pace at which we emit continues, i.e., were we to follow a "business as usual" scenario, we would find ourselves well beyond the 600 ppm concentration level by 2050.
Thus an important source of uncertainty is the actual quantity of greenhouse gases (GHG's) present in the atmosphere by 2050. But because the temperature response to these gases builds up slowly (and in fact a lot of the warming we are likely to experience is already built in the system because of our past and current emissions), the effect of GHG's in the atmosphere by 2050 is actually quite similar across the different emissions scenarios . But by 2100, the effects vary widely because the atmosphere has had ample time to respond to the GHG forcing, which varies significantly depending on the emission pathway. This highlights the difficulty of predicting, responding and limiting man-made climate change in that its effects are happening on long timescales, and this makes our choices today play out very slowly over time as well.
One thing we'd like to caution is that this is not a map of the exact difference in temperature between July 2008 and July 2050. Specifically, the map shows the difference between the average temperature during two periods of time; in other words two 20-year means, one centered around 1990, and one centered around 2050, what we call "climatologies". Also, do not think of this as a fine resolution map that we should trust blindly over regions of complex topography (mountains and coastline which create microclimates). Rather, think of these maps as hopefully robust estimates. Finally, as we already pointed out, the map is an average future of what twenty models project if we follow a particular path. It is not a definitive prediction, but a glimpse of a potential future.
Why Do We Have Confidence in Them?
These models are approximations, but they are a collection of much of what we know about how the climate system works. The models have gone through decades of incremental developments, tests and validation. They reproduce many aspects of changes in the climate that have occurred so far, or that happened during natural experiments, which gives us confidence that they are capturing the key forces at play going into the future. This is especially true of their simulations of temperature, which changes relatively smoothly and gradually over time and space, at least at the where decadal averages are concerned.
All models unanimously agree that warming is part of our future, i.e., this map of average warming does not hide any model disagreement in terms of the overall direction of temperature change. There is more disagreement for changes in rainfall, which in some regions of the world are so uncertain that some models predict increases and some models predict decreases in the amount of precipitation for a given region. And, if anything, scientists are worried that the range of warming displayed by this ensemble of models has a lower upper limit compared to what could actually happen because of the many complexities that models don't yet include -- such as the response of the vegetation to climate change.
Local Warming
The US July difference map shows that the Midwest and West are the most dramatic hot spots. Temperatures there are expected to rise roughly four to five degrees. The reason is because high pressure sets up and dominates in these regions. We mentioned in the video piece this is consistent with what we've already seen over the past 30 to 40 years. Here is a look at how July temperatures averaged over the period 1998 to 2007 (the most recent decade) compared to July average temperatures from 1968 to 1996."
We also presented a closer look at three specific cities: New York, Kansas City, KS and Boise, Idaho. The average high temperature in July in New York City today is 84 degrees but by the year 2050 it jumps up to 88 degrees. For Kansas City, July average high temperatures today are 88 degrees but by 2050 they're up five degrees. And for Boise, Idaho, you go from an average high of 90 all the way to 96 degrees, a six-degree difference.
We produced this comparison by looking at the changes in mean temperature in the grid box where each city is located. Other climate processes too detailed to be represented in such boxes may influence temperature in big cities; in particular, the effect of the urban heat island. Of course, that would likely enhance the degree of warming projected here.
Even by imposing changes in mean temperatures onto the current maximum temperature to project changes in these "extremes" we may be acting in a conservative manner. This is all about the statistical characteristics of the temperature records at a location, whose distribution, when collected over time, would look like a bell-shaped curve around a mean value. The way we are estimating changes in maximum temperatures, which are the measurements falling in the right tail of the bell-shaped curve, is to assume that the change in mean temperature will simply shift the entire distribution to the right, without changing its shape. Studies have indicated that these extreme changes may actually take on a life of their own, with changes in variability (the width of the bell-shaped curve) adding back on to changes in the mean. If that is true, the tail of the distribution is not just shifting but stretching to the right, making higher temperatures more and more likely. So, we caution against taking these projected changes in max temperatures too literally, but if anything we would bet on them being conservative estimates.