I thought it would be interesting to look at "local" climate change rather than "global" climate change.
So I went to the US Historical Climatology Network
and downloaded the past century's temperature data for the four USHCN stations surrounding Madison (Brodhead, Darlington, Watertown, and Portage). Madison itself doesn't qualify as a USHCN station, probably because of its history of urbanization.
I took the average of those four stations to represent Madison's climate. I also calculated the long-term trend (using a LOESS function) and the two-sigma envelope around that trend. That envelope gives an idea of the year-to-year random variability -- 19 out of 20 years should fall within that envelope.
Unfortunately the USHCN data don't yet include 2012 yet, but I was able to track down a preliminary version of the station data for Watertown and Portage, and in recent years the average of those two has been virtually identical to the average of all four stations, so I think it's a fair comparison. I showed 2012 on the graph, but didn't use it in any of the calculations that follow.
Here are the results:Fig 1. Thin red line is annual temperature; thick red line is long-term trend; dashed lines are two-sigma envelope. Red dot at upper right is preliminary figure for 2012.
Next, I compared this to the output from a global climate model simulation of the 20th century. The model was the National Center for Atmospheric Research's Community Climate System Model (NCAR CCSM)
. Here is the comparison of the model results for south-central Wisconsin to the actual weather station data: Fig 2. Same as previous, but blue lines represent climate model output.
Note that the model's temperature trend has matched the observed temperature trend pretty well over the past few decades. Since 1970, the model shows a bit less
warming than actually occurred -- actually, about 50% less.
On the other hand, the model seems to underestimate how much random year-to-year variation in temperature there is. The standard deviation of the detrended residuals from the model is about half of the value from the actual weather station data.
So, model-Wisconsin has a bit less warming than real-Wisconsin, and also a bit less random year-to-year variation.
The next figure adds in a typical projection of 21st century temperatures for the Madison area, from the same NCAR CCSM model. It's based on the IPCC's "A1B" emissions scenario, a scenario that assumes that CO2 emissions keep growing for a while before stabilizing in the middle of the century. It's considered a "middle-of-the-road" emissions scenario.
Because (as we saw above) the model seems to be underestimating the random year-to-year variability in temperature, I scaled up the variability to match that of the observed data from the 20th century. This doesn't affect the long-term trend
at all -- I didn't change that -- but it gives a more realistic view of how "noisy" the temperatures are likely to be in any given time period. Fig 3. Same as previous, but with expanded variance for model output; green lines represent IPCC A1B scenario projection.
Oops! Houston, we have a problem. The temperatures seem to have gone off the top of the graph. Here's a re-scaled version:Fig 4. Same as previous, with addition of line representing 2012 temperature.
For comparison, I added in a line at the "2012 temperature" level. As you can see, by sometime around 2050 what was formerly considered a fairly hot year in Wisconsin (actually, the hottest on record...
) will be normal, or actually cooler than normal.
So. Two questions that I'd ask myself:How much does average temperature really matter?
Any change in water availability is probably much more important ... but it's also much harder to predict how this will change. In general, a warmer climate means more evaporation and a drier landscape. But whether precipitation increases by 10% or decreases by 10% will matter a lot. Unfortunately, I don't think there's much agreement on this.
The statistics of "extreme" weather also matter a lot. A slight increase (or decrease) in the frequency of droughts, floods, tornadoes, etc. could cost (or save) the state a lot. Again, though, I don't think this kind of thing can be predicted reliably at the regional scale we're looking at here.Do I really believe this?
I wouldn't put too much weight in the predictions of global climate models. I think they're still a bit too primitive and limited to make reliable predictions, and the uncertainty in the emissions scenario is very large. We could end up burning much more fossil fuel than expected, or much less.
That said, I think this is an interesting exercise. The real-Wisconsin in 2050 or 2100 might be hotter or cooler than the model-Wisconsin, but without any way to judge which of those is more likely, I'd go with this model as a rough approximation of what to expect.
There are other ways of studying past climate change that don't rely on models. Even if no one had ever programmed a numerical climate model, we would expect the global temperature to increase just due to the fact that we know we're increasing the CO2 concentration of the atmosphere, and we know that CO2 absorbs infrared radiation. You don't need a model to tell you that a planet whose atmosphere resists the loss of heat to space will warm up.
--------------------- As I understand it, 2012 was the warmest year on record for the continental US, and from my calculations it also set a new record for the south-central Wisconsin area. According to the Wisconsin State Climatologist's Office, for the state as a whole 2012 was tied for second-warmest on record.