The nice part about science is the ability to retest things as new data and better methods become available. In the case of climate change, new data and updated models are producing increasingly higher warming predictions for the end of this century. MIT joined other entities in retesting their predictions with their Integrated Global System Model . The IGSM is used to make probabilistic projections of climate change from 1861 to 2100. Back in 2003, at the time of their original predictions, end of the century median surface temperatures were 2.4°C higher than the climatological average of the preceding century. Armed with additional data and significant updates to the model, their latest prediction is an astounding 5.1°C (median value) in the 2091 to 2100 time period. That’s more than double the value found just a handful of years ago. I can guarantee, and I’m sure they would agree, that their data isn’t completely sufficient; nor is their model accounting for critical feedback processes, many of which we’re only now becoming aware of.
Their new study also includes new predictions of CO2 concentrations over the next 80 years. Their new 5th percentile projection is higher than their 2003 median (50th percentile) at just under 700ppm (current values are 387ppm and increasing). Their new 50th percentile projection is almost as high as their 2003 95th percentile projection: 866ppm vs. 900ppm. Finally, their new 95th percentile projection registers at a nearly unfathomable 1100ppm. Concentrations of CO2 leading up to 1100ppm would certainly open the door to out-of-control climate feedback processes, the kind which nobody would want to deal with.
Warming in their simulations range from 3.1°C to 7.3°C by 2100. They make sure to note that not one of their 400 simulations resulted in globally averaged temperature increases of less than 2°C. Not one. That’s a very significant result. Why the big change? The authors explain:
Rather than interacting additively, these different affects appear to interact multiplicatively, with feedbacks among the contributing factors, leading to the surprisingly large increase in the chance of much higher temperatures.
That multiplicative description is characteristic of non-linear systems, such as the climate system. It’s quite frankly something that many climate change deniers/delayers don’t understand or gloss over. Additive changes of GHG emissions result in multiplicative surface temperature changes down the road. We don’t have to inject too much CO2 or other gases to generate large temperature increases. Which little additive change in emissions will result in more feedback processes kicking in? We don’t know. As such, I don’t think it’s worth it to continue emitting GHGs until we see the feedback has kicked in – it will be too late to slow things down at that point.
Another important result: polar amplification is present in their simulations. By that, I mean that just as has already been observed in the past 30 years, polar temperatures are expected to increase more than temperatures across the mid-latitudes and tropics. There are some differences between the Northern and Southern Hemispheres. Their median percentile projection calls for a 10°C rise at the north pole by 2091-2100 compared to 1981-2000, a 7°C rise at 45°N and a 6°C rise at the Equator. At 45°S, the median temperature change is predicted to be slightly more than 4°C; the south pole temperature change is predicted to be about 7°C.
Does anybody think the Arctic ice sheet can exist year round with 10°C warmer annual temperatures? I certainly don’t. This report identifies a 5% probability of Arctic Ocean ice disappearing in the summer by 2100. I don’t think it will take until 2025 before that happens. Again, the poles are observed and sampled very infrequently in time and space. We simply don’t have solid ideas of how polar climate dynamics behave – not in stable conditions and certainly not in unstable conditions.