Weatherdem's Weblog

Bridging climate science, citizens, and policy

Leave a comment

GHG Emissions: 2C Pathways Are A Fantasy

A new commentary piece in Nature Climate Change continues to make significant errors and propagates a mistaken core assumption too many in the climate community make: that with enough [insert: political will or technological breakthroughs], 2C warming is still an achievable goal.

I have disagreed with this assumption and therefore its conclusion for six years – ever since establishment Democrats decided to waste valuable political capital on a right-wing insurance bill with a vague promise that climate legislation would come someday.  That decision essentially assured that, absent a global economic collapse or catastrophic geologic events, our species would easily push the planet past 2C warming.

The following graphic shows global historical emissions in solid black.  The green curve represents the fantasy projection of an emissions pathway that leads to <2C global warming.  As you can see, emissions have to start declining this year in the assumed scenario.  The yellow curve represents what is likely to happen if climate action is delayed for 8 years and this year’s emissions remain constant during those 8 years.  It gets increasingly difficult to achieve the same long-term warming cap because of that 8 year delay.

The red curve builds on the yellow curve projection by keeping the next 8 year’s emissions constant but reducing federal money to research decarbonization technology.  This is the linchpin to any emissions pathway that could potentially put us on a pathway to a less warm climate.  Decarbonization technology has to not only be fully researched but fully deployed on a planetary scale for the 2C pathway to happen.  It’s hard to see on this graph, but global emissions have to go net negative for us to achieve <2C warming.  While the yellow curve has a harder time achieving that than the green curve, the red curve doesn’t get there one century from now.  But the red curve isn’t the most likely pathway – it wasn’t in 2010 and it isn’t today.

The most likely pathway is the solid black curve out to 2125.  It assumes the same things as the red curve and adds an important component of reality: emissions are likely to increase in the near-term due to continued increased fossil fuel use.  Natural gas and coal plants continue to be built – any assumption otherwise might be interesting academically but has no place in the real world.  By assuming otherwise, scientists make themselves a target of future policy makers because the latter won’t pay attention to the nuanced arguments the scientists will make once it’s clear we’re hurtling past 2C.  Once we burn increasing amounts of fossil fuels during the next 8 years (as we did the 8 years before that and so on), it is harder still to cut emissions fast enough to try to keep global warming <2C.  The reasons should be obvious: the emitted GHGs will radiatively warm the planet so long as they’re in the atmosphere and it will take even more technological breakthroughs to achieve the level of carbon removal necessary to keep warming below some level.


The authors recognize this challenge:

[…]to remain within a carbon budget for 2 °C in the baseline scenario considered, peak reduction rates of CO2 emissions around 2.4% per year are needed starting mitigation now. A global delay of mitigation action of eight years increases that to 4.2% per year (black dashed in Fig. 1a) — extremely challenging both economically and technically. The only alternative would be an overshoot in temperature and negative emissions thereafter. Research in negative emissions should therefore be a priority, but near term policy should work under the assumption that such technology would not be available at large scale and low cost soon.

I disagree with the author’s conclusion:

Society is at a crossroad, and the decisions made in the US and elsewhere over the next 4–8 years may well determine if it is possible to limit climate change to levels agreed in Paris.

We passed the crossroad already.  It really doesn’t matter when, the fact is we passed it.  I think it is a waste of time to examine low-end emission scenarios for policy purposes.  They serve some scientific use.  Policy makers need relevant scientific advice and 2C scenarios don’t do that.  They perpetuate a myth and therefore pose a real danger to society.  The so-called reality-based community needs to critically self-examine what they’re doing and why they’re doing it.  We’re headed for >3C warming and we need to come to terms with what that means.

Leave a comment

Climate Papers

I found this article from a Tweet this morning:
Prof John Mitchell: How a 1967 study greatly influenced climate change science

The Carbon Brief blog asked climate scientists to nominate the most influential refereed paper.  Manabe & Wetherland’s 1967 paper entitled, “Thermal Equilibrium of the Atmosphere with a Given Distribution of Relative Humidity” was the winner.  The paper incorporated the transfer of heat from the Earth’s surface to the atmosphere and back for the first time in a model.  Their model produced surface temperatures that were closer to reality than previous efforts.  They also tested constant and doubled atmospheric CO2 and found global mean temperatures increased by 2.4C under a doubling scenario.  In a nutshell, a simple model in 1967 projected the same warming signal as dozens of more sophisticated models do today.

I am not the first to pose the following question: what additional value do today’s extensive models provide over simple models?  Climate scientists still use simple models in their investigations.  They’re obviously useful.  But posing the question differently addresses my more recent interests: does the public derive more value from today’s climate model results than they did before with simpler and cheaper models?  The most obvious addition to me is the increasing ability to resolve regional climate change which is more variable than the global mean.  I do wonder how the public would react if they heard that climate models are largely generating the same projections given the amount of money invested in their development and analysis.  We have a partial answer already with the growth of climate skeptics in the public sphere.  Some people are obviously drawn to the problem.  As complex as all the aspects of the problem are and as busy as most people are, perhaps it is in science’s best interest to not make too much noise.

I will also note that one of the drawbacks of climate science in the academy is the utter lack of historical context for results.  My experience really has been the proverbial information dump as part of the information deficit model of learning.  The Facts Speak For Themselves.  I don’t remember hearing about this article that so many in my field consider seminal.  My colleagues would benefit from exposure to the history of their science.

Leave a comment

Three Factors Contributed To Natural Global Warming 14,500 Years Ago

A paper in the July 17th issue of Science presents results identifying three factors that contributed to the last major period of natural global warming:

  • an increase of about 40 parts per million in atmospheric carbon dioxide
  • a strengthening of the Atlantic Ocean’s conveyor belt circulation
  • the release of heat stored in the ocean over thousands of years

Identified as the Bølling-Allerød period, about 14,500 years ago, it was a period of time when the climate system warmed suddenly (in climatological terms) and strongly.  Critical steps along the way began with glacial melt.  Once that stopped, the enormous subsurface heat that had accumulated for 3,000 years erupted like a volcano and popped out over decades.  This huge heat flux melted the sea ice in the Arctic and warmed up Greenland.  In terms that most of us can understand, global sea level rose by 16 feet and temperatures in Greenland soared by up to 27 degrees Fahrenheit over several hundred years.  This points out that global warming events can occur quite rapidly and have devastating effects on the entire globe.

The findings were identified using simulations, which were conducted on the Community Climate System Model (CCSM), which is a collaborative effort based at NCAR.

What’s similar in today’s world?  Well, the CO2 increase for starters.  We’re increasing the CO2 concentrations in the atmosphere by 40 parts per million every 20 years – for the past 120 years.  Today’s concentration is ~385ppm, whereas the average in the past 1000 years has been ~280ppm.  The IPCC considered cases where CO2 concentrations by 2100 would range from 500ppm to 1000ppm.  Even with aggressive action to reduce CO2 emissions, concentrations will rise for the remainder of this century.  What will that do?  How is that condition related to the other two factors that contributed to the last period of major global warming?

The oceans are undergoing expansion due to the addition of thermal energy.  How long will the oceans be able to take up excess heat before re-releasing it into the atmosphere?  That an important question that nobody has an answer to today.

Will the Atlantic Ocean’s conveyor belt circulation strengthen?  Or will we only face 2 out of 3 of the factors?  These are all important questions that we shouldn’t be toying with in order to protect immoral corporate profits.  I am not overstating the state of things when I say the fate of today’s societies and ecosystems hang in the balance.

Leave a comment

Sweet Modeling Video

I found a very nicely put together short video that shows some of the parameters considered in general climate models. Go to the following blog:

Ice Blog

It’s in French, so look for the term “ici” in parentheses. It’s a link that will allow you to download the video (36.4MB). Play it in your favorite software and enjoy the show!