Weatherdem's Weblog

Bridging climate science, citizens, and policy


Leave a comment

April 2018 CO2 Concentrations: 410.26 ppm

During April 2018, Scripps University measured an average of 410.26 ppm CO2 concentration at the Mauna Loa, Hawai’i Observatory.

This value is important.  Why?  Because 410.26 ppm is the largest CO2 concentration value for any April in recorded history.  The unending trend toward higher concentrations with time, no matter the month or specific year-over-year value, as seen in the graphs below, is more significant.

When I wrote about this topic a few years ago, there were no monthly or annual CO2 averages that exceeded 400 ppm.  In the intervening time, concentrations passed that threshold.  Actually, monthly CO2 concentrations have not fallen below 400 ppm since Jan 2016; the same thing can be said for annual concentrations since 2015.

How do concentration measurements change during calendar years?  The following two graphs demonstrate this.

CO2_concentration_5y_trend_NOAA_201806

Figure 1 – Monthly CO2 concentration values (red) from 2014 through 2018 (NOAA).  Monthly CO2 concentration values with seasonal cycle removed (black).

CO2_concentration_50y_trend_NOAA_201806

Figure 2 – 60 year time series of CO2 concentrations at Mauna Loa Observatory.  The red curve represents the seasonal cycle based on monthly average values.  The black curve represents the data with the seasonal cycle removed to show the long-term trend.  This graph shows the recent and ongoing increase in CO2 concentrations.  Remember that as a greenhouse gas, CO2 increases the radiative forcing of the Earth, which increases the amount of energy in our climate system.

CO2 concentrations are increasing at an increasing rate – not a good trend with respect to minimizing future warming.  Natural systems are not equipped to remove CO2 emissions quickly from the atmosphere.  Indeed, natural systems will take tens of thousands of years to remove the CO2 we emitted in the course of a couple short centuries.  Human systems do not yet exist that remove CO2 from any medium (air or water) at a large enough scale to make a difference to planetary CO2 concentrations.  CO2 removal systems are not likely to exist for some time.  I’ve written a sentence like that for nearly a decade now.  Unfortunately, NOAA will extend the right side of the above graphs for years and decades to come.

CO2 concentrations rise when there is more CO2 emitted into the Earth system than removed from it.  Recently, humans have only increased the CO2 emission rate, which means that CO2 concentrations have to rise, absent carbon sinks becoming more efficient.

The rise in CO2 concentrations will slow down, stop, and reverse when we decide it will.  It depends first and foremost on the rate at which we emit CO2 into the atmosphere.  We can choose 400 ppm or 450 ppm or almost any other concentration target (350 ppm seems out of reach within the next couple hundred years).  That choice is dependent on the type of policies we decide to implement.  It is our current policy to burn fossil fuels because we think doing so is cheap, although current practices are massively inefficient and done without proper market signals.  We will widely deploy clean sources of energy when they are cheap, the timing of which we have some control over.  We will remove CO2 from the atmosphere if we have cheap and effective technologies and mechanisms to do so, which we also control to some degree.  These future trends depend on today’s innovation and investment in research, development, and deployment.


2 Comments

GHG Emissions: 2C Remains a Fantasy

In the post-Paris Accord climate world, analysis of theoretical scenarios that have a reasonable likelihood of keeping global temperatures “only” 1.5°C warmer than pre-Industrial have become all the rage (one of the latest examples I’ve seen).  I’ve written posts about the mythology associated with 2°C scenarios given the reality of countries’ CO2 emissions historically.  Given that reality, 1.5°C scenarios reside further in the realm of fantasy.

The primary reason is the lack of scalable technologies to remove CO2 from the atmosphere.  That is not a judgment statement, it is an observation about how things exist in the real world.  Theoretical studies have their utility.  My ongoing anxiety revolves around policy makers’ dependence on those studies to inform their decision making.

The resources required to deploy global-scale renewable or nuclear energy are mind-boggling enough.  If we need to add to that infrastructure additional technologies that remove CO2 from the atmosphere, it strikes me as obvious that we need to be sober about our expectations to do so.

For the record, decisions like the Canadian government’s to purchase the Kinder Morgan’s Trans Mountain pipeline for C$4.5bn (US$3.45bn) will result in higher future requirements to deploy additional renewable energy and CO2 removal technologies.  They make it harder to achieve the already fantastical targets that many climate activists are focused on.  The decision ensures that the recent plateau in CO2 emissions will remain a historical anomaly:

CO2_Emissions_Phys.org_2017_2

Set in the context of future emission scenarios, the decision should be framed more as one that locks us into a warmer global future:

CO2_emissions_Phys.org_2017

To have any hope of a <2°C world, global CO2 emissions need to peak.  They haven not done so as of 2017 and likely will not in 2018 or in the following handful of years, absent some financial or geopolitical disaster.  The right hand time series is clear: if we continue emitting anywhere near 35-40 GTCO2 every year, it becomes increasingly likely global temperatures will rise to 3-4°C by 2100.

The difference between 1.5°C and 2°C sounds very small to most people.  The impacts of that small difference are actually big:

Infographic_1.5C-vs-2C-final_Carbon_Brief_2016

The impacts are what policy makers are responsible for.  And this infographic does not show what impacts are likely at 3°C or 4°C.  We can attribute this lack of information to the aforementioned excessive focus on 1.5°C and 2°C by the research community.  It will be hard to decide on climate policy moving forward if we are not appropriately informed about the risks that today’s decisions are locking in.


Leave a comment

GHG Emissions: 2C Pathways Are A Fantasy

A new commentary piece in Nature Climate Change continues to make significant errors and propagates a mistaken core assumption too many in the climate community make: that with enough [insert: political will or technological breakthroughs], 2C warming is still an achievable goal.

I have disagreed with this assumption and therefore its conclusion for six years – ever since establishment Democrats decided to waste valuable political capital on a right-wing insurance bill with a vague promise that climate legislation would come someday.  That decision essentially assured that, absent a global economic collapse or catastrophic geologic events, our species would easily push the planet past 2C warming.

The following graphic shows global historical emissions in solid black.  The green curve represents the fantasy projection of an emissions pathway that leads to <2C global warming.  As you can see, emissions have to start declining this year in the assumed scenario.  The yellow curve represents what is likely to happen if climate action is delayed for 8 years and this year’s emissions remain constant during those 8 years.  It gets increasingly difficult to achieve the same long-term warming cap because of that 8 year delay.

The red curve builds on the yellow curve projection by keeping the next 8 year’s emissions constant but reducing federal money to research decarbonization technology.  This is the linchpin to any emissions pathway that could potentially put us on a pathway to a less warm climate.  Decarbonization technology has to not only be fully researched but fully deployed on a planetary scale for the 2C pathway to happen.  It’s hard to see on this graph, but global emissions have to go net negative for us to achieve <2C warming.  While the yellow curve has a harder time achieving that than the green curve, the red curve doesn’t get there one century from now.  But the red curve isn’t the most likely pathway – it wasn’t in 2010 and it isn’t today.

The most likely pathway is the solid black curve out to 2125.  It assumes the same things as the red curve and adds an important component of reality: emissions are likely to increase in the near-term due to continued increased fossil fuel use.  Natural gas and coal plants continue to be built – any assumption otherwise might be interesting academically but has no place in the real world.  By assuming otherwise, scientists make themselves a target of future policy makers because the latter won’t pay attention to the nuanced arguments the scientists will make once it’s clear we’re hurtling past 2C.  Once we burn increasing amounts of fossil fuels during the next 8 years (as we did the 8 years before that and so on), it is harder still to cut emissions fast enough to try to keep global warming <2C.  The reasons should be obvious: the emitted GHGs will radiatively warm the planet so long as they’re in the atmosphere and it will take even more technological breakthroughs to achieve the level of carbon removal necessary to keep warming below some level.

ghg-emissions-201612-nature-climate-paper

The authors recognize this challenge:

[…]to remain within a carbon budget for 2 °C in the baseline scenario considered, peak reduction rates of CO2 emissions around 2.4% per year are needed starting mitigation now. A global delay of mitigation action of eight years increases that to 4.2% per year (black dashed in Fig. 1a) — extremely challenging both economically and technically. The only alternative would be an overshoot in temperature and negative emissions thereafter. Research in negative emissions should therefore be a priority, but near term policy should work under the assumption that such technology would not be available at large scale and low cost soon.

I disagree with the author’s conclusion:

Society is at a crossroad, and the decisions made in the US and elsewhere over the next 4–8 years may well determine if it is possible to limit climate change to levels agreed in Paris.

We passed the crossroad already.  It really doesn’t matter when, the fact is we passed it.  I think it is a waste of time to examine low-end emission scenarios for policy purposes.  They serve some scientific use.  Policy makers need relevant scientific advice and 2C scenarios don’t do that.  They perpetuate a myth and therefore pose a real danger to society.  The so-called reality-based community needs to critically self-examine what they’re doing and why they’re doing it.  We’re headed for >3C warming and we need to come to terms with what that means.


Leave a comment

Wildfire – Policy and climate change

I read a wildfire article today that was breathless about the scope of total acreage burned across the drought-stricken northwest US and of course included a climate change angle.  This is the first wildfire article I’ve read that did not include some mention of decades of ill-conceived fire policies in the intermountain West.

Let’s not mince words: a lot of fires are burning on a lot of acres this year primarily because of those man-made policies.  Millions of overcrowded acres of forest because people put tiny fires out for decades and allowed trees (fuel) to grow and grow.  Fire is a natural process that we purposefully interrupted.  Prior years with extensive fires also generated media and environmentalist attention.  As I stated above, the difference between then and now is climate activists politicized the science.  An EcoWatch article now contains no mention of historical decisions because it is more important to satisfy the environmentalist base by claiming nature is pure without humans and impure with us.

This is disappointing but not surprising.  For now, I am glad there are more responsible media outlets that continue to acknowledge the very real and dominant influence people have on forests (forest management), the very real and strong influence nature has on forests (drought), as well as the growing influence that people will have on forests in the future (climate change).


1 Comment

Climate Links: Resilient Arctic & Pacific Decadal Oscillation

Items that caught my eye this morning on Twitter:

Scientists: Arctic Is More Resilient To Global Warming Than We Thought” Who is “We” and why did “we” think the Arctic wasn’t resilient?  The second is easier to answer: because climate scientists with bullhorns told us for years that the Arctic was “DOOMED!”  I’ve written about this topic and knew when the record low extent and volume occurred in 2012 that it was likely one bad year and not the end of the world.  I haven’t seen or heard from those same scientists who breathlessly told the public about the doomed Arctic in 2012 that they were wrong (boy, were they wrong!).  While this article makes that point today (3 years too late), I don’t expect anyone to remember it when the Arctic has another bad summer.  2013 was a good Arctic summer: cooler than recent years – and guess what? Arctic ice responded by … growing – you know, what it’s supposed to do according to physics.  Headslap.

To truly grasp what we’re doing to the planet, you need to understand this gigantic measurement”  This is an article about “giga”: what the prefix means and how people should know about it.  I disagree with it from the perspective that the explanations don’t utilize anything truly useful.  For example: a gigaton is “more than 6 million blue whales”.  Who knows how much a big whale weighs?  Can you envision 6 million of them?  The basic problem with giga is it is so big that it defies our everyday experiences.  The superficial problem is despite being written by someone who understands science, the article likely misses its intended audience and thus is not useful.

At the risk of delving too far into technical issues, this article is useful for me personally based on relevance to my geographic region’s upcoming winter weather: “Subtle Differences to Previous El Niños Key to Winter Forecast, And Why the PDO Matters“.  The PDO is the Pacific Decadal Oscillation – a low-frequency natural climate pattern that has direct and indirect influences on weather across at least the western half of the nation.  Many people in my professional community are aware that there is currently a strong El Nino in the equatorial Pacific.  I noted from recent write-ups that there are also warm sea surface temperature anomalies in the north Pacific (e.g., off the west coast of the U.S.) – in addition to the warm anomalies across the central and eastern equatorial Pacific (hallmarks of El Ninos).

The northern Pacific anomalies are an oddity and this article helps explain why they might be present.  In the late 1990s, the PDO likely entered into a cool pattern, which helps explain a couple of things.  First, El Ninos during the 2000s and 2010s had lower amplitude (cooler).  And second, global temperatures didn’t rise as quickly since 1998 as they did during the previous PDO phase (1977-1998).  This observation is also known as the global warming “hiatus” or “pause”.  A cool PDO means Pacific sea surface temperatures are cooler than average.  One effect of this is the Pacific absorbs heat from the atmosphere and keeps the atmosphere cooler than it otherwise would be.  The opposite is also true: warmer than average Pacific SSTs releases more heat to the atmosphere than is absorbed and the atmosphere warms more than average.

Which leads me to the next article: “Has the PDO Flipped to a Warming Phase?”  The PDO typically stays in a warm or cool regime for 10 to 30 years – hence the “multidecadal” characterization.  As I wrote above, the PDO moved into a cool phase in the late 1990s.  Recent positive temperature anomalies (since 2014) might indicate that the PDO is temporarily positive or it might indicate the PDO switched back to a positive or warm phase.  This has significant implications for global weather patterns until it switches back to its negative phase.  For example, I wrote above that there is currently a strong El Nino.  If the PDO switched to a positive phase, it could enhance any El Nino.  It would also do what I described above: release heat back into the atmosphere.  This means the global warming “hiatus” is, if the PDO switches, over.

Back to the Colorado forecast.  With a warmer north and equatorial Pacific (positive PDO and ENSO), what kind of winter can we expect?  The answer won’t surprise you: it’s hard to tell.  There are few similar historical examples that scientists can use to issue a reliable forecast.  They have to determine if warmer ocean temperatures persist and how the atmosphere’s jet stream responds.  Ridge and trough locations over the western US will ultimately determine when we get snow and how much snow we get during each storm.


1 Comment

Commercialized Carbon Removal

The Guardian has an interesting article about startups working on capturing carbon dioxide from the air – a necessary technology if we are serious about any kind of climate goal with less CO2 than our business-as-usual pathway.  What are the companies doing and why is it important?  The article highlights three businesses: Carbon Engineering, Global Thermostat and Climeworks, which started in the 2000’s.  “All are aiming to make low-carbon fuels, used recycled CO2 and renewable energy to power the process.”  They’re making low-carbon fuels because small markets for them already exist.  Those markets will have to increase in size and scope for these startups to grow.

That’s one important aspect.  Another is the so-called “climate goal”.  As stated by international groups and increasing numbers of scientists, the climate goal is keeping CO2 concentrations below a certain threshold in order to keep global temperatures from increasing above 2C.  There is another international climate conference scheduled for later this year – the Conference of Parties in Paris, France (COP-21).  Leading up to that conference, countries are submitting carbon reduction goals that can be internationally monitored and verified.  Goals stated so far will definitely not meet the <2C goal – read about the UK’s goals here, for example, or here about the 45 countries who submitted plans already.  The reason is simple: we cannot deploy enough renewable energy generation or institute enough efficiency measures or change fuel types fast enough that will translate into halving global CO2 emissions within 10 years and going net negative by 2050.

What these companies are doing is directly related to that last goal: net negative CO2 emissions.  It won’t happen by 2050 or any other year unless we push serious money into research, development, and deployment.  It will take the private and public sectors working together for decades for utility-scale CO2 withdrawal from fuel burning and the free atmosphere.  These companies, and others, are the forefront of that industry.  But they need capital – a lot of it.  Cashflow at this juncture of their existence is crucial for them to exist long enough to deploy their technologies and evolve them as they run in the real world.  Anyone have a spare million or so dollars sitting around?


Leave a comment

Water efficiency

I saw a tweet last night that I found interesting.

“The only thing that can.”  I hope Peter means water-use efficiency for all users.  The graphic he includes with this tweet suggests he’s focused on household toilet usage in California.  I’ll round the numbers used in the graphic: 1980 usage was 800,000 acre-feet per year; current use (no efficiency) is 1,200,000 acre-feet per year.  Current savings from efficiency improvements: 640,000 acre-feet per year.  Additional potential savings: 290,000 acre-feet per year.

The 640,000 AFY is laudable.  That’s a lot of water that Californians don’t have to use and thankfully aren’t.  That is a real accomplishment.  An additional 290,000 AFY is a good goal to work on – why waste a resource when you don’t have to.

But toilet water usage isn’t the primary usage of California water – and it’s that small point that troubled me when I saw the tweet.  Total water usage in California is 40,00,000 AFY.  That 640,000 efficiency represents just 1.6% of the total usage.  It also represents >50% reduction from what water usage could be without any efficiency measures.  What I want to know is what efficiencies water-thirsty California agriculture implements.  Agriculture is by far the dominant user of water – if we achieved just 1% sector efficiency, how much more water could California save because of the scale of industry usage compared to residential usage?

Agriculture is a sizable part of the California economy – $43 billion industry that generates $100 billion in overall economic activity.  Because of that, agriculture wields political clout in Sacramento.  This means that while physical scientists can inform policymakers on the ongoing drought, we need the social sciences to inform policymakers how to deal with it.  I would also like to see quantitative results of efficiency gains by sector.


Leave a comment

Climate Papers

I found this article from a Tweet this morning:
Prof John Mitchell: How a 1967 study greatly influenced climate change science

The Carbon Brief blog asked climate scientists to nominate the most influential refereed paper.  Manabe & Wetherland’s 1967 paper entitled, “Thermal Equilibrium of the Atmosphere with a Given Distribution of Relative Humidity” was the winner.  The paper incorporated the transfer of heat from the Earth’s surface to the atmosphere and back for the first time in a model.  Their model produced surface temperatures that were closer to reality than previous efforts.  They also tested constant and doubled atmospheric CO2 and found global mean temperatures increased by 2.4C under a doubling scenario.  In a nutshell, a simple model in 1967 projected the same warming signal as dozens of more sophisticated models do today.

I am not the first to pose the following question: what additional value do today’s extensive models provide over simple models?  Climate scientists still use simple models in their investigations.  They’re obviously useful.  But posing the question differently addresses my more recent interests: does the public derive more value from today’s climate model results than they did before with simpler and cheaper models?  The most obvious addition to me is the increasing ability to resolve regional climate change which is more variable than the global mean.  I do wonder how the public would react if they heard that climate models are largely generating the same projections given the amount of money invested in their development and analysis.  We have a partial answer already with the growth of climate skeptics in the public sphere.  Some people are obviously drawn to the problem.  As complex as all the aspects of the problem are and as busy as most people are, perhaps it is in science’s best interest to not make too much noise.

I will also note that one of the drawbacks of climate science in the academy is the utter lack of historical context for results.  My experience really has been the proverbial information dump as part of the information deficit model of learning.  The Facts Speak For Themselves.  I don’t remember hearing about this article that so many in my field consider seminal.  My colleagues would benefit from exposure to the history of their science.


Leave a comment

U.S. Energy Information Administration: Reference Projection

EIA released its 2015 reference case for electricity generation between 2000 and 2040.  The upshot: while they expect natural gas and renewables to continue their growth in the U.S.’s overall energy portfolio, coal is still very much in the mix in 2040.  From a climate perspective, if their reference projection becomes reality, we easily pass 2C warming by 2100.

Their reference projection “reflects current laws and regulations—but not pending rules, such as the Environmental Protection Agency’s Clean Power Plan“.  So it is no surprise that current laws and regulations result in passing the 2C threshold (or the GHG emissions which would actually lead to passing the 2C threshold).  The EPA’s Clean Power Plan isn’t in effect yet – and it will take time to analyze changes to actual generation once its final form does take effect.

 photo EIA Annual Energy Outlook 2015 Fig 1_zpsuiinhtg0.png

Figure 1. EIA’s Reference Case analysis and projection of U.S. electricity generation (2000-2040).

The good news is renewables’ share grows during the next 25 years.  Again, there’s no surprise there.  Nor is it surprising to see natural gas’ share also grow.  If you look at the left y-axis, the absolute share of renewables exceeds that of natural gas.  The bad news (from a 20th-century climate perspective) is that coal remains 34% of the electricity generation in this scenario.  That news is tempered by the fact that in both absolute and percentage terms, coal use is lower during the next 25 years than the last 15 years.  The absolute numbers are most frustrating from a climate perspective.  In 2040, this scenario projects >1.5 trillion kilowatt hours of coal generation.  Absent additional policy measures, that value remains largely unchanged during the next 25 years.  How do we address that?  Well, beating people over the head with scientific consensus claims hasn’t worked (and won’t in the future either): the American public know what causes global warming, once you get past self-identity question framing.  Once you interact with Americans on familiar terms, they’re much more willing to support global warming-related policies than many climate activists want you to believe.

 photo EIA Annual Energy Outlook 2015 Fig 2_zpsxotnkmbd.png

Figure 2. EIA’s renewable generation by type.

The EIA projects wind penetration to continue as it has for the last decade – almost doubling in absolute terms in the next 25 years.  We need that deployment and more to make a serious dent in GHG emissions.

 photo EIA Annual Energy Outlook 2015 Fig 3_zpsvigp121n.png

Figure 3. EIA’s six cases in their 2015 annual report.

You can see how different assumptions impacts EIA’s 2040 projections of electricity generation in 2040 compared to the 2013 historical case.  Don’t hope for high oil prices: renewables constitute more than 1 trillion kilowatt hours in that case, but coal also grows to nearly 2 trillion kWh!  Putting dreams aside, I don’t think those coal plants will all be running highly efficient carbon capture and sequestration technologies.

We still need RD&D for multiple technologies.  To do that, we need policies that prioritize innovative – and yes, risky – programs and projects.  The government is the only institution that can reliably assume that level of risk.  If we want to avoid 4C or 6C, we can; we need innovative policies and technologies today to stay below those thresholds.


Leave a comment

Warming Pause Research Continues

By my count, there are four leading scientific explanations for the short-term slowdown overlaying the long-term rate of global surface warming.  They are:

Enhanced heat uptake by the world’s oceans, especially over the Pacific due to enhanced easterly winds

Sustained small to medium-sized equatorial volcanic activity

Slightly reduced solar irradiance

Natural variability, including the Interdecadal Pacific Oscillation

One interesting aspect to  these explanations’ presence is most researchers believe their explanation is the leading one.  It is a symptom of the state of climate science:  specialization proliferation leads to poor cross-disciplinary communication.  Someone might have this within their purview, but I am currently unaware whether anyone is apportioning relative causality of these explanations together.  Attribution is challenging, of course, but such an effort seems worthwhile to me.

Some recent science updates on these explanations:

Heat Uptake by Several Oceans Drives Pause

Reconciling Warming Trends