Weatherdem's Weblog

Bridging climate science, citizens, and policy


1 Comment

UN Continues to Issue Irrelevant Pleas for Climate Action

The United Nations will issue yet another report this year claiming that deep greenhouse gas emission cuts are within reach.  As reported by Reuters (emphasis mine):

It says existing national pledges to restrict greenhouse gas emissions are insufficient to limit warming to 2 degrees Celsius (3.6 Fahrenheit) above pre-industrial times, a U.N. ceiling set in 2010 to limit heatwaves, floods, storms and rising seas.

“Deep cuts in greenhouse gas emissions to limit warming to 2 degrees C … remain possible, yet will entail substantial technological, economic, institutional, and behavioral challenges,” according to the draft due for publication in Copenhagen on Nov. 2 after rounds of editing.

Substantial is an understatement.  To achieve a better than even chance at keeping global mean annual temperatures from rising less than 2 degrees C, emissions have to peak in 2020 and go negative by 2050.  Technologies simply do not exist today that would achieve those difficult tasks while meeting today’s energy demand, let alone the energy demand of 2050.

The following quote points toward understanding the scale of the problem:

Such a shift would also require a tripling or a quadrupling of the share of low-carbon energies including solar, wind or nuclear power, it said.

That’s actually an underestimate of the required low-carbon energies.  Because again, achieving <2C warming will require net-negative carbon, not just low carbon.  But let’s stick with their estimate for argument’s sake.  Low-carbon technologies currently provide 16% of the global energy portfolio.  I’m not entirely certain the tripling quote refers to this 16% or not for the following reason: “traditional biomass” (wood and similar materials) represent 10% of the global energy portfolio, or 63% of the low-carbon energies.  We’re obviously not going to use more of this material to provide energy to the global energy-poor or industrial nations.  Wind, solar, biomass, and geothermal together account for 0.7% of the global energy portfolio.  That is a key figure.  How many news stories have you seen touting wind and solar deployment?  All of those small utility-scale plants globally account for less than 1% of total global energy.

So perhaps the UN is referring to the 16% figure, not the 0.7% figure, because even quadrupling it yields 2.8% of total global energy.  But what I just wrote is then even more valid: we need enough new solar, wind, and nuclear deployment have to not only match 15.3% of today’s global energy, but 45% of today’s global energy.  How much new low-carbon energy is that?  A lot of new low-carbon energy.  The US alone would require either 1 million+ 2.5MW wind turbines or 300,000+ 10MW solar thermal plants or 1,000+ 1GW nuclear power plants (more than the total number of today’s nuclear plants – globally).  And this doesn’t include any requirements to update national transmission grids or CCS deployment or sequestration topics.  As I said, the scale of this problem is vast and is completely glossed over by previous and it looks like current UN reports.

Look, the reasons to decarbonize are valid and well-recognized.  Emissions are driving planetary changes at rates that occur only very rarely in geologic history.  Those changes will accelerate throughout the 21st century and beyond.  Yet this remains the obsessive focus of most climate activists.  The problem remains how to achieve deep decarbonization – what policies will facilitate that effort?  The fact remains that no economy has decarbonized at requisite rates – and that includes economies that historically widely deployed nuclear and biomass energy.  The UN continues to issue reports that are wildly out-of-date the day they’re issued.  They do themselves and the world’s population no favors by doing so.  We need new methods and new frameworks within which to define and evaluate problems.


Leave a comment

On False Equivalence

The Guardian recently ran a couple of really bad climate pieces.  The first has a headline guaranteed to draw eyes, “Miami, the great world city, is drowning while the powers that be look away“.  Who would possibly allow a “great world city” drown?  The monsters!  Know that the author is billed as a “science editor”, which I take to mean he understands basic scientific concepts such as uncertainty, time scale, and accuracy.  What does Robin McKie have to say?

The effect is calamitous. Shops and houses are inundated; city life is paralysed; cars are ruined by the corrosive seawater that immerses them. [...] Only those on higher floors can hope to protect their cars from surging sea waters that corrode and rot the innards of their vehicles. [...] Miami and its surroundings are facing a calamity worthy of the Old Testament.

Really?  Old Testament calamity? Inundated. Paralysed. Ruined. Corrode and rot.

That’s fairly flowery language for a science editor.  How much of it is based in reality?  There are definitely localized effects of sea level rise in Miami.  Seawater is corrosive.  But I missed the news reports of Miami calamities, inundations, being a paralyzed city.  Those are serious effects he describes that aren’t quite as extensive or horrific as his article portrays.

Or, as Time writer Michael Grunwald writes, “I’m sorry to spoil the climate porn, but while the periodic puddles in my Whole Foods parking lot are harbingers of a potentially catastrophic future, they are not currently catastrophic. They are annoying. And so is this kind of yellow climate journalism.”

I agree with Michael on this one.  This type of journalism works against taking the very action that Miami actually is doing right now to adapt to a changing reality.  This quote says it perfectly:

What’s happening in the Middle East right now is calamitous. A blocked entrance is inconvenient.

Thank you, Michael, for some overdue perspective.  He adds,

But let’s get real. The Pacific island of Kiribati is drowning; Miami Beach is not yet drowning, and the Guardian’s persistent adjective inflation (“calamitous,” “astonishing,” “devastating”) can’t change that.

This encouraged a number of climate porn addicts to take to the Twitter and denounce Grunwald’s lack of enthusiasm for not wanting to be a part of their tribe.  Tweets displayed peoples’ camps:

Here is what folks were trying to say: person A has a gun held to their head right now; person B will die sometime in the future, but we don’t know exactly when.  And since the same characteristic will eventually apply to both persons, they both share existential threats.  Ask Kiribatians how much of their daily life is affected by sea level rise and I’d bet dollars to doughnuts you’ll get a very different answer than a Miamians’.  And contrary to most climate activists, that’s not because Miamians are climate uneducated.  It’s because their daily lives aren’t affected by climate change today to the same degree than a Kiribatian is.  Saying they are doesn’t make it so.

I also agree with Mike that this fact doesn’t alter the need to mitigate and adapt.  I agree with TheCostofEnergy that Miami and island nations face different timing and resource issues.  That is precisely why island nations face an existential threat today and Miami doesn’t.  Island nation people have nowhere to move to.  Their islands will disappear and they will be forced to move.  That presents an enormous culture disruption.  Miami has much more adaptive capacity than do island nations.  Miami will have to adapt, there is no doubt about that.  But that’s not an existential threat except in some absurdly narrow use of the term.

Disaster porn language usage has to stop.  It’s not accurate.  It dissuades instead of incentivizes action.  It breaks down instead of builds trust.


Leave a comment

Deep Decarbonization Pathways Interim Report Released

An international group of folks put together an interim report analyzing “Deep Decarbonization Pathways”.  Decarbonization refers to the process of using less carbon within an economy.  The intent of the report was to show ways forward to keep global mean temperatures below 2C.  Readers of this blog know that I no longer think such a goal is achievable given the scope and scale of decarbonization.  We have not moved from a “business-as-usual” approach and have run out of time to reduce GHG emissions prior to relevant limits to meet this goal.  I argue the exact opposite of what the authors describe in their summary:

We do not subscribe to the view held by some that the 2°C limit is impossible to achieve and that it should be weakened or dropped altogether.

Thus the main problem with this report.  They’re using a threshold that was determined without robustly analyzing necessary actions to achieve it.  In other words, they a priori constrain themselves by adopting the 2C threshold.  Specifically, a more useful result would be to ascertain what real-world requirements exist to support different warming values in terms real people can intuitively understand.  The report is not newsworthy in that it reaches the same results that other reports reached by making similar assumptions.  Those assumptions are necessary and sufficient in order to meet the 2C threshold.  But examination unveils something few people want to recognize: they are unrealistic.  I will say that this report goes into more detail than any report I’ve read to date about the assumptions.  The detail is only slightly deeper than the assumptions themselves, but are illuminating nonetheless.

An important point here: the authors make widespread use of “catastrophe” in the report.  Good job there – it continues the bad habit of forcing the public to tune out anything the report has to say.  Why do people insist on using physical science, but not social science to advance policy?

On a related note, the report’s graphics are terrible.  They’re cool-color only, which makes copy/paste results look junky and interpretation harder than it should be.  So they put up multiple barriers to the report’s results.  I’m not sure why if the intent is to persuade policy makers toward action, but …

Continue reading


1 Comment

N.C.’s Sea Level Rise Reaction

Many people involved in climate activism have probably heard of North Carolina’s reaction to sea level projections.  The reaction has been exaggerated by some of those same activists.  I read this article and had the following thoughts.

By the end of the century, state officials said, the ocean would be 39 inches higher.

There was no talk of salvation, no plan to hold back the tide. The 39-inch forecast was “a death sentence,” Willo Kelly said, “for ever trying to sell your house.”

Coastal residents joined forces with climate skeptics to attack the science of global warming and persuade North Carolina’s Republican-controlled legislature to deep-six the 39-inch projection, which had been advanced under the outgoing Democratic governor. Now, the state is working on a new forecast that will look only 30 years out and therefore show the seas rising by no more than eight inches.

Up to this point, readers probably have one of two reactions.  They either agree with quoted environmentalists and think N.C. tried to “legislate away sea level rise.”  Or they agree with Kelly’s reactions and the legislature’s boundaries on projection scope.

I think the reactions were entirely justified from a personal standpoint and easy to predict if anyone had stopped to think things through.  Nearly everybody would have the same reaction if your property was under threat to be considered worthless – regardless of the underlying reason.  Why?  Because you have an emotional attachment to your property that far exceeds the attachment to a 90-year sea level projection.  You’re going to react to the former more strongly than the latter.  The article identifies the underlying process:

“The main problem they have is fear,” said Michael Orbach, a marine policy professor at Duke University who has met with coastal leaders. “They realize this is going to have a huge impact on the coastal economy and coastal development interests. And, at this point, we don’t actually know what we’re going to do about it.”

This is the problem with the vast majority of climate activists’ language: they coldly announce that civilization will collapse and won’t offer actions people can take to avoid such a collapse.  Well, people will respond to that language, just not the way activists want them to.  People will fight activists and identify with climate skeptics’ arguments since they view the announcements as a threat to their way of life.

Where I differ with Kelly and others is this: she and other coastal residents had better look for viable long-term solutions before that 30-year period is over.  If they prevent long-term planning beyond 2040, inland residents of N.C. will be unfairly burdened with the cost of subsidizing Kelly and others for their lifestyle choices.

Kelly’s view is not without merit, to be sure:

Long before that would happen, though, Kelly worries that codifying the 39-inch forecast would crush the local economy, which relies entirely on tourism and the construction, sale and rental of family beach houses. In Dare County alone, the islands’ largest jurisdiction, the state has identified more than 8,500 structures, with an assessed value of nearly $1.4 billion, that would be inundated if the tides were 39 inches higher.

That’s 8,500 structures in just one county – worth $1.4 billion – an average of $165,000 per structure.  I would absolutely fight to keep my $165,000 worth as long as I could.  Nationwide, the estimate is $700 billion; not a trivial sum is it?  The article has this choice quote:

“What is it you would ask us to do differently right now? Tell people to move away?”   “Preaching abandonment is absurd. People would go in the closet and get the guns out.”

The Coastal Resources Commission bungled their attempt to evaluate the science and establish policy.  By the time they announced results with no action plans, rumors fed by misunderstanding and bias confirmation ran rampant.  The result was Kelly’s actions to change the time horizon that planners could use.

So what are the solutions?  The Commission should establish and maintain relationships with stakeholders.  Get to know the mayors and planners and scientists and property owners.  Find out what their interests are and what motivates them to do what they do.  Identify actions they can take in the next 30 years that sets them up for success afterward.  But don’t release information without context.  Because sea level rise is likely to accelerate in the 2nd half of the 21st century.  But most people will focus on potential direct threats to themselves and their livelihoods, not global concerns.  So get into the weeds with folks.


Leave a comment

REMI’s Carbon Tax Report

I came across former NASA climate scientist James Hansen’s email last week supporting a carbon tax.  At the outset, I fully support this policy because it is the most economically effective way to achieve CO2 emission reductions.  An important point is this: it matters a lot how we apply the tax and what happens to the money raised because of it.  Many policy analysts think that the only way a carbon tax will ever pass is for the government to distribute the revenue via dividends to all households.  This obviously has appealing aspects, not least of which is Americans love free stuff.  That is, we love to reap the benefits of policies so long as they cost us nothing.  That attitude is obviously unsustainable – you have simply to look at the state of American infrastructure today to see the effects.

All that said, the specific carbon tax plan Hansen supported came from a Regional Economic Models, Inc. report, which the Citizens Climate Lobby commissioned.  The report found what CCL wanted it to find: deep emission cuts can result from a carbon tax.  There isn’t anything surprising with this – many other studies found the exact same result.  What matters is how we the emission cuts are achieved.  I think this study is another academic dead-end because I see little evidence how the proposed tax actually achieves the cuts.  It looks like REMI does what the IPCC does – they assume large-scale low-carbon energy technologies.  The steps of developing and deploying those technologies are not clearly demonstrated.  Does a carbon tax simply equate to low-carbon technology deployment?  I don’t think so.

First, here is an updated graphic showing REMI’s carbon emission cuts compared to other sources:

 photo EPA2014vsEIA2012vsKyotovsREMI2014_zps961bb7c7.png

The blue line with diamonds shows historical CO2 emissions.  The dark red line with squares shows EIA’s 2013 projected CO2 emissions through 2030.  EIA historically showed emissions higher than those observed.  This newest projection is much more realistic.  Next, the green triangles show the intended effect of EPA’s 2014 power plant rule.  I compare these projections against Kyoto `Low` and `High` emission cut scenarios.  An earlier post showed and discussed these comparisons.  I added the modeled result from REMI 2014 as orange dots.

Let me start by noting I have written for years now that we will not achieve even the Kyoto `Low` scenario, which called for a 20% reduction of 1990 baseline emissions.  The report did not clearly specify what baseline year they considered, so I gave them the benefit of the doubt in this analysis and chose 2015 as the baseline year.  That makes their cuts easier to achieve since 2015 emissions were 20% higher than 1990 levels.  Thus, their “33% decrease from baseline” by 2025 results in emissions between Kyoto’s `Low` and `High` scenarios.

REMI starts with a $10 carbon tax in 2015 and increases that tax by $10/year.  In 10 years, carbon costs $100/ton.  That is an incredibly aggressive taxing scheme.  This increase would have significant economic effects.  The report describes massive economic benefits.  I will note that I am not an economist and don’t have the expertise to judge the economic model they used.  I will go on to note that as a climate scientist, all models have fundamental assumptions which affect the results they generate.  The assumptions they made likely have some effect on their results.

Why won’t we achieve these cuts?  As I stated above, technologies are critical to projecting emission cuts.  What does the REMI report show for technology?

 photo REMI2014ElectricalPowerGeneration-2scenarios_zpse41c17d9.png

The left graph shows US electrical power generation without any policy intervention (baseline case).  The right graph shows generation resulting from the $10/year carbon tax policy.  Here is their models’ results: old unscrubbed coal plants go offline in 2022 while old scrubbed coal plants go offline in 2025.  Think about this: there are about 600 coal plants in the US generating the largest single share of electricity of any power source.  The carbon tax model results assumes that other sources will replace ~30% of US electricity in 10 years.  How will that be achieved?  This is the critical missing piece of their report.

Look again at the right graph.  Carbon captured natural gas replaces natural gas generation by 2040.  Is carbon capture technology ready for national-level deployment?  No, it isn’t.  How does the report handle this?  That is, who pays for the research and development first, followed by scaled deployment?  The report is silent on this issue.  Simply put, we don’t know when carbon capture technology will be ready for scaled deployment.  Given historical performance of other technologies, it is safe to assume this development would take a couple of decades once the technology is actually ready.

Nuclear power generation also grows a little bit, as does geothermal and biopower.  This latter technology is interesting to note since it represents the majority of the percentage increase of US renewable power generation in the past 15 years (based on EIA data) – something not captured by their model.

The increase in wind generation is astounding.  It grows from a few hundred Terawatt hours to over 1500 TWh in 20 years time.  This source is the obvious beneficiary to a carbon tax.  But I eschew hard to understand units.  What does it mean to replace the majority of coal plants with wind plants?  Let’s step back from academic exercises that replace power generation wholesale and get into practical considerations.  It means deploying more than 34,000 2.5MW wind turbines operating at 30% efficiency per year every year.  (There are other metrics by which to convey the scale, but they deal with numbers few people intuitively understand.)  According to the AWEA, there were 46,100 utility-scale wind turbines installed in the US at the end of 2012.  How many years have utilities installed wind turbines?  Think of the resources required to install almost as many wind turbines in just one year as already exist in the US.  Just to point out one problem with this installation plan: where do the required rare earth metals come from?  Another: are wind turbine supply chains up to the task of manufacturing 34,000 wind turbines per year?  Another: are wind turbine manufacturing plants equipped to handle this level of work?  Another: are there enough trained workers to supply, make, transport, install, and maintain this many wind turbines?  Another: how is wind energy stored and transmitted from source to use regions (thousands of miles in many cases).

Practical questions abound.  This report is valuable as an academic exercise, but  I don’t see how wind replaces coal in 20 years time.  I want it to, but putting in a revenue-neutral carbon tax probably won’t get it done.  I don’t see carbon capture and sequestration ready for scale deployment in 10 years time.  I would love to be surprised by such a development but does a revenue-neutral carbon tax generate enough demand for low-risk seeking private industry to perform the requisite R&D?  At best, I’m unconvinced it will.

After doing a little checking, a check reminded me that British Columbia implemented a carbon tax in 2008; currently it is $40 (Canadian).  Given that, you might think it serves as a good example of what the US could do with a similar tax.  If you dig a little deeper, you find British Columbia gets 86% of its electricity from hydropower and only 6% from natural gas, making it a poor test-bed to evaluate how a carbon tax effects electricity generation in a large, modern economy.


Leave a comment

N.C.’s Sea Level Rise Reaction

Many people involved in climate activism have probably heard of North Carolina’s reaction to sea level projections.  The reaction has been exaggerated by some of those same activists.  I read this article and had the following thoughts.

By the end of the century, state officials said, the ocean would be 39 inches higher.

There was no talk of salvation, no plan to hold back the tide. The 39-inch forecast was “a death sentence,” Willo Kelly said, “for ever trying to sell your house.”

Coastal residents joined forces with climate skeptics to attack the science of global warming and persuade North Carolina’s Republican-controlled legislature to deep-six the 39-inch projection, which had been advanced under the outgoing Democratic governor. Now, the state is working on a new forecast that will look only 30 years out and therefore show the seas rising by no more than eight inches.

Up to this point, readers probably have one of two reactions.  They either agree with quoted environmentalists and think N.C. tried to “legislate away sea level rise.”  Or they agree with Kelly’s reactions and the legislature’s boundaries on projection scope.

I think the reactions were entirely justified from a personal standpoint and easy to predict if anyone had stopped to think things through.  Nearly everybody would have the same reaction if your property was under threat to be considered worthless – regardless of the underlying reason.  Why?  Because you have an emotional attachment to your property that far exceeds the attachment to a 90-year sea level projection.  You’re going to react to the former more strongly than the latter.  The article identifies the underlying process:

“The main problem they have is fear,” said Michael Orbach, a marine policy professor at Duke University who has met with coastal leaders. “They realize this is going to have a huge impact on the coastal economy and coastal development interests. And, at this point, we don’t actually know what we’re going to do about it.”

This is the problem with the vast majority of climate activists’ language: they coldly announce that civilization will collapse and won’t offer actions people can take to avoid such a collapse.  Well, people will respond to that language, just not the way activists want them to.  People will fight activists and identify with climate skeptics’ arguments since they view the announcements as a threat to their way of life.

Where I differ with Kelly and others is this: she and other coastal residents had better look for viable long-term solutions before that 30-year period is over.  If they prevent long-term planning beyond 2040, inland residents of N.C. will be unfairly burdened with the cost of subsidizing Kelly and others for their lifestyle choices.

Kelly’s view is not without merit, to be sure:

Long before that would happen, though, Kelly worries that codifying the 39-inch forecast would crush the local economy, which relies entirely on tourism and the construction, sale and rental of family beach houses. In Dare County alone, the islands’ largest jurisdiction, the state has identified more than 8,500 structures, with an assessed value of nearly $1.4 billion, that would be inundated if the tides were 39 inches higher.

That’s 8,500 structures in just one county – worth $1.4 billion – an average of $165,000 per structure.  I would absolutely fight to keep my $165,000 worth as long as I could.  Nationwide, the estimate is $700 billion; not a trivial sum is it?  The article has this choice quote:

“What is it you would ask us to do differently right now? Tell people to move away?”   “Preaching abandonment is absurd. People would go in the closet and get the guns out.”

The Coastal Resources Commission bungled their attempt to evaluate the science and establish policy.  By the time they announced results with no action plans, rumors fed by misunderstanding and bias confirmation ran rampant.  The result was Kelly’s actions to change the time horizon that planners could use.

So what are the solutions?  The Commission should establish and maintain relationships with stakeholders.  Get to know the mayors and planners and scientists and property owners.  Find out what their interests are and what motivates them to do what they do.  Identify actions they can take in the next 30 years that sets them up for success afterward.  But don’t release information without context.  Because sea level rise is likely to accelerate in the 2nd half of the 21st century.  But most people will focus on potential direct threats to themselves and their livelihoods, not global concerns.  So get into the weeds with folks.


Leave a comment

More on EPA’s Proposed CO2 Emissions Rule: Podesta; Role of Science

I just found this article and wanted to point out a couple of things related to my post on the EPA’s proposed CO2 emissions rule.  The first (emphasis mine):

In a two-hour interview conducted just weeks before his return to Obama’s inner circle as White House Counsel, Podesta told me that the president had been willing to take risks and expend political capital on the climate issue. “But fifty years from now, is that going to seem like enough?” Podesta asked. “I think the answer to that is going to be no.

Podesta blamed Obama’s spotty climate record in part on the president’s top aides during his first term (aides who Podesta, as Obama’s transition director in 2008, helped select). The aides’ attitudes about climate change, Podesta recalled, were dismissive at best: “Yeah, fine, fine, fine, but it’s ninth on our list of eight really important problems.

I agree with Podesta’s assessment that fifty years from now people will look back and judge that Obama and everyone else didn’t do enough to curtail GHG emissions and prevent a great deal of additional global warming.  That isn’t a slight on Obama’s character – or anyone else’s – it’s a statement on how I view action on the topic.

Isn’t it interesting that Podesta helped select the same aides who refused to push climate higher on the problem list?  Podesta is a smart guy – he knew what peoples’ pet issues were and what weren’t on their list of priorities.  So in the same interview that Podesta says Obama’s climate actions won’t seem like enough in fifty years, Podesta lays some blame at the feet of first-term aides who didn’t prioritize climate for the lack of Obama’s action.  Perhaps a little self-assessment didn’t make the article due to editing, but it would be nice to see people take responsibility for how we’ve gotten here.  That includes Democrats and climate activists right along with Republicans and skeptics.

The next quote really rankles me:

The Obama Administration’s newly proposed regulations on power plants illustrate how the president continues to fall short of what science demands in the face of rapidly accelerating climate change. From a scientific perspective, there is much less to these regulations than either industry opponents or environmental advocates are claiming.

[...]

The science he is faced with [...] demand actions that seem preposterous to the political and economic status quo.

This language implicitly assumes that what certain people want should take precedence over others.  The author, like many others, think they would like those certain people to be scientists instead of conservative theologians or accountants or any other person.  Science doesn’t demand anything in this or any other instance.  We use physical science to assess what the physical effects of GHGs have been and will be on the climate system.  That’s where physical science ends.  If you want to do anything about that information, you bring in social science – political science, sociology, environmental science, philosophy, etc.  Those fields have much to say about what to do and why a particular course of action might be desirable – see normative theory.

Too many people confuse the two.  Or more accurately in the climate change realm, they argue using physical science as a proxy in normative debates.  This is a large source of the polarization of science today.  Instead of using proxies, people should debate the core issues.  If the core issue is the political left versus right, the debate should be on value systems and specific values.  Instead, people drag climate science into the normative debate and among the results is the refusal to accept climate science as valid by skeptics.  This has more to do with perception of legitimate authority than the actual science.

Back to the science:

Podesta, however, acknowledged that Obama’s climate policy (as it stood last November) would not hit the 2°C target. “Maybe it gets you on a trajectory to three degrees,” he said, “but it doesn’t get you to two degrees.”

I wrote much the same thing.  The science is quite clear on this.  Whether you think the policy is bad or good or whether hitting or not hitting the 2°C target is a bad or good thing are separate discussions.  Personally, I think not hitting the 2°C target is a bad thing.  But I know that’s a normative judgment about a scientific result.  I therefore support more effective policy actions such as a carbon tax.

Again, this rule is merely proposed at this time.  EPA originally said it would propose the rule in 2011-2012, then put it on indefinite hold so Obama could run for re-election.  It will now face legal challenges.  It will not go into effect for at least two years, and quite possibly four to six years after all the legal challenges.  In that time frame, we will have at least one new president, who will put their choice for EPA administrator in place, who will be responsible for directing the agency on the rule’s implementation.  The rule will be effective until 2030 and will face two additional presidential election results.  Do climate activists think Republicans will leave the rule alone through 2030?  How do we square that with the knowledge the rule is far from sufficient to limit warming to <2°C?  What are the next policy steps with these real world boundaries?


3 Comments

EPA’s Proposed CO2 Emissions Rule in Context

 photo EPA2014vsEIA2012vsKyoto_zps8d150e25.png

If you follow climate and energy news, you probably have or will encounter media regarding today’s proposed CO2 emissions rule by the EPA.  Unfortunately, that media will probably not be clear about what the rule means in understandable terms.  I’m writing this in an attempt to make the proposed rule more clear.

The graph above shows US CO2 emissions from energy consumption.  This includes emissions from coal, oil, and natural gas.  I have differentiated historical emissions in blue from 2013 EIA projections made in red, what today’s EPA proposal would mean for future emission levels, and low and high reductions prescribed by the Kyoto Protocol, which the US never ratified.

In 2011, historical US energy-related emissions totaled 5,481 million metric tons of CO2.  For the most part, you can ignore the units and just concentrate on emission’s magnitude: 5,481.  If the EPA’s proposed rule goes into effect and achieves what it sets out to achieve, 2020 emissions could be 4,498 MMT and 2030 emissions could be 4,198 MMT (see the two green triangles).  Those 2030 emissions would be lower than any time since 1970 – a real achievement.  It should be apparent by the other comparisons that this potential achievement isn’t earth shaking however.

Before I get further into that, compare the EPA-related emissions with the EIA’s projections out to 2030.  These projections were made last year and are based on business as usual – i.e., no federal climate policy or EPA rule.  Because energy utilities closed many of their dirtiest fossil fuel plants following the Great Recession due to their higher operating costs and the partial transfer from coal to natural gas, the EIA now projects emissions just above 2011’s and below the all-time peak.  I read criticism of EIA projections this weekend (can’t find the piece now) that I think was too harsh.  The EIA historically projected emissions in excess of reality.  I don’t think their over-predictions are bad news or preclude their use in decision-making.  If you know the predictions have a persistent bias, you can account for it.

So there is a measurable difference between EIA emission projections and what could happen if the EPA rule is enacted and effective.  With regard to that latter characterization, how effective might the rule be?

If you compare the EPA emission reductions to the Kyoto reductions, it is obvious that the reductions are less than the minimum requirement to avoid significant future climate change.  But first, it is important to realize an important difference between Kyoto and the EPA rule: the Kyoto pathways are based off 1990 emissions and the EPA is based off 2005 emissions.  What happened between 1990 and 2005 in the real world?  Emissions rose by 19% from 5,039 MMT to 5,997 MMT.  The takeaway: emission reductions using 2005 as a baseline will result in higher final emissions than using a 1990 baseline.

If the US ratified and implemented Kyoto on the `Low` pathway (which didn’t happen), 2020 emissions would be 4,031 MMT (467 MMT less than EPA; 1445 MMT less than EIA) and 2050 emissions would be 2,520 MMT (no comparison with EPA so far).  If the US implemented the `High` pathway, 2020 emissions would be 3,527 MMT (971 MMT less than EPA!; 1,949 MMT less than EIA!) and 2050 emissions would be drastically slashed to 1,008 MMT!

Since we didn’t implement the Kyoto Protocol, we will not even attain 2020 `Kyoto Low` emissions in 2030.  Look at the graph again.  Connect the last blue diamond to the first green triangle.  Even though they’re the closest together, you can immediately see we have a lot of work to do to achieve even the EPA’s reduced emissions target.  Here is some additional context: to keep 2100 global mean temperatures <2C, we have to achieve the lowest emissions pathway modeled by the IPCC for the Fifth Assessment Report (see blue line below):

 photo CO2_Emissions_AR5_Obs_Nature_article_zps1e766d71.jpg

Note the comment at the bottom of the graph: global CO2 emissions have to turn negative by 2070, following decades of declines.  How will global emissions decline and turn negative if the US emits >3,000 MMT annually in 2050?  The short answer is easy: they won’t.  I want to combine my messages so far in this post: we have an enormous amount of work to reduce emissions to the EPA level.  That level is well below Kyoto’s Low level, which would have required a lot of work in today’s historical terms.  That work now lies in front of us if we really want to avoid >2C warming and other effects.  I maintain that we will not reduce emissions commensurate with <2C warming.  I think we will emit enough CO2 that our future will be along the RCP6.0 to RCP8.5 pathways seen above, or 3-5C warming and related effects.

Another important detail: the EPA’s proposed rule has a one-year comment period which will result in a final rule.  States then have another year to implement individual plans to achieve their reductions (a good idea).  The downside: the rule won’t go into effect until 2016 – only four years before the first goal.  What happens if the first goal isn’t achieved?  Will future EPA administrators reset the 2030 goal so it is more achievable (i.e., higher emissions)?  Will lawsuits prevent rule implementation for years?  There are many potential setbacks for implementing this rule.  And it doesn’t achieve <2C warming, not even close.


Leave a comment

NASA & NOAA: April 2014 Warmest Globally On Record

According to data released by NASA and NOAA this month, April was the warmest April globally on record.  Here are the data for NASA’s analysis; here are NOAA data and report.  The two agencies have different analysis techniques, which in this case resulted in slightly different temperature anomaly values but the same overall rankings within their respective data sets.  The analyses result in different rankings in most months.  The two techniques do provide a check on one another and confidence for us that their results are robust.  At the beginning, I will remind readers that the month-to-month and year-to-year values and rankings matter less than the long-term climatic warming.  Weather is the dominant factor for monthly and yearly conditions, not climate.

The details:

April’s global average temperature was 0.73°C (1.314°F) above normal (14°C; 1951-1980), according to NASA, as the following graphic shows.  The past three months have a +0.63°C temperature anomaly.  And the latest 12-month period (May 2013 – Apr 2014) had a +0.62°C temperature anomaly.  The time series graph in the lower-right quadrant shows NASA’s 12-month running mean temperature index.  The 2010-2012 downturn was largely due to the last La Niña event (see below for more).  Since then, ENSO conditions returned to a neutral state (neither La Niña nor El Niño).  As previous anomalously cool months fell off the back of the running mean, the 12-month temperature trace tracked upward again throughout 2013 and 2014.

 photo NASA-Temp_Analysis_20140430_zps82150da6.gif

Figure 1. Global mean surface temperature anomaly maps and 12-month running mean time series through April 2014 from NASA.

According to NOAA, April’s global average temperatures were +0.77°C (1.386°F) above the 20th century average of 13.7°C (56.7°F).  NOAA’s global temperature anomaly map for April (duplicated below) shows where conditions were warmer and cooler than average during the month.

 photo NOAA-Temp_Analysis_201404_zps92d3f6cb.gif

Figure 2. Global temperature anomaly map for August 2013 from NOAA.

The two different analyses’ importance is also shown by the preceding two figures.  Despite differences in specific global temperature anomalies, both analyses picked up on the same spatial temperature patterns and their relative strength.

Influence of ENSO

 photo NinoSSTAnom20140501_zpsc925f282.gif

Figure 3. Time series of weekly SST data from NCEP (NOAA).  The highest interest region for El Niño/La Niña is `NINO 3.4` (2nd time series from top).

There has been neither El Niño nor La Niña in the past couple of years.  This ENSO-neutral phase is common.  As you can see in the NINO 3.4 time series (2nd from top in Figure 3), Pacific sea surface temperatures were relatively cool in January through March, then quickly warmed.  This switch occurred because normal easterly winds (blowing toward the west) across the equatorial Pacific relaxed and two significant westerly wind bursts occurred in the western Pacific.  These anomalous winds generated an eastward moving Kelvin wave, which causes downwelling and surface mass convergence.  Warm SSTs collect along the equator as a result.  These Kelvin waves eventually crossed the entire Pacific Ocean, as Figure 4 shows.

 photo PacifcOcEqTAnomaly20140523_zpsff7554f1.gif

Figure 4.  Sub-surface Pacific Ocean temperature anomalies from Jan-Apr 2014.  Anomalously cool eastern Pacific Ocean temperatures in January gave way to anomalously warm temperatures by April.  Temperatures between 80W and 100W warmed further since April 14.

The Climate Prediction Center announced an El Niño Watch earlier this year.  The most recent update says the chances of an El Niño during the rest of 2014 exceeds 65%.  There is no reliable prediction of the potential El Niño’s strength at this time.  Without another westerly wind burst, an El Niño will likely not be very strong.  Even moderate strength El Niños impact global weather patterns.

An important detail is whether the potential 2014 El Niño will be an Eastern or Central Pacific El Niño (see figure below).  Professor Jin-Yi Yu, along with colleagues, first proposed the difference in a 2009 Journal of Climate paper.  More recently, Yu’s work suggested a recent trend toward Central Pacific El Niños influenced the frequency and intensity of recent U.S. droughts.  This type of El Niño doesn’t cause global record temperatures, but still impacts atmospheric circulations and the jet stream, which impacts which areas receive more or less rain.  If the potential 2014 El Niño is an Eastern Pacific type, we can expect monthly global mean temperatures to spike and the usual precipitation anomalies commonly attributed to El Niño.

 photo EastvsCentralPacificENSOschematic_zps08856e81.jpg

Figure 5. Schematic of Central-Pacific ENSO versus Eastern-Pacific ENSO as envisioned by Dr. Jin-Yi Yu at the University of California – Irvine.

If an El Niño does occur later in 2014, it will mask some of the deep ocean heat absorption by releasing energy back to the atmosphere.  If that happens, the second half of 2014 and the first half of 2015 will likely set global surface temperature records.  2014, 2015, or both could set the all-time global mean temperature record (currently held by 2010).  Some scientists recently postulated that an El Niño could also trigger a shift from the current negative phase of the Interdecadal Pacific Oscillation (IPO; or PDO for just the northern hemisphere) to a new positive phase.  This would be similar in nature, though different in detail, as the shift from La Niña or neutral conditions to El Niño.  If this happens, the likelihood of record hot years would increase.  I personally do not believe this El Niño will shift the IPO phase.  I don’t think this El Niño will be strong enough and I don’t think the IPO is in a conducive state for a switch to occur.

The “Hiatus”

Skeptics have pointed out that warming has “stopped” or “slowed considerably” in recent years, which they hope will introduce confusion to the public on this topic.  What is likely going on is quite different: since an energy imbalance exists (less energy is leaving the Earth than the Earth is receiving; this is due to atmospheric greenhouse gases) and the surface temperature rise has seemingly stalled, the excess energy is going somewhere.  The heat has to go somewhere – energy doesn’t just disappear.  That somewhere is likely the oceans, and specifically the deep ocean (see figure below).  Before we all cheer about this (since few people want surface temperatures to continue to rise quickly), consider the implications.  If you add heat to a material, it expands.  The ocean is no different; sea-levels are rising in part because of heat added to it in the past.  The heat that has entered in recent years won’t manifest as sea-level rise for some time, but it will happen.  Moreover, when the heated ocean comes back up to the surface, that heat will then be released to the atmosphere, which will raise surface temperatures as well as introduce additional water vapor due to the warmer atmosphere.  Thus, the immediate warming rate might have slowed down, but we have locked in future warming (higher future warming rate).

 photo Ocean_heat_content_balmaseda_et_al_zps23184297.jpg

Figure 6. Recent research shows anomalous ocean heat energy locations since the late 1950s.  The purple lines in the graph show how the heat content of the whole ocean has changed over the past five decades. The blue lines represent only the top 700 m and the grey lines are just the top 300 m.  Source: Balmaseda et al., (2013)

You can see in Figure 6 that the upper 300m of the world’s oceans accumulated less heat during the 2000s (5*10^22 J) than during the 1990s.  In contrast, accumulated heat greatly increased in ocean waters between 300m and 700m during the 2000s (>10*10^22 J).  We cannot and do not observe the deep ocean with great frequency.  We do know from frequent and reliable observations that the sea surface and relatively shallow ocean did not absorb most of the heat in the past decade.  We also know how much energy came to and left the Earth from satellite observations.  If we know how much energy came in, how much left, and how much the land surface and shallow ocean absorbed, it is a relatively straightforward computation to determine how much energy likely remains in the deep ocean.

Discussion

The fact that April 2014 was the warmest on record despite a negative IPO and a neutral ENSO is eye-opening.  I think it highlights the fact that there is an even lower frequency signal underlying the IPO, ENSO, and April weather: anthropogenic warming.  That signal is not oscillatory, it is increasing at an increasing rate and will continue to do so for decades to centuries.  The length of time that occurs and its eventual magnitude is dependent on our policies and activities.  We continue to emit GHGs at or above the high-end of the range simulated by climate models.  Growth in fossil fuel use at the global scale continues.  This growth dwarfs any effect of a switch to energy sources with lower GHG emissions.  I don’t think that will change during the next 15 years, which would lock us into the warmer climate projections through most of the rest of the 21st century.  The primary reason for this is the scale of humankind’s energy infrastructure.  Switching from fossil fuels to renewable energy will take decades.  Acknowledging this isn’t defeatist or pessimistic; it is I think critical in order to identify appropriate opportunities and implement the type and scale of policy responses to encourage that switch.


1 Comment

Climate and Energy Topics – 21 May 2014

The New York Times’ Andy Revkin had this very interesting post last week: “Three Long Views of Life With Rising Seas“.  He asked three folks for their long-term view on how human might deal with the centennial-scale effects of Antarctic glacier melt.  Some of their (partial) responses merit further thought:

Curt Stager, Paul Smith: Imagine the stink we would all raise if another nation tried to take even one inch of our coastline away from us – and yet here is a slow taking of countless square miles from our shores by a carbon-driven ocean-turned-invader.

David Grinspoon: But I think if our society is around for several more centuries we will have to have found different ways to deal collectively with our world-changing technologies. If we’ve made it that far, we’ll find ways to adapt.

Kim Stanley Robinson: It was when the ice core data in Greenland established the three-year onset of the Younger Dryas that the geologists had to invent the term “abrupt climate change” because they had so frequently abused the word “quick” sometimes meaning several thousand years when they said that. Thus the appearance of “Abrupt Climate Change” as a term (and a National Research Council book in 2002).

Andy Revkin finished with: The realities of sea-level rise and Antarctic trends and China’s emissions, etc., make me feel ever more confident that the [bend, stretch, reach, teach] shift I charted for my goals in my TEDx talk (away from numbers and toward qualities) is the right path.

Chinese coal use almost equals that of the rest of the world combined, according to the EIA:

 photo ChineseCoalUsage20140521_zpsac73e973.png

This is but one reason I believe <2C warming is already a historical consideration.  All of this coal production and consumption would have to stop immediately if we have any hope of meeting this political goal.  That will not happen – absent coal generated power, which constitutes the majority generated, the global economy would spin into a depression.

On the good news front, U.S. consumers are expanding home energy efficiency and distributed power generation, according to Deloitte.  These practices started with the Great Recession, but for the first time are continuing after the economy “recovers”.  In 2013, new solar growth occurred among families making between $40,000 and $90,000.  The most engaged demographic could be Generation Y: “1/3 said they “definitely/probably” will buy a smart energy application, which is up from 28 percent in 2011.”

I’ve let my drought series lapse, but have kept watching conditions evolve across the country.  California has obviously been in the news due its drought and wildfires.  All of California is currently in a “severe” drought for the first time since the mid-1970s (see picture below).  So the quick science point: this has happened before (many times; some worse than this) and isn’t primarily caused by anthropogenic forcing.  The quick impacts point: California’s population is double today what it was in the mid-1970s.  Therefore, the same type of drought will have more impact.  Wrapping these points together: drought impacts could be greater in the 2010s than the 1970s due to sociological and not physical factors.  An important caveat: Californians are more adept now at planning for and responding to drought.  They recognize how dry normal conditions can get and have adapted more so than other places in the U.S.  Drought conditions likely won’t improve until this winter during the next rainy season since last winter was a bust for them.

 photo CAdrought20140521_zpsd403ee59.jpg

An incredible story comes from the New York Times about what it takes to engage communities on climate and energy issues.  Nebraska farmers and ranchers are fighting against the Keystone XL pipeline.  Why, you might ask?  Well, they’re certainly not a bunch of hippie greens.  No, they’re responding to their lifestyle and value system.  If KXL is built, it will be built on their land.  That means someone will take away small pieces of a bunch of farmers land, because the locals have already refused $250,000 payments for them.  If KXL is built, it will risk locals’ cattle.  Who do you think will suffer if the pipeline leaks?  The cows, the ranchers, and the Ogallala Aquifer of course.  A critical piece of the paper is this:

Here was one of the best stories she’d ever seen: Conservative American farmers rise up to protect their land. She could use the image of the family farm to reframe the way Nebraskans thought about environmentalism. It wasn’t going to be Save the Sandhill Cranes. It was going to be Save the Neighbors.

To get Nebraskans to respond to environmental issues, you have to engage them on their values, not yours (unless of course you share them).  This is the key that environmentalists have missed for decades and its part of the reason why environmentalism is so politicized.  It’s why conservatives tend not to respond to climate activism framing.

There’s plenty more where this came from.  Stay tuned.

Follow

Get every new post delivered to your Inbox.

Join 293 other followers