Weatherdem's Weblog

Bridging climate science, citizens, and policy


3 Comments

Energy and Climate Stories Via Charts

The following charts show different pieces of a sobering story: the US and the world has not and is not in the foreseeable future doing enough to reduce carbon-intensive energy.  This shouldn’t come as any great surprise, but I think these charts enable us to look at the story graphically instead of just hearing the words.  Graphics tend to have a larger impact on thought retention, so I’m going to use them to tell this story.

 photo GlobalEnergyByType-2013ProjectionbyBNEF_zps7ec53b2d.jpg

Figure 1. Annual global installations of new power sources, in gigawatts.  [Source: MotherJones via BNEF]

This figure starts the story off on a good note.  To the left of the dotted line is historical data and to the right is BNEF’s projected data.  In the future, we expect fewer new gigawatts generated by coal, gas, and oil.  We also expect many more new gigawatts generated by land-based wind, small-scale photovoltaic (PV) and solar PV.  Thus the good news: there will be more new gigawatts powered by renewable energy sources within the next couple of years than dirty energy sources.  At the same time, this graph is slightly misleading.  What about existing energy production?  The next chart takes that into account.

 photo GlobalEnergyTotalByType-2013ProjectionbyBNEF_zpsdb2d8856.jpg

Figure 2. Global energy use by generation type, in gigawatts.  [Source: MotherJones via BNEF]

The story just turned sober.  In 2030, coal should account for ~2,000GW of energy production compared to ~1,200GW today.  Coal is the dirtiest of the fossil fuels, so absent radical technological innovation and deployment, 2030 emissions will exceed today’s due to coal alone.  We find the same storyline for gas and to a lesser extent oil: higher generation in 2030 than today means more emissions.  We need fewer emissions if we want to reduce atmospheric CO2 concentrations.  The higher those concentrations, the warmer the globe will get until it reaches a new equilibrium.

Compare the two graphs again.  The rapid increase in renewable energy generation witnessed over the last decade and expected to continue through 2030 results in what by 2030?  Perhaps ~1,400GW of wind generation (about the same as gas) and up to 1,600GW of total solar generation (more than gas but still less than coal).  This is an improvement over today’s generation portfolio of course.  But it will not be enough to prevent >2°C mean global warming and all the subsequent effects that warming will have on other earth systems.  The curves delineating fossil fuel generation need to slope toward zero and that doesn’t look likely to happen prior to 2030.

Here is the basic problem: there are billions of people without reliable energy today.  They want energy and one way or another will get that energy someday.  Thus, the total energy generated will continue to increase for decades.  The power mix is up to us.  The top chart will have to look dramatically different for the mix to tilt toward majority and eventually exclusively renewable energy.  The projected increases in new renewable energy will have to be double, triple, or more what they are in the top chart to achieve complete global renewable energy generation.  Instead of a couple hundred gigawatts per year, we need a couple thousand gigawatts per year.  That requires a great deal of innovation and deployment – more than even many experts are aware.

Let’s take a look at the next part of the story: carbon emissions in the US – up until recently the largest annual GHG emitter on the globe.

 photo carbon-intensity-us-states-economy-eia-20130530_zps280fe2fb.png

Figure 3. Percent change in the economy’s carbon intensity 2000-2010. [Source: ThinkProgress via EIA]

As Jeff notes, the total carbon intensity (amount of carbon released for every million dollars the economy produces) of the economy dropped 17.9 percent over those ten years.  That’s good news.  Part of the reason is bad news: the economy became more energy-efficient in part due to the recession.  People and organizations stopped doing some of the most expensive activities, which also happened to be some of the most polluting activities.  We can attribute the rest of the decline to the switch from coal to natural gas.  Which is a good thing for US emissions, but a bad thing for global emissions because we’re selling the coal that other countries butn – as Figure 2 shows.

 photo carbon-emissions-us-states-eia-20130530_zps17d52b6b.png

Figure 4. Percent change in the economy’s total carbon emissions 2000-2010. [Source: ThinkProgress via EIA]

Figure 4 re-sobers the story.  While we became more efficient at generating carbon emissions, the total number of total emissions from 2000 to 2010 only dropped 4.2%.  My own home state of Colorado, despite having a Renewable Energy Standard and mandates renewables in the energy mix, saw a greater than 10% jump in total carbon emissions.  Part of the reason is Xcel Energy convinced the state Public Utilities Commission that new, expensive coal plants be built.  The reason?  Xcel is a for-profit corporation and new coal plants added billions of dollars to the positive side of their ledger, especially since they passed those costs onto their rate payers.

In order for the US to achieve its Copenhagen goals (17% reduction from 2005 levels), more states will have to show total carbon emission declines post-2010.  While 2012 US emission levels were the lowest since 1994, we still emit more than 5 billion metric tons of CO2 annually.  Furthermore, the US deliberately chose 2005 levels since they were the historically high emissions mark.  The Kyoto Protocol, by contrast, challenged countries to reduce emissions compared to 1990 levels.  The US remains above 1990 levels, which were just under 5 billion metric tons of CO2.  17% of 1990 emissions is 850 million metric tons.  Once we achieve that decrease, we can talk about real progress.

The bottom line is this: it matters how many total carbon emissions get into the atmosphere if we want to limit the total amount of warming that will occur this century and the next few tens of thousands of years.  There has been a significant lack of progress on that:

 photo energy_sector_carbon_intensity-20130530_zpsae891a88.jpg

Figure 5. Historical and projection energy sector carbon intensity index.

We are on the red line path.  If that is our reality through 2050, we will blow past 560 ppm atmospheric CO2 concentration, which means we will blow past the 2-3°C sensitivity threshold that skeptics like to talk about the most.  That temperature only matters if we limit CO2 concentrations to two times their pre-industrial value.  We’re on an 800-1100 ppm concentration pathway, which would mean up to 6°C warming by 2100 and additional warming beyond that.

The size and scope of the energy infrastructure requirements to achieve an 80% reduction in US emissions from 1990 levels by 2050 is mind-boggling.  It requires 300,000 10-MW solar thermal plants or 1,200,000 2.5-MW wind turbines or 1,300 1GW nuclear plants (or some combination thereof) by 2050 because you have to replace the existing dirty energy generation facilities as well as meet increasing future demand.  And that’s just for the US.  What about every other country on the planet?  That is why I think we will blow past the 2°C threshold.  As the top graphs show, we’re nibbling around the edges of a massive problem.  We will not see a satisfactory energy/climate policy emerge on this topic anytime soon.  The once in a generation opportunity to do so existed in 2009 and 2010 and national-level Democrats squandered it (China actually has a national climate policy, by the way).  I think the policy answers lie in local and state-based efforts for the time being.  There is too wide a gap between the politics we need and the politics we have at the national level.


Leave a comment

Obamacare’s ‘Cadillac Tax’ Exposes Policy Weaknesses

In 2009 and 2010, I had many discussions with people about the Affordable Care Act (Obamacare).  At the outset let me explain that health care reform would have been expanding Medicare to every American.  It has the lowest overhead of any service and would have resulted in providing health care to everybody regardless of income or any other metric.  My fallback position was a Medicare opt-in as part of state-based or national-based health exchanges.  Let the private for-profit corporations compete against Medicare in the free market.  As conservatives usually say (but ran away from in this instance), let the market decide.  Well, we all know how non-free the market is.  Conservatives and Libertarians love to pick winners: as long as they’re winning.

Instead, President Obama spent two years’ worth of political capital on a search for his First Grand Bargain.  And make no mistake: he got exactly what he wanted.  Instead of health care reform, Americans were saddled with a health insurance giveaway.  Millions of Americans won’t be allowed to make a choice in the market; they will be forced to buy something.  That is a disgusting development in our country’s history.

Here is an anecdote that demonstrates the fundamental weakness of the “reform”: “While it might reduce health care spending, for many people it doesn’t reduce the cost of care.”  If you’re healthy, things will be great because you’ll receive free or cheap preventative care.  If you’re really sick, things will get worse because you’ll pay more and more for the same care you’ve been receiving.  Oops.  As Joan says, “if you have a serious health issue and were previously uninsured because of your pre-existing condition, you can at least get insurance now.”  Note the critical missing piece in that sentence: you won’t get quality care; you’ll get insurance.  Which, depending on your socioeconomic status, means you could get good care or crappy care.  That is the big reform as part of the President’s Grand Bargain.

Joan goes on to say, “The actual health care they receive needs to be made less expensive. That’s where the next steps in reform have to be made.”

Um, duh.  But just when we make those next reform steps?  That was the elephant in the room in my 2009-2010 discussions with Obamacare zealots.  Nobody was willing to say how they would make those next steps … or when.  The only thing they would say was it would eventually happen because incrementalism was the proper strategic political choice.  It became clear to me later that incrementalism works for folks in the establishment.  It keeps them employed for years and decades as tiny steps are taken every decade or two.  Meanwhile, Abbey and Casey Bruce’s bills will double in cost.  How many millions of Americans face higher medical bills in 2014 because the establishment folks decided incremental steps are the best?  President Obama and a bunch of other folks were reelected in 2012.  Are they pushing additional health care reform?  No and they won’t either.  They did health care reform.  We’ll have to wait until some undetermined point in the future to try for true health care reform again.


Leave a comment

Can scientific issues be up for political debate?

The short answer should obviously be yes.  But within the climate change realm, there are some folks who think that scientific realities should dictate political attitudes:

Even as some studies suggest the potential for double-digit warming across the globe, the media has been stubbornly silent, treating climate change as an issue that is still up for political debate, instead of a scientific reality.

That is a dangerous viewpoint to hold and to operate from.  This isn’t an either-or choice to make.  Politics and science are two very different enterprises for many different reasons.  Would these same advocates accept dictated political attitudes based on religious reality?  Of course they wouldn’t.  So why should others blindly adopt their viewpoint?

This is but one example of climate advocates trying to silence others’ opinions, the same charge that they accuse the fossil fuel industry of doing to them.  Which leads us to a rather inevitable conclusion: the fight isn’t about “reality” vs. politics (note the frame – if you don’t agree, you’re not a part of someone’s “reality”).  The fight is over value systems.  Many climate activists are using science as a proxy in a battle which demands other tools.

Another note: if the media isn’t paying “enough attention” to your BIG problem, perhaps the problem lies in your messaging and not the media’s bias.  Doubling down on used-up rhetoric isn’t going to sell your story any better.


Leave a comment

NASA & NOAA: April 2013 13th Warmest Globally On Record

According to data released by NASA and NOAA last week, April was the 13th warmest April globally on record.  Here are the data for  NASA’s analysis; here are NOAA data and report.  The two agencies have slightly different analysis techniques, which in this case resulted in different temperature anomaly values but the same overall rankings.  Most months, the analyses result in different rankings.  The two techniques do provide a check on one another and confidence for us that their results are robust.

The details:

April’s global average temperatures were 0.50°C (0.9°F) above normal (1951-1980), according to NASA, as the following graphic shows.  The past three months have a +0.53°C temperature anomaly.  And the latest 12-month period (Apr 2012 – Mar 2013) had a +0.59°C temperature anomaly.  The time series graph in the lower-right quadrant shows NASA’s 12-month running mean temperature index.  The 2010-2012 downturn was largely due to the latest La Niña event (see below for more) that ended early last summer.  Since then, ENSO conditions returned to a neutral state (neither La Niña nor El Niñ0).  Therefore, as previous anomalously cool months fall off the back of the running mean, and barring another La Niña, the 12-month temperature trace should track upward again throughout 2013.

 photo NASA-Temp_Analysis_20130430_zpsd93c9d48.gif

Figure 1. Global mean surface temperature anomaly maps and 12-month running mean time series through April 2013 from NASA.

According to NOAA, April’s global average temperatures were 0.52°C (0.94°F) above the 20th century mean of 13.7°C (56.7°F).  NOAA’s global temperature anomaly map for April (duplicated below) shows where conditions were warmer and cooler than average during the month.

 photo NOAA-Temp_Analysis_201304_zps204a8f35.gif

Figure 2. Global temperature anomaly map for January 2013 from NOAA.

The two different analyses’ importance is also shown by the preceding two figures.  Despite differences in specific global temperature anomalies, both analyses picked up on the same temperature patterns and their relative strength.

Both analyses show much cooler than normal conditions over most of North America, Europe, and northeast Asia.  As I’ve discussed elsewhere, this is in response to the abnormal jet stream.  Large, unmoving high pressure centers blocked the jet stream at different locations in the Northern Hemisphere multiple times this winter and spring.  The jet stream therefore assumed a high amplitude pattern where the trough and ridge axes were tens of degrees of latitude apart from one another.  When this happens, very cold air is pulled southward and warm air is pulled northward (look at central Eurasia).  In April 2013, the specific position of the high pressure centers caused cold air to spill southward over land as opposed to over the oceans.  These cold air outbreaks were an advantage for the US in that severe storms were unable to form.  This situation obviously broke down in the past couple of weeks and we have correspondingly seen devastating severe weather outbreaks across the south-central US.

During the second half of last year, a ENSO-neutral state (neither El Niño nor La Niña) began, which continues to this day:

 photo NinoSSTAnom20130501_zpsf742a7c0.gif

Figure 3. Time series of weekly SST data from NCEP (NOAA).  The highest interest region for El Niño/La Niña is NINO 3.4 (2nd time series from top).

The last La Niña event hit its highest (most negative) magnitude more than once between November 2011 and February 2012.  Since then, tropical Pacific sea-surface temperatures peaked at +0.8 (y-axis) in September 2012.  You can see the effect on global temperatures that the last La Niña had via this NASA time series.  Both the sea surface temperature and land surface temperature time series decreased from 2010 (when the globe reached record warmth) to 2012.  So a natural, low-frequency climate oscillation affected the globe’s temperatures during the past couple of years.  Underlying that oscillation is the background warming caused by humans.  And yet temperatures were still in the top-10 warmest for a calendar year (2012) and individual months, including through March 2013, in recorded history.  We ascribe a certain status to top-10 events.  April 2013 obviously missed the top-10 threshold, but it remains close to that level of anomalous warmth.  However, the difference in temperature magnitude between the 10th and 13th warmest Aprils is measured in tenths of a degree.

Skeptics have pointed out that warming has “stopped” or “slowed considerably” in recent years, which they hope will introduce confusion to the public on this topic.  What is likely going on is quite different: since an energy imbalance exists (less outgoing energy than incoming energy due to atmospheric greenhouse gases) and the surface temperature rise has seemingly stalled, the excess energy is going somewhere.  That somewhere is likely the oceans, and specifically the deep ocean (see figure below).  Before we all cheer about this (since few people want surface temperatures to continue to rise quickly), consider the implications.  If you add heat to a material, it expands.  The ocean is no different; sea-levels are rising in part because of heat added to it in the past.  The heat that has entered in recent years won’t manifest as sea-level rise for some time, but it will happen.  Moreover, when the heated ocean comes back up to the surface, that heat will then be released to the atmosphere, which will raise surface temperatures as well as introduce additional water vapor due to the warmer atmosphere.  Thus, the immediate warming rate might have slowed down, but we have locked in future warming (higher future warming rate).

 photo Ocean_heat_content_balmaseda_et_al_zps23184297.jpg

Figure 4. New research that shows anomalous ocean heat energy locations since the late 1950s.  The purple lines in the graph show how the heat content of the whole ocean has changed over the past five decades. The blue lines represent only the top 700 m and the grey lines are just the top 300 m.  Source: Balmaseda et al., (2013)

Paying for recovery from seemingly localized severe weather and climate events is and always will be more expensive than paying to increase resilience from those events.  As drought continues to impact US agriculture, as Arctic ice continues to melt to new record lows, as storms come ashore and impacts communities that are not prepared for today’s high-risk events (due mostly to poor zoning and destruction of natural protections), economic costs will accumulate in this and in future decades.  It is up to us how many costs we subject ourselves to.  As President Obama begins his second term with climate change “a priority”, he tosses aside the most effective tool available and most recommended by economists: a carbon tax.  Every other policy tool will be less effective than a Pigouvian tax at minimizing the actions that cause future economic harm.  It is up to the citizens of this country, and others, to take the lead on this topic.  We have to demand common sense actions that will actually make a difference.  But be forewarned: even if we take action today, we will still see more warmest-ever La Niña years, more warmest-ever El Niño years, more drought, higher sea levels, increased ocean acidification, more plant stress, and more ecosystem stress.  The biggest difference between efforts in the 1980s and 1990s to scrub sulfur and CFC emissions and future efforts to reduce CO2 emissions is this: the first two yielded an almost immediate result while it will take decades to centuries before CO2 emission reductions produce tangible results humans can see.  That is part of what makes climate change such a wicked problem.


3 Comments

State of Polar Sea Ice – April 2013: Arctic Sea Ice Decline and Antarctic Sea Ice Gain

Global polar sea ice area in April 2013 tracked back to climatological normal conditions (1979-2009) from the temporary surplus the previous two months.  This follows January and February’s improvement from September 2012′s significant negative deviation from normal conditions (from -2.5 million sq. km. to +750,000 sq. km.).  While Antarctic sea ice gain was slightly more than the climatological normal rate following the austral summer, Arctic sea ice loss was slightly more than normal during the same period.

Arctic Sea Ice

According to the NSIDC, sea ice creation during April measured 1.5 million sq. km.  This melt rate was approximately normal for the month, so April′s extent remained below average again.  Instead of measuring near 15 million sq. km., April 2013′s average extent was only 14.37 million sq. km., a 630,000 sq. km. difference.  In terms of annual maximum values, 2013′s 15.13 million sq. km. was 733,000 lower than normal.

Barents Sea (Atlantic side) ice once again fell from its climatological normal value during the month after remaining low during most of the winter.  Kara Sea (Atlantic side) ice temporarily recovered from its wintertime low extent and reached normal conditions, which is also different from spring 2012′s conditions, before 2013 melt caused the extent to fall below normal conditions again.  The Bering Sea (Pacific side), which saw ice extent growth due to anomalous northerly winds in 2011-2012, saw similar conditions in December 2012 through February 2013.  This caused anomalously high ice extent in the Bering Sea again this winter.  As it did previously this winter, an extended negative phase of the Arctic Oscillation allowed cold Arctic air to move far southward and brought warmer than normal air to move north over parts of the Arctic.  The AO’s tendency toward its negative phase in recent winters is related to the lack of sea ice over the Arctic Ocean in September each fall.  Warmer air slows the growth of ice, especially ice thickness.  This slow growth allows more melt than normal during the subsequent summer, which helps establish and maintain negative AO phases.  This is a destructive annual cycle for Arctic sea ice.

In terms of climatological trends, Arctic sea ice extent in April has decreased by 2.3% per decade, the lowest of any calendar month.  This rate is closest to zero in the late winter/early spring months and furthest from zero in late summer/early fall months.  Note that this rate also uses 1979-2000 as the climatological normal.  There is no reason to expect this rate to change significantly (much more or less negative) any time soon, but increasingly negative rates are likely in the foreseeable future.  Additional low ice seasons will continue.  Some years will see less decline than other years (e.g., 2011) – but the multi-decadal trend is clear: negative.  The specific value for any given month during any given year is, of course, influenced by local and temporary weather conditions.  But it has become clearer every year that humans have established a new climatological normal in the Arctic with respect to sea ice.  This new normal will continue to have far-reaching implications on the weather in the mid-latitudes, where most people live.

Continue reading


1 Comment

April 2013 CO2 Concentrations: 398.35 ppm

During April 2013, the Scripps Institution of Oceanography measured an average of 398.35ppm CO2 concentration at their Mauna Loa, Hawai’i’s Observatory.

This value is a big deal.  Why?  Because not only is 398.35 ppm the largest CO2 concentration value for any April in recorded history, it is the largest CO2 concentration value in any month in recorded history.  More on that below.  This year’s April value is 1.90 ppm higher than April 2012′s!  Month-to-month differences typically range between 1 and 2 ppm.  This jump of 1.90 ppm is within that range.  It is also ~0.9 ppm less than March’s and 1.47 ppm less than February’s year-over-year change of 3.37 ppm.  The unending trend toward higher concentrations with time, no matter the month or specific year-over-year value, as seen in the graphs below, is more significant.

Let’s get back to that all-time high concentration value.  The yearly maximum monthly value normally occurs during May. Last year was no different: the 396.78ppm concentration in May 2012 was the highest value reported last year and, prior to the last three months, in recorded history (neglecting proxy data).  I expect May of this year to produce another all-time record value.  That value will hold first place until February 2014.  I wrote the following three months ago:

If we extrapolate last year’s maximum value out in time, it will only be 2 years until Scripps reports 400ppm average concentration for a singular month (likely May 2014; I expect May 2013′s value will be ~398ppm).  Note that I previously wrote that this wouldn’t occur until 2015 – this means CO2 concentrations are another climate variable that is increasing faster than experts predicted just a short couple of years ago.

For the most part, I stand by that prediction.  But actual concentration increases might prove  me wrong.  Here is why: the difference in CO2 concentration values between May 2012 and March 2012 was 2.33 ppm (396.78 – 394.45).  If we do the simplest thing and add that same difference to this March’s value, we get 399.67 ppm.  That is awfully close to 400 ppm, but less than the 399.93 ppm extrapolation I performed in February.  It’s also close to the 399.3 ppm extrapolation I calculated in March.  I discussed May 2013′s projection with Sourabh after February’s post.  They predicted 399.5-400 ppm concentration for May 2013.  For the second month in a row, I think NOAA will measure May 2013′s mean concentration near 399.3 ppm.

 photo co2_widget_brundtland_600_graph_201304_zps95ee980f.gif

Figure 1 – Time series of CO2 concentrations measured at Scripp’s Mauna Loa Observatory in April from 1958 through 2013.

CO2Now.org added the `350s` and `400s` to this month’s graphic.  I suppose they’re meant to imply concentrations shattered 350 ppm back in the 1980s and are pushing up against 400 ppm now in the 2010s.

How do concentration measurements change in calendar years?  The following two graphs demonstrate this.

 photo CO2_concentration_5y_trend_NOAA_201305_zps97154b97.png

Figure 2 – Monthly CO2 concentration values from 2009 through 2013 (NOAA).  Note the yearly minimum observation is now in the past and we are one month removed from the yearly maximum value.  NOAA is likely to measure this year’s maximum value near 399ppm.

 photo CO2_concentration_50y_trend_NOAA_201305_zps0fd15ff0.png

Figure 3 – 50 year time series of CO2 concentrations at Mauna Loa Observatory.  The red curve represents the seasonal cycle based on monthly average values.  The black curve represents the data with the seasonal cycle removed to show the long-term trend.  This graph shows the recent and ongoing increase in CO2 concentrations.  Remember that as a greenhouse gas, CO2 increases the radiative forcing of the Earth, which increases the amount of energy in our climate system.

In previous posts on this topic, I showed and discussed historical and projected concentrations at this part of the post.  I will skip this for now because there is something about this data that I think provides a different context of the same conversation.  I saw a graphic last month that I provides useful focus on this topic:

 photo CO2_concentration_annual_growth_rate_NOAA_2012_zps4d9dfbcb.png

Figure 4 – CO2 concentration (top) and annual average growth rate (bottom). Source: Guardian

The top part of Figure 4 should look familiar – it’s the black line in Figure 3.  The bottom part is the annual change in CO2 concentrations.  If we fit a line to the data, the line would have a positive slope, which means annual changes are increasing with time.  So CO2 concentrations are increasing at an increasing rate – not a good trend with respect to minimizing future warming.  In the 1960s, concentrations increased at less than 1 ppm/year (average rate of increase in the bottom graph by decade).  In the 2000s, concentrations increased at 2.07 ppm/year.  This isn’t surprising – CO2 emissions continue to increase decade after decade.  Natural systems are not equipped to remove CO2 emissions quickly from the atmosphere.  Indeed, natural systems will take tens of thousands of years to remove the CO2 we emitted in the course of a couple short centuries.  Human systems do not yet exist that remove CO2 from any medium (air or water).  They are not likely to exist for some time.  So NOAA will extend the right side of the above graphs for years and decades to come.

The greenhouse effect details how these increasing concentrations will affect future temperatures.  The more GHGs (CO2 and others) are in the atmosphere, all else equal, the more radiative forcing the GHGs cause.  More forcing means warmer temperatures as energy is re-radiated back toward the Earth’s surface.  Conditions higher in the atmosphere affects this relationship, which is what my volcano post addressed.  A number of medium-sized volcanoes injected SO2 into the stratosphere (which is above the troposphere – where we live and our weather occurs) in the last decade.  Those SO2 particles reflected incoming solar radiation.  This happened because of their chemical and radiative properties.  So while we emitted more GHGs into the troposphere, less radiation entered the troposphere (the bottom layer of the atmosphere) in the past 10 years than the previous 10 years.  With less incoming radiation, the GHGs re-emitted less energy toward the surface of the Earth.  This is likely part of the reason why the global temperature trend leveled off in the 2000s after its relatively rapid run-up in previous decades.

This situation is important for the following reason.  Once the SO2 falls out of the atmosphere, the additional incoming radiation will encounter higher GHG concentrations than was present in the late 1990s.  As a result, we will likely see a stronger surface temperature response sometime in the future than the response of the 1990s.

The remainder of the reason is the oceans.  Thanks to the Interdecadal Pacific Oscillation’s most recent negative phase, the Pacific in particular absorbed heat energy near the surface and transported it to the deep ocean instead of allowing the heat to accumulate near the surface.  The following graphic shows how heat absorption by global oceans changed in recent years:

 photo Ocean_heat_content_balmaseda_et_al_zps23184297.jpg

Figure 5. New research that shows anomalous ocean heat energy locations since the late 1950s.  The purple lines in the graph show how the heat content of the whole ocean has changed over the past five decades. The blue lines represent only the top 700 m and the grey lines are just the top 300 m.  Source: Balmaseda et al., (2013)

This temporary energy transport to the deep ocean is good news in the short-term: global surface temperatures slowed their rise during the 2000s compared to the 1990s and 1980s.  That does not mean however the global warming has stopped, as ideological skeptics want you to believe.  That heat energy still exists in the Earth’s climate system.  The oceans move heat around the planet just as the atmosphere does.  The very large amount of extra heat currently in the deep ocean will eventually come back up to the surface.  When it does, it add to the surface warming signal.  So we can expect to see an extra rise of global mean surface temperatures sometime in the future.  Thus, this is not good news in the long-term.

The rise in CO2 concentrations will slow down, stop, and reverse when we decide it will.  It depends primarily on the rate at which we emit CO2 into the atmosphere.  We can choose 350 ppm or 450 ppm or any other target.  That choice is dependent on the type of policies we decide to implement.  It is our current policy to burn fossil fuels because we think doing so is cheap, albeit inefficient and without proper market signals.  We will widely deploy clean sources of energy when they are cheap, the timing of which we control.  We will remove CO2 from the atmosphere when we have cheap and effective technologies and mechanisms to do so, which we also control.  These future trends depend on today’s innovation and investment in research, development, and deployment.  Today’s carbon markets are not the correct mechanism, as they are aptly demonstrating.  We will limit future warming and climate effects when we choose to do so.


2 Comments

Denver’s April 2013 Climate Summary With A Bonus

During the month of April 2013, Denver, CO (link updated monthly) recorded a 74°F difference between maximum and minimum temperatures.  This fact tells us nothing about how temperatures compare to climatological norms however.  For the entire month, Denver was 5.7°F below normal (41.7°F vs. 46.4°F).  The maximum temperature of 80°F was recorded on the 29th while the minimum temperature of 6°F was recorded on the 10th.  Here is the time series of Denver temperatures in April 2013:

 photo Denver_Temps_201304_1_zps0b7f12c3.png

Figure 1. Time series of temperature at Denver, CO during April 2013.  Daily high temperatures are in red, daily low temperatures are in blue, daily average temperatures are in green, climatological normal (1981-2010) high temperatures are in light gray, and normal low temperatures are in dark gray. [Source: NWS]

There is a big disparity between 2013 temperatures and normal temperatures, especially daily maxima.  Three outbreaks of Arctic air impacted Denver during the month, which set record low temperatures on four different days.  This graph also shows something else that is eye-opening: five daily maximum temperatures were equal to or lower than the climatological daily minimum temperature!  As someone who was ready for spring to spring, April was a disappointing weather month.

But it also got me to thinking about the difference between spring 2013 and spring 2012.  As many of us remember, temperatures in the US in 2012 were very warm compared to climatological norms.  So how different were temperatures in Denver in February-March-April 2013 versus 2012?  I decided to take a look.  Let’s start with extending the dates in Figure 1 back to the beginning of February 2013:

 photo Denver_Temps_201304_2_zps9764a3a4.png

Figure 2. Time series of temperature at Denver, CO during February-April 2013.  Daily high temperatures are in red, daily low temperatures are in blue, climatological normal (1981-2010) high temperatures are the top dark gray line, and normal low temperatures are the bottom dark gray line. [Source: NWS]

This graphic simply demonstrates the same story that I wrote above as well as in my March and February Denver Climate Summary posts.  February was obviously colder than normal due to extended cold air masses over the area.  March and April were also colder than normal, but this was due to vigorous mid-latitude cyclones that brought Arctic air masses south over the area.  This is evident by the significant dips in both maximum and minimum daily temperatures: there was one in the beginning of March, another in the end of March, and three in April.

With this chart in mind, let’s look at the difference between 2012 and 2013.  First, daily maximum temperatures:

 photo Denver_Temps_201304_3_zps34dbe5f9.png

Figure 3. Time series of maximum temperature at Denver, CO during February-April 2012 and 2013.  2013 temperatures are in brick-red, 2012 temperatures are in red, and climatological normal (1981-2010) high temperatures are the dark gray line with green crosses. [Source: NWS]

My memory of 2012’s maximum temperatures was close to reality.  February 2012 was colder than I remember, but this was likely affected by the warmth of April 2012 and the record-setting daily highs in the summer of 2012.  Figure 3 shows a very large difference between daily maximum temperatures in 2012 and 2013, especially after the 22nd of March.  I didn’t remember the cold snap on April 3, 2012.  This graphic shows, by proxy, the lack of spring synoptic storms in 2012.  Daily maximum temperatures rarely fell below the normal for the date.  Instead, April temperatures were as much as 20°F warmer than normal on some dates, but regularly 10°F warmer than normal.  In contrast, 2013 temperatures were often 25-30°F colder than normal.  The difference between two years’ temperatures is a measure of interannual weather variability.  I have more on that below.

 photo Denver_Temps_201304_4_zps477a8e24.png

Figure 4. Time series of minimum temperature at Denver, CO during February-April 2012 and 2013.  2013 temperatures are in blue, 2012 temperatures are in green, and climatological normal (1981-2010) high temperatures are the dark gray line with brown pluses. [Source: NWS]

Again, February 2012’s temperatures were similar to February 2013’s.  The specific dates of temperature swings obviously varies between the two years.  March 2012 and March 2013 also look similar, up until the 22nd of March (see maximum temperatures above also).  Thereafter, the time series diverge with much colder air in place over Denver four different times through the end of April.  2012 had warmer than normal minimum temperatures through most of April.  The combination of warmer than normal nights and days, combined with a relative lack of precipitation in 2012 set the stage for the record-setting warmth in the summer as well as the rapid decline in drought conditions, which are still largely present now.

Interannual Variability

I have written hundreds of posts on the effects of global warming and the evidence within the temperature signal of climate change effects.  This series of posts takes a very different look at conditions.  Instead of multi-decadal trends, this series looks at highly variable weather effects on a very local scale.  The interannual variability I’ve shown above is a part of natural change.  Climate change influences this natural change – on long time frames.  The climate signal is not apparent in these figures because they are of too short duration.  The climate signal is instead apparent in the “normals” calculation, which NOAA updates every ten years.  The most recent “normal” values cover 1981-2010.  The temperature values of 1981-2000 are warmer than the 1971-2000 values, which are warmer than the 1961-1990 values.  The interannual variability shown in the figures above will become a part of the 1991-2020 through 2011-2040 normals.

Precipitation

Precipitation was above normal again during April 2013, extending this new trend to three months.  During the month, 1.87″ of liquid water equivalent precipitation fell, compared to 1.71″ normally.  The wettest April on record was in 1983 when 4.56″ of precipitation fell.  There were three notable weather events during April: a 6″+ snowstorm on the 9th, a 7″+ snowstorm on the 15th, and a 5″+ snowstorm on the 22nd.  In total, the NWS recorded 20.4″ of snow.

The recent precipitation surplus reduced northeast CO drought severity in the last three m months, but did not break it yet.  Above-average precipitation will have to fall for longer than three months for that to happen.  The NWS expects continued drought conditions across most of Colorado through the next three months.  Additional improvement in eastern Colorado might occur, but NOAA and the CPC expects western Colorado drought  to remain the same or worsen.


Leave a comment

Ideology and Misperception in Energy and Climate

I could write a dissertation on this topic and spend the rest of my life researching and publishing on it.  I will have to settle for a short blog post for now, because my own research is in need of my attention.

People posted a number of tweets and articles on how “Political ideology affects energy-efficiency attitudes and choices“, which is the title of a new PNAS article.  The upshot: ideology trumps the free market.  This isn’t a surprise to me anymore – I’ve studied plenty of cases in the past two years that demonstrate this phenomenon.  In this case, peoples’ purchases of energy-efficient light bulbs were most influenced by what the bulb’s labeling stated.  The study made two stickers available: “Protect the Environment” or blank.  In both cases, the researchers made the same bulb benefits (energy use & cost) available to each potential purchaser.  The only difference was the presence of a blank or pro-environment sticker on the packaging.  With the pro-environmental sticker, conservatives were less likely to purchase the CFL bulb.  Without it, conservatives and liberals were equally likely to purchase the CFL bulb.  That’s not rational, which is a significant assumption of modern economic theory.  The result shows, unsurprisingly, that peoples’ behavior depends on their personal ideology and value system.  This has obvious implications for climate change activists: you have to operate in the value system of your targeted audience if you want them to receive your proposals well.  Beating the same drums harder won’t make conservatives care about climate change.

Climate groups are willfully failing elsewhere.  A new Yale Project on Climate Change Communication and George Mason University Center for Climate Change Communication poll demonstrates that increasing numbers of Americans are drawing incorrect conclusions from recent weather events to climate change.  The warmest year on record in the US (2012) was made more severe due to global warming, according to 50% of respondents.  A similar number believe the ongoing US drought is worse due to global warming.  The results go on and on.

Here is the rub: these beliefs have no basis in scientific fact.  2012 US temperatures were largely influenced by natural interannual variability.  It was warmer than 1998 by more than 1°F, which is significant.  But identifying a global warming signal in one year’s temperature data for the US is beyond the current capabilities of science.  We can say more robustly that the 2000s were significantly warmer than the 1990s, which were warmer than the 1980s, etc.  2012’s temperatures were extreme and it had implications that are still being felt by human and ecological systems.  The important point there is this: are existing systems capable of handling today’s weather extremes?  If not, we should do something.

The belief in climate change enhanced drought is also unsupported, as I wrote about a couple of weeks ago.  Initial findings from a NOAA-led team were unable to detect a global warming-related signal in either the onset, magnitude, or extent of the extraordinary 2012 drought.  This isn’t particularly surprising when you consider the last two droughts of similar extent and severity occurred in the 1950s and 1930s – prior to much anthropogenic forcing.  Specifically, they found that “The interpretation is of an event resulting largely from internal atmospheric variability having limited long lead predictability.”  Again, this drought is producing effects, but it isn’t directly attributable to climate change.  The question remains: are existing systems capable of handling these types of extreme events?  If they aren’t, we should do something about them, not draw unscientific causal linkages in an effort to build support for change.

The IPCC’s SREX report (Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation), issued just last year, reinforces this message.  There is a detectable global warming signal in a few measurable parameters such as temperature, water vapor, and sea level change.  But the climate system retains a great deal of natural variability which scientists do not fully understand.  Climate conditions will change in the next 90 years, but the likelihood of those changes varies.  Weather conditions may or may not change.  Their inherent transience makes it difficult to ascribe causal factors behind any changes.  Note further that climate projections of the 2090s are not climate conditions of the 2090s or 2010s.  Identifying likely future changes does not translate to detecting those changes today.

Yale and George Mason should digest their poll results along with the latest guidance from scientific peer-reviewed literature to help guide their communication efforts moving forward.  Given the results of this latest poll, they have their work cut out for them.  Framing, whether it is related to selling CFLs to a diverse public or differentiating between weather and climate, is critically important in climate communication.