Weatherdem's Weblog

Bridging climate science, citizens, and policy


1 Comment

Research: Updated Projections of Future Sea Level Rise

An international team of climate scientists led by Anders Levermann wrote a paper than appeared in the Proceedings of the National Academy of Sciences (PNAS) of the US that described long-term (2,000y) sea level changes in response to different stabilized temperature thresholds.  You can find a short Reuters summary of the paper here.  I will provide more detail and share some observations of this paper.  The paper garnered a good amount of attention in climate activist circles since publication.

First, a little historical perspective.  Global mean sea levels rose in the 20th and 21st centuries: about 0.2m.  Prior to research conducted in the past five years, projections of additional 21st century sea-level rise ranged from another 0.2m to 2.0m.  These projections did not, in general, consider feedbacks; the parent simulations did not consider cryosphere processes (i.e., melting glaciers and the land-based Greenland and Antarctic ice sheets).  More recent research included more feedbacks and cryosphere processes, but their treatment remains immature.  Additionally, recent research started to examine projections based on realistic emissions scenarios, after researchers began to accept the fact that policymakers are unlikely to enact meaningful climate policy any time soon.  As such, sea-level projection ranges increased.  Which is where this latest paper comes in.

Levermann et al. reported a 2.3 m/°C sea-level rise projection in the next two thousand years.  Benjamin Strauss’s PNAS paper put this into context:

[W]e have already committed to a long-term future sea level >1.3 or 1.9m higher than today and are adding about 0.32 m/decade to the total:10 times the rate of observed contemporary sea-level rise

Thus, if global temperatures rise only 1°C and stabilize there (an extremely unlikely scenario), sea-levels two thousand years from now could be 2.3 m higher.  This might not sound like much, but an additional 7.5 feet of sea level rise would inundate 1.5 million U.S. peoples’ homes at high tide.  With 2°C, sea levels could rise 4.6m.  On our current emissions pathway, global mean temperatures would rise 4°C, which would result in an additional 9.2 m of sea level rise.  That’s 30 feet higher than today!  Levermann notes that these higher sea level projections are supported by sea level heights that occurred in the distant past (paleoclimate), even with their associated uncertainties.

According to Strauss’s analysis, such a rise would threaten more than 1,400 municipalities – and those are just U.S. municipalities that exist today, not tomorrow.  Globally, billions of people would be adversely affected.  What social stresses would billions of people moving inland exert?  The U.S. experienced its own small glimpse into this future post-Hurricane Katrina as a few thousand people permanently abandoned their below-sea level neighborhoods.

And now a couple of important points.  Some of the processes Levermann utilized involved linear change.  In complex systems like the Earth’s climate, very few changes are linear.  Many more are exponential.  When I discuss linear and exponential change with my students, I include a number of different examples because our species doesn’t easily understand exponential change.  We typically severely underestimate the final value of something that changes exponentially.  If changes within the climate system occur exponentially, the Levermann projections probably won’t be valid.  But their estimate will likely be an underestimate.

A USA Today article on the Levermann paper has this quote:

Looking at such a distant tomorrow “could scare people about something that might not happen for centuries,” says Jayantha Obeysekera of the South Florida Water Management District, a regional government agency. He says such long-term projections may not be helpful to U.S. planners who tend to focus on the next few decades.

Should people be scared that their communities will be underwater in 2000 years?  I don’t think they will be, number one.  Few people pay attention to trends that will affect them in 2 or 20 years.  200 or 2000 are well beyond anybody’s individual concern.  But I think society as a whole should examine this updated projection.  Do we want to condemn one-third of Florida to the bottom of the Atlantic Ocean?  Do we want to condemn thousands of towns and cities to that fate, even if the time horizon may be well beyond our lifetime?

Moving beyond the inevitable question regarding fear, what do these results mean?  We need to include results like these in planning processes.  If nothing else, planners and policymakers have more realistic estimates of likely future sea level in hand.  Those estimates will continue to change (hopefully for the better) with additional research.  But decision-making shouldn’t stop with the expectation that some future projection might be perfect because it won’t be.  The decisions we make today will have profound effects on the eventual level of the sea in the distant future.  There will also be countless effects on the climate, societies, and ecosystems until we reach that level.   That is what today’s decision-making needs to address.

Climate Central has a useful map to investigate how potential thresholds implicate sea level rise with respect to US states at different points in the future.


1 Comment

Tropical Storm Dorian Moving West Across Atlantic

The fourth named storm of the 2013 Atlantic hurricane season formed earlier this week off the west coast of Africa.  T.S. Dorian is moving west under the influence of a strong mid-level ridge.  He encountered cool sea-surface temperatures a couple of days ago.  While over warming waters since then, high vertical wind shear and dry, stable air to his north encumbered him since then, which kept his strength at Tropical Storm status.  Dorian is a compact tropical system and currently has no banding features on satellite imagery.

T.S. Dorian’s position as of this morning was 17.7N, 43.4W with an estimated central pressure of 1006mb.  He is moving WNW @ 21mph and has maximum sustained winds of 50mph.

The National Hurricane Center’s official track forecast keeps Dorian south of 20N through the weekend as he approaches the Lesser Antilles from the northeast.  By Monday morning, the NHC’s current forecast has Dorian north of Puerto Rico as a Tropical Storm.  The forecast then shows Dorian north of the Dominican Republic Tuesday morning and just off-shore eastern Cuba Wednesday morning.  By Wednesday, the track uncertainty cone extends from Jamaica (south of Cuba) to the Bahamas (north of Cuba and east of southern Florida).  The official intensity forecast keeps Dorian at Tropical Storm strength through Wednesday.

Any impacts to the United States, if there are any, will not occur until the second half of next week.


3 Comments

Australia Giving Up On Relatively Successful Carbon Market

Australia voted last week to scrap their carbon tax and replace it with a much less economically efficient cap-and-trade scheme.  The pro-business Reuters article acknowledges the only “positive” that results from this decision: businesses will save money.  Well, hallelujah.  I’m sure today’s children will be immensely grateful when they’re adults with all the resultant climate change effects that Australian businesses were able to avoid paying for their actions and saved a few billion dollars in the 2010s.  That’s one way to look at this news.  Let’s flesh the landscape out before throwing Australia under the bus too quickly.

To be fair, Australia simply moved up the date when they joined … the European carbon “market”.  You remember, that’s the market that severely over-supplied carbon credits at its outset and refused earlier this year to remove some of those excess credits for a mere two years.  In essence, the European carbon market doesn’t work.  How can you tell?  Carbon costs €4.2/tCO2 today.  When the European market started, the cost was €31/tCO2.  At one-tenth the original price, the market signal is clear: there are far too many allowances in the European market.  Have greenhouse gas emissions (note: CO2 isn’t the only GHG!) fallen in the EU since the market’s inception?  Yes, but this is a result of the continued economic malaise the Europeans inflict on themselves, as described by the European Environment Agency’s most recent report.

The temporary benefit to the earlier Australian move to the EU’s ETS is this: the flow of carbon credits is one way: from Europe to Australia.  Australia can’t export credits until July 2018.  So in the short-term, Australia could help relieve the over-supply of EU carbon credits.  This might help in raising the carbon price back to more realistic levels, but this won’t happen until 2016 at the earliest because of lower emissions and demand for permits in Australia.

There are two big negative effects of moving from a fixed tax to a floating market.  The first is that carbon will become much cheaper in Australia: from A$25.40 per tonne to A$6 per tonne.  Is carbon really only worth A$6?  In an over-supplied market, perhaps it is.  The fact that not all industries are involved in the carbon market means that we manipulate the true carbon price.  Of course, as much as folks like to talk about “free markets”, most markets are heavily manipulated by vested interests.  The second negative effect remains local: the move removes A$3.8 billion from the Australian federal budget over four years.  Australia’s Prime Minister Kevin Rudd proposed to make up this budget shortfall by “removing a tax concession on the personal use of salary-sacrificed or employer-provided cars.”  Good luck with that, Mr. Rudd.  Everybody is loath to give up a financial benefit once they receive it.  Look – more market manipulation!

Australian coal companies were more than happy to propagate misinformation to Australian energy consumers: electricity price increases were due exclusively to the carbon tax!  This highlights a common problem with any carbon-pricing scheme: special interests can more easily spread misinformation and disinformation (and are often happy to do so!) than market proponents can spread true information.  The reason is often quite simple: the truth is complex and consumers don’t want to invest the time to understand why they pay the prices they pay.  How many consumers demanded energy utilities stop raising prices before carbon market inception?  Then who was responsible for price increases?  “Market forces” is the lame excuse dished out to the masses.  How about the relentless, unquenchable hunger for ever-rising profits?  Somehow, that’s alright, but accurately pricing a commodity is heresy.

An additional piece of context: Australia suffered from record heat waves, droughts, and floods in the past ten years.  The Australian public’s acceptance of climate change related to these disasters is widespread, as is their desire to “take action”.  Well, the government took action and that same public cried uncle with slightly higher utility bills.  This proves the common refrain: people support climate policies … so long as they are absolutely free.  That smacks into reality awfully quick.  It also demonstrates that there is no such thing as a “Climate Pearl Harbor” that leads to unequivocal support for a given climate policy.  The slow-acting nature of climate works strongly against widespread, effective climate policy.


Leave a comment

Current & Future U.S. Heat Waves

A substantial portion of the U.S. population experienced a heat wave during the past week.  Due to the number of people affected, the media spent some time on the topic.  As opposed to places like Las Vegas or Phoenix, where the “heat is supposed to happen”, folks normally accustomed to rather pleasant summer conditions experienced real heat again.  Heat waves of various intensity happen every year.  This  heat wave is rather intense – it is breaking some heat records.  Some interesting factoids:

Temperatures at Newark Liberty Airport in New Jersey were recorded at 98 degrees at 1 p.m. local time on Friday, as the mercury hit 93 in Central Park. John F. Kennedy Airport in Queens, New York, recorded temperatures of 100 degrees on Thursday, beating out the previous record set for that date a year ago, and on Friday the heat index there reached 108.

Electricity usage soared to an all-time high in New York City as the work week closed out, provider Con Edison announced, as service hit a peak of 13,214 megawatts around 2 p.m. local time. The previous record was 13,189 megawatts on July 22, 2011, according to the company.

So, some serious heat and serious energy consumption.  The latter proves interesting to look at in more detail: if warming trends continue, power plants will be unable to operate like we expect them to due to water and infrastructure cooling requirements.  That spells trouble for people: the worst heat waves of the future might be accompanied by temporary brownouts and blackouts.  How manageable will heat waves be with no A/C?

What about the warming trend?  If we stay on our current greenhouse gas emissions pathway (the highest considered by climate models), look at the potential number of weeks with 100°F+ temperatures in 2090-2099:

 photo A1FI-warming.gif

Figure 1. Projection of A1FI emissions pathway-derived number of weeks (2090-2099) per with daily maximum temperatures exceeding 100°F.

With this heat wave fresh in mind, imagine what it will be like later this century when there is more than one excessive heat wave per year in the Midwest and along the east coast.  Instead of five days of misery, what will 25 days be like?  How about 50 days of 100°F heat in Virginia, North Carolina, Kentucky, Illinois, Iowa, and Nebraska?  When 100°F daytime heat dominates one, two, or even three months every year and high nighttime temperatures accompany it, this week’s heat wave will seem refreshing by contrast.

That’s how we feel in Denver, CO this year.  Instead of 73 total 90°F+ days – 13 of those days at 100°F+ – in 2012 (with June 2012 7.6°F warmer than normal), summer 2013 has been closer to average.  Yes, it’s been warm and only one 100°F day occurred so far this year, but it feels almost pleasant in comparison to last summer when the heat was relentless for months on end.

Three days of excessive heat is difficult to experience.  Three months is currently unimaginable.  How much worse future heat waves get is mostly within our control.  The sooner we significantly cut greenhouse gas emissions, the better things will end up for all of us.  But as the above graph demonstrates, the future could be quite hot if we continue along our current emissions pathway too much longer.


Leave a comment

Car mileage evaluation criticism

The Denver Post Editorial Board took a stance on car mileage based on a recent Consumer Reports (CR) article.  Let me state at the outset that I regularly use Consumer Reports rankings as part of my purchase decision-making process.  That said, no testing is ever 100% complete and is much less regularly communicated well to non-experts.  At issue: CR performed independent tests on cars and calculated different miles per gallon values than those the EPA provided.  Should the EPA update their testing?  Perhaps, but the Editorial Board and CR didn’t provide an overwhelming case to do so.  Let’s look at what each entity said.

First, the Post:

Consumers could very well feel deceived by the numbers, but there are other issues at work.

Hybrids with just a single occupant can zip past traffic using high-occupancy-vehicle lanes in some parts of the country — including Colorado — because of their superior efficiency. The idea is to support, through public policy, efficient vehicles that generate less harmful emissions. But if they’re really not substantially more efficient, it’s neither environmentally beneficial nor fair to drivers of traditional vehicles that may, in reality, get similar gas mileage.

There are key parts to this section that I want to highlight.  What does “substantially more efficient” mean?  What value efficiency is enough to warrant public policy?  In the Denver area, suburban drivers love their SUVs and trucks.  So to start, I’ll compare an SUV, a truck, a sports car, and a 2010 Prius.  And I’ll start with the location the Board identified as a policy recipient: the highway.  The EPA’s ratings for these four vehicles are: 15, 18, 26, and 48mpg (highway).  Should Prius drivers receive plicy support for driving a vehicle that averages 2-3X as many mpg as the majority of other vehicles?

Let’s change the argument a little to better match the spirit of the article and consider the “combined” fuel efficiency.  This designation accounts for street driving and highway driving.  The same four vehicles have the following combined ratings: 13, 14, 21, and 50mpg.  The complaint that CR and the Board has is the hybrid value is too high (based on their own testing protocols, which aren’t detailed very well).  So let’s change the hybrid’s EPA combined value with CR’s “overall” value: 44mpg.  Should we direct public policy toward cars that get 2-3X as many mpg as the majority of other vehicles?  Note that the comparison, and thus my argument, didn’t change.

The argument that CR made is that their hybrid vehicle test values differed from EPA values by more than 3mpg on average.  Yes, 6mpg is a difference.  Are consumers being tricked?  I don’t think so.  And this is really where I split with CR.  Here is their take:

Overall, fuel-efficiency shortfalls have narrowed considerably over the years. When Consumer Reports conducted a similar study in 2005 that compared our gas-mileage results with the EPA estimates, we found that most cars got significantly fewer mpg than their window stickers promised. Conventional gas-powered vehicles missed their EPA estimates by an average of 9 percent, and hybrids by 18 percent.

So it isn’t as if CR is bashing hybrids; far from it.  All of the EPA sticker values differed from CR’s.  That isn’t surprising since CR tested vehicles differently.  One vital question: which test best mimics real-world driving?  Most people drive very aggressively (hard acceleration and braking), which has a big impact on gas mileage.  Are the CR values too high?  What if we consider additional factors: heat and cold, wind and rain.  Those happen in the real world.  But not in the CR tests.  Isn’t it interesting that a consumer advocacy group is challenging the EPA’s tests for being unrealistic when they themselves didn’t test vehicles in real-world conditions?  How far off are CR’s values?  We don’t know.  Should we reengineer public policy in the face of CR’s unrealistic tests?  For what purpose?  Should the EPA and CR develop a more rigorous testing protocol – perhaps one that both entities can perform and therefore directly compare against one another?

I have another problem with the CR quote.  Note the bolded words.  EPA doesn’t promise drivers that they’ll attain the sticker mileage.  In fact, the EPA goes out of its way to emphasize and explain their values are merely estimates.  That’s not a promise, not even close.  It’s annoying that people read too much into things.  In fact, CR could do what I did: check the EPA website for common misconceptions.  The ninth most common:

9. Fuel Economy Label The EPA fuel economy estimates are a government guarantee on what fuel economy each vehicle will deliver.
The primary purpose of EPA fuel economy estimates is to provide consumers with a uniform, unbiased way of comparing the relative efficiency of vehicles. Even though the EPA’s test procedures are designed to reflect real-world driving conditions, no single test can accurately model all driving styles and environments. Differing fuel blends will also affect fuel economy. The use of gasoline with 10% ethanol can decrease fuel economy by about 3% due to its lower energy density.

Like I said, the EPA goes out of its way to explain what its published values are.  CR and the Board should have done 60 seconds of checking before they called out the EPA for deceiving consumers with “promises”.

A more appropriate target are auto manufacturers, who know what the tests are and try their best to optimize the results.  The same entity that has a financial interest in optimizing estimated mileage should not test vehicles’ mileage.  Like I wrote above, this is where CR and EPA should work together to independently test vehicles.

But as far as the basic argument goes, hybrids do get better overall mileage than other vehicles.  They get the best mileage if drivers drive them where manufacturers intended them to drive: in stop-and-go city traffic.  But they still drastically outperform their competition in highway driving, enough so that I think current public policies provide nearly the correct incentive for drivers to think of another dimension when choosing which vehicle they will purchase.


2 Comments

44.9% of the Contiguous United States in Moderate or Worse Drought – 9 Jul 2013

According to the Drought Monitor, drought conditions improved recently across some of the US. As of Jul. 9, 2013, 47.3% of the contiguous US is experiencing moderate or worse drought (D1-D4), as the early 2010s drought continues month after month.  That is the lowest percentage in a number of months. The percentage area experiencing extreme to exceptional drought increased from 14.6% to 14.8%, but this is ~4% lower than it was six months ago. The eastern third of the US was wetter than normal during June, which helped keep drought at bay.  The east coast in particular was  much wetter than normal.  Instead of Exceptional drought in Georgia and Extreme drought in Florida two years ago, there is flash flooding and rare dam water releases in the southeast.  Eight eastern states experienced their top-three wettest Junes on record.  The West is quite a different story.  Long-term drought continues to exert its hold over the region, as it remains warmer and drier than normal month after month.

 photo USDrought20130709_zpsc09f25c7.gif

Figure 1US Drought Monitor map of drought conditions as of July 9th.

If we focus in on the West, we can see recent shifts in drought categories:

 photo west_drought_monitor_20130709_zpse8877571.png

Figure 2 – US Drought Monitor map of drought conditions in Western US as of July 9th.

Early year snowmelt relief was short-lived, as drought conditions expanded as worsened in the past three months.  More than three-fourths of the West is in Moderate drought.  More than half of the West is now is Severe drought.  And one-fifth of the West is in Extreme drought.

Temporary drought relief might occur in New Mexico and southern Colorado due to the recent heavy rains brought by a retrograding low pressure system that also brought cooler than normal temperatures to Oklahoma and Texas.

Here are the conditions for Colorado:

 photo CO_drought_monitor_20130709_zps9faef3f3.png

Figure 3 – US Drought Monitor map of drought conditions in Colorado as of July 9th.

There is some evidence of relief evident over the past six months here.  Severe drought area dropped from 95-100% to 83%.  Extreme drought area dropped significantly from 53% to 39%.  Exceptional drought shifted in space from central Colorado to southeastern Colorado, which left the percentage area near 17%.  The good news for southeastern Colorado is the recent delivery of substantial precipitation.  It isn’t likely to alleviate the long-term drought, but will hopefully dent short-term drought.

US drought conditions are more influenced by Pacific and Atlantic sea surface temperature conditions than the global warming observed to date.  Different natural oscillation phases preferentially condition environments for drought.  Droughts in the West tend to occur during the cool phases of the Interdecadal Pacific Oscillation and the El Niño-Southern Oscillation, for instance.  Beyond that, drought controls remain a significant unknown.  Population growth in the West in the 21st century means scientists and policymakers need to better understand what conditions are likeliest to generate multidecadal droughts, as have occurred in the past.

As drought affects regions differentially, our policy responses vary.  A growing number of water utilities recognize the need for a proactive mindset with respect to drought impacts.  The last thing they want is their reliability to suffer.  Americans are privileged in that clean, fresh water flows every time they turn on their tap.  Crops continue to show up at their local stores despite terrible conditions in many areas of their own nation (albeit at a higher price, as found this year).  Power utilities continue to provide hydroelectric-generated energy.

That last point will change in a warming and drying future.  Regulations that limit the temperature of water discharged by power plants exist.  Generally warmer climate conditions include warmer river and lake water today than what existed 30 years ago.  Warmer water going into a plant either means warmer water out or a longer time spent in the plant, which reduces the amount of energy the plant can produce.  Alternatively, we can continue to generate the same amount of power if we are willing to sacrifice ecosystems which depend on a very narrow range of water temperatures.  As with other facets of climate change, technological innovation can help increase plant efficiency.  I think innovation remains our best hope to minimize the number and magnitude of climate change impacts on human and ecological systems.


Leave a comment

June 2013 CO2 Concentrations: 398.58 ppm

During June 2013, the Scripps Institution of Oceanography measured an average of 398.58 ppm CO2 concentration at their Mauna Loa, Hawai’i Observatory.

This value is important because 398.58 ppm the largest CO2 concentration value for any June in recorded history.  This year’s June  value is 2.81 ppm higher than June 2012′s!  Month-to-month differences typically range between 1 and 2 ppm.  This year-to-year jump is clearly well outside of that range.  This is more in line with February’s year-over-year change of 3.37 ppm and May’s change of 3.02 ppm.  Of course, the unending trend toward higher concentrations with time, no matter the month or specific year-over-year value, as seen in the graphs below, is more significant.

The yearly maximum monthly value normally occurs during May. This year was no different: the 399.89ppm concentration in May 2013 was the highest value reported this year and, prior to the last five months, in recorded history (neglecting proxy data).  I expected May of this year to produce another all-time record value and it clearly did that.  May 2013′s value will hold onto first place all-time until February 2014.

 photo co2_widget_brundtland_600_graph_201306_zpsc81fcd45.gif

Figure 1 – Time series of CO2 concentrations measured at Scripp’s Mauna Loa Observatory in June from 1958 through 2013.

CO2Now.org added the `350s` and `400s` to the past few month’s graphics.  I suppose they’re meant to imply concentrations shattered 350 ppm back in the 1980s and are pushing up against 400 ppm now in the 2010s.  I’m not sure that they add much value to this graph, but perhaps they make an impact on most people’s perception of milestones within the trend.

How do concentration measurements change in calendar years?  The following two graphs demonstrate this.

 photo CO2_concentration_5y_trend_NOAA_201307_zpsfceaf70f.png

Figure 2 – Monthly CO2 concentration values (red) from 2009 through 2013 (NOAA). Monthly CO2 concentration values with seasonal cycle removed (black). Note the yearly minimum observation occurred eight months ago and the yearly maximum value occurred last month. CO2 concentrations will decrease through October 2013, as they do every year after May.

 photo CO2_concentration_50y_trend_NOAA_201307_zps11bdf547.png

Figure 3 – 50 year time series of CO2 concentrations at Mauna Loa Observatory (NOAA).  The red curve represents the seasonal cycle based on monthly average values.  The black curve represents the data with the seasonal cycle removed to show the long-term trend.  This graph shows the recent and ongoing increase in CO2 concentrations.  Remember that as a greenhouse gas, CO2 increases the radiative forcing of the Earth, which increases the amount of energy in our climate system.

CO2 concentrations are increasing at an increasing rate – not a good trend with respect to minimizing future warming.  Natural systems are not equipped to remove CO2 emissions quickly from the atmosphere.  Indeed, natural systems will take tens of thousands of years to remove the CO2 we emitted in the course of a couple short centuries.  Human systems do not yet exist that remove CO2 from any medium (air or water).  They are not likely to exist for some time.  So NOAA will extend the right side of the above graphs for years and decades to come.

This month, I want to spend some more time on this focus: CO2 concentration values.  Given our species’ penchant for round numbers, it came as little surprise that the corporate media placed an uncommon amount of attention on a value that really has relatively low meaning: daily CO2 concentrations at Mauna Loa surpassed 400 ppm for a day during the month of May.  In fact, both the media and many climate activists made a very big deal about this development (news articles and my twitter feed “blew up” with this news).  I think that was largely a waste of time.  Again, the daily value itself didn’t represent any large difference once reached.  The climate system did not automatically kick into a different setting once concentrations passed 400 ppm for a day.  Nothing substantially new occurred that didn’t when concentration were “only” 399 ppm (or 390 ppm or 380 ppm for that matter).  Indeed, where are the articles this month with daily values in the 398-399 ppm range?  They’re nonexistent.  So what is important: psychologically significant thresholds or the unending acceleration of concentrations across years and decades?

As I state in this series every month, the trend makes much more of a difference than any daily, monthly, or even yearly average value.  And that trend is accelerating upwards at a rate that many experts didn’t think was possible even 10 years ago.  The effects from last year’s average CO2 concentrations won’t manifest in real-world terms until 30-50 years from now.  I didn’t see anybody else in May pointing out that important detail.  Similarly, I didn’t see any explanation that today’s mean temperatures are largely a result of CO2 concentrations from 30+ years ago.  Perhaps most importantly, climate activists didn’t mention that CO2 concentrations are rising at a rising rate despite decades of their “activism”.  That fact creates a rather uncomfortable situation because most activists are proponents of doing tomorrow what they did yesterday.  If those actions haven’t had any effect up until now, why the advocacy for the status quo when those same activists try to claim that the status quo is untenable.  If they really believed in their catastrophic climate change claims, shouldn’t they honestly evaluate the effects their actions have had?  And if those actions produced far less meaningful progress than they state is absolutely required for the survival of our species and the planet (grandiose language, I know), why do their strategies and tactics remain largely unchanged?

I write these posts for people who are curious or interested in the state of a key climate variable.  Almost two years ago now, I realized that doomsday language turns a significant portion of my potential audience off from the get-go.  If we are to do something meaningful about climate change, we cannot afford the disengagement and hostility of one-third or more of our fellow global citizens towards climate activism.  I don’t want to simply treat people as empty vessels into which I can pour knowledge.  I want to engage them on ground that is similar between us precisely because I want to do something.  Screaming about a 400 ppm mean CO2 concentration for one day and then walking away from the variable until we pass the next perceived meaningful threshold doesn’t strike me as engagement.

The rise in CO2 concentrations will slow down, stop, and reverse when we decide it will.  It depends primarily on the rate at which we emit CO2 into the atmosphere.  We can choose 400 ppm or 450 ppm or almost any other target (realistically, 350 ppm seems out of reach within the next couple hundred years).  That choice is dependent on the type of policies we decide to implement.  It is our current policy to burn fossil fuels because we think doing so is cheap, although current practices are massively inefficient and done without proper market signals.  We will widely deploy clean sources of energy when they are cheap; we control that timing.  We will remove CO2 from the atmosphere if we have cheap and effective technologies and mechanisms to do so, which we also control to some degree.  These future trends depend on today’s innovation and investment in research, development, and deployment.  Today’s carbon markets are not the correct mechanism, as they are aptly demonstrating.  But the bottom line remains: We will limit future warming and climate effects when we choose to do so.