The recently-released National Climate Assessment (NCA) from the U.S. government offers considerable cause for concern for climate calamity, but downplays the decelerating trend in global surface temperature in the 2000s, which I document here.
Many climate scientists are currently working to figure out what is causing the slowdown, because if it continues, it would call into question the legitimacy of many climate model projections (and inversely offer some good news for our planet).
An article in Nature earlier this year discusses some of the possible causes for what some have to referred to as the global warming “pause” or “hiatus”. Explanations include the quietest solar cycle in over a hundred years, increases in Asian pollution, more effective oceanic heat absorption, and even volcanic activity. Indeed, a peer-reviewed paper published in February estimates that about 15 percent of the pause can be attributed to increased volcanism. But some have questioned whether the pause or deceleration is even occurring at all.
Verifying the pause
You can see the pause (or deceleration in warming) yourself by simply grabbing the freely available data from NASA and NOAA. For the chart below, I took the annual global temperature difference from average (or anomaly) and calculated the change from the prior year. So the very first data point is the change from 2000 to 2001 and so on. One sign of data validation is that the trends are the same on both datasets. Both of these government sources show a slight downward slope since 2000:
You can see some of the spikes associated with El Niño events (when heat was released into the atmosphere from warmer than normal ocean temperatures in the tropical Pacific) that occurred in 2004-05 and 2009-10. But the warm changes have generally been decreasing while cool changes have grown.
To be sure, both sets of data points show an overall rise in temperature of +0.01C during the 2000s. But, if current trends continue for just a few more years, then the mean change for the 2000s will shift to negative; in other words, the warming would really stop. The current +.01C increase in temperatures is insufficient to verify the climate change projections for major warming (even the low end +1-2C) by mid-to-late century. A peer reviewed study in Nature Climate Change published in 2013 drew the same conclusion: “Recent observed global warming is significantly less than that simulated by climate models,” it says.
Whenever this surprising result (that warming has slowed) is pointed out, it raises some objections. Here are a few (feel free to add your own in the comments section!):
“You are cherry-picking your start and end times.”
This is a common argument when any data are shown. The recently released National Climate Assessment used 1901 to 1960 as its definition for “normal” weather in a number of its benchmark analyses. Other reports use the entire century-wide mean, while yet others use the National Weather Service conventional 1981-2010 climatology. All of this is cherry-picking one way or another. The key here is to see if the data are behaving as they should.
For the chart I show above, I could have easily chosen the very warm 1998 as my starting point to amplify my trend line, but instead I cleanly chose the 2000s. However, another point to make that everyone will agree with is that I’m plotting temperature coinciding withthe highest global atmospheric CO2 concentration. Therefore, no matter what you believe the sensitivity is, the impact should be strongest in these recent years vs. any others.
“The last decade was still the warmest of all time.”
This is true per the datasets that I am using (NASA and NOAA), so no dispute there. However, in order for climate change projections to verify, we need to continue breaking records more often than not. In the NASA dataset, 2013 only broke one monthly record (2012 only tied one), meaning that most of the time, we are not moving upward. Without breaking new warm records, we continue to flat line and each year, fall further and further behind projections.
“Your sample size is too small.”
My thirteen data points from the 2000s are deemed by critics as not enough data to make any case at all. I could have expanded to 1998 to raise the size to 15, but I readily admit that the more data the better in these situations. The question then becomes what sample size would you need to see to start getting concerned that the climate models might be too warm? The trend line for either data set suggests the mean change could shift negative in just the next few years. Would that be sufficient?
Every person- every scientist- may have a different definition here. I will say that the global annual temperature is not just one figure, but a culmination of thousands of data points- a very large sample size in itself! The deceleration in warming is inconsistent with climate model projections if it were to continue. You can choose to agree with that prior statement, but also caveat with the usual “but we need more data”. I’m fine with that.
“The data are not accurate.”
This has become my new favorite, because for years and years, key figures in the climate change research community have used these data points to support the view warming is occurring at an alarming pace. Now, we hear from some scientists that this data is “masking” reality, such that the real global warming is buried in the deep oceans in areas that are difficult to measure.