You have to be very careful with averages, they are not as simple as you might think. That thought was uppermost in my mind when I was reading a recent paper in Nature Climate Change. It had been written up by the Press Association (PA) and repeated by the Guardian, I guess that its multitude of environmental reporters-editors-heads had the day off.
The PA said we need to brace ourselves for accelerating climate change. It added that; “new evidence suggests the rate at which temperatures are rising in the northern hemisphere could be 0.25°C by 2020 – a level not seen for at least 1,000 years.” Given that the annual average global surface temperature hasn’t risen at all for about 18 years and that 2020 is only five years away great changes must be coming. Or are they?
The PA continued: “The analysis, based on a combination of data from more than two dozen climate simulation models from around the world, looked at the rate of change in 40-year long time spans.”
“Lead scientist Dr Steve Smith, from the US Department of Energy’s Pacific Northwest National Laboratory, said: “We focused on changes over 40-year periods, which is similar to the lifetime of houses and human-built infrastructure such as buildings and roads.”
Firstly, why 40-years? It’s said to be “similar” to the lifetime of houses. So why not 30, or 50-years? I hope the analysis doesn’t confine itself to just 40-years? That would be too much like cherry-picking.
Alas it does. And there is the problem. Unless one performs the analysis over a range of averages then it is almost useless. Also when does one start? By calendar decades? What does nature know of decades and the arbitrary start points of our decades?
Warning bells must also ring for anyone averaging real world temperature data with its well known decadal changes (rising 1910-40, no change 1940-1980, rising 1980-late 90s, no change since) coupled with the IPCC view that human-induced climate changes were only really manifest after the 1950s.
Smith et al use their 40-year averaging on real world data and about two dozen climate models. They conclude that over the 900 years before the 20th century the 40-year average “rarely” exceeds 0.1°C per decade. I suppose the key word there is “rarely!”
In the paper the researchers say that sub-century rates of global surface temperature change have rarely been examined. I don’t think that is the case. They say that when averaging the temperature data over a 20-year period it is not easily to distinguish natural variability. I would argue that doubling the average period doe not help that much either, particularly when one considers what the IPCC says about the onset of human influence on climate. Look at their 40-year data from climate models and HadCrut4, is there really a trend, and is it anthropogenic?
Also, looking at their Fig 3 their rate of temperature change per decade from CMIP5 simulations is (globally) more than 0.2°C greater than observational data. (Click on image to enlarge)
The result found by Smith et al (2015) must not be taken in isolation and, in my view, should not have been published in isolation, from a broader analysis. It over-emphasises the steep rise in global temperature seen in the 90s and distorts what has happened in the past 20-years to global surface temperatures. As I said, you have to be careful with averages, ask any bank manager.