The Committee on Climate Change has given its view on the much-discussed recent article on global warming predictions in the Mail on Sunday, written by David Rose. The article points out the disparity between model simulations of global warming and real data, suggesting that using models to formulate policy in such a situation might be unwise.
The Committee on Climate Change is an independent, statutory body established under the Climate Change Act 2008. Its role is to advise the UK Government and Devolved Administrations on emissions targets, and report to Parliament on progress made in reducing greenhouse gas emissions and preparing for climate change. It says it conducts independent analysis into climate change science, economics and policy.
In their criticism of the Mail’s article Professor Sir Brian Hoskins on behalf of the committee, with assistance from Dr Steve Smith, science advisor to the committee, state that “like all scientists we take a skeptical stance, testing each assertion against the evidence.” If only they did. Throughout their riposte they emphasise models over real world data, and only select supporting research.
Hoskins and Smith say that three climate scientists have already taken issue with their quotes used in the Mail on Sunday article. Each was given just a single quote. Piers Forster was quoted as saying that high estimates of climate sensitivity are unlikely, something he repeated on the Bishop Hill blog. Myles Allen was quoted as saying that future warming is likely to be significantly lower than once thought, something he repeated in the Guardian a few days later, and James Annan claimed he did not say that climate sensitivity is likely to be half what is considered by many, although he did say to the New York Times that values of climate sensitivity under 2 degrees are looking a lot more plausible than above 4.5 degrees.
Hoskins and Smith say that the last 15 years of global annual average temperature (land and ocean), which shows no warming, is unimportant. They claim the temperature trend is rising when short-term factors such as El Nino, aerosols and solar effects are filtered out. They quote one paper as evidence. In my recent report I also quote the paper as evidence but I do not ignore many more that take a very different view. What is more, their claim on the effect of aerosols on recent temperature is out of date with recent research that suggests aerosols provide less cooling than some originally thought.
Hoskins and Smith refer to the climatewatch website in an attempt to prove that some that some climate indicators, e.g. ocean heat, sea level, sea ice cover and mountain glaciers, indicate that the Earth is continuing to warm. Some of this data does not support their case. For instance looking at the ocean heat graph on this website shows it has not increased for a decade, and that sea level has not changed its rate of rise for a century, and even has some recent indications of slowing down.
Hoskins and Smith misuse statistics. Regarding the graph used in the Mail on Sunday, they say that in a 60-year period one would expect six data points to be outside the 90% probability band. As they suggest this would often be that three data points are above the 90% band and three below, though such small numbers mean it often wouldn’t be. But that is not what the statistics mean in this case.
There is a statistical uncertainty amongst individual datapoints, but also between model runs, and they are not the same thing. Looking at the real data in the graph shows seven consecutive datapoints outside the 50% probability zone. Even allowing for the fact that the datapoints are not independent but are autocorrelated the probability of this being due to chance is tiny. What the graph does show is that the real word data is only reproduced in less than five percent of the computer model simulations. What is more, where the real data take a certain course it is irrelevant what course is taken by model forecasts. That is the essence of the Mail on Sunday story, that model simulations are wrong at least 95% of the time. Given their spread, and the fact that individual model runs with tweaked input parameters are not statistically independent entities, it is even quite possible that the model runs that fit the real world data do so only by chance. This is quite enough to cast doubt on the validity of the use of the climate models, and to question their ability to forecast the climate.
If this kind of data were from a drugs trial it would have been stopped long ago, even allowing for the little understood stopping bias effect which occurs when looking for the first signs of effectiveness or harm in such trials.
The Mail on Sunday graph was based on comparing 25 climate model simulations with real world data. Personally, I find it more illuminating not to use probability bands on such graphs but instead plot the trajectory of individual simulations. This is what is done in this graph for three sets of real world data and 38 climate simulations. Only two out of the 38 models come anywhere near the real world data, and then not particularly well.
I am disappointed in the Committee on Climate Change’s analysis of the Mail on Sunday article. One expects a more objective and sober assessment of empirical facts and research.