Skip to content

Climate Sensitivity, Proxy Data And Statistics

In the debate over climate change one of the most misunderstood and misused terms is sensitivity. Climate sensitivity is usually defined as the change in global mean surface temperature following a doubling of atmospheric CO2 once equilibrium is reached. The concept seems simple but there is a catch: the definition of ‘equilibrium’, which depends on the timescale employed.

As it turns out, the timescales that nature uses—which can encompass thousands and even millions of years—cannot be compared with the century long timescale used in climate models. A recent online article, published by Nature Geoscience, states that accurate prediction of Earth’s future warming hinges on our understanding of climate sensitivity. Moreover, only by studying climate change in the past, the paleoclimate, identifying all the factors involved and how they interacted can our understanding of climate sensitivity be improved.

The reason people are interested in climate sensitivity is because it is a supposedly simple way of telling how fast the world will heat up because of human CO2 emissions. There are two basic ways of estimating climate sensitivity: using numerical climate models and studying climate change in the past. In “Where are you heading Earth?” Richard E. Zeebe, from the Department of Oceanography, University of Hawaii at Manoa, compares the two approaches and concludes that only by reconstructing Earth’s climate history can we accurately predicting Earth’s future warming. Unfortunately, different climate feedbacks operate on different timescales, greatly complicating the analysis. The author provides the following example:

For example, continental ice sheets respond slowly to changes in radiative forcing and their feedback on temperature may be ignored in the model-derived equilibrium climate sensitivity on a centennial timescale. However, the very same feedback is naturally part of the equilibrium climate sensitivity derived from palaeoclimate records in cases where the temporal data coverage extends beyond the characteristic response time of ice sheets. In this example, the climate sensitivities derived from models and from palaeodata are obviously not the same and comparing the two is like comparing apples and oranges.

If Models and paleodata are like apples and oranges, which source to trust? “Unfortunately, model-derived climate sensitivities are subject to large uncertainties,” Zeebe states. “This is not because climate models are flawed but simply because the climate system is complex and accurate predictions are inherently difficult.” While I agree that getting accurate predictions from a complex nonlinear system like a climate model is difficult I also believe the models themselves to be fundamentally flawed. This is proven every time a new missing factor is discovered or an existing mechanism is found to behave differently the previously assumed. Real world observations often show model assumptions to be false, and that qualifies as being flawed in my estimation.

Where does the author come down on the model vs data question? “Studying past climates to estimate climate sensitivity inarguably has one great advantage over theoretical computer models: it is based on actual data,” he states. Indeed, the entire point of the article is to stress the importance of enhancing our knowledge of paleoclimate conditions. The problem is there’s a dearth of proxy data.

The best data we have from prehistoric times is arguable ice core data from Antartica and Greenland. But that only goes back about 800,000 years, possibly less if recently discovered basal freezing is altering the deepest, and hence oldest, portions of the ice. For older periods the proxy data come from ocean floor sediments. “In fact, most of what we know today about the climate of the past few hundred million years is based on deep-sea archives,” states the author, who then proceeds to decry recent cutbacks in ocean core funding (he is, after all, an oceanographer).

Section of ice core coming out of drill. Credit: Kendrick Taylor, WAIS Divide.

Zeebe goes on to describe measurements taken around the time of the PETM—always a favorite among climate change researchers—pointing out that he and his coleagues “recently estimated the size of the PETM carbon input based on sediment records of deep-sea carbonate dissolution and showed that the subsequent rise in atmospheric CO2 alone was insufficient to explain the full amplitude of global warming.” The point here is not their conclusions or the comments of others regarding the new carbon release estimates, but rather the lack of comprehensive and reliable proxy data for paleoclimate research.

Ideal, of course, would be reconstructions of changes in past atmospheric CO2 concentrations based on direct proxy records to constrain carbon input and climate sensitivity — not only during the PETM but also during other climate episodes of the past. Although progress has recently been made to improve existing proxies for past atmospheric CO2 concentrations and seawater carbonate chemistry parameters10, the uncertainties are still significant, particularly in the more distant past. At present, it seems that key to improving the accuracy of palaeoclimate-sensitivity estimates is to both refine existing pCO2 proxies and encourage creative minds to develop new pCO2 proxies.

To that end, the author calls for the establishment of a monetary prize to promote the development of better paleodata: “I suggest establishing a prize in climate science, sponsors willing, for anyone who can find a reliable and accurate proxy for past atmospheric CO2 concentrations that works over timescales from millennia to hundreds of millions of years.” Perhaps Dr. Zeebe has his entry ready to go if such a prize is established (just kidding).

Zeebe is not alone in calling for better resolution proxy data to provide historical insights into past climate change. In a companion article, “Convergent Cenozoic CO2 history,” David J. Beerling and Dana L. Royer claim that it is time for systematic testing of proxies, “against measurements and against each other.” The Cenozoic era, the past 65 million years of Earth’s history, experienced a wide range of large climate variations. These including the transition from an ice-free planet to the onset of the Pleistocene glacial–interglacial cycles, with a number of sudden climate shifts in between.

“A decade ago, efforts to reconstruct atmospheric CO2 levels during this era showed fundamental disagreements between different proxy indicators of atmospheric CO2 concentrations,” they state. “This was especially true for the first half of the Cenozoic, with discrepancies between proxies spanning a range from less than 300 ppm to more than 3,000 ppm.” Following recent revisions, atmospheric CO2 reconstructed from terrestrial and marine proxies is shown in the figure below.

While Zeebe is calling for developing new proxies, Beerling and Royer are proposing that existing proxies be refined and recalibrated. But developing proxies of atmospheric CO2 levels is a formidable interdisciplinary scientific challenge. Beerling and Royer describe the process:

The process begins with identifying a clear response in a biological or geochemical system to changes in atmospheric or oceanic CO2concentrations. This response must be sufficiently large to be detected in the fossil or sedimentary record and it must persist in the fossil record without alteration at a later stage. If these conditions are met, the proxy can be calibrated against modern systems. Finally, the proxy must be shown to detect known changes in atmospheric CO2concentrations over a time span also covered by independent reconstructions or records.

Each stage requires assumptions and introduces errors, making the process long, hard and fraught with danger (at least to data accuracy). “We recognize that it will not be easy to constrain Earth’s CO2 history by proxy, but it is crucial to continue to reduce uncertainties,” the authors state. But they are nothing if not optimists, claiming that a “consensus” is being reached with regard to such data. One would think that the term consensus would be avoided by anyone in climate research, given the ill repute it has garnered in the global warming debate.

Evidently consensus is broadly defined in this case. Currently, estimates of Pliocene (5.332 million to 2.588 million years ago) CO2 levels by numerous methods agree with a difference of around 50 ppm. That is about half of the increase in current CO2 levels attributed to human excess. Deeper in time and the differences grow considerably. If climate is as sensitive as they say, this level of accuracy is still insufficient. “A twofold variation in estimates derived from the different techniques remains,” Beerling and Royer admit. “It raises legitimate concerns over the credibility of the estimates of ancient atmospheric CO2concentrations.”

Are the climate science estimates getting better, is the historical data improving? Perhaps. But it is important to recognize that our current knowledge of paleoclimate remains spotty and filled with errors. So, if studying paleodata is superior to climate modeling in the elusive hunt for an accurate sensitivity reading, and the proxy data used to study paleoclimate is itself suspect, where does that leave climate science? Right where its critics have suggested, an immature science that really cannot make any trustworthy predictions about future climate change. That fact is something we all should be sensitive to.

Be safe, enjoy the interglacial and stay skeptical.

The Resilient Earth, 8 July 2011