Skip to content

Steve McIntyre: How The IPCC Fixed The Facts

Steve McIntyre, Climate Audit

Why did the IPCC delete a graph that shows the discrepancy between observations and climate projections?

AR5 Second Order Draft (SOD) Figures 1.4 and 1.5 showed the discrepancy between observations and projections from previous assessment reports. SOD Figure 1.5 (see below as annotated) directly showed the discrepancy for AR4 without additional clutter from earlier assessment reports. Even though AR4 was the most recent and most relevant assessment report, SOD Figure 1.5 was simply deleted from the report.

Nor can it be contended that IPCC erroneously located the projections in SOD Figure 1.5, as SKS claimed here in respect to SOD Figure 1.4. The uncertainty envelope shown in SOD Figure 1.5 was cited to AR4 Figure 10.26. As a cross-check, I digitized relevant uncertainty envelopes from AR Figure 10.26 (which I’ll show later in this post) and plotted them in the figure below (A1B – red + signs; A1T orange). They match almost exactly. Richard Betts acknowledged the match here.

figure 1.5 SOD annotated
Figure 1. AR5 SOD Figure 1.5 with annotations showing HadCRUT4 (yellow) and uncertainty ranges from AR4 Figure 10.26 in 5-year increments (red + signs).

AR5 Figure 1.4
Having deleted the informative (perhaps too informative) SOD Figure 1.5, IPCC’s only comparison between AR4 projections and actuals is in the revised Figure 1.4, a figure that seems more designed to obscure than illuminate.

In the annotated version shown below, I’ve plotted the AR4 Figure 10.26 A1B uncertainty range in yellow. Unfortunately, Figure 1.4 no longer shows an uncertainty envelope for AR4 projections. Here one has to watch the pea carefully. Uncertainty envelopes are shown for the three early assessments, but not for AR4, though it is the most recent. All that is shown for AR4 are 2035 uncertainty ranges for three AR4 scenarios (including A1B) in the right margin, plus a spaghetti of individual runs (a spaghetti that does not correspond to any actual AR4 graphic.) From the right margin A1B uncertainty range, the missing A1B uncertainty range can be more or less interpolated, as I have done here with the red envelope. I matched 2035 uncertainty to the right margin and interpolated back to 2000 based on the shape of the other envelopes. The re-stated envelope is about twice as wide as the actual AR4 Figure 10.26 uncertainty envelope that had been used in SOD Figure 1.5. Even with this much expanded envelope, HadCRUT4 observations are at the very edge of the expanded envelope – and well outside the actual AR4 Figure 10.26 envelope.

figure 1.4 annotated
Figure 2. AR5 Figure 1.4 with annotations. The yellow wedge shows the uncertainty range from AR4 Figure 10.26 (A1B). The red wedge interpolates the implied uncertainty range based on the right margin A1B uncertainty range.



In the final draft document sent to external reviewers, SOD Figure 1.5 directly compared projections from AR4 Figure 10.26 to observations, a comparison which showed that recent observations were running below the uncertainty envelope. The reference period for the AR4 uncertainty envelope was well-specified (1981-2000) and IPCC correctly transposed the envelope to the 1961-1990 reference period used in SOD Figure 1.5.

IPCC defenders have purported to justify changes to the location of uncertainty envelopes from the three early assessment reports on the basis that IPCC had erroneously located them in SOD Figure 1.4. Thousands of institutions around the world routinely compare projections to actuals without making mistakes about what their past projections were. Such comparisons are simple accounting, rather than cutting-edge science. It is disquieting that such errors persisted into the third iteration of the documents and the final version sent to external reviewers.

But, be that as it may, there was no reference period error concerning AR4 projections or in SOD Figure 1.5. So reference period error is not a reason for the deletion of this figure.

Richard Betts did not dispute the accuracy of the comparison in SOD Figure 1.5, but argued that the new Figure 1.4 was “scientifically better”. But how can the comparison be “scientifically better” when uncertainty envelopes are shown for the three early assessment reports, but not for AR4. Nor can a comparison between observations and AR4 projections be made “scientifically better” – let alone valid in accounting terms – by replacing actual AR4 documents and graphics with a spaghetti graph that did not appear in AR4.

Nor is the new graphic based on any article in peer reviewed literature.

Nor did any external reviewers of the SOD suggest removal of Figure 1.5, though some (e.g. Ross McKitrick) pointed out the inconsistency between the soothing text and the discrepancy shown in the figures.

Nor, in the absence of error, is there any justification for such wholesale changes and deletions after the third and final iteration had been sent to external reviewers.

In the past, IPCC authors famously deleted data to “hide the decline” in Briffa’s temperature reconstruction in order to avoid “giving fodder to skeptics”. Without this past history, IPCC might be entitled to a little more latitude. However, neither IPCC nor its supporting institutions renounced such conduct or undertook avoid similar incidents in the future. Thus, IPCC is vulnerable to concerns that its deletion of SOD Figure 1.5 was primarily motivated to avoid “giving fodder to skeptics”.

Perhaps there’s a valid reason, but it hasn’t been presented yet.

Full analysis