Skip to content

Australian Met Office Under Pressure Over Manipulated Temperature Data

Graham Lloyd, The Australian

Some of Australia’s long-term temperature records may contain faults introduced by the Bureau of Meteorology’s computer modelling, according to a widely published expert.

Average temperatures.

David Stockwell said a full audit of the BoM national data set was needed after the bureau confirmed that statistical tests, rather than direct evidence, were the “primary” justification for making changes.

Dr Stockwell has a PhD in ecosystems dynamics from ANU and has been recognised by the US government as “outstanding” in his academic field.

His published works include a peer-reviewed paper analysing faults in the bureau’s earlier High Quality Data temperature records that were subsequently replaced by the current ACORN-SAT.

Dr Stockwell has called for a full audit of ACORN-SAT homogenisation after analysing records from Deniliquin in the Riverina region of NSW where homogenisation of raw data for minimum temperatures had turned a 0.7C cooling trend into a warming trend of 1C over a ­century.

The bureau said it did not want to discuss the Deniliquin findings because it had not produced the graphics, but it did not dispute the findings or that all of the information used had come from the BoM database.

Faced with a string of examples of where the temperature trend had been changed after computer analysis, the bureau has defended its homogenisation ­process.

It has said that while some ­stations may show anomalous ­results, the overall record showed a similar warming trend to that of other inter­national climate ­organisations.

Dr Stockwell does not suggest that the bureau tampered with the Deniliquin data but that the ­bureau may have placed too much trust in computer modelling.

Full story