Dec 23 2009

MET Office Did Not Release Actual HADCRUT3 Code

Steve McIntyre has a post up regarding the recent release of CUR code and data from the MET Office in the UK. I don’t have time for anything but a quick post at the moment (need to shovel some global warming that has my cars snowed in), but I can tell you now the code released by the MET is not the same as the code released with the CRU data and emails, and is very likely not the code used to create the recent CRU temperature profiles used by IPCC and others.

The two ‘programs’ just released by the MET Office can be found here and here.

First I checked the CRU files dumped to the public in late November to see if these files were released previously. The answer is no, there are no such files. Specifically there are no Perl scripts, which these newly released files are programmed in.

We do know there is a ‘gridder’ program, which is the first file listed above and at the MET Office site. One is mentioned in this key file from the CRU data dump produced in 2004/2005. It has these telling entries:

Aims of the work

The work being done falls in five main areas:

  • Improved land data: additional data, extra quality control.
  • Comprehensive land error model: Add estimates of observation errors, extend existing sampling and bias uncertainty estimates to arbitrary grid resolutions.
  • Flexible gridder: make gridded fields on any spatial resolution.

Flexible gridder

Earlier versions of HadCRUT [1, 2] were produced only on a 5×5 degree resolution. Recent work on marine datasets [3] has enabled us to produce them on any resolution; this is valuable for regional studies and for comparison with model results. We have developed the same functionality for gridding the land temperature, and so we can make HadCRUT3 gridded fields at any spatial resolution.

This file also discusses the relative uncertainty or error in the CRU temperature profiles, an uncertainty that means the claims of significant global warming in the last 100 or so years is not statistically significant.

Anyway, the important take away here is the work that was done developing a gridding algorithm that can be set to any resolution. The Perl scripts released by the MET Office have no such flexibility in grid size.

I searched for ‘gridder’ in the CRU data  dump and found some references in the infamous HARRY_READ_ME file:

The next opportunity comes at the output from anomdtb – the normalised values in the *.txt files that the IDL gridder reads. These are just files – one per month – with lists of coordinates and values, so ideal to add normalised values to.

I guess I need to finish the fortran gridder program. That would allow steamlining. Notes on that work are mainly in the file ‘gridder.sandpit’. Suffice to say, it works 🙂 Needs tweaking, and a few philosophical questions resolving, but apart from that..

When Harry was doing his work the gridder function was first in IDL which he recreated one in Fortran (both different programming languages than Perl). No Perl versions existed it seems.

Finally, if you check the coding standards applied, the Perl Scripts are worse than the CRU code that was dumped and which Harry was slogging through. The first MET Office file has this in the meager and rushed header:

#!/usr/bin/perl -w

# Make a time-series from gridded input file
# Global average calculated as (NH+SH)/2

Not very professional. Not creation date, no change history, no authorship (though I could see keeping that under wraps at the moment). This indicates to me these were rushed out since the CRU Climategate leak.

Here is a sample of a CRU header from some fortran code for comparison:

! samplingerror2.f90
! f90 main program written on 16.02.99 by Tim Mitchell
! last modification on 17.07.00
! f90 -o samplingerror2 initialmod.f90 runselectmod.f90 extractmod.f90 scalemod.f90 savemod.f90 loadmod.f90
! transformmod.f90 sortmod.f90 basicfun.f90 samplingerror2.f90

A tad bit better, but not professional by any means. Once I realized this is NOT the same code I went and rechecked the MET Office site and noticed some subtle wording:

station_gridder.perl takes the station data files and makes gridded fields in the same way as used in CRUTEM3. The gridded fields are output in the ASCII format used for distributing CRUTEM3.

make_global_average_ts_ascii.perl takes the output of the gridded and makes a global average annual temperature anomaly timeseries, again in the same way as those distributed with CRUTEM3.

It seems the MET Office is also in the business of illusion and misinformation, in the  same way the CRU was. Now all we need is a good explanation for this bait and switch. Not holding my breath on that one. My guess is we will start to see serious problems with the data released as well as people start diving into it.

Now, off to shovel some snow.

You can read more here at Bishop Hill and here at WUWT.

Update: Another person has come to the same conclusions (and quicker – of course I was on an airplane for 7 hours yesterday …)

3 responses so far

3 Responses to “MET Office Did Not Release Actual HADCRUT3 Code”

  1. […] This post was mentioned on Twitter by AJ Strata, Richard Liggins. Richard Liggins said: @petrolhead62 Doesn't look like it's the same code however: […]

  2. WWS says:

    someone’s been reading your blog!

    found this article on the web this morning and I would swear that it looks like a thin rewrite of a post I saw here a couple of weeks ago.

    but maybe it’s just great minds working on similar paths.

  3. sbd says:

    From: Phil Jones
    To: Gabi Hegerl , “Michael E. Mann”
    Subject: Re: Mann and Jones (2003)
    Date: Tue Aug 10 15:47:04 2004
    Cc: Tom Crowley

    No second attempt – don’t know what the first was? We’ll be doing a new instrumental data set (surprisingly called HadCRUT3), but that’s it at the moment. Attached is a good review of corals – just out.

    At 10:36 10/08/2004 -0400, Gabi Hegerl wrote:

    Hi Mike and Phil,

    Thanks! Yes, factor 1.29 will get me closer to my best guess scaling (factor 1.6 to same-size signals). The scaling is a tough issue, and I think there are lots of possibilities to do it depending on what one wants to do. For comparing underlying forced signals, I think tls is best. To get a conservative size paleo reconstruction (like what part of instrumental do we reconstruct with paleo), the traditional scaling is best.

    I’ll write up what Myles and I have been thinking and send it.

    Phil, if there is a second attempt at that with the Hadley Centre, let me know, I don’t like racing anybody!

    Michael E. Mann wrote:

    Dear Phil and Gabi,

    Dear Phil and Gabi,
    I’ve attached a cleaned-up and commented version of the matlab code that I wrote for doing the Mann and Jones (2003) composites. I did this knowing that Phil and I are likely to have to respond to more crap criticisms from the idiots in the near future, so best to clean up the code and provide to some of my close colleagues in case they want to test it, etc. Please feel free to use this code for your own internal purposes, but don’t pass it along where it may get into the hands of the wrong people.

    In the process of trying to clean it up, I realized I had something a bit odd, not necessarily wrong, but it makes a small difference. It seems that I used the ‘long’ NH instrumental series back to 1753 that we calculated in the following paper:

    * Mann, M.E., Rutherford, S., Bradley, R.S., Hughes, M.K., Keimig, F.T., [1]Optimal Surface Temperature Reconstructions using Terrestrial Borehole Data, Journal of Geophysical Research, 108 (D7), 4203, doi: 10.1029/2002JD002532, 2003. (based on the sparse available long instrumental records) to set the scale for the decadal standard deviation of the proxy composite. Not sure why I used this, rather than using the CRU NH record back to 1856 for this purpose. It looks like I had two similarly named series floating around in the code, and used perhaps the less preferable one for setting the scale.

    Turns it, this has the net effect of decreasing the amplitude of the NH reconstruction by a factor of 0.11/0.14 = 1.29.

    This may explain part of what perplexed Gabi when she was comparing w/ the instrumental series. I’ve attached the version of the reconstruction where the NH is scaled by the CRU NH record instead, as well as the Matlab code which you’re welcome to try to use yourself and play around with. Basically, this increases the amplitude of the reconstruction everywhere by the factor 1.29. Perhaps this is more in line w/ what Gabi
    was estimating (Gabi?)

    Anyway, doesn’t make a major difference, but you might want to take this into account in any further use of the Mann and Jones series…

    Phil: is this worth a followup note to GRL, w/ a link to the Matlab code?

    p.s. Gabi: when do you and Tom plan to publish your NH reconstruction that now goes back about 1500 years or so? It would be nice to have more independent reconstructions published in the near future! Maybe I missed this? Thanks…

    From: David Parker
    To: “Mann, Michael”
    Subject: Heads up
    Date: Wed, 26 Mar 2008 12:45:42 +0000
    Cc: “Folland, Chris” , “Kennedy, John” , “Jones, Phil” , “Karl, Tom”


    Yes it was based on only Jan+Feb 2008 and padding with that final value but John Kennedy has changed / shortly will change this misleading plot!



    —–Original Message—–
    From: Michael Mann []
    Sent: 26 March 2008 11:19
    To: Folland, Chris
    Cc: Phil Jones; Thomas R Karl
    Subject: heads up

    Hi Chris (and Tom and Phil),

    I hope you’re all doing well. Just wanted to give you a heads up on something. Have you seen this?

    apparently the contrarians are having a field day w/ this graph. My understanding that it is based on using only Jan+Feb 08 and padding w/ that final value.

    Surely this can’t be?? Is Fred Singer now running the UK Met Office website?

    Would appreciate any info you can provide,


    Michael E. Mann
    Associate Professor
    Director, Earth System Science Center (ESSC)

    Department of Meteorology Phone: (814) 863-4075
    503 Walker Building FAX: (814) 865-3663
    The Pennsylvania State University email:

    David Parker Met Office Hadley Centre FitzRoy Road EXETER EX1 3PB UK
    Tel: +44-1392-886649 Fax: +44-1392-885681