We have earlier commented on trends in temperature. (When is a trend not a trend?) Here we look quite simply at the 30-year trend line from three different temperature data sets and a 23-model ensemble hind-cast/projection. The trends were calculated using LINEST function in Excel. The year shown in the chart is the final year of each 30-year regression period. As can be seen the 30-year trend is positive for most of the last hundred or so years. It reached a peak in 2004 or 2005, depending on the data series, and has since declined slightly. Whilst the three data sets, not surprisingly, show very similar trends, the modelled trends are quite different.

Since 1883 there has been a peak or a trough in the trend line roughly every 30 years. Your guess as to what will happen in the future is as good as mine.

On 25 March 2012, the above paragraph was hanged with '30' replacing '25. The following was also added.

In the next graph we look at the 30-ear trend for 7 models. The choice of models was based on those listed on Table 6 of the IPCC “General Guidelines on the Use of Scenario Data for Climate Impact and Adaptation Assessment”, Version 2, June 2007. Where a model has more than one published simulation its results were average before being include in the graph. The siulation results come from the climate explorer site.

Elsewhere we have shown the results of the same seven models. Showing the difference as 30-year trends appears to demonstrate how widely different the simulations of the models were.


The Georgia Institute of Technology has recently released details of a study of the relationship between Arctic Sea Ice and Northern Hemisphere Snow Cover. The press release summarises it as “The researchers analyzed observational data collected between 1979 and 2010 and found that a decrease in autumn Arctic sea ice of 1 million square kilometers -- the size of the surface area of Egypt -- corresponded to significantly above-normal winter snow cover in large parts of the northern United States, northwestern and central Europe, and northern and central China.”

This goes some way to explaining the fact that whereas Arctic Sea Ice has been tending to decrease (at 53,000 km2/year) snow coverage has declined more slowly (at 22,000 km2/year). The importance of snow and ice is their role in the albedo feed-back mechanism. Snow reflects almost all the incoming energy, water and land (at least the northern boreal forests where most snow falls) reflect about 10%. So, other things being equal, the influence of snow and ice are equivalent. But, and it’s a big but, other things are not equal. Both sea ice and snow cover vary seasonally. The following chart shows average monthly values for the period 1978 to 2011. This shows that in winter snow covers a much larger area than sea ice.

However the albedo effect is only applicable when the sun is above the horizon. The next chart shows the areas adjusted for solar angle. In this case we have assumed a latitude of 80 °N for sea ice and 70 °N for snow and multiplied areas in the first chart by the sine of the sun angle at midday on the 15th of each month. (Calculated using the tool at http://aa.usno.navy.mil/data/docs/AltAz.php). This presents a very different picture and suggests that the influence of snow and ice are equivalent – with snow being perhaps more predominant.

This calculation is subject to a large number of caveats. The sea ice area is based on NSIDC ‘sea ice extent’ which shows “the total area of ocean covered with at least 15 percent ice”. This is reasonable as a metric since sea ice is almost 90% below water but of course such ice is not reflecting radiation. The use of 70 °N and 80 °N respectively and mid-day sun angle are also only approximate. Ideally it would be necessary to track areas of ice and snow at different times of the year, at different latitudes and the energy reflected at different times of day. 

The final chart shows the solar-angle adjusted ice and snow area. This was calculated as for the previous chart. It shows that despite the snow area being larger and reducing less than the albedo adjusted ice area has shown a steady decline.



The realclimate.org blog has a thread posted by Barry Bickmore related to an article which appeared in the Wall Street Journal (WSJ). The original article was written by a group of eminent scientists with little specific expertise in the science of climate change. To summarise in over-simplistic terms they said that they could accept Anthropogenic Global Warming (AGW) but not Catastrophic Anthropogenic Global Warming (CAGW). The WSJ published a response by a group of equally eminent climate scientists who supported CAGW.

The posting of Barry Bickmore looks at some of the claims in more detail than would be possible in a newspaper column. One point the first group of scientists had made was that climate models had not captured the recent stasis in temperatures. Bickmore’s response was that “individual models actually predict that the temperature will go up and down for a few years at a time, but the long-term slope (30 years or more) will be about what those straight lines say.” Below we show the annual temperatures expressed as degrees Celsius for 23 models (downloaded from the ClimateExplorer site) the maximum, minimum and average for these 23 models and the temperature from the HadCRU3 data series. As the HadCRU3 series only gives temperature anomalies we have adjusted it to give the same mean as the models for the period of overlap.

First of all it can be seen the chart supports Bickmore’s point. The average of the models follows the general trend of temperature from 1900 to the present and many of the individual models have periods with little or no increase even after the effect of CO2 kicked in from the mid-1970s onward. The current period may have some similarities to the period 1910 to 1970 when a strong Atlantic Multidecadal Oscillation led to an enhanced temperature increase for the first half of that period and to period of stasis for the second half. As the period of stasis, but with wide variations, lasted for about a quarter of a century we may have another decade or so of level temperatures with it still being possible to defend the models. In that period it is likely that models will improve and that the models themselves will be better able to represent such periods.

What is surprising is the difference between the models. The average temperature of the ‘hottest’ model is 15.4 °C and of the ‘coolest’ is 12.4 °C. This is, if my maths is correct, equivalent to a difference of 15.92 W/m², an order of magnitude larger than typical anthropogenic forcing estimates.

Although Bickmore doesn’t mention temperature we have produced a similar chart for precipitation.

This also shows a large difference between the models but, unlike temperature simulation, little evidence that the underlying trend has been captured. In this case the ‘wettest’ model has an average precipitation of 1184 mm/year and the ‘driest’ a precipitation of 918 mm/year. This is equivalent to 19.0 W/m², again large compared to anthropogenic forcing.


All scientists are well aware of the mantra “Correlation does not imply causation” and people have identified examples to illustrate this. Two spurious correlations I remember were one between the number of missionaries sent to Jamaica and the number of illegitimate babies born the following year, and another between the number of nesting storks and the number of babies born in some German towns. That said, correlation can indicate possible links; cancer and smoking being a case in point. When a physical explanation for a phenomenon is known then regression can also indicate the relative importance of different factors.

Scafetta’s models discussed at scepticalscience.com are examples of regression models which assume that global temperature can be explained by astronomical and climatic cycles superimposed on an underlying but unexplained trend. With some justification they have been dubbed ‘climastrology’.

To look at this in a bit more detail I’ve been playing around with two simple regression models. The first used three parameters: sunspots (SS - as a proxy for solar radiation), optical thickness (OT - to represent aerosols from volcanoes and elsewhere) and Atlantic Multidecadal Oscillation (AMO, as a representative climatic cycle). This is similar to the model Foster and Rahmstorf used to determine the underlying temperature trend from 1979 to 2010 with AMO replacing the El Nino index. The most questionable parameter in both cases is the use of natural climatic oscillations as independent variables. The regression parameters were calculated using the HadCRU3 temperature base and the LINEST function in Excel.

The resulting equation was:

Temperature = 0.62 * AMO + 0.00098*SS + 0.092*OT – 0.44

The equation has an r2 value of 0.19 and a standard error estimate of 0.24 °C. One anomaly is that the coefficient for the Optical Thickness is positive – implying that volcanoes would increase temperature! From these statistics you would not expect the agreement to be very good and it is not.

I then tried a four parameter regression, adding CO2. This gave the following equation.

Temperature = 0.49 * AMO + 0.00035 * SS -0.15 * OT + 0.0082 * CO2 - 2.98
n this case the r2 value is 0.89 and the standard error of estimate is 0.091 °C. The coefficients are now in the right direction. For comparison I have also plotted the ensemble of IPCC climate models in this case the standard error of estimate is 0.14, rather worse than the regression model.

The regression model does as well as the IPCC ensemble in places where the IPCC ensemble performs well (the increase from 1970 to 2000)and the regression model does better in places where the IPCC ensemble is known to underperform ( the increase from 1910 to 1945, the slight decline from 1945 to 1970 and the levelling off from 2000 to 2011). It is clear that the IPCC models will improve dramatically when they are able to simulate climate oscillations.

Ater I had developed this model I remembered that CO2 is not the only greenhouse gas so I modified it to use CO2 equivalent (CO2Eqv). The data were taken mainly from the GISS site (http://data.giss.nasa.gov/modelforce/ghgases//GHGs.1850-2000.txt) updated to 2011. The new model gave the following regression:

Temperature = 0.55 * AMO + 0.000049 * SS -0.35 * OT + 0.0177 * CO2Eqv - 2.98

In this case the r2 value is 0.90 and the standard error of estimate is 0.087 °C - a slight improvement. One intersting difference between the models is that in this model the influence of sunspots is considerably reduced and that of optical thickness (aerosols) is increased.

A final point – which may or may not be significant but I throw it in for fun. The coefficient for CO2Eqv is 0.0177. During that period CO2Eqv increased from 537.4 to 956.8 ppm and accounted for 0.74 °C of warming. The ratio of the increase of CO2Eqv was 1.78. This implies a CO2Eqv sensitivity of 0.89 °C for a CO2Eqv doubling (0.74 * log2)/log(1.78)), a figure close to most estimates. Given the warning about not reading too much into correlations with which I started this piece it should be treated with caution.

The modelling using CO2 equivalent was added after the initial post. The final figure with corrected labelling was replaced on 4 March.


In the IPCC Technical Assessment Report of 2007 many of the graphs used smoothed values of data series. This is perfectly valid as it facilitates seeing trends among the year-to-year fluctuations. The methods used are described in 'Appendix 3.A: Low-Pass Filters and Linear Trends'. They use for annual data as filter which has 13 weights 1/576 [1-6-19-42-71-96-106-96-71-42-19-6-1]. An example of this is in figure 3.8. It is interesting to note that have used a algorithm which allows smoothing right to the end of the data series.

We give below the three main global temperature series using the 13 point filter. At the end of the series we used only the part of the filter which applies retrospectively. This curve appears to shows that the rate of temperature increase has fallen off in recent years.

The same annex also mentions using regression to estimate trends. For simplicity we have done this using 13-year series and the LINEST algorithm in Excel.

Using the 13-year period for trends gives the clear impression that the rate of warming has slowed dramatically. Perhaps as a result of this, it is now being suggested that we need at least 30 years to detect a trend. (Though, it should be noted that I can find no reference to the need for a 30-year trend in IPCC report.) In the following graph we have plotted the 30-year trend lines and the 30-year trend from the average of a model ensemble.

Even using the 30-year trend it is clear that rate of temperature increase has fallen back. This graph also shows that the modelled 30-year trend was close to the observed one for the period 1975 to 2005 but outside that period diverged widely.


This book was better than I expected. My expectation was based not on Mann’s reputation/caricature, but on his previous book “Dire Predictions” co-authored with Lee Kump which we have reviewed previously.

On this site and in blog contributions I’ve recently been trying to promote two themes. The first is that the climate science community is weakening its case by trying to ignore inconvenient facts and data; it then gets doubly blamed for the cover-up and the inconsistency between their claims and the data. The second theme is that when presenting climate science to a largely lay audience, as Mann does here, a scientist has to be more careful than in a published paper. A paper will be thoroughly scrutinised by other scientists; lay people will not know if the wool is being pulled over their eyes.

Much of the book deals with the ‘climate wars’ aspect of title and the fact that few attacks on climate scientists fail to include Mann or his ‘Hockey Stick’. In the USA the whole issue of climate is much more divisive than in the UK. In the UK the Climate change Bill, mandating a reduction of 80% in CO2 emissions by 2050, was passed with only 5 votes against. At a recent lecture the Director of the Grantham Institute for Climate Change, Sir Brian Hoskins, was very frank about the shortcoming in climate science. None of this would be possible in the USA. And it goes a long way to explain why Mann devotes time to this topic.

One of my complaints of his previous book was that Mann blithely ignored any criticisms of his work. In this book he tackles some of them – even if not always head on. One example was the use of the word ‘censored’. When the data used for his original millennial temperature reconstruction was released it contained a folder called ‘censored’. This was regarded by anthropogenic global warming antagonists as proof of malfeasance; in reality it is a normal statistical term used to define a data sub-set excluded to test its importance to the overall conclusion. Another area he deals with is what he refers to as the ‘divergence problem’. This is the fact from about 1960 onwards most tree rings fail to respond to global warming. In the case of his own work he simply says that his data sets ended in the 1970s and 1980s and claims that the idea of adding the observed temperature for recent years to bring the data up to date, and increasing the hockey stick appearance, was suggested by a reviewer. Elsewhere he deals with a “high-elevation site in western United States”, without actually calling them ‘bristle cone pines’, and accepts that their growth rates could have been influenced by CO2 enhancement rather than temperature increases. This had been a criticism of his record. Another criticism of his work had been that one of the proxy records, sediments from a Lake in Finland, had not only been corrupted by upstream engineering works but had been used ‘upside down’. In one of comments on this Mann says “one of our methods didn’t assume orientation, while the other used an objective procedure for determining it”. This appears to be an admission that the orientation might not have been correct though elsewhere he says that this record did not change the overall conclusions.

So, if other climate scientists might have understood the oblique references in the book how might the public react to the book. Well of course few of them would have picked up the allusions and would quite possibly have been unaware of the significance of some of the statements. Another objection of proxy records is that, for statistical reasons, they underestimate the variability of the parameter they are estimating. Mann recognises this and explains this is a reason for the wide error bands. It is quite possible that increases in temperature such as those from 1910 to 1945 or 1975 to 2005 might have occurred in the past but not have registered in the proxy record. Again few members of the public reading this book would have understood the point and simply seen the ‘blade’ of hockey stick and not realised that the handle could have been as curvy as the blade. Another example of misleading the public is the graph he presents of a projection of temperature made in 1988 but he only includes data “available through 2005 in this analysis” even though later data were available at the time of writing the book and show the projection as been less accurate.

Elsewhere I have argued that there was need for a book in a popular style to combat the popular books of AGW antagonists – this is indeed such a book. What is now needed is a book which arbitrates between two sides.

Author: Michael E. Mann

Publisher: Columbia University Press, 2012
E-ISBN: 978-0-231-52638-8


This is the time of year when climate data sets are updated to include annual totals for the preceding year (in this case 2011). Most sites concentrate on temperature - though sometimes include not just observed atmospheric temperature but also variables such as modelled projections and temperature in the oceans. One variable which is often forgotten is precipitation. After all, the positive feedback from water vapour assumes that it remains in the atmosphere rather than becoming precipitation.

On the chart below we use two data sets. The first is the NCDC 5° gridded precipitation anomaly at http://www1.ncdc.noaa.gov/pub/data/ghcn/v2/grid/grid_prcp_1900-current.dat.gz. To get a monthly global figure we averaged the data, cosine weighted on latitude to compensate for reducing grid sizes. The values are in millimetres. The second data set was of precipitation hind-cast/projected downloaded from the Climate Explorer web site at http://climexp.knmi.nl. The data set used was described as “all models, 20c3m/sresa1b” and included 23 models. These data were in mm/day so to convert then to equivalent units they were multiplied by the number of days in the month. They were adjusted to give values relative to the period 1980 to 2010. As trends were masked by month-to-month variations the five year centred moving averages are also plotted.