Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.


Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest MeWe

RSS Posts RSS Comments Email Subscribe

Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...

New? Register here
Forgot your password?

Latest Posts


Maximum and minimum monthly records in global temperature databases

Posted on 15 March 2011 by shoyemore

The worldwide success of books like The Guinness Book of Records is an example of human fascination with record-breaking – the smallest, fastest, farthest, the first etc. The frequency and size of records can tell us something about the underlying process. For example, the world mile record was broken on average once every 3 years between 1913 and 1999, by about 1 second per record. The shift in dominance of middle distance running from the Anglophone countries to Africa has meant the record has only been broken once since 1993.

This post describes a method of recording and graphically presenting successive annual counts of record-breaking months by temperature, e.g. the warmest or the coldest since records were kept, over more than one database. The rate of appearance of record-breaking (warmest or coldest) months intuitively provides a signal of climate change complementary to the usual temperature data. See the “Further Reading” section at the end of the post.

Such data of maximum or minimum records are quite useful as they might, for example, provide evidence of a warming (or cooling) climate, when the temperature data is apparently static over short periods. As we will see, such is what has occurred in the 2000s.

Steps to follow:

(1)    Download monthly climate data into a spreadsheet, either the raw data or the temperature anomaly. For easier manipulation, re-arrange the data with successive years in rows underneath each other, and the months in 12 columns, from January to December.

(2)    Create a second matrix of rank numbers. In Excel, the RANK function will return the ranking of each monthly temperature datum since the first datum was recorded i.e. the top month in the column. Consult the Excel Help to tell you how to use RANK to find the minimum records, which you can do in a separate worksheet. The IF function can be used to set all ranks, other than the one of interest, to 0. Figure 1 shows the result for the first four years, using GISS data for an example.

(3)    In a further column to the right, simply add the number of record months in each year.

(4)    If using more than one database, an average is taken. If, for 1960, the GISS database shows 1 new record month,  the NOAA database shows 0, and the HADCRUT database shows 1, it is counted as average = 0.66 for 1960, and entered into a score of average yearly record months, which you can keep in another column.

(5)    You now have two columns, each of the average maximum and minimum records in each year. You can use two further columns to create running totals of each, and a further column to find the difference between the two running totals.


 Figure 1: Conversion of GISS temperature anomaly into a binary indicator of maximum monthly records for first four years.

We intuitively expect that, in a period of warming, there should be more maximum monthly records than minimum, and vice versa in a period of cooling. If we assume that the frequency and duration of warming and cooling periods even out in the long run (natural variation), the running totals of maximum and minimum records should be approximately equal. The differences obtained by subtracting one running total from the other should centre on zero like a sine wave. Figure 2 shows the annual differences in cumulative sums of average new maximum and minimum records in 3 databases (GISS, HADCRUT and NOAA from 1880 to 2010).


 Figure 2: Annual differences in cumulative sums of Average Annual Maximum and Minimum Monthly Records. As an example, in 1911 there was an excess of 30 minimum monthly records over maximum, counting since 1880.


  • There is an “early measurement effect” because all the first year’s monthly temperature measurements will all be both maximum and minimum records. Subsequent months will modify the records so that it will take a few years for the annual counts to settle down. Since the effect influences both maximum and minimum records, Figure 2 is, on the average, free of this effect.
  • In Figure 2, the early decades show perhaps a 20-year period of cooling. After 1920, a mid-century warming commences, and this looks like natural variation (a half-sine wave) up to about 1940.
  • Then a period of stasis ensues (for 12 years) until the excess of maximum over minimum records starts again with an accelerating increase up to 2010.
  • Figure 2 resembles charts of the temperature anomaly – but it has a different origin than subtracting the temperature observation from a chosen baseline. It is more “granular” than (for example) a LOESS smoother. However, it misses mid-century cooling, which did not generate any cold monthly records.
  • It is difficult to reconcile Figure 2 with the expectation of a long term average of 0, if the record months are occurring randomly and in equal proportions. The mathematics to prove this is a bit tougher, so we will not go into that level of detail.

 Figure 3 is a chart of the running total of new annual maximum monthly records, starting with the 1955 value set to 0. Note is a non-linear, increasing trend – for each 10 year division, more records are occurring.


Figure 3: Cumulative Change in Annual Average Maximum Monthly Records since 1956. The 1955 value is set = 0.


  • It is possible to fit a function to the curve and use the model to predict the rate of occurrence of future new records. The mathematics of the curve fitting will not be described.
  • The rates are estimated from the fitted function, for different decades, in new maximum monthly records (r) per year:
    • 1960-1970            0.56r/yr
    • 1970-1980            0.94r/yr
    • 1981-1990            1.27r/yr
    • 1991-2000            1.56r/yr
    • 2001-2010            1.81r/yr
  • To understand the previous table better, in the decade 1960-1970, new maximum monthly records occurred on average about once every 21 months (=12 x 1/0.56). In the decade 2001-2010, they occurred on average every 7 months (=12x1/1.81).
  • Since the incremental increase in temperature for each new record reflects the temperature rise, the average temperature rate can be estimated from the temperature data. Let ∆T=Average Temperature Rise over all maxima. Then Temperature Rate = ∆T x Rate of Occurrence of Records.
  • Plugging in ∆T=0.011C (estimated from the temperature record), the following values are estimated for temperature increase in degrees C per decade:
    • 1960-1970            0.07C/decade
    • 1970-1980            0.10C/decade
    • 1981-1990            0.14C/decade
    • 1991-2000            0.17C/decade
    • 2001-2010            0.20C/decade
  • Predictions for the next decade (assuming continuance of current conditions):
    • 2020 Rate = 2.33r/yr
    • 2020 Rate of Temperature Increase = 0.26C/decade
    • The probability of 2011 not having a new record month is 0.09

This basic, and even crude, analysis confirms the model of temperature rise given by mainstream climate science.  That is no surprise. However, it can be expanded to incorporate natural variation (factors like ENSO and volcanic eruptions) using methods like logistic regression, which is more robust than ordinary least squares. The advantage of this method is that the mathematics of a noisy temperature process has been replaced by the mathematics of a simple stochastic process. Stochastic processes are well understood and used in many situations like monitoring time between crashes of a computer system (in software reliability engineering) or time between events (in health survival analysis).

This analysis undermines, yet again, many of the simplistic contrarian models e.g. that natural variability is driving warming, or that the earth has been cooling in the period 1998-2002. As Professor Richard Lindzen said: “Temperature is always rising and falling”. However, that implies an equalization of maximum and minimum monthly records over a long period. The numbers of minimum monthly records in these global temperature databases has not even been close to numbers of monthly maxima for some time.  The last such sequence in these databases ended in 1917, almost one-hundred years ago. The current rate of occurrence of minimum records is 0 per year, and the rate for maximum records is consistently outstripping that of minima by almost 2 per year, and rising.

Further Reading:

How often can we expect a record event? Benestad(2003)

Record-breaking temperatures reveal a warming climate. Wergen(2010)

Detection Probability of Trends in Rare Events: Theory and Application to Heavy Precipitation in the Alpine Region. Frei(2001)

0 0

Printable Version  |  Link to this page


Prev  1  2  

Comments 51 to 60 out of 60:

  1. I was trying to discuss statements made in your article and in your comments. Such as the statement in your article that "This analysis undermines, yet again, many of the simplistic contrarian models e.g. that natural variability is driving warming," Please explain how your analysis relates to attribution of the detected warming. You refer back to your comment #46. "In other words, we must look to processes that warm or cool the globe to explain the excursions in Figures 2 and 3. The conventional wisdom is that (human induced) CO2 warming did not set in on a large scale until the 1970s, whereas warming earlier in the century was due to other (natural) variations. There is nothing explicit in the chart to upset that view." I agree with your observations about conventional wisdom, but was hoping for something in your analysis that actually supported that conventional wisdom rather than merely being consistent with it; or an analysis that merely failed to explicitly disprove the conventional wisdom.
    0 0
  2. Charley A #51 As I said above "the data are what the data are". This alternative look at the temperature record in the main supports the generally accepted views of climate science.
    0 0
  3. Record-breaking events can also inform trends in variability. Those interested in this topic may find the following paper interesting. Anderson, Amalia, Alexander Kostinski, 2010: Reversible Record Breaking and Variability: Temperature Distributions across the Globe. J. Appl. Meteor. Climatol., 49, 1681–1691. doi: 10.1175/2010JAMC2407.1
    0 0
  4. @ #53 Anderson -- that's a very interesting paper with a method of analysis I've not seen before. The concept of doing record-breaking analysis while moving back in time is one of those "why didn't I think of that!" ideas. The rather startling conclusion about decreasing variability of monthly average temperatures does not fit with the consensus thinking about the effects of global warming. More concretely, the paper finds about a 0.2C decrease over 106 years in the standard deviation GHCN monthly temperatures,which have a starting deviation of about 1.8C. A non-trivial observation that doesn't seem to have gotten the appropriate amount of attention. Although the method does require detrending of the data, the detection of variability trends seems very robust to the detrending method chosen. I forsee this method being used to analyze other parameters where variability is of great interest, such as precipitation. I note that the paper took the GHCN monthly record as it stands, and there is no discussion of whether the observed 10% decrease in variability in the GHCN monthly temperature record is due to a true reduction in variability of temperatures or whether the observed decrease is merely an artifact of changes in measurement and record keeping.
    0 0
  5. Anderson #53, Thanks for the reference.
    0 0
  6. I should add that the second reference above - Wergen(2010) - also contains "reversible time" analysis.
    0 0
  7. Charlie A: Your last observation gets to the heart of interpreting that paper: "I note that the paper took the GHCN monthly record as it stands, and there is no discussion of whether the observed 10% decrease in variability in the GHCN monthly temperature record is due to a true reduction in variability of temperatures or whether the observed decrease is merely an artifact of changes in measurement and record keeping." Further research is currently being done with the goal of answering that question. Shoyemore: Indeed, Wergen(2010) also contains "reversible time" analysis. It has been used occasionally throughout the history of record-breaking analysis, e.g. Benestad, RE. 2004. Record-values, nonstationarity tests and extreme value distributions. Global and Planetary Change 44:11–26. I think Benestad may have been the first to use "reversible time" analysis in studying temperature, though it was proposed as a statistical technique prior to this.
    0 0
  8. This whole article, scientifically, is trash. There is no real quantification of cooling or warming that is occurring, just maximums or minimums. Using such data is hogwash. For instance, in the U.S. alone there has been over two dozen double record days where it was both the hottest and coldest day on record. You have to have quantification of the amount of heat vs time. This is not being done and people just don't understand how a year with everyday a broken heat record can actually be colder.
    0 0
    Moderator Response: [DB] Thank you for stating your utterly unscientific, yet fascinating, opinion.
  9. Uh, Henry, no-one says this is the only line of evidence that warming is occurring... the mean temperature has also been rising, the trend in extremes has merely been consistent with this.
    0 0
  10. Henry @58 "You have to have quantification of the amount of heat vs time. This is not being done" Yes it is! You might like to look at this Empirical Evidence of Warming Scroll down to the section on Total Heat Content and the study by Murphy et al.
    0 0

Prev  1  2  

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.

The Consensus Project Website


(free to republish)

© Copyright 2020 John Cook
Home | Links | Translations | About Us | Privacy | Contact Us