Extreme Events Increase With Global Warming
Posted on 11 November 2011 by Rob Painting
For a more basic version of this post, see here
That continued warming of the Earth will cause more frequent and intense heatwaves is hardly surprising, and has long been an anticipated outcome of global warming. Indeed the 1990 Intergovernmental Panel on Climate Change (IPCC) Policymakers Summary stated that:
"with an increase in the mean temperature, episodes of high temperatures will most likely become more frequent in the future, and cold episodes less frequent."
One such "high temperature episode" was the monster summer heatwave centred near Moscow, Russia in 2010, where temperatures rocketed well above their normal summertime maximum and were so record-shattering they may have been the warmest in almost 1000 years.
Rahmstorf and Coumou (2011) developed a statistical model and found that record-breaking extremes depend on the ratio of trend (warming or cooling) to the year-to-year variability in the record of observations. They tested this model by analysing global temperatures and found that warming increased the odds of record-breaking. When applied to the July 2010 temperatures in Moscow, they estimated a 80% probability that the heat record would not have occurred without climate warming.
Figure 1 - Probability of July average temperature anomalies in Moscow, Russia since 1950. This image shows that the average temperature in Moscow for July 2010 was significantly hotter than in any year since 1950. Credit: Claudia Tebaldi and Remik Ziemlinski. From ClimateCentral.org
Some statistical background
Earlier statistical work on record-breaking events has shown that for any time series that is stationary (i.e. no trend), the probability of record-breaking falls with each subsequent observation. This is known as the 1/n rule, where n equals the previous number of data in the series. For example, the first observation has a 1-in-1 chance of being the record extreme (100%), the second has a 1-in-2 chance (50%), the third a 1-in-3 chance, and so on.
For climate heat records, the stationarity rule is not apparent, as one might expect in a warming world. Previous work in this area has shown that the slowly warming mean (average) temperature is responsible for this nonstationarity.
Monte Carlo
Following on from this earlier work, Rahmstorf and Coumou (2011) sought to disentagle the two effects of the mean change in temperature (climate warming), from the random fluctuations of weather, so as to find out the contribution of each to record-breaking. To do this, the study authors turned to Monte Carlo simulations. These are computer-generated calculations which use random numbers to obtain robust statistics. A useful analogy here is rolling a dice. Rolling once tells us nothing about the probability of a six turning up, but roll the dice 100,000 (as in this experiment) and you can calculate the odds of rolling a six.
From the simulations the authors obtain 100 values which represent a 100 year period. Figure 2(A) - 2(C) are the "synthetic" time series, and 2(D), 2(E) are respectively, the 'synthetic' global mean and Moscow July temperature. In all panels the data have been put into a common reference frame (nomalized) for comparison. (See figure 1 for an example of a Gaussian or normal distribution. Noise represents the year-to-year variability).
Figure 2 - examples of 100 year time series of temperature, with unprecedented hot and cold extremes marked in red and blue. A) uncorrelated Gaussian noise of unit standard deviation. B) Gaussian noise with added linear trend of 0.078 per year. C) Gaussian noise with non-linear trend added (smooth of GISS global temp data) D) GISS annual global temp for 1911-2010 with its non-linear trend E) July temp at Moscow for 1911-2010 with non-lnear trend. Temperatures are normailzed with the standard deviation of their short-term variability (i.e. put into a common frame of reference). Note: For the Moscow July temperature (2E) the long-term warming appears to be small, but this is only because the series has been normalized with the standard deviation of that records short-term variability. In other words it simply appears that way because of the statistical scaling approach - the large year-to-year variability in Moscow July temperatures makes the large long-term increase (1.8°C) look small when both are scaled. Adapted from Rahmstorf & Coumou (2011)
Follow steps one.....
Initially, the authors ran the Monte Carlo simulations under 3 different scenarios, the first is for no trend (2[A]), with a linear trend (2[B]) and a nonlinear trend (2[C]). The record-breaking trends agree with previous statistical studies of record-breaking namely that: with no trend the probability of record-breaking falls with each observation (the 1/n rule), and with a linear trend the probability of record-breaking gradually reduces until it too exhibits a linear trend. With a non-linear trend (2[C]), the simulations show behaviour characteristic of the no-trend and linear trend distibutions. Figures 2(D) and 2(E) are the actual GISS global and Moscow July temperatures respectively.
.....two.....
Next, the authors then looked at both the GISS global and Moscow July temperature series to see whether they exhibited a gaussian-like distribution (as in the 'lump' in figure 1). They did, so this supports earlier studies indicating that temperature deviations are fluctuating about, and shifting with a slowly moving mean (as in the warming climate). See figure 3 below.
Figure 3 -Histogram of the deviations of temperatures of the past 100 years from the nonlinear climate trend lines shown in Fig. 2(D) and (E) together with the Gaussian distributions with the same variance and integral. a) Global annual mean temperatures from NASA GISS, with a standard deviation of 0.088 ºC. (b) July mean temperature at Moscow station, with a standard deviation of 1.71 ºC. From Rahmstorf & Coumou (2011)
.....and three.
Although the authors calculate probabilities for a linear trend, the actual trend for both the global, and Moscow July temperature series is nonlinear. Therefore they separated out the long-term climate signal, and the weather-related annual temperature fluctuation, which gave them a climate 'template' on which to run Monte Carlo simulations with 'noise' of the same standard deviation (spread of annual variability from the average, or mean). This is a bit like giving the Earth the chance to roll the 'weather dice' over and over again.
From the simulations with, and without the long-term trend, the authors could then observe how many times a record-breaking extreme occurred.
Figure 4 -Expected number of unprecedented july heat extremes in Moscow for the past 10 decades. Red is the expectation based on Monte Carlo simulations using the observed climate trend shown in Figure 2(E). Blue is the number expected in a stationary climate (1/n law). Warming in the 1920's and 1930's and again in the last two decades increases the expectation of extremes during those decades. From Rahmstorf and Coumou (2011)
Large year-to-year variability reduces probability of new records
A key finding of the paper was that if there are large year-to-year non-uniform fluctuations in a set of observations with a long-term trend, rather than increasing the odds of a record-breaking event, they act to reduce it because the new record is calculated by dividing the trend by the standard deviation The larger the standard deviation the smaller the probability of a new extreme record.
This can be seen by comparing the standard deviation of the NASA GISS, and Moscow July temperature records (figure 3), plus figure 2(D) and [E]). As the GISS global temperature record has a smaller standard deviation (0.088°C) due to the smaller year-to-year fluctuations in temperature, you will note in figure 2(D) that it has a greater number of record-breaking extremes than the Moscow July temperature record over the same period (2[E]). So, although the long-term temperature trend for Moscow in July is larger (100 year trend=1.8°C), so too is the annual fluctuation in temperature (standard deviation=1.7°C), which results in lower probability of record-breaking warm events.
Interestingly it's now plain to see why the MSU satellite temperature record, which has a large annual variability (large standard deviation, possibly from being overly sensitive to La Niña & El Niño atmospheric water vapor fluctuations) still has 1998 as it warmest year, whereas GISS has 2005 as it's warmest year (2010 was tied with 2005, so is not a new record). Even though the trends are similar in both records, the standard deviation is larger in the satellite data and therefore the probability of record-breaking is smaller.
That other Russian heatwave paper
The results of this paper directly contradict the findings of Dole (2011) who discounted a global warming connection with the 2010 Russian heatwave. Rahmstorf & Coumou (2011) demonstrate that the warming trend from 1980 onwards (figure 4) greatly increased the odds of a new record-breaking warm extreme, and in fact should have been anticipated.
It turns out that Dole (2011) failed to account for a quirk in the GISS Moscow station data which had wrongly applied the annual urban heat island (UHI) adjustment for the monthly July temperatures, when UHI is a winter phenomenon in Moscow. This meant that the large warming trend evident there in July had been erroneously removed. This was confirmed by looking at Remote Sensing Systems (RSS) satellite data over the last 30 years, which shows a strong warming trend in Moscow July temperatures. See Real Climate post "The Moscow Warming Hole" for detail, and note Figure 5 below.
Figure 5 - Comparison of temperatues anomalies from RSS satellite data (in red) over the Moscow region versus Moscow station data (in blue). Solid lines shows the average July value for year, whereas the dashed lines show the linear trend for 1979-2009. Satellite data have a trend of 0.45°C per decade, as compared to 0.72°C per decade for the Moscow station data.
Whatchoo talkin' bout Willis?
Using the GISS data from 1911-2010 (a nonlinear trend), the authors calculate a 88% probability the extreme Russian heatwave (a record in the last decade of the series) was due to the warming trend. Clearly the summer temperatures for July 2010 in Moscow were a massive departure from normal, and including them would create bias, so the study authors exclude 2010 from their analysis, and re-calculate for 1910-2009. They found a 78% probability the freak heatwave was due to warming, and extending their analysis back to include the entire GISS temperature observations (1880 -2009), they found an 80% probability.
So to sum up:
- Rahmstorf and Coumou (2011) is a statistical/analytical study, which does not look at the physical causes of the 2010 Russian heatwave. Instead they assess the likelihood of record-breaking extremes.
- Based on earlier work, and confirmed by the Monte Carlo simulations; in a climate with no trend (no long-term warming or cooling), the probability of record-breaking extremes falls over time (perhaps contrary to popular belief).
- With a warming (linear) trend, the number of record-breaking warm extremes reduce until they eventually increase in a linear manner.
- With a nonlinear trend (warm/stagnation intervals) the probability of record-breaking is a combination of the no trend/linear trend scenarios.
- Larger annual fluctuations (larger standard deviation) reduce the number of records, whereas smaller fluctuations increase the number of record- breaking extremes.
- This helps explain why the satellite temperature record (large annual fluctuation) has 1998 as it warm record, whereas GISS (small annual fluctuation) has 2005 - GISS has a greater likelihood of seeing record-breaking.
- By seperating out the random (weather) component and the long-term (warming) component, the authors established there is a 80% probability that the 2010 July temperature record in Moscow would not have happened without climate warming from 1880-2009.
- The record-breaking extreme should have been anticipated, which contradicts earlier work on the Russian heatwave (Dole [2011]).
- In a warming world expect to see more record-breaking warm temperatures.
I am not convinced your point 1)"The variability in temperature increases with higher latitude, so yes, "verall summer temps are much more variable in South Dakota than Texas" is correct. I did generate 30 years of NOAA maps with a State-wide summer temperature. I collected data on South Dakota, Nebraska, Montana, Texas and Oklahoma. I started with year 1951 and went to 1980 (same used in GISS maps). 1951 to 1980 State temperatues. (I am using Fahrenheit as that is how the NOAA maps are set up in, it will not matter on standard deviation it will be the same proportion) Nebraska Temp average:.. 72.18... Standard Deviation: 1.57 South Dakota Temp Ave:.. 70.07.... Standard Deviation: 1.61 Montana Temp Ave: .......... 63.8...... Standard Deviation: 1.36 Texas Temp Ave: ............... 81.33... .Standard Deviation: 1.54 Oklahoma Temp Ave: ....... 79.96 .... Standard Deviation: 1.87 Your conclusion that northward states would have a greater degree of variablility in the 30 year period does not appear to be forgone conclusion. Texas is less than Nebraska or South Dakota but not a great amount but greater than Montana.[DB] You continue to ignore good advice in your eagerness to prosecute your agenda of "no it isn't". The "statistics" you employ are woefully incomplete and inadequate to the task to which you set them.
You must be able to grasp the figure I posted earlier in this thread here, or you will never comprehend the points made in the OP. Or do your preconceptions keep you from comprehending them?
Off-topic meanderings on regional temperatures struck out.
[DB] If you do not understand how Dr. Hansen obtained his graphs then I suggest emailing him. Endless speculations using regional temperature data are off topic on this thread.
Future comments like this will be deleted outright. OT temperature perambulations snipped, again.
From this article: "The strength of the height anomaly at 500mb during July/August 2010 was 4 times the standard deviation of July heights—a departure amplitude similar to that in the region's July surface temperatures. Typically, there is little persistence of the circulation pattern from July to August, although the current block that formed in early July continued with great strength through the middle of August. The extreme surface warmth over western Russia during July and early August is mostly a product of the strong and persistent blocking high. Surface temperatures soared as a result of the combination of clear skies, sinking motion within the environment of the high pressure causing compressional heating of air, the lack of any temporary relief owing to the blocking of the typical cold fronts that cool the region intermittently in summer. Add to this scenario the cumulative effect of drought that began in early summer which caused soils to dry and plants to desiccate to wilting point , thereby causing additional surface warming via land feedbacks as the blocking condition persisted. These are all well-known and studied physical processes that have accompanied summertime blocking and heat waves in the past."[DB] And with this you stand revealed as not having read the OP at top. And thus continue to prosecute your agenda of contrarian for the sake of contention alone.
Cease.
Extensive quote that should have just been linked to struck out.