The climate 'hiatus' doesn’t take the heat off global warming
Posted on 29 April 2015 by Guest Author
by Matthew England, UNSW Australia
The recent slowdown in the rise of global average air temperatures during the first decade of the 21st century is being used as a touchstone argument for those who deny the science of global warming. But in research published today in Nature Climate Change, I and others show that the “hiatus” is just a blip on the radar compared to the long-term warming we have in store.
The slowdown or “hiatus” in warming refers to the period since 2001, when despite ongoing increases in atmospheric greenhouse gases, Earth’s global average surface air temperature has remained more or less steady, warming by only around 0.1C. This contrasts with the 1990s, when warming reached more than 0.2C.
But by looking at around 200 projections from climate models, and separating those that capture the slowdown from those that do not, we have shown that the slowdown does not affect long-term warming projections in any measurable way.
More extreme climate change contrarians have seized upon this so-called “pause” as a sign that global warming has stopped altogether, or that we are headed into a period of global cooling. Considering that global average temperatures are still rising by a variety of measures, just more slowly, and as the last time we had a coldest year on record was 1909, this is quite a leap of faith.
Unfortunately using global average surface air temperatures as a measure of total warming ignores the fact that most of the heat (more than 93%) goes into our oceans, which continue to warm without any sign of a pause, as you can see below.
Brakes on global warming?
Global average surface air temperatures only reflect the heat present in the atmospheric layer immediately above the land/ocean surface. As heat gets sloshed around the oceans by processes such as El Niño, the overlying atmosphere responds by tracking these heat variations.
As a result, surface temperature is strongly affected by natural variability. Beyond year-to-year variability such as El Niño there are decade-to-decade changes, such as the Interdecadal Pacific Oscillation, which has been shown to have a marked impact on global temperature rise.
In particular the negative phase of the Interdecadal Pacific Oscillation can lead to dramatically increased trade winds and fewer El Niños – as has been occurring since 2001. The modulation of these processes can significantly impact global average temperatures.
It is no coincidence that the only year outside the 21st century that remains in the top 10 hottest years on record is 1998. That year saw the strongest El Niño ever recorded – a natural variation that added considerable heat to the 1998 result.
Ever since then climate change deniers have loved to produce graphs that start in 1998, although when 2005 broke through as the new warmest year on record, another new favourite start year was born. Cherry-picking data at its worst.
And just for the record: 2010 then effectively tied with 2005 for the hottest year, and now 2014 has beaten the lot. And all the while the coldest year, 1909, languishes as a fading memory of a time long past, predating human’s undeniable footprint on climate. It’s a record that is certain to stand for a long time to come.
While 1998 was a warm year for the surface atmosphere, in the oceans, where most of the warming is going, the story is different. Looking at ocean temperatures down to 2 km, 1998 does not even make the cut for the top 10. Instead, total annual average ocean heat content has increased steadily during the hiatus, at quite a confronting rate given that this metric is closely tied to global sea-level rise.
The strong influence of natural variability on surface air temperatures is the reason that climate researchers regularly point out that any record shorter than around 20-30 years is not useful for detecting long-term trends associated with anthropogenic warming.
Step back from these short time scales and look at the past century to see the clear human influence.
Sifting the signal from the noise
Climate modellers have run hundreds of future warming simulations. Some of these simulations capture the slowdown if they happen to be in the right phase of the Interdecadal Pacific Oscillation.
But if you average all the model projections, the warming predicted for the past decade outpaces observations. This has been argued as evidence that scientists have overstated the threats posed by increasing concentrations of greenhouse gases.
Yet the climate science community has repeatedly shown compelling evidence that this slowdown reflects decade-to-decade variability.
Until now, however, no evaluation has been made of the possible consequences for long-term projections. Specifically, if the variability controlling the current hiatus is linked to longer-term sequestration of heat into the deep ocean, this might require us to recalibrate future projections.
With this in mind, we decided to test whether 21st century warming projections are altered in any way when considering only simulations that capture a slowdown in global surface warming, as observed since 2001.
As you can see in our paper we looked at this by separating all available future projections into two groups - those that captured the current slowdown and those that did not. We then compared the warming throughout the 21st Century for both groups.
We considered two well-known emissions scenarios taken from the latest IPCC report. The first scenario assumes greenhouse gas concentrations continue to rise unabated through the 21st century. The second assumes emissions are reduced to address global warming, peaking by 2040 before declining sharply.
It turns out that the difference between the “slowdown” set and the rest of the projections was negligible by the mid-21st century. And the longer the projections go on, the smaller the difference gets.
Under the high emissions scenario, for example, the difference in average projected end-of-century warming between the two groups of models is less than 0.1C; a tiny fraction of the projected 5C global warming if emissions are not curbed.
This clearly shows that the impact of the current hiatus is effectively non-existent in the context of long term warming.
Don’t wait for the warming to return
It is not news to climate scientists that natural variability impacts the short-term movement of global average temperatures in both directions. We have seen short-term pauses and even short-term reversals in global temperature rise in the past century, against the backdrop of an unambiguous long term warming trend.
We have also seen periods of more rapid surface atmosphere warming. In effect the surface warming comes in spurts, facilitated by phase shifts in the Interdecadal Pacific Oscillation.
And over time, despite the cycles of warming and cooling due to this oscillation, despite solar minima, despite cooling from volcanic eruptions and cooling from the massive loading in the atmosphere of anthropogenic aerosols, the trend in temperatures above the noise clearly continues upwards.
Our research shows that limiting the projections to models that capture the current slowdown in no way alters long-term projections. We can thus place extra confidence in the synthesised projections of the IPCC.
If we continue on our current high emissions trajectory the world will have no chance of staying under the 2C threshold that the federal government has committed to.
Limiting warming to 2C is a laudable aspiration – one that gives us a decent chance (not certainty) of avoiding dangerous interference with the climate system. Unfortunately if we hang around talking about a 15-year slowdown for too long, the chance to limit warming to less than the 2C threshold will rapidly disappear.
This article was originally published on The Conversation. Read the original article.
Professor England says
"As a result, surface temperature is strongly affected by natural variability. Beyond year-to-year variability such as El Niño there are decade-to-decade changes, such as the Interdecadal Pacific Oscillation, which has been shown to have a marked impact on global temperature rise.
In particular the negative phase of the Interdecadal Pacific Oscillation can lead to dramatically increased trade winds and fewer El Niños – as has been occurring since 2001. The modulation of these processes can significantly impact global average temperatures".
This suggests that factors other than human activity have an effect on global temperatures. But what percentage of global atmospheric temperature/ocean temperature change is due to natural factors and what to human activity? Without knowing this how can the extent of the appropriate action be taken to limit huan impact be determined with any degree of accuracy? And why is the sea temperature given in joules rather than in degrees C or conversely why is surface temperature given in degrees C rather than joules? And can ocean temperatures be accurately measured to fractions of a degree or to 100 or so zeta joules?
Sea "temperature" is not given in joules, ocean heat content is.
ryland - Attribution studies, looking at how much of recent climate change is due to us and how much is due to natural variations, is a very significant aspect of climate study. And multiple studies using different methods have come to the conclusions (see figure 2) that we are responsible for more than 100% of recent warming (with a small cooling influence from natural forcings).
See also an excellent overview of the current understanding of attribution at Realclimate, summarizing the information from IPCC AR5 - the best guess is that anthropogenic forcings are responsible for 110% of warming in the last half century, with only a 5% chance of the attribution being less than 50%. In other words, a 95% chance, given the evidence, that we are the dominant cause of recent warming.
To the extent we wish to minimize climate change, we have to address our responsibility for it.
ryland - In response to your second, and quite unrelated question, ""...can ocean temperatures be accurately measured to fractions of a degree or to 100 or so zeta joules?, the answer is simply Yes, we can.
This opinion that we cannot measure to the stated level of accuracy is a common misconception - from the Law of Large Numbers we find that as the number of observations increases the accuracy of the observations as a whole converges towards the real answer, regardless of the resolution of the individual measurements. Roll a sufficiently large number of dice, and the average fo the die values will converge towards 3.5, with precision increasing along with the number of dice rolled.
"This suggests that factors other than human activity have an effect on global temperatures".
Well duh! Someone suggesting that there arent other factors? However, England is talking about the yearly to decadal variations in surface temperature which are strongly dominated by internal variability due to an unevenly heated, water-covered planet. Climate is by definitition about the 30-year averages. The currently observed change in climate (ie the long term trend) since 19th C however is mostly if not all due to human activity.
Surface temperature is important because that is what we experience but it is also prone to high levels of variability. As the article points out, you can look at other metrics instead - OHC, sealevel, global glacial mass which have a much lower degree of internal variability.
Thank you for your "Well duh!" . I trust that if you are participating in the MOOC your responses to students will be somewhat less denigrating even if you regard questions asked/comments made as infantile. Having been a university professor for a very long time, I can assure you students regard responses such as "Well duh!" very unfavourably indeed.
ryland: "This suggests that factors other than human activity have an effect on global temperatures."
No one suggests that there aren't factors other than human activity that can affect global temperature. Except those setting up straw men to knock down, of course.
That said, other than an increase in solar intensity (the sole source of energy in) - which has not happened as it has in fact been the opposite recently, none of the internal variability factors that affect global temperature could have produced the monotonic warming over 35 years that we have observed. On the contrary, they all eventually revert to the mean.
ryland - If your 15:41 comment is directed to me (that isn't clear), and I have been overly brusque, my apologies. However, I have frequently run across the assertion that it's impossible to measure temperature anomalies, sea level rise, and the like to the acccuracies stated in peer-reviewed literature, and that assertion is incorrect.
The math behind the Law of Large Numbers goes back to Jacob Bernoulli in 1713, and is based on the statistics of measurements and random errors. A sufficient number of measurements describes a probability function around the correct value, and the more measurements you make the tighter the bounds of that probability, the higher the accuracy. That accuracy rapidly becomes smaller than the precision of any individual measurement. Given 'n' measurements and a measurement error with a standard deviation of 'S', the uncertainty scales with the number of measures by:
uncertainty = S / n0.5
Again, the simple case of dice is illustrative. If you roll a die five times, you might get the numbers 2, 5, 3, 6, 5, with a mean value of 4.2. But as you roll the die over and over, the measured mean value (assuming, of course that the die isn't loaded) will converge towards the real average of 3.5. After 10000 rolls the uncertainty in the estimate of the mean will 100 times smaller than the standard deviation of a single die roll, far below the single digit resolution of the die faces.
If the die is loaded, the measured mean value will be different, providing a reasonable test of whether or not the die is fair.
Of course, the measurements might be biased high or low, which would be a systematic error. But additive systematic errors (offsets) are wholly cancelled out by looking at changes, at the anomalies. Systematic errors in scaling (such as XBT issues with speed of descent for ocean heat content) can be identified by proper calibration and cross comparison with other measures such as ARGO - and once found they can be corrected to produce a consistent and accurate record.
KR@8 Thank for your courtesy and especiailly for your reply. Both are much appreciated. No, I wasn't referring to you as your post at 4 was both innocuous and informative. Additionally, from your responses at 4 and 8 I doubt very much if you would even consider making derogatory remarks
ryland: "This suggests that factors other than human activity have an effect on global temperatures."
The idea that only human activity has an effect on global temperatures would mean that the production of CO2 on earth, could somehow eliminate the 11 year Solar cycle on the Sun, and ensure that volcanic activity on Earth suddenly became uniform. Such an implausible hypothesis is perhaps the reason that your comment received the Well duh! response.
Phil@10 You may possibly be right. However if "well duh" is the accepted way on SkS of addressing posters who are not climate scientists and use imprecise sentences then so be it. Although that said, "well duh" may not be the best way to get the recipient of the post to immediately see the valdity and irrefutable logic of a particular argument. Moving on: KR as you suggested I am reading the views of both Gavin Schmidt and Judith Curry on attribution. Hard for me to digest in a single sitting but at the risk of another "well duh" a significant difference seems to be in Curry's use of 30 year periods and Schmidts dismissal of these as being too short to accurately evaluate the effects of natural factors. By the way I didn't see a single "well duh" from either Scmidt or Curry
[TD] For discussions of attribution, the "It's Not Us" thread would be better than this one. Note there are three tabbed panes--Basic, Intermediate, and Advanced.
It can be challenging to avoid becoming short with new participants, especially if the points raised have been discussed at great length in previous discussions with others. There are numerous pseudo-skeptic arguments that are considered 'zombies' because they keep being raised despite repeated refutation, and quite frankly the question of attribution for climate change is one of those (see the It's not us thread, ranked as the 56th most common climate denial myth). After a while it becomes tiring to refute a poor assertion for the Nth time...
My personal approach is to point new participants to the relevant information as much as possible - and save sharper tones for those who continue to repeat incorrect assertions in the presence of evidence to the contrary.
ryland, I've responded to you on the It's Not Us thread.
The ocean heat content graphic shows an increase over about 50 years of about 30x10^22 Joules for the top 2 km. What percent is this of the total? Is there an estimate for the total heat content of the ocean?
Ryland, please accept my humble apologies for the "well duh". As a some time moderator on this site, my fault was doubly bad. It was an unfortunate knee-jerk reaction to what appeared to me as a straw-man argument.
WxChief - please note the OHC quoted is actually change is OHC since baseline (average of 1955-2006). There is very sparse data below 2000 but given mixing mechanisms available, it is not expected to be large. Furthermore it is constrained by steric sealevel rise (the Trenberth 2009 paper looked at this accounting exercise).
scaddenp - Is there an accepted value for the average 1955-2006 OHC?
The ups and downs, pauses and accelerations come from surface air temperatures being a consequence of sea surface temperatures, which are variable over decades due to ocean currents, overturning - ie things like ENSO, PDO. Warming overlays this variability.
But, presuming I read the article correctly, if we are waiting for the "hiatus" to end in a hockey stick, it will probably be after PDO shifts phase? I suggest - based on Foster and Rahmstorf's "Global Temperature Evolution" - that even a 'less likely due to PDO' el Nino or two will send us into record temperatures even without that shift. A positive PDO will just make the upturn more short term consistent and steeper.
Dont take this as gospel but I think it more complicated than that. An absolute OHC doesnt make a lot of sense. With respect to what? Absolute zero? freezing temp of water? What would you do with this number?
What I believe is actually calculated is change of temperature against the average of 1955-2006 temperature. The delta temperature plus heat capacity of seawater is then used to calculate OHC. This is a useful number.
Picking up on ryland's 1st post: Is an "Overall Weighted Average Sea+Land Temperature" published anywhere (calculated based on the following)?:
OWASLT = Sum(Temp x Mass x Heat Capacity) / Sum(Mass x Heat Capacity), and looking at all pieces of mass components in the atmosphere + mass in the ocean (say down to 2000m or whatever depth would appropriate with respect to available global data & that should rightfully be included for an all inclusive weighted average temperature like this).
By combining everything into one OWASL temperature, this would then remove all the surface-only decadal swings caused by the shoshings back & forth between land & deep sea.
If total sea OHC is known (which is simply = temp x mass x Cp), then obviously sea temp is also known for all sea global locations and for all sea depths. Therefore, deriving the above OWASL temperature is 'doable'.
If sea OHC has continually & steadily been rising, and since this mass x Cp is 93% of the total mass x Cp of the globe, then obviously if a OWASLT was reported, then it would also be steadily climbing year after year on the same steady rate (even during the last 10 years when the rate-of-rise of surface-only temps was less than the 1990's). Having such a metric would remove all doubt that comes by only looking at the surface-only temp. And remove all contrarian ammunition, because there would be no hiatus in temperature rise using a metric like this.
Does anyone calculate & report an OWASL temp like this as their single, all inclusive metric number?
scaddenp @19, absolute heat content makes sense with respect to the heat content of the components at absolute zero temperature in a solid state. That is part of the definition of absolute zero. I agree, however, that the OHC anomaly is a far more usefull value. Also, it can be determined with a much lower error margin.
sauerj @20, your OWSALT would closely approximate to the "average temperature increase in the oceans" which deniers love to quote because its low numerical value is wonderfully deceptive. Here is an example from Bob Tisdale:
It is deceptive because nobody would care if the surface was warming as slowly as the average temperature increase across the ocean depths. Of course, it is not. It warms much faster and that creates a problem.
Further, such a measure has almost no scientific value. Surface temperature and OHC (or better Global Surface Heat Content, bearing in mind that the entire ocean is less than 0.2% of the radius of the Earth) both have immediate scientific import, being values in the equation for climate sensitivity as calculated by energy balance. OHC is a direct factor in the time taken to reach equilibrium, and surface temperatures are the governing value for the effects of reaching equilibrium. Both, therefore, are usefull. In contrast OWSALT as an index just ignores usefull information.
TC, are you aware of a graph that looks at warming rate vs depth?
Tristan @22, seeing you ask, we have this from a recent, paywalled article, Unabated planetary warming and its ocean structure since 2006:
Also this from Purkey and Johnson 2010 for the 1990s and 2000s:
Thanks TC, exactly what I was looking for!
Hope you're feeling better.
England's paper has (predictably ) been portrayed on Andrew Bolt's blog as a backdown requirng an apology
"Warmists who denied the pause now claim to explain it"
LINK
While I understand that the paper is a useful in explaining what the consequences are if there is indeed a pause, that is a long way from being established.
Gistemp data since 2000 from the Skeptical Science trend callculator::
Trend: 0.09 ±0.13 °C/decade
For “two decades” it is
Trend: 0.12 ±0.09 °C/decade
So there is a warming trend, with a headline value lower than that for the statistically significant warming trend from 1979:
Trend: 0.16 ±0.04 °C/decade
But the error margins for the period since 2000 mean that for the short period there is a 95% chance that the trend is as much as warming 0.22 °C/decade or a cooling trend of as much as -0.04 °C/decade.
In fact both the shorter periods support the null hypothesis.
That is, they are statistically not distinguishable from the warming trend beginning in 1979.
null hypothesis
noun
1.(in a statistical test) the hypothesis that there is no significant difference between specified populations, any observed difference being due to sampling or experimental error.
[RH] Activated link to fix page formatting.
KR, so what you are saying to Ryland is...that %100 of all warming is due to human CO2 forcing which is offset by tiny amounts of natural cooling...in other words the main driver if climate, using your reasoning, is offset by blips in natural variability...that again, naturally occurring blips cancel out that warm feeling we're all looking for. So what is the primary driver then?
[JH] Please note that posting comments here at SkS is a privilege, not a right. This privilege can be rescinded if the posting individual treats adherence to the Comments Policy as optional, rather than the mandatory condition of participating in this online forum.
Please take the time to review the policy and ensure future comments are in full compliance with it. Thanks for your understanding and compliance in this matter.
KR, I would like to point out that we do not posses the coverage needed to properly survey the ocean and its depth to come up with anything concrete. Our surveys currently rely on a lot of speculation
[TD] You need to back up your contentions with data.
Can anybody tell me the spacial resolution and coverage of ocean temperature surveys? This question impacts the subject matter directly, as models use survey data
[JH] Perhaps you should do your own research on this matter.
Owenvsgenius, you can't measure everything: this is called 'the uncertainty principle'. Thus all measurements have to be verified by other means making repetition of results central to scientific method!!
@ Owenvsgenius
Here's a couple to give you a head start on your research:
Google Scholar wants to be your friend. Call him.
I think the information on you want (level of uncertainty in ocean indicators) can be found Schuckmann and La Traon
Owenvsgenius - See IPCC AR5 on detection and attribution for a summary of the literature on what percentage of recent warming is attributable to anthropogenic factors. The mean estimate is that ~110% of the warming over the last half-century is due to us.
I believe the other references you've been given on ocean sampling are a good start, and will defer additional comments in that regard until you've done some reading.
I hope this is a good thread for this. If not, feel free to delete this comment, (I know that is a dumb thing to say, because moderators always have that freedom!)
I looked at the graphs from wattsupwiththat, and I realized, in order to keep the "pause" going, denialists have to keep changing the starting date of the faux pause.
In March of 2014, they said global warming stopped in Augist of 1996. By April of 2015, they said global warming stopped in December of 1996. They had to change the start date, because even in the cherry-picked dataset they use (RSS), the Earth keeps warming.
The whole "pause" idea is a fraud.
I wrote a blog post about what I saw. If someone with better math skills than I have would falsify it, that would be useful.
The blog post is here.
If my post is nonsense, I would not be offended to learn that.
dcpetterson @33.
You ask for comment on your blog post.
You are correct that a regression for the full period covered in all three of Mad Monckton of Brentchley analyses would provide a positive trend but it would be very very small (I make it +0.05ºC/century) and statistically insignificant.
His Lordship's delicate adjustments of the period being examined are purely so he can get a big fat zero in trend. Different choices of start & end points during this short period allow him to achieve this as the wobbles in RSS data (which are larger than the wobbles in surface temperature records) continue to bounce along just over and occasioanally under zero. Your primary finding and your accusation of cherry-picking his time-intervals as presented in your blog post is thus no great revelation.
You are correct in pointing to Monckton's dodgy choice of temperature record. RSS TLT attempts to measure a weighted temperature from surface to stratosphere. The weightings of the TLT measurements does give a lower average altittude than TMT but the descriptor "lower" is otherwise less than accurate and RSS TLT is certainly no substitute for surface measurements.
And your criticism that the period chosen by Mad Monckton is too short is also correct. He effectively is arguing using the contrarian 'escalator'.
As this woodfortrees plot demonstrates, the Viscount has managed to magic away most of the temperature increase shown in the RSS data.
My own analysis for the length of the 'pause' using RSS data show that the accumulating RSS data gives a steepening rate of warming up to mid-2004. (For surface temperature records the steepening continues into 2007.) That means RSS TLT temperatures were accelerating up to 2004 which is entirely incompatable with a 'pause' in temperature rise starting in 1996. And plotting the trend of surface temperature measurements and including more recent data demonstrates that since last autumn surface temperature records are again showing a steepening trend in global temperature rise. That surely means the 'pause' (if we were to call it that) has at the least 'paused'.