Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
Keep me logged in
New? Register here
Forgot your password?

Latest Posts

Archives

Climate Hustle

Your questions on climate sensitivity answered

Posted on 26 September 2014 by Guest Author

This is a re-post from Roz Pidcock at Carbon Brief

How sensitive is the earth to carbon dioxide? It's a question that's at the heart of climate science.

It's also complicated, and scientists have been grappling with pinning down the exact number for a while now.

But while the exact value of climate sensitivity presents a fascinating and important scientific question, it has little relevance for climate policy while greenhouse emissions stay as high as they are.

Nevertheless, each time a new research paper comes out suggesting climate sensitivity might be low it's misused by parts of the media to argue cutting emissions aren't so urgent after all.

The latest example comes in an article in today's Times, which claims a new climate sensitivity means "Climate change could be slower than forecast".

So what is climate sensitivity? What does and doesn't it tell us about future warming?

               Times Climate Sensitivity

Source: The Times, 26th September 2014

What is climate sensitivity?

Climate sensitivity is the warming we can expect when the concentration of carbon dioxide in the atmosphere reaches double what it was in preindustrial times.

Preindustrial levels were about 280 parts per million (ppm), we're at 400 ppm now and at current emission rates, we're due to hit 560 ppm soon after 2050.

Scientists estimate the value of climate sensitivity lies between 1.5 and 4.5 Celsius. This doesn't mean we'll get that temperature rise immediately. Parts of the climate system respond slowly. Heat entering the oceans takes time to cause atmospheric warming, for example.

That means the full warming effect of the greenhouse gases we release into the atmosphere might not materialise until decades or centuries later.

That's why scientists sometimes talk about a simpler measure of climate sensitivity known as the Transient Climate Response (TCR).

TCR is easier to estimate because it ignores the slower changing parts of the climate system. It's a useful way to look at the short term rather than centuries from now.

But the same rule applies to both definitions. The higher the climate sensitivity, the more warming we'll see.

How do scientists calculate climate sensitivity?

Working out climate sensitivity is complicated, and there are three main ways to do it.

One method looks at how earth responded to natural greenhouse gas changes in its geological past. Another matches global surface temperatures with greenhouse gas concentrations and other forcings over the last century or so, to try and work out sensitivity from how the planet is responding - known as 'energy budget models'. The third uses climate models to predict the theoretical effect of a doubling of carbon dioxide based on scientists' understanding of how different elements of the climate system interact.

Any estimate of climate sensitivity comes with a range - a lower and upper limit within which the actual value could reasonably lie.

In its 2013 report, the Intergovernmental Panel on Climate Change (IPCC) gave a likely range for climate sensitivity of between 1.5 and 4.5 degrees Celsius. The likely range for TCR was between 1.0 and 2.5 degrees.

Are estimates of climate sensitivity getting lower?

No.

Some high-profile papers based on the energy budget model approach have emerged recently, suggesting the values of climate sensitivity lies at the bottom of the IPCC's likely range.

This week a paper by Judith Curry and Nic Lewis gave a value for TCR of 1.05 to 1.8 degrees Celsius, with a mid value of 1.33 degrees. This is similar to a paper last year by Otto et al. (2013) - both estimates are about 25 per cent lower than the IPCC's middle estimate of 1.8 degrees.

Though this flurry of recent papers has attracted some media attention and plaudits from climate skeptics, their publication doesn't mean scientists are "backing off" higher climate sensitivity, as some have suggested.

One criticism of the energy budget model approach that lies behind these kind of studies is that it doesn't take into account the role of the oceans in taking up excess heat. Other estimates of climate sensitivity using climate models support the higher end of the IPCC's likely range.

The IPCC considers all the different ways of calculating climate sensitivity, without making a value judgment about which is best. So as long as there is a body of literature supporting both ends of the likely range, it won't be revising it any time soon.

It's important not to oversell the significance of a single paper, or a collection of papers that use one of several available methods.

Pinning down climate sensitivity is an area of ongoing scientific debate.  The hope is that new research can help narrow the range of uncertainty.

When a new study is published, on any scientific topic, it doesn't override all that came before it. It simply adds to the body of literature.

Does low climate sensitivity mean emissions cuts can happen more slowly?

No. Some parts of the media have interpreted low climate sensitivity as evidence that emissions cuts aren't urgent after all. For example, an article by journalist Ben Webster in today's Times said about Judith Curry and Nic Lewis's paper:

"[I]f the new paper is correct, the emissions cuts needed to prevent a dangerous rise in temperature could take place more slowly than governments have proposed."

This is definitely not the case. We're emitting carbon dioxide so fast that the difference between a low and a high value is largely irrelevant in climate policy terms.

As Myles Allen, professor at Oxford University and IPCC author, told us recently:

"A 25 per cent reduction in TCR would mean the changes we expect between now and 2050 might take until the early 2060s instead ... So, even if correct, it is hardly a game-changer."

It's also important to remember that climate sensitivity is not the same as total warming.

Instead, it's what we'll get every time the carbon dioxide concentration doubles above pre-industrial levels.

If emissions stay as high as they are, that means even a low value of climate sensitivity would see a significant amount of warming by the end of the century. The risks are large either way, Myles Allen explains:

"[A]ny revision in the lower bound on climate sensitivity does not affect the urgency of mitigation."

On the other hand, if climate sensitivity is at the upper end of the IPCC's likely range, reports suggesting a four degree temperature rise above pre-industrial levels by the end of the century are likely to be optimistic.

So while the precise value of climate sensitivity poses an interesting scientific question, tackling the policy response is by far the bigger problem. Without swift emissions cuts, we can expect a serious level of warming - whatever climate sensitivity ends up being.

0 0

Bookmark and Share Printable Version  |  Link to this page

Comments

Comments 1 to 22:

  1. There are a couple of points I'd make about Lewis & Curry's 1.64 degree C estimate of ECS (and by extension, all estimates derived from recent temperature observations, which are consistently lower than those derived from paleo proxy data or computer models).

    First, and most importantly, climate sensitivity is not really "a thing", in that it's not some fundamental, unchanging constant of the planet (like g= 9.8m/s2). Climate sensitivity is the sum total of a whole array of forcing and feedback mechanisms, which are themselves not constants. So, ECS isn't always going to equal 3.0, or 1.64 or some other fixed value. It can (and very likely will) vary under differing climactic conditions as those underlying forcing and feedback mechanisms vary. Treating ECS like a constant is a useful shorthand for back-of-the-envelope estimates, but it's nowhere close to a law of physics.

    As a simple example, consider Ice Albedo feedback. It only applies at the transition zone of polar ice extent (it has no effect in places that are permanetly frozen nor permanently ice-free), and as that ice extent retreats to higher latitudes with less surface area and insolation, the magnitude of IA feedback weakens, ultimately to zero. So, all else equal, you'd expect net feedback (and ultimately ECS) to be higher at the peak of an ice age vs. during an interglacial period. But all else won't be equal... other feedback mechanisms will each have their own dynamics, the sum total of which is unlikely to remain a constant. As such, different methods of estimation (paleo proxy measures vs current observations) might yield markedly different findings, and both can be entirely correct about the conditions that they measured. So evidence that ECS was 4 degrees Celsius in the distant geological past does not contradict evidence that ECS was 1.6 degrees Celsius in the very recent past. Furthermore, if ECS is not a linear constant, then extrapolations about the future become increasingly uncertain the further out you project.

    Second, models all very sensibly assume ice albedo feedback is positive (i.e. warming causes polar ice to retreat, which causes more sunlight to be absorbed, which causes more warming.) To just about everyone's surprise, for the last couple of decades, the southern hemisphere has experienced the exact oppposite... ice extent has increased in the face of warming, acting as a damper rather than amplifier (i.e. a source of negative feedback rather than positive) which would lower . Nobody can say for certain why this has occurred (there are several good theories), or for how much longer it will continue to happen, but in the long term, the anomaly should reverse and warming should lead to less ice and positive feedback. Looking at the recent temperature record, this anomaly will "contaminate" observations, and likely lead to discrepancies between observation based estimates of ECS and model based estimates. So evidence that ECS was 1.64 degrees C in the recent past does not contradict models that estimate ECS at 3.0 degrees C for the near future.

    0 0
    Moderator Response:

    [Rob P] - There are no 'recent observations' for ocean heat content back in the 19th century. All these energy budget approaches have to use modelled estimates for that time period. 

     

  2. Russ R @1:

    1) You say "all estimates derived from recent temperature observations, which are consistently lower than those derived from paleo proxy data or computer models" as if it were actually true.  In fact, many estimates based on the instrumental record are as high as or higher than those from the paleo record and computer models.  On average, those from the instrumental record are slightly lower.  Perhaps more germain, the estimates from the instrumental record that you read are biased low due to your sources of information.  But neither of these facts make your assertion true.

    2)  You are correct that climate sensitivity is "not a thing" (though neither is g).  Major changes in continental positions, changes in dominant ocean currents, changes in vegetation and rainfall patterns, and changes in GMST will all effect the the value of the climate sensitivity.  Therefore Paleo Climate estimates primarilly provide a sanity range.

    However, you do not think the consequences of that through.

    What it means is that the purportedly low climate sensitivity over the twentieth century is no gauranttee of a continuing low sensitivity into the future.  As global temperatures rise it becomes likely that the low climate sensitivity of recent times (if it is a fact) will revert to the higher non-glacial climate sensitivity from paleodata (ie, regress towards the mean) so that the final impact of anthropogenic forcings will be determined by that higher climate sensitivity than our purported valley.

    You argue against that based on an expected reduction in ice albedo feedback as the planet warms.  That expectation is correct, but there is also an expected increase in the strength of the water vapour feedback as the planet warms due to the Clasius-clapeyron relationship.  The ice albedo feedback is currently near a minimum and is very low compared to that during glacials (for example).  Any major warming, therefore, is likely to result in the increase in strength of the water vapour feedback dominating over the decrease in strength of the ice albedo feedback.

    You also appeal to uncertainty.  In decision making under risk, uncertainty is not your friend.  If your playing Russian Roulette, the game is not made safer if before each go, the gun is loaded with the result of the roll of a six sided die minus 2 cartridges.  On the contrary, the uncertainty makes the game even more foolhardy.  In this case, uncertainty may mean that nailing down the current ECS at a nice midrange 2.5 C does not gauranttee us that it will not have pushed out to 4 C by the end of this century.

    3)  Despite the increase in sea ice in the Antarctic, the decrease in sea ice in the Arctic is larger in magnitude, and the decrease in winter snowfall in the northern hemisphere is larger still.  The net effect is that most models underestimate the ice/snow albedo feedback rather than the reverse.

    0 0
  3. Roz Pidcock in the OP writes:

    "One criticism of the energy budget model approach that lies behind these kind of studies is that it doesn't take into account the role of the oceans in taking up excess heat."

    In fact the formula used by Lewis and Curry is ECS = change in temperature divided by (change in forcing minus change of heat flux into the oceans).  Therefore they very explicitly take into account the role of oceans in taking up excess heat.  And Then There's Physics never said otherwise.  There are very valid criticisms (IMO) of their methods of taking OHC flux into account, but not that they ignored it all together.

    0 0
  4. Hi Tom,

    You rigth, that was what figured out in "And Then There's Physics" Blog. In Fact, L&C used heat-uptake but their value is unable to explain real sea level rise in their final periode and this would imply, that heat uptake from other sources as ocean heat uptake.

    Because they ignored other sources, their dQ (Heat-Uptake) ins biased to low.

    And that is what ive written in "And Then There's Physics" at September 26, 2014 at 2:37 pm

    0 0
  5. Christian, hi.

    My original version of my comment mentioned, and linked to your comment at ATTP.  Unfortunately it was lost and the repeat was much shortened.

    0 0
  6. Although Tom has already pointed this out, since it was attributed to me, I will add that I didn't ever say that the method doesn't take the role of the oceans into account.  In determining the ECS it does indeed include the change in system heat uptake rate, which is dominated by energy going into the oceans.  As others have mentioned there are issues with how they've determined their system heat uptakes rates, but it isn't corret that the method doesn't consider the oceans at all.

    0 0
  7. Speaking of ATTP (And Then There's Physics)...

    In his article, Study lowers range for future global warming, but does it matter? (Capital Weather Gang, Washington Post, Sep 26, 2014), Jason Samenow references and links to  ATTP's post, Lewis and Curry

    Here's what Samenow has to say:

    "The blog And Then There’s Physics lists several reasons why Lewis and Curry’s estimate could be too low, including not fully accounting for the transfer of heat between the ocean and atmosphere.

    "It further cautions one must use be careful in leaping to conclusions from the results of a single study. Computer modeling studies generally estimate higher values for ECS, as do some studies based on paleoclimate data."

    0 0
  8. How hot was the earth the last CO2 was 400ppm?

    Or should we say 460ppm Co2e?

    And how high were sea levels?

    0 0
    Moderator Response:

    [Rob P] - Based on the relationship between atmospheric CO2 and sea level over the last 40 million years, we are likely committed to 24 metres (+7/-15 metres at 68% confidence) of future sea level rise.

    That seems to be supported by recent observations from the Antarctic showing that the West Antarctic ice sheet, and perhaps coastal sectors of the East Antarctic ice sheet, are beyond the point of no return. It will likely play out over millennia, but over a metre of sea level rise this century will cause widespread disruption and be extraordinarily expensive to deal with.

  9. ranyl & Rob P @8, with a slug of CO2 of a trillion tonnes or more, approximately 25% remains in the atmosphere in the long term.  About 50% has human emissions have already been drawn down, so another 50% remains to be drawn down.  That means to get a long term 400 ppmv CO2 concentration, in the short term CO2 concentrations need to rise to 620 ppmv.  For 460 ppmv CO2e, you need 640 ppmv of CO2 (as other WMGHG decay in the short term).  The upshot is that long term concentrations of 400 ppmv plus, along with the effects of such concentrations evident from the paleo record, are likely long term consequences of BAU (RCP 8.5 or even RCP 6) scenarios.  They are still very avoidable, however, if we get serious attempts to mitigate climate change.

    0 0
  10. Sunlight is energy input that is absorbed in heating up the atmosphere and the oceans as well driving operations on the land. Energy is output by radiating out into space. The energy input and output had been roughly in balance for eons, so enabling the biological operations that organisms were dependent on. Then the emissions from fossil fuel usage rapidly upset that balance so heat has accumulated in the atmosphere, oceans and land. The emissions have effectively placed a blanket around the globe, so causing the warming. Cutting back the rate of emissions will do no more that slow down the heating. The energy input and output balance has been irrevocably degraded by the operations of industry. Natural forces will very slowly restore this balance after the demise of industrialed civilization.

    0 0
  11. Tom @ 9 -  True, it depends how things pan out in the real world. If, for instance, carbon fertilisation of the atmosphere does indeed cause the stupendous growth of land-based plants, as the carbon cycle models predict, much of the excess carbon dioxide will be drawn down. The evidence for this is pretty equivocal so far.

    Also, once the ice sheets are committed to collapse, there's no turning back. If we haven't passed that point, and humanity were to get its act together, the collapse of the ice sheets may be averted. Trouble is that empty assurances are all we have so far. Fossil fuel emissions keep climbing, and will continue to do so for the foreseeable future.

    As you may have figured out, averting dramatic future sea level rise is not something I'm very optimistic about. 

    0 0
  12. Lewis & Curry (2014) is pretty much what you'd expect from the title 'The implications for climate sensitivity of AR5 forcing and heat uptake estimates' and from its authors - a bean-counter meets a quasi-holistic climatologist. The thrust of the study is to take the numbers from AR5 WG1 Appendix2 and shove the implications of them back at the IPCC. This is easily done but there is quite a bit of cheese-paring required to get the desired result. For instance, note how the 'headline' 1859-2011 result when compared with Otto et al inc. Lewis (2013) loses 5% of the ∆T and gains 25% of the ∆(F-Q).

    And choosing a different temperature record than HadCRUT4 would gain 5-10% more ∆T. The comparing of peak temperatures from late-1800s & mid-1900s is potentially questionable unless you are signed up to a big constant-amplitude multi-decadal oscillation. There is certainly room for significantly higher sensitivity by taking different time periods if the Appendix 2 forcings are taken at face value (which is what the study is about). And as most of the warming has occurred recently, slow feedbacks will not have had time to act for 'most of the warming'. And the one natural wobbler of temperature that is beyond doubt, ENSO, is an unknown for the post-1850 period. ENSO could have elevated the 'headline' base temperatures just as it has mainly depressed the 'headline' end period. That could easily have clipped 10% off the ∆T used bt Lewis & Curry. (I note the MacDonald & Case (2005) PDO reconstruction (wiki-graph) looks a bit positive for 1859-82 suggesting ENSO will indeed have been warming.)

    So the headline low ECS provided by Lewis & Curry (2014) is at best controversial.

    And do note, if it is as Lewis & Curry suggest, it only works if we are now about to experience a repeat of the cooling cycle seen twice before over the last 160 years. So hold onto your hats. The Kara Sea will melt away (or is it 'ice over'?) plunging the whole Northern Hemisphere into two decades of cooling and priming a negative AMO ready for another round of Wyatt's Stadium Wave. This I will enjoy seeing.

    0 0
  13. ranyl & Rob P @8, with a slug of CO2 of a trillion tonnes or more, approximately 25% remains in the atmosphere in the long term. About 50% has human emissions have already been drawn down, so another 50% remains to be drawn down. That means to get a long term 400 ppmv CO2 concentration, in the short term CO2 concentrations need to rise to 620 ppmv. For 460 ppmv CO2e, you need 640 ppmv of CO2 (as other WMGHG decay in the short term). The upshot is that long term concentrations of 400 ppmv plus, along with the effects of such concentrations evident from the paleo record, are likely long term consequences of BAU (RCP 8.5 or even RCP 6) scenarios. They are still very avoidable, however, if we get serious attempts to mitigate climate change.

    Sorry Tom that is using models with many CO2 feedbacks not included, presuming a strogner fertilization effect than being seen, not including forest fire feedbacks, and several others despite the complexity of the models. And don't forget the widespread fertilization effect of nitrogen fertilisers, that has been more than enough to offset all the NO2 emissions from them, stop using the industrial fertilizers as is necesary for biodiversities sake and well that fertilizer effects goes. And si has been shown that ecosystem disrupton releases carbon (not included in those optimistic carbon withdrawal nodels),  adn extrem weather event sand increased erosion is releasign soil carbon, and then there si the frozen sea bed stoes of permafrost of siberia that are releases stuff also not included.

    We are already committed to be above 400ppm (and think 470ppn CO2e) for at the very best least another 200 years or so, so that is at least 80% of the warming of full equilibrium.

    Stiff the roses.

    350-400ppm it was 3-5C hotter and that was not including the extra heat recetnyl found in the western pacific recently reported, the WPAC was 3C hotter than thought and that is a reasonable chunk of ocean to increase, and thus 5C is the much more likely.

    There is no more room for any more carbon emissions, yet we are going to have lots more for nothing stops overnight.

    Now I get all the papers and many more to prove these points.

    Including several recent papers that match climate models to reality in temr so fo wate vapour and cloud fomrations when only the models with a climate sensitivity of 3C or above do the trick,  and it is more likel yto be 4C as many are now realising.

    We have a huge carbon debt not a budget.

    But anyway apparently WWW3 has kicked off in papal terms so that wil a whole load carbon emssions and habitat destruction with millions of people sufferign on top, so let su be sure 400ppm will soon be a distant target everyone wishes we could back to, but gettign any CO2 out the atmosphere any time soon will a transformation of everything to acheive.

    And how much carbon extra is it going to take to actual make all the new power generation, electric cars, etc, and just how toxic are batteries and the like?

    Oh I know lets pretend we put another 1/3 again of CO2 into the atmosphere because some computer models (that have underestimated lots so far (e.g. sea ice arctic), give a 2/3 chance of avoiding 2C by 2100, or to put it another way lets play russian roulette with 2 bullets in the 6 carousel with the future of humanity?

    Now what is the carbon budget that gives a 95% chance of avoiding 2C when the CS is 4C?

    0 0
  14. ranyl @13, it would be helpfull to myself, and presumably other readers if you distinguish quotation from your own words.  At a minimum, you should use quotation marks (on your keyboard next to the enter key).  It would also be helpful if you used the indent function from the wysiwyg panel in the comments screen, indicated by the quotation mark signal.

    Trivial points aside, it is very easy to get a long list of papers which indicate models may (there is disagreement on the point among relevant scientists) overestimate CO2 drawdown (or climate sensitivity).  It is equally easy to get long lists of papers which indicate models may overestimate the same.  Climate change deniers continually refer to the later and ignore the former, in a process that is called pseudoscience.  It is no more scientific to continually refer to the former and ignore the later.  The climate scientists who actually device the models, such as David Archer, keep track of both; and revise the models on the basis of the balance of evidence.

    So:

    • While you can list a series of reasons to think the models underestimate draw down, it has recently been shown that volcanic emissions are significantly larger than previously thought, which implies a larger draw down rate, and hence that models underestimate the draw down.
    • The models in question have with reasonable accuracy retrodicted the Earth's carbon budget over the last 600,000 years.  While they are likely to be wrong in detail (as with all models), they are therefore unlikely to be wrong about the basic picture.
    • The higher temperatures and sea levels in the pliocene where in a near full equilbrium condition.  That is, they were achieved as the Earth achieved its Earth System Climate Sensitivity, which is noticably higher than the Equilibrium Climate Sensitivity or the more relevant (over the coming two centuries) Transient Climate Response.
    • Finally, anybody who knows me knows I am not sanguine about about even 500 ppmv, let along 650.  As a matter of urgency we need to stop net anthropogenic emissions before atmospheric CO2 tops 450 ppmv.  Not, however, due to some panicked forecasts about the effect of a current 400 ppmv 500 plus years down the track.
    0 0
  15. For additional information, the often mentioned trillion tonnes of carbon limit on human emissions to avoid dangerous impacts of global warming amounts to a 540 ppmv limit.  That view, above all others, represents the scientific consensus on safe levels of emissions.  There are scientists who believe that even it overstates the necessary level of concern, but at least equally many who thinks it overstates the safe limits, including Hansen.  

    The later takes the view expressed by 350.org, ie, that 350 ppmv is the upper limit on safe greenhouse concentrations.  I have a problem with 350.org that they appear to over-egg the data.  Hansen, for example (and no doubt sincerely), considers the possibility of runaway global warming real, even though the science is very firmly against him on that point.

    My biggest problem with the 350.org point of view is a failure to take into account the fact that, given zero net human emissions, CO2 levels will fall substantially if slowly; and that the full impacts of ECS will occur slowly and on approximately the same timescale as the initial natural draw down of CO2.  These are very important facts, and make an otherwise impossible task plausible.  That is, there is an economic cost in reducing CO2 emissions at least in the short term, and reducing emissions to net negative values in the very short term as advocated by 350.org makes that economic cost sufficient that it could plausible cause as much damage as 550 or even 650 ppmv of CO2.  In constrast, limiting net emissions to 0.75 to 1 trillion tonnes (475-540 ppmv) in the short term will have an initial short term economic cost substantially less than (for example) the cold war.  That is, it is achievable while retaining the economic ability to achieve other major ends.  

    Developing the technology to limit CO2 emissions by that amount will also develop the technology to go the further and necessary step to achieve net zero anthropogenic emissions.  In particular, zero gross anthropogenic emissions will be impossible to achieve for a variety of reasons, and gross emissions above 5% of current emissions may result in either no natural draw down, or a long term slow build up of CO2 concentration.  Therefore, as with 350.org, I agree that we will need to develop a cost effective technology for carbon sequestration.  I disagree about the scale to which that is necessary, and the time period in which it is necessary.  

    (Apologies to the moderators for the extensive off topic post.)

    0 0
  16. MA Rodger @12, even taken as a hypothetical excercise to determine the values of TCR and ECS using IPCC values, Lewis and Curry (hereafter L&C) is seriously flawed.  I explained part of the reasons in a comment at And Then There's Physics.  Further to that comment (quoted below), their method of determining the Heat Content Flux for 1859-1882 is not grounded in the IPCC, is highly dubious and likely further deflates their headline results.  In addition, more recent findings since the IPCC was published suggest a higher effective radiative forcing of aerosols is in order, which would increase the result still further.  So, as an attempt to determine ECS based on IPCC assumptions, the result is flawed - and even more flawed as an attempt to find the actual value of ECS.  (The determination of the value of the TCR is also flawed, but the problems are of little consequence given how close their value is to the IPCC values).

     

    "I downloaded the AR5 forcing data from Annexe 2, HadCRUT4 from the Hadley Center, Domingues et al OHC data from the CSIRO and Levitus et al forcing data from the NODC. I then proceeded to calculate ΔT, ΔF, and ΔQ from that data. Using C&L value for Q over the period 1859-1892(which I also dispute). The result was that L&C incorrectly estimated ΔT by 0.57% (small enough to be a rounding error), ΔF by 2.68%, and ΔQ by 17.03%. The later two are too large to be rounding errors. All errors favour lower values for TCR and ECS. Combined, the errors deflated TCR by 3.16% and ECS by 8.27%.

    The “error” in ΔT is just a rounding error as noted. That in ΔF may be due to an adjustment to the aerosol forcing. If so, it means Lewis and Curry are not, after all, trying to show what is obtained from the IPCC data, and need to independently justify their choices of data. If they were trying to show the results of the IPCC data, and also obtain the difference when the forcing data is modified as Lewis claims it ought, then they should have shown both.

    The difference in ΔQ is the most interesting. I obtained the most recent values by downloading the 0-2000 meter pentadal record, and using the difference between successive values to determine the difference between individual years six years apart. I then used the the 2005-2012 annual data as an anchor point from which annual values were reconstructed back to 1955. Comparison of rolling 5 year averages with the pentadal values showed a constant offset over the reconstructed period, which because constant has no effect on trends. I then deducted the 0-700 meter annual OHC, added in the Domingues 0-700 meter OHC and the Purkey and Johnson trend from 1992 (as per box 3.1) and dividing by 0.93 to account for the heat content of ice loss, into the ground and into the atmosphere.

    Interestingly, my figures and L&C’s figures agree within 1% if I neither add in the Purkey and Johnson trend for OHC below 2000 m, nor apply a modifier for non-ocean heat storage. This looks like a likely source for the error.

    Resolving the errors results in a mean TCR of 1.37 C, and a mean ECS of 1.79 C per doubling of CO2. These are still low values, but well within the IPCC range. Further, at this stage the errors amount to errors in arithmetic rather than errors in assumptions (of which I believe there are plenty)."

    (Bolding added to draw attention to key points.)

    0 0
  17. Tom Curtis @16.

    You say "their method of determining the Heat Content Flux for 1859-1882  ... is highly dubious." Coincidently for other reasons, I recently extracted the thermosteric SLR data from Gregory et al (2013) by scaling their Fig 2c. This is apparently what L&C2014 did to obtain thermoSLR data which they then used to infer OHC for years prior to the earliest OHC data. I must say, the resulting numbers do have a rather dubious feel. The Base Periods centred on 1870 & 1940 used by L&C2014 turn out to be the best choice possible for reducing Δ(F-Q) and thus for reducing ECS/TCR estimates. But I can now pick my own Base & End Periods and using the first half decade and last half dozen years of the Gregory et al thermoSLR data (this length of period to avoid volcanos) to infer Δ(F-Q), IPCC Appendix 2 for ΔF and HadCRUT(C&W) for temperature I get ECS=2.4ºC.

    0 0
  18. MA Rodger @17, my take was that they used the CCSM4 spun up from 850 AD as shown in figure 1 (red line) for pre-1950 ocean heat fluxes.  Certainly, based on a pixel count, it gives the same values over 1859-1882.  VarNVarN in figure 2c (solid blue line) is similar, but smoother and drops much lower around the turn of last century, which would allow much higher climate sensitivity.  I am not sure that woud be the case with the figure 1 values.

    For an alternate approach, I used the Ocean Uptake Efficiency from Gregory and Forster, along with HadCRUT4 to determine Q in the 19th century.  The result is a mean ECS of 1.98 K per doubling of CO2, inline with Otto et al 2012.  I discuss it in detail at ATTP.  I do not claim, of course, that that is the best approach, although I think it is significantly better than L&C's method as applied (ie, using a single run on a single ensemble member and an incorrect downscaling).  But the factor of 23 difference in the resulting estimated Q means L&C need to seriously justify their choice at a minimum, something they have not done.

    0 0
  19. An ammendment to my post @18.  At ATTP, Paul S has pointed out an error I made in my use of the Ocean Uptake Efficiency.  Correcting for it, and allowing for non-ocean heat storage results in a slightly lower ECS than just making the error corrections already noted.  That would count as concurrence between the L&W estimate and an alternative reasonable approach.

    0 0
  20. Tom Curtis @18.

    Thanks for pointing out my use of the wrong Gregory graph. It is as you say figure 1 that was used. I had assumed it was in figure 2 somewhere as I couldn't see how L&C14 obtained such low numbers from figure 1. But I had failed to account for the 60% adjustment used by L&C14.

    For the record repeating the exercise @17 with the figure 1 numbers yields ECS=2.13ºC.

    0 0
  21. MA Rodger @20, did you downscale the ocean heat content by 0.6 to match L&C's method?

    0 0
  22. Tom Curtis @21.

    Yes. I did remember. Bear in mind I am using Gregory et al. Figure 1 numbers to provide ΔQ at both ends of the period, my thinking being, if Gregory et al. is good enough for 1881, its good enough also for 2000.

    0 0

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



Get It Here or via iBooks.


The Consensus Project Website

TEXTBOOK

THE ESCALATOR

(free to republish)

THE DEBUNKING HANDBOOK

BOOK NOW AVAILABLE

The Scientific Guide to
Global Warming Skepticism

Smartphone Apps

iPhone
Android
Nokia

© Copyright 2017 John Cook
Home | Links | Translations | About Us | Contact Us