Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

A detailed look at climate sensitivity

Posted on 8 September 2010 by dana1981

Some global warming 'skeptics' argue that the Earth's climate sensitivity is so low that a doubling of atmospheric CO2 will result in a surface temperature change on the order of 1°C or less, and that therefore global warming is nothing to worry about. However, values this low are inconsistent with numerous studies using a wide variety of methods, including (i) paleoclimate data, (ii) recent empirical data, and (iii) generally accepted climate models.

Climate sensitivity describes how sensitive the global climate is to a change in the amount of energy reaching the Earth's surface and lower atmosphere (a.k.a. a radiative forcing).  For example, we know that if the amount of carbon dioxide (CO2) in the Earth's atmosphere doubles from the pre-industrial level of 280 parts per million  by volume (ppmv) to 560 ppmv, this will cause an energy imbalance by trapping more outgoing thermal radiation in the atmosphere, enough to directly warm the surface approximately 1.2°C.  However, this doesn't account for feedbacks, for example ice melting and making the planet less reflective, and the warmer atmosphere holding more water vapor (another greenhouse gas). 

Climate sensitivity is the amount the planet will warm when accounting for the various feedbacks affecting the global climate.  The relevant formula is:

dT = ?*dF

Where 'dT' is the change in the Earth's average surface temperature, '?' is the climate sensitivity, usually with units in Kelvin or degrees Celsius per Watts per square meter (°C/[W m-2]), and 'dF' is the radiative forcing, which is discussed in further detail in the Advanced rebuttal to the 'CO2 effect is weak' argument.

Climate sensitivity is not specific to CO2

A common misconception is that the climate sensitivity and temperature change in response to increasing CO2 differs from the sensitivity to other radiative forcings, such as a change in solar irradiance.  This, however, is not the case.  The surface temperature change is proportional to the sensitivity and radiative forcing (in W m-2), regardless of the source of the energy imbalance. 

In other words, if you argue that the Earth has a low climate sensitivity to CO2, you are also arguing for a low climate sensitivity to other influences such as solar irradiance, orbital changes, and volcanic emissions.  Thus when arguing for low climate sensitivity, it becomes difficult to explain past climate changes.  For example, between glacial and interglacial periods, the planet's average temperature changes on the order of 6°C (more like 8-10°C in the Antarctic).  If the climate sensitivity is low, for example due to increasing low-lying cloud cover reflecting more sunlight as a response to global warming, then how can these large past climate changes be explained?

ice core temps

Figure 1: Antarctic temperature changes over the past 450,000 years as measured from ice cores

What is the possible range of climate sensitivity?

The IPCC Fourth Assessment Report summarized climate sensitivity as "likely to be in the range 2 to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values."

Individual studies have put climate sensitivity from a doubling of CO2 at anywhere between 0.5°C and 10°C; however, as a consequence  of increasingly better data, it appears that the extreme higher and lower values are very unlikely.  In fact, as climate science has developed and advanced over time , estimates have converged around 3°C.  A summary of recent climate sensitivity studies can be found here

A study led by Stefan Rahmstorf concluded "many vastly improved models have been developed by a number of climate research centers around the world. Current state-of-the-art climate models span a range of 2.6–4.1°C, most clustering around 3°C" (Rahmstorf 2008).  Several studies have put the lower bound of climate sensitivity at about 1.5°C,on the other hand, several others have found that a sensitivity higher than 4.5°C can't be ruled out.

A 2008 study led by James Hansen found that climate sensitivity to "fast feedback processes" is 3°C, but when accounting for longer-term feedbacks (such as ice sheet
disintegration, vegetation migration, and greenhouse gas release from soils, tundra or ocean), if atmospheric CO2 remains at the doubled level, the sensitivity increases to 6°C based on paleoclimatic (historical climate) data.

What are the limits on the climate sensitivity value?

Paleoclimate

The main limit on the sensitivity value is that it has to be consistent with paleoclimatic data.  A sensitivity which is too low will be inconsistent with past climate changes - basically if there is some large negative feedback which makes the sensitivity too low, it would have prevented the planet from transitioning from ice ages to interglacial periods, for example.  Similarly a high climate sensitivity would have caused more and larger past climate changes.

One recent study examining the Palaeocene–Eocene Thermal Maximum (about 55 million years ago), during which the planet warmed 5-9°C, found that "At accepted values for the climate sensitivity to a doubling of the atmospheric CO2 concentration, this rise in CO2 can explain only between 1 and 3.5°C of the warming inferred from proxy records" (Zeebe 2009).  This suggests that climate sensitivity may be higher than we currently believe, but it likely isn't lower.

Recent responses to large volcanic eruptions 

Climate scientists have also attempted to estimate climate sensitivity based on the response to recent large volcanic eruptions, such as Mount Pinatubo in 1991.  Wigley et al. (2005) found:

"Comparisons of observed and modeled coolings after the eruptions of Agung, El Chichón, and Pinatubo give implied climate sensitivities that are consistent with the Intergovernmental Panel on Climate Change (IPCC) range of 1.5–4.5°C. The cooling associated with Pinatubo appears to require a sensitivity above the IPCC lower bound of 1.5°C, and none of the observed eruption responses rules out a sensitivity above 4.5°C."

Similarly, Forster et al. (2006) concluded as follows.

"A climate feedback parameter of 2.3 +/- 1.4 W m-2 K-1 is found. This corresponds to a 1.0–4.1 K range for the equilibrium warming due to a doubling of carbon dioxide"

Other Empirical Observations

Gregory et al. (2002) used observed interior-ocean temperature changes, surface temperature changes measured since 1860, and estimates of anthropogenic and natural radiative forcing of the climate system to estimate its climate sensitivity.  They found:

"we obtain a 90% confidence interval, whose lower bound (the 5th percentile) is 1.6 K. The median is 6.1 K, above the canonical range of 1.5–4.5 K; the mode is 2.1 K."

Examining Past Temperature Projections

In 1988, NASA climate scientist Dr James Hansen produced a groundbreaking study in which he produced a global climate model that calculated future warming based on three different CO2 emissions scenarios labeled A, B, and C (Hansen 1988).   Now, after more than 20 years, we are able to review Hansen’s projections.

Hansen's model assumed a rather high climate sensitivity of 4.2°C for a doubling of CO2.  His Scenario B has been the closest to reality, with the actual total radiative forcing being about 10% higher than in this emissions scenario.  The warming trend predicted in this scenario from 1988 to 2010 was about 0.26°C per decade whereas the measured temperature increase over that period was approximately 0.18°C per decade, or about 40% lower than Scenario B.

Therefore, what Hansen's models and the real-world observations tell us is that climate sensitivity is about 40% below 4.2°C, or once again, right around 3°C for a doubling of atmospheric CO2.

Probabilistic Estimate Analysis

Annan and Hargreaves (2009) investigated various probabilistic estimates of climate sensitivity, many of which suggested a "worryingly high probability" (greater than 5%) that the sensitivity is in excess of than 6°C for a doubling of CO2.  Using a Bayesian statistical approach, this study concluded that

"the long fat tail that is characteristic of all recent estimates of climate sensitivity simply disappears, with an upper 95% probability limit...easily shown to lie close to 4°C, and certainly well below 6°C."
Annan and Hargreaves concluded that the climate sensitivity to a doubling of atmospheric CO2 is probably close to 3°C, it may be higher, but it's probably not much lower.


sensitivity-big.gif
 
Figure 2: Probability distribution of climate sensitivity to a doubling of atmospheric CO2

Summary of these results

Knutti and Hegerl (2008) presents a comprehensive, concise overview of our scientific understanding of climate sensitivity.  In their paper, they present a figure which neatly encapsulates how various methods of estimating climate sensitivity examining different time periods have yielded consistent results, as the studies described above show.  As you can see, the various methodologies are generally consistent with the range of 2-4.5°C, with few methods leaving the possibility of lower values, but several unable to rule out higher values.

sensitivity summary

Figure 3: Distributions and ranges for climate sensitivity from different lines of evidence. The circle indicates the most likely value. The thin colored bars indicate very likely value (more than 90% probability). The thicker colored bars indicate likely values (more than 66% probability). Dashed lines indicate no robust constraint on an upper bound. The IPCC likely range (2 to 4.5°C) and most likely value (3°C) are indicated by the vertical grey bar and black line, respectively.

What does all this mean?

According to a recent MIT study, we're currently on pace to reach this doubled atmospheric CO2 level by the mid-to-late 21st century.

mit-ppm.jpg
Figure 4: Projected decadal mean concentrations of CO2.  Red solid lines are median, 5%, and 95% for the MIT study, the dashed blue line is the same from the 2003 MIT projection.
 
So unless we change course, we're looking at a rapid warming over the 21st century.  Most climate scientists agree that a 2°C warming is the 'danger limit'.   Figure 5 shows temperature rise for a given CO2 level. The dark grey area indicates the climate sensitivity likely range of 2 to 4.5°C.
 
key global warming impacts 
Figure 5: Relation between atmospheric CO2 concentration and key impacts associated with equilibrium global temperature increase. The most likely warming is indicated for climate sensitivity 3°C (black solid). The likely range (dark grey) is for the climate sensitivity range 2 to 4.5°C. Selected key impacts (some delayed) for several sectors and different temperatures are indicated in the top part of the figure.

If we manage to stabilize CO2 levels at 450 ppmv (the atmospheric CO2 concentration as of 2010 is about 390 ppmv), according to the best estimate, we have a probability of less than 50% of meeting the 2°C target. The key impacts associated with 2°C warming can be seen at the top of Figure 5. The tight constraint on the lower limit of climate sensitivity indicates we're looking down the barrel of significant warming in future decades.

As the scientists at RealClimate put it,
"Global warming of 2°C would leave the Earth warmer than it has been in millions of years, a disruption of climate conditions that have been stable for longer than the history of human agriculture. Given the drought that already afflicts Australia, the crumbling of the sea ice in the Arctic, and the increasing storm damage after only 0.8°C of warming so far, calling 2°C a danger limit seems to us pretty cavalier."

This post is the Advanced version (written by dana1981) of the skeptic argument "Climate sensitivity is low". Note: a Basic version is on its way and should be published shortly.

0 0

Printable Version  |  Link to this page

Comments

1  2  3  Next

Comments 1 to 50 out of 101:

  1. The line from RealClimate is the most telling, it amazes me that people can ignore its impact: "... warmer than it has been in millions of years, ... longer than the history of huan agriculture". It amazes me that anyone believes we can cause that great a change so quickly and not expect drastic and unpleasant changes as a consequence.
    0 0
  2. Question. At what temperature does an Interglacial Period begin in Figure 1 and at what temperature does it end?
    0 0
  3. "A common misconception is that the climate sensitivity and temperature change in response to increasing CO2 differs from the sensitivity to other radiative forcings, such as a change in solar irradiance. This, however, is not the case. The surface temperature change is proportional to the sensitivity and radiative forcing (in W m-2), regardless of the source of the energy imbalance. " The problem with this statement is that it assumes that variation doesn't exist between different sets of climate couplings/forcings-ie the idea that negative feedback can act on one parametre and not another. Eg: lets say we increase sunlight, which for sake of argument, reduces cloud cover in temperate regions, producing a positive feedback. T rises higer than it would from solar output alone. Eg2: lets say we increase c02, which warms the tropics, which, for the sake of argument, produces more low cloud cover (more water held by air in warmer temperatures) which increases endothermic cooling due to more coulds/precipitation (same mechasim as our bodies sweating-which is also probably ocuring now with increased in rainfall in tropics with La Nina). A negative feedback from a rise in C02. In the 2 above examples, one is a strong climate senstivity with regards to the sun, the other a low climate senstivity with regards to c02. Why do the 2 sensitivities have to always be the same?? (sun/c02)? If you argue that the sun would also produce more clouds in teh tropics from the same sort of warming this isnt necassarily so, because the increase in solar output is logarithmic between the tropics and the arctic due to variation in angle of incidence, whereas c02 would be more uniform from tropics to arctic. So not only is there possibly variations in feedbacks between c02/sun, but also variations in feedbacks between various focrings between the tropics and arctic. Even if the above examples are mistaken, I just dont see how all climates sensitivities have to be the same- ie all high climate sensitivity, or all low climate sensitivity. ?? (Moreovoer the 1.2 degrees with C02 regardless of feedback isnt assured either, due to much the same sort of issues).
    0 0
  4. Could anyone explain how it got almost 3C warmer than now prior to the last Ice age with lower CO2 levels?
    0 0
  5. And could someone please explain to me one time why in all the ice core data we never see temperature continue to rise when CO2 "catches up" with it on a chart. I mean logic would tell you that if the feedback thing was so true that our peak temperatures in all cycles would FOLLOW the peak of CO2. It never happens that way tough.
    0 0
  6. crunz246 - see Climate's changed before Intermediate version. You are probably also interested in CO2 lags Temperature In short - CO2 isnt the only forcing in town - its just the one causing the current warming.
    0 0
  7. crunz246, in addition to the links scaddenp provided that directly answer your questions, you should also see the more general post CO2 Is Not the Only Driver of Climate.
    0 0
  8. For contrast, here's the simplest possible explanation of sensitivity: http://www.ssec.wisc.edu/data/east/latest_eastwv.jpg When water vapor is uneven the earth cools, when it is more uniform the earth warms. Models show concentrated convection cools, high clouds warm, low clouds cool, upper tropospheric water vapor warms, all resulting from the distribution of water vapor. Water vapor is distributed by weather and weather is poorly modeled in GCMs at the smaller scales (local convection) where the detail is important.
    0 0
  9. cruzn246 writes: Could anyone explain how it got almost 3C warmer than now prior to the last Ice age with lower CO2 levels? There are several contributing answers to this. First, the fluctuations you see in Fig. 1 took place over long periods of time -- by comparison, the modern CO2 & temperature increase is just getting started. Second, despite the caption to Fig. 1, it's not necessarily certain that global temperatures were that much warmer during the previous interglacial; the ice cores presumably weight temperatures in the Antarctic and the Southern Ocean more heavily than the rest of the globe. Third, as scaddenp and Tom Dayton note, there are other factors at play. As just one example, if we stopped emitting sulfate aerosols and let the atmosphere clear for a few months, we'd find that the radiative forcing from CO2 was being masked by aerosol cooling. Obviously, heavy industry was not quite as much of a factor 120,000 years ago! There's probably other points that I'm forgetting, too. The bottom line is that there's no one single answer to that question; it's a combination of multiple effects.
    0 0
  10. I teach chemistry at the university level. The Atmosphere is a chemical system in equilibrium with the chemical sytem of the oceans. Le Chatalier's principle states that a system in chemical equilibrium reacts to external stimulus to maintain that equilibrium. So the system may react to changes by maintaining it's equilibrium. What happens is this: the system changes very little until it's ability to react is overwhelmed, then changes are rapid and drastic. This is why climate scientists talk about a "tipping point" beyond which the Earth cannot comfortably recover.
    0 0
  11. Well eric, firstly I dont think you can have some local perturbations somehow messing an average. And would support that by noting the determinations of sensitivity from GCMs match rather well with estimates of sensitivity from empirical techniques such eruption response and LGM data.
    0 0
  12. On that note from scaddenp, I refer eric to my climate sensitivity article.
    0 0
  13. Whoops, this is the climate sensitivity article. In which case I suggest that eric re-read it.
    0 0
  14. I'm still puzzled by one group of scientists saying humidity in the atmosphere of our planet has trended down since 1958 and another group that says it has gone up with the rise in temperature.The downward trend was measured by NOAA.Am I missing something here....anyone?
    0 0
  15. If the actual humidity in our atmosphere has gone down in real terms measured by radiosones and satilites by NOAA the discution about climate feedbacks of co2 being positive is over.You have to have higher humidity to trap more heat.Am I wrong....anyone?
    0 0
  16. Re: adrian smits (14) Unsure where you're getting that downward trend in humidity claim from. This from NOAA contradicts that claim. Re: adrian smits (15) See above. And then see this for a better grounding in CO2's role in the greenhouse effect. Unless skeptics come up with a physics-based alternative to the well-understood physics of greenhouse gases, CO2 rules and water vapor, while important, is a bit player (I restrained myself with great effort from using "drools" vice the "great effort" bit I actually went with - pity). See also Richard Alley's talk: CO2 is the biggest control knob. The Yooper
    0 0
  17. Adrian - I think you are getting confused over data sets. All the data sets that I have seen (eg see chpt3 of IPCC WG1, AR4), should water vapour going up. Look for specific humidity or precipitable water. Perhaps you could provide a link to the data set from NOAA that you think contradicts this?
    0 0
  18. Ned, there was stuff growing up in Canada that usually grows well south back then. Of course it was much warmer worldwide then.
    0 0
  19. Thats why I asked the question.I've found graphs at wuwt from NOAA that show a downward trend and I've also seen the graph you talked about Daniel so whats up with that? Is there a difference between general humidity and specific humidity? Specific humidity is the first time i've seen that term used.
    0 0
  20. Re: adrian smits (19) Ask yourself this: Do I trust a graph from a blog citing a source, or do I double-check the graph on the blog versus what the source itself shows? In this case, I pointed you to the NOAA source. If what's on display at WUWT is different from what NOAA (the source) shows, why do you think that would be? You have an inquiring mind, else you would not be here. You'll figure it out. For your second question, if by general humidity you mean relative humidity:
    "Humidity is the amount of water vapor present in the air and the relative humidity is the measure of the amount of water vapor present in the air compared to the amount needed for saturation."
    while Specific Humidity is
    "the mass of water vapour in a sample of moist air divided by the mass of the sample."
    When the temperature of air is cooled or reduced the relative humidity (RH) increases. The moisture content of the air remains the same until the RH rises to the point of 100% saturation and condensation occurs (source here). NOAA tracks Specific Humidity Hope that's more clear than mud :) The Yooper
    0 0
  21. cruzn246 - warmer in Canada does NOT mean warmer worldwide. Nonetheless, it may have been warmer say 3000BC than now from other lines of evidence. However, the important question is what forcing? Must likely it was solar but its not solar today.
    0 0
  22. wuwt - well known authoritive source of data. :-) I hope they have you detail to go back to the original source.
    0 0
  23. Hi, cruzn246. I agree that it was probably warmer than today at the peak of the previous interglacial; my point was just that we don't necessarily know exactly how much warmer because Antarctic ice cores aren't necessarily giving us the global mean temperature -- temperatures in the Southern Ocean are probably weighed more heavily. As for the distribution of plant communities, the biosphere had thousands of years to adjust to warming temperatures at the time. Right now we're raising CO2 and temperatures on a decadal-to-century time scale. It takes a while for things to come into equilibrium (which won't actually happen until after we stop increasing CO2 concentration).
    0 0
  24. scaddenp (#11): my mantra is "all climate is local". There is no such thing as "messing an average" since the average is a function of all the locals. The satellite shot I showed may indicate some local uneven distribution of water vapor (at least in N. America and adjacent Atlantic right now), but that unevenness is NOT balanced by some evenness somewhere else to produce an average. If there is more unevenness elsewhere then the world is cooling, period. Tomorrow it might even out and we will have global warming. What happens is entirely up to the local weather. Your appeal to models falls flat. Computation of the effects of Pinatubo for example are very crude. The aerosols affected weather differently as they spread, not just an oversimplified reduction in solar radiation as is performed in the model. The weather response to Pinatubo also included the fact that we were in El Nino beforehand, also poorly modeled in the GCM. Once the GCM's can replicate (not predict obviously since that requires unknowable initial conditions) the frequency and magnitude of climate features like El Nino, then they will be believable for modeling the response to Pinatubo. BTW, proving that humidity has gone up on average means nothing. The distribution of water vapor is the only thing that causes or doesn't cause global warming. If water vapor is evenly distributed then there is global warming, if not, global cooling. There is a large natural range encompassing both cases and a lot in the middle.
    0 0
  25. Re: Eric ("skeptic") @ 24
    "BTW, proving that humidity has gone up on average means nothing. The distribution of water vapor is the only thing that causes or doesn't cause global warming. If water vapor is evenly distributed then there is global warming, if not, global cooling. There is a large natural range encompassing both cases and a lot in the middle."
    You are in error. 1. As my earlier link clearly shows, global humidity anomalies have increased since 1970 (by about 4%, the equivalent volume of Lake Erie of the Great Lakes). As humidity levels in the air normalize within nine days (excess precipitates out while evaporation "refills the tanks"), in order for the air to hold increased moisture over time it must have warmed. Multiple global datasets show this warming. Look it up. 2. The GHG effects of CO2 work their stuff in the upper troposphere, above the major concentrations of water vapor in the lower troposphere. Water vapor acts as a feedback to the warming impetus caused by rising levels of CO2 concentrations (forcings). This is all basic stuff (the physics of greenhouse gases are very well understood). Look it up (keywords: back radiation). Ample resources for education on both points exist on Skeptical Science. And on many other reputable websites. Pour the coppers of your pockets into your mind and your mind will fill your pockets with gold: Real Climate: Start Here The Discovery of Global Warming Richard Alley's talk: CO2 is the biggest control knob The Yooper
    0 0
  26. Daniel Bailey (#25) said "in order for the air to hold increased moisture over time it must have warmed." True for an ideal situation (the C-C relationship), but certainly does not hold for a global average. If that global average increased moisture is evenly distributed then the world must be warmer on average. If not, then not necessarily depending on how uneven water vapor is (as a whole, not just the increase). This is an very common misconception here, and your second point is a good starting point: "Water vapor acts as a feedback to the warming". How? By absorbing and emitting IR, some of which returns downward. The distribution of water vapor is what determines the amount of GH warming from WV. There is more GH warming from WV on average if WV increases on average. But that is only true if the distribution of that WV stays the same (meaning weather stays the same on average). There are many threads here insisting that weather is changing and that the distribution of WV is more highly concentrated (increased precipitation extremes as one common example). When WV is highly concentrated, the areas with greater concentration reach saturation for IR absorption. The areas with less have less absorption than if the WV were more evenly distributed. The result is less warming (sensitivity) than if the weather remained constant with the increased warming from CO2.
    0 0
  27. I've been wondering about this for a while, but doesn't climate sensitivity depend on your initial conditions to some degree? For example, if you start of with a positive forcing under glacial conditions, you need to take into account ice albedo feedback. On the other hand, if you start of in hothouse conditions where no significant ice caps exist, it seems that the same forcing would not be amplified in the same way. Am I missing something, or is it just that we focus on sensitivity under conditions comparable to those of today?
    0 0
  28. Also the link to Knutti & Hegerl 2008 is broken (there'a %20 tagged on at the end).
    0 0
  29. Figure 1 displays temperatures for two Antarctic sites. I think it might confuse readers that they are called "global" in the caption.
    0 0
  30. Werecow, yes the values for 'climate sensitivity' that the article describes are all predicated on the assumption of the current climate. Whatever the 'true' value is would not hold exactly the same after multiple 'doublings' of CO2 over a span of thousands of years. You may notice that most of the paleoclimate studies which attempt to figure out climate sensitivity are focusing on time periods where conditions were relatively close to those we have today. If sensitivity were a constant it wouldn't really matter what time period you looked at. That said, the single largest climate sensitivity factor we have identified is water vapor... and that is tied directly to temperature over a very wide range. Basically, this largest aspect of climate sensitivity wouldn't show significant variation unless the temperature got to the point where all water froze or all water evaporated.
    0 0
  31. CBDunkerson: That makes sense. Thanks for clearing that up.
    0 0
  32. #24 Eric (skeptic) at 20:09 PM on 9 September, 2010 proving that humidity has gone up on average means nothing. The distribution of water vapor is the only thing that causes or doesn't cause global warming. If water vapor is evenly distributed then there is global warming, if not, global cooling. There is a large natural range encompassing both cases and a lot in the middle. You are right. See this comment at another thread for example. This is how unevenly water is distributed in the atmosphere: With a simple Zero-dimensional climate model it is very easy to demonstrate the effect. The more uneven atmospheric water vapor distribution gets, the lower average surface temperature goes. For a realistic range of parameters, entropy production of the system also goes up as water vapor gets lumpy, even if heat distribution along the surface is extremely efficient (same surface temperature everywhere). The main problem with analytic computational climate models is that they are unable to resolve these fine structures so they simply apply averages at sub-grid scales. To put it in another way you can see through a barbed wire fence easily. But if you take the average density of iron per unit area, it gets indistinguishable from a thin but absolutely opaque iron plate.
    0 0
  33. Eric (skeptic) and BP - Absolutely correct and very relevant, the distribution of cloud cover affects temperature. The question, which I have not seen solidly answered, is whether and how the distribution of cloud cover changes with temperature, and if so does it act as a positive (rising cloud height) or negative (expanded cloud area) feedback. Stephen Schneider in his last TV appearance stated that a 2% increase in cloud cover would halve the CO2 induced temperature rise (negative feedback), while a 0.2km increase in cloud height would double it (positive feedback), and that our current measurements are not accurate enough to determine those changes. Hence the wide range of climate sensitivity due to cloud feedback.
    0 0
  34. Berényi Péter @32 That is not a picture of water vapour
    0 0
  35. Dana1981, Thanks for the excellent article. #3 Thingadonta, I think you raise some valid points in that the pattern of the change can be different between different sources of an imbalance. For instance, the pattern caused by an increase in solar output can be different from an increase in GHGs. There are articles on this site and elsewhere which identify these differences; for example, a larger rise in nighttime lows than in daytime highs is associated with a higher GHG content in the atmosphere where that would not be the pattern expected from an increase in solar output. However, the atmosphere and oceans do a pretty good job of distributing the energy over the globe, and in the long run, energy in is very close to energy out (radioactive decay and others making up the tiny fraction difference). #29 Lars, I think you'd have to propose a climate system where Antarctic and Greenland were isolated from the rest of the world for tens of thousands of years, in order to seriously question that there is a relationship between the ice core derived temperatures and the global average. It's pretty well known that the record is not a global proxy. However, they are, in part, based on oxygen isotope ratios. I'm willing to accept that oxygen is pretty well mixed in the biosphere over periods of thousands of years. Werecow, I'll second what CBDunkerson said, and add that an awareness that the sensitivity varies based on what is the current state can pretty readily be found in paleoclimate studies. On the distribution of water vapor and clouds: The really nice thing about the paleoclimate is that it takes all these considerations into account.
    0 0
  36. BP, thanks for the link, I agree completely. The use of averages in grid cells is not much different, philosophically, than using world wide parameterizations for water vapor feedback. It's a lot of use of ideal or heavily simplified relationships, which are valid in some cases by definition and then assumed to have a plus and minus delta on each side of them that averages out. KR, the cloud argument is valid and affects both IR back radiation and albedo. But the concentration of water vapor outside of clouds is just as important (if not more). Each point in space and time will have some thermodynamic formula for back radiation. Nothing much can be said about an average for an area (e.g. the size of a GCM grid cell or the whole earth) using global-average sensitivity formulas as suggested above. The main problem being that the formula cannot account for water vapor unevenness. Using climate models with volcanic aerosol inputs without being able to model the weather (e.g. mesoscale convection) will simply produce an overestimate of sensitivity. For example http://www.independent.co.uk/news/science/pinatubos-summer-washout-volcanoes-erupt-and-the-worlds-weather-changes-bill-burroughs-explains-the-link-between-forecasting-such-effects-and-global-warming-1554179.html shows that weather itself was a negative feedback for this particular volcano preceded by El Nino (not lower humidity on a global average basis from a globally averaged volcanic cooling).
    0 0
  37. Chris G (#35), you are mostly right that weather and other internal factors should have some paleo consistency. But not completely right because of biosphere changes, geological and other long term changes that affect weather. The bigger problem is that paleoclimate estimates require a mutual relationship with no other external factors. Unless there are paleo records of the other factors (partly possible) and a known relationship between those and the CO2 and temperature being estimated (not very well known). To take a simple example if some solar magnetic effect (e.g. GCM flux) affects both temperature and CO2, then the sensitivity of temperature to increased CO2 alone cannot be determined from those two proxy measurements.
    0 0
  38. Eric (skeptic) - I think that averages can be used if they are understood and properly chosen. As an example, Trenberth has recalculated upwards IR from the Earth's surface in his energy budgets to be an average of 396 W/m^2, rather than the earlier value 390 W/m^2, due to a better understanding of surface temperature variations and the effect via the T^4 relationship of temperature and energy emitted. If the average is properly chosen, it will be fine as a model input, even if the gridding is much larger than the fine level detail of water vapor levels or cloud coverage.
    0 0
  39. #33 KR at 02:22 AM on 11 September, 2010 the distribution of cloud cover affects temperature. It's not just cloud cover. I have shown you clouds because they are visible to the naked eye. The boundary of clouds is a special surface in the atmosphere separating regions with above 100% relative humidity from those below it. But all the other surfaces with equal relative or specific humidity are fractal-like. Precipitable water index distribution for North America: Now, let's consider a very simple climate model. There are two layers, the surface and the atmosphere. In such a model atmospheric (absolute) temperature is always 0.84 times lower than surface temperature, because from there half the thermal radiation goes up, half down (and 0.84 ~ 2-1/4). As in this model the path length is fixed, IR optical depth τ is proportional to the concentration of GHGs in the atmosphere. For the sake of simplicity, let's suppose it is independent of wavelength in thermal IR. In this case absorptivity/emissivity of the atmosphere is 1-e. Also, let the atmosphere be transparent to short wave radiation. If I is the short wave radiation flux at the surface and T is absolute temperature there (and the surface radiates as a black body in IR), then I = (1+e)/2·σ·T4   (σ is the Stefan–Boltzmann constant) It is easy to see for a given SW flux I if IR optical depth τ is increased, T should go up as well. However, let's make the model just a little bit more complicated. Let's have two compartments of equal area over which the sum of GHGs is constant but it may be different between them. That is, in compartment A optical depth is 2τ·cos2φ and in compartment B it is 2τ·sin2φ (the average is τ of course). Also, let the heat transport between compartments be very efficient, so surface temperature T is the same everywhere. In this case the effective optical depth is τeff=-log((e-2τ·cos2φ+e-2τ·sin2φ)/2) Now, τeff happen to have a maximum at φ=45° where GHG distribution is uniform between the compartments and decreases as it is getting more uneven. Therefore a small increase in overall IR optical depth τ due to increased GHG concentration can be compensated for by making its distribution skewed. Water vapor, as a not well mixed GHG is perfect for this purpose. I do not put expression for entropy production here, because it is a bit complicated. But you can figure it out yourself based on radiative entropy flux of a black body being 4/3·σ·T3. Anyway, overall entropy production is also increased by decreasing τeff, so the maximum entropy production principle pushes the climate system toward an uneven GHG distribution whenever it is possible. Note cloud albedo is not taken into account at all in this discussion, only clear sky water vapor distribution.
    0 0
  40. Berényi - a very interesting post here. Are you arguing a particular point from it, however? I cannot tell. As I stated in this post, small scale fractal phenomena can be incorporated into rather coarsely gridded models if the average values are appropriately generated and dealt with.
    0 0
  41. #40 KR at 12:24 PM on 11 September, 2010 As I stated in this post, small scale fractal phenomena can be incorporated into rather coarsely gridded models if the average values are appropriately generated and dealt with. Yes, but the physics gets lost in the process. Equations governing averages can't be derived without being able to handle the true equations at all scales. Therefore you are left with some ill-founded empirical formulas. In case of water vapor simple averages are not even sufficient, no matter how you calculate the average. You also need higher moments of the distribution to get average optical depth for grid cells. Do you also have equations for those to implement them in a computational model?
    0 0
  42. "the physics gets lost in the process. Equations governing averages can't be derived without being able to handle the true equations at all scales. " Pardon? This is an extraordinary statement, flying in the face of both experimental and theoretical evidence in the modelling area that I work in. Can you please explain further what you mean? For the water vapour question, there appears to be a huge literature (eg look at cites for A,M. Tompkins 2002) but I know little about it. Why not ask say Gavin Schmidt directly about it instead of guessing? I would agree that all models are wrong, but some are useful. GCMs have been shown to have considerable skill however in many areas.
    0 0
  43. BP, scaddenp is quite correct in his astonishment at your statement that "the physics gets lost in the process. Equations governing averages can't be derived without being able to handle the true equations at all scales." You are suffering from a severe case of naive reductionism. You'd better not take an aspirin for that condition, until the physicists have finished the Grand Unified Theory so they can model from the lowest level up, what aspirin will do!
    0 0
  44. #43 Tom Dayton at 11:54 AM on 12 September, 2010 BP, scaddenp is quite correct in his astonishment at your statement that "the physics gets lost in the process. Equations governing averages can't be derived without being able to handle the true equations at all scales." No, he is not. Here is the paper he has referenced: J. Atmos. Sci., Volume 59, Issue 12 (June 2002) pp. 1917–1942. A Prognostic Parameterization for the Subgrid-Scale Variability of Water Vapor and Clouds in Large-Scale Models and Its Use to Diagnose Cloud Cover Adrian M. Tompkins The first thing to note is that there is nothing mysterious about "first principles". They are just the most thoroughly tested core of physics presented in a manner that is consistent and understandable. By "tested" I mean gazillion of actually performed experiments and careful observations, any of which could have falsified these principles, but failed to do so. Their understandability is also an indispensable criterion, because in certain directions human understanding has demonstrably more power than any algorithm. Of course understanding needs some algorithmic preprocessing of its input to be able to kick in, but then performs something no (Turingian) computing can ever do. Among other things this ability is in its prime in debugging. The preprocessing I have mentioned is for transforming the proposition to be understood into a very special (recursively modular) form. With some irreducible representations it simply can't be done, so there are things unattainable for us indeed. A computational climate model, which is transparent in this sense is still to be built. But back to your astonishment. You have probably heard of the Clay Institute's Millennium Prize Problems. These are extraordinarily hard and important mathematical problems, with a one million dollar prize for each (must be the least easy way to make a million bucks). Now, among these problems we can find the existence (or the lack of it) of well-behaved solutions of the Navier-Stokes Equation for reasonable initial conditions. These equations describe the motion of fluid substances, pretty basic stuff for e.g. GCMs, so the truly astonishing fact is that the science is not settled at all, not even the science of basic mathematical tools. And the existence problem of solutions to incompressible fluid motion is just the tip of the iceberg, there are many more unresolved tidbits around turbulent flows. Rather expensive wind tunnels are not maintained by mere accident in an age of supercomputers. And now let's see what Tompkins did. He has recognized the fact GCM performance depends not only on average humidity in grid cells, but also on finer details of its distribution inside the cells, that is, on higher moments like variance and skewness (or kurtosis, although he fails to mention this one), just like I was trying to show you above. Then, at least to my astonishment he proceeds as follows: "From the brief review of statistical schemes it is apparent that a widely varying selection of PDFs [probability density functions] have been used. One reason for this is that it is difficult to obtain generalized and accurate information from observations concerning variability [of humidity] down to small scales" Difficult, indeed. The traditional approach in cases like this is to consider the difficulty as a challenge and start working on ways to overcome it. It is quite astonishing how often this kind of attitude has been proven to be fertile. But instead of aiming for actual data, he is trying to circumvent the problem by resorting to a CRM (Cloud Resolving Model). That is, instead of going out and having a look or two at what's going on in nature, he applies another computational model, this time on a smaller scale. "The first aim of this paper therefore, is to use a 3D CRM, on domains on the order of a climate model grid box but also with relatively high horizontal resolution, to assess whether a generalized function form exists that can describe the total water variability" It would not be such a serious problem had he not called running a program an experiment and program output data. "Examination of the PDFs every half hour throughout the experiment proved them to be very similar in characteristics, since the computational domain was sufficient in size to continuously contain an ensemble of clouds, and the initial conditions were a realistic field of clouds in a state of quasi-equilibrium. The data at the 65 536 grid points are divided into 200 bins of equal width [etc., etc.]" Of course Gedankenexperiments were always legitimate tools of the scientific inquiry, but they are not substitutes for actual experiments or observations. Traditionally they were never used to settle the science, but to uncover holes in existing theory (to be filled later by either reshaping the theory or collecting relevant data). Note the CRM he has experimented with has the same Navier-Stokes equations in its belly, also gridded, therefore unable to handle sub-grid processes of its own. As flows tend to be turbulent down to microscopic scales (have you ever seen the fine threads of cigarette smoke?) and turbulence generates fractal structures, this model also has to be parametrized. In 3D flows (unlike in 2D ones) energy in small-scale eddies never gets negligible by shrinking through many orders of magnitude (this is the very reason behind the mathematical troubles). So. Water vapor distribution statistics presented in this paper are neither rooted in first principles nor in actual observations, they just hover in thin air (they do what vapor is supposed to do after all).
    0 0
  45. BP - why do think I suggested that you look at CITES of the paper, rather than the paper? You think this problem is unstudied, unsolved and unsolvable? Firstly, using CRM - do you believe that these are developed without reference to empirical principles? Also cloud parameterization uses both CRM results AND observation results. The problem with reliance on observational results is that you cannot extrapolate with much certainty outside the range of observation parameters whereas a model from basic principles allows more confidence. I am sure you would be highly critical of parameterizations that extrapolated from observations. Also NS. Do you suppose that if NS turns up in engineering, we are forced to shrug shoulders and walk away? Of course not - there is massive toolkit for dealing with NS. And yes, this can involve trading confidence for solutions at boundaries for information loss within. And by the way, surely most of thermodynamics is laws of averages, (eg pressure, temperatures), for which there are fundamental problems doing derivations from true equations at all scales. I'd say most of the whole science of chemistry fits this as well. So are you really looking to understand limits and skill of models, or looking for excuses to dismiss them as fits another agenda?
    0 0
  46. #45 scaddenp at 09:08 AM on 13 September, 2010 So are you really looking to understand limits and skill of models, or looking for excuses to dismiss them as fits another agenda? I am trying to understand what's going on. Have a look at this review article, please. Phil. Trans. R. Soc. A (2005) 363, 2931–2946 doi:10.1098/rsta.2005.1676 Published online 24 October 2005 Modelling climate change: the role of unresolved processes BY PAUL D. WILLIAMS "In climate models, too, it is tacitly assumed that the effectively random subgrid-scale events are so large in number that their integrated effect on the resolved scales is predictable, allowing it to be included in models. However, in fluids there is an enormous separation of scales between the microscale and the macroscale. There is no such ‘thermodynamic limit’ in the climate system, as suggested by figure 1. Phrased differently, if there were a billion clouds, gravity waves or ocean eddies in a GCM grid box then their impacts on the resolved flow would be predictable, like the temperature of a gas, and the current treatment of unresolved scales in climate models would be defensible. But such a separation of scales between the resolved and unresolved dynamics simply does not exist. The number of sub-grid-scale events per grid box is not large enough to permit the existence of a meaningful statistical equilibrium." But simply adding random noise to climate variables as the author insists is not a panacea either. As the climate system is not an ergodic process, it is a bit difficult to collect reliable empirical data on statistics of the noise to be added, although as we have seen above, in some cases the statistics alone can determine the outcome. These are all pretty fascinating stuff, I just wonder why this side of the coin is left out from climate communication entirely including research press releases and this fine science blog itself.
    0 0
  47. BP #46 I am trying to understand what's going on. Sorry but the evidence suggests that this is not the case. If it was, you would have dealt with the serious deficiency in a prior analysis or allowed someone else to do so. Fixing this problem is simple (I can do it for you if you don't know how to do it yourself), yet despite repeated requests you have not done so. I suggest you fix the problem so that you are not continually nagged about it, as it reflects very badly on your credibility at present
    0 0
  48. Berényi - In regards to water vapor feedback, you should look at Dessler and Sherwood (2009) (as referred by Chris Colose), where they discuss gridded averaging of complex behavior, and how it appears to work just fine in modeling. At it's core, they state that "The large-scale wind and temperature fields that mainly control the humidity are explicitly calculated from the basic fluid equations, unlike small-scale processes that must be represented by crude parameterizations.", as the water vapor involved in the feedback is primarily in the upper tropical troposphere, above the majority of clouds.
    0 0
  49. Berényi Péter, the millenium prize problem has a completely different goal than "simply" solve the Navier–Stokes equations. Are you just making noise to not let people understand?
    0 0
  50. #49 Riccardo at 07:07 AM on 14 September, 2010 the millenium prize problem has a completely different goal than "simply" solve the Navier–Stokes equations I think I was clear enough. Re-read please. That problem is about the very existence of well-behaved solutions, which is of course different from actually solving the equations. It does not have immediate consequences either on numerical integration of these beasts which is relied upon heavily by GCMs. However, any step forward in this specific area of research would advance our general understanding of structures behind turbulent phenomena and that could be useful in climate modeling as well. In 3D flows a considerable portion of energy is being pushed to ever smaller scale features. This energy is thermalized eventually, but the road leading there is rather bumpy. The dissipation process is not uniform, on intermediate scales even focusing phenomena can develop giving birth to such extreme events as rainbands or tornadoes spawned from the eyewall of hurricanes making landfall. Therefore some rather tricky context-dependent statistics is required for the parametrization of sub-grid phenomena, one that is neither measured properly, nor can it be derived from first principles due to lack of understanding. BTW, I have found an essay which is highly relevant in this context. Bulletin of the American Meteorological Society Volume 86, Issue 11 (November 2005) pp. 1609–1614. DOI: 10.1175/BAMS-86-11-1609 The Gap between Simulation and Understanding in Climate Modeling Isaac M. Held "Should we strive to construct climate models of lasting value? Or should we accept as inevitable the obsolescence of our models as computer power increases?"
    0 0

1  2  3  Next

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us