Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Can we trust climate models?

Posted on 24 May 2011 by Verity

This is the first in a series of profiles looking at issues within climate science, also posted at the Carbon Brief.

Computer models are widely used within climate science. Models allow scientists to simulate experiments that it would be impossible to run in reality - particularly projecting future climate change under different emissions scenarios. These projections have demonstrated the importance of the actions taken today on our future climate, with implications for the decisions that society takes now about climate policy.

Many climate sceptics have however criticised computer models, arguing that they are unreliable or that they have been pre-programmed to come up specific results. Physicist Freeman Dyson recently argued in the Independent that:  

“…Computer models are very good at solving the equations of fluid dynamics but very bad at describing the real world. The real world is full of things like clouds and vegetation and soil and dust which the models describe very poorly.”

So what are climate models? And just how trustworthy are they?

What are climate models?

Climate models are numerical representations of the Earth’s climate system. The numbers are generated using equations which represent fundamental physical laws. These physical laws (described in this paper) are well established, and are replicated effectively by climate models.

Major components of the atmosphere system such as the oceans, land surface (including soil and vegetation) and ice/snow cover, are represented by the current crop of models. The various interactions and feedbacks between these components have also been added, by using equations to represent physical, biological and chemical processes known to occur within the system. This has enabled the models to become more realistic representations of the climate system. The figure below (taken from the IPCC AR4 report, 2007) shows the evolution of these models over the last 40 years.

Climate model development over the last 40 years

Models range from the very simple to the hugely complex. For example, ‘earth system models of intermediate complexity’ (EMICs) are models consisting of relatively few components, which can be used to focus on specific features of the climate. The most complex climate models are known as ‘atmospheric-oceanic general circulation models’ (A-OGCMs) and were developed from early weather prediction models.

A-OGCMs treat the earth as a 3D grid system, made up of horizontal and vertical boxes. External influences, such as incoming solar radiation and greenhouse gas levels, are specified, and the model solves numerous equations to generate features of the climate such as temperature, rainfall and clouds. The models are run over a specified series of time-steps, and for a specified period of time.

As the processing power of computers has increased, model resolution has hugely improved, allowing grids of many million boxes, and using very small time-steps. However, A-OGCMs still have considerably more skill for projecting large-scale rather than small-scale phenomena.

IPCC model projections

As we have outlined in a previous blog, the Intergovernmental Panel for Climate Change (IPCC) developed different potential ‘emissions scenarios’ for greenhouse gases. These emissions scenarios were then input to the A-OGCM models. Combining the outputs of many different models allows the reliability of the models to be assessed. The IPCC used outputs from 23 different A-OGCMs, from 16 research groups to come to their conclusions.

Can we trust climate models?

'All models are wrong, but some are useful' George E Box 

There are sources of uncertainty in climate models. Some processes in the climate system occur on such a small scale or are so complex that they simply cannot be reproduced in the models. In these instances modellers use a simplified version of the process or estimate the overall impact of the process on the system, a procedure called ‘parameterisation’. When parameters cannot be measured, they are calibrated or ‘tuned’, which means that the parameters are optimised to produce the best simulation of real data.

These processes inevitably introduce a degree of error - this can be assessed by sensitivity studies (i.e. systematically changing the model parameters to determine the effect of a specific parameter on model output).

Other sources of potential error are less predictable or quantifiable, for example simply not knowing what the next scientific breakthrough will be, and how this will affect current models.

The IPCC AR4 report evaluated the climate models used for their projections, taking into account the limitations, errors and assumptions associated with the models, and found that:

“There is considerable confidence that AOGCMs provide credible quantitative estimates of future climate change, particularly at continental and larger scales.”

This confidence comes from the fact that the physical laws and observations that form the basis of climate models are well established, and have not been disproven, so we can be confident in the underlying science of climate models.

Additionally, the models developed and run by different research groups show essentially similar behaviour. Model inter-comparison allows robust features of the models to be identified and errors to be determined.

Models can successfully reproduce important, large-scale features of the present and recent climate, including temperature and rainfall patterns. However, it must be noted that parameter ‘tuning’ accounts for some of the skill of models in reproducing the current climate. Furthermore, models can reproduce the past climate. For example simulations of broad regional climate features of the Last Glacial Maximum (around 20,000 years ago) agree well with the data from palaeoclimate records.

Climate models have successfully forecast key climate features. For example, model projections of sea level rise and temperature produced in the IPCC Third Assessment Report (TAR - 2001) for 1990 – 2006 show good agreement with subsequent observations over that period.

So it is a question of whether the understanding of the uncertainties by the climate science community are sufficient to justify confidence in model projections, and for us to base policy on model projections. Whether we chose to accept or ignore model projections is a risk. As Professor Peter Muller (University of Hawaii) put it in an email to the Carbon Brief:

“Not doing anything about the projected climate change runs the risk that we will experience a catastrophic climate change. Spending great efforts in avoiding global warming runs the risk that we will divert precious resources to avoid a climate change that perhaps would have never happened. People differ in their assessment of these risks, depending on their values, stakes, etc. To a large extent the discussion about global warming is about these different risk assessments rather than about the fairly broad consensus of the scientific community.”

It should be noted that limits, assumptions and errors are associated with any model, for example those routinely used in aircraft or building design, and we are happy to accept the risk that those models are wrong.

For more information about climate models:

0 0

Printable Version  |  Link to this page

Comments

1  2  3  Next

Comments 1 to 50 out of 123:

  1. Is it worth reiterating at this point that you don't need a climate model if you just want to compute future temperature trends? All you need is an energy balance calculation or some kind of empirical model of it. Hansen point this out here, and numerous bloggers (including Tamino, Arthur Smith and Lucia) have reproduced the empirical calculation. The 20th century climate is enough to determine the parameters of the system (although not the climate sensitivity, which requires longer timescales but for that reason is irrelevant over the next few decades), allowing forecasts for the next few decades at least. The principal uncertainly according to Hansen is the values of the aerosol forcing. But even if this is wrong, the response function mops up the error to a large extent when tuning against the 20th century climate.
    0 0
  2. Kevin, Much has been written in the blogosphere recenty about aerosols, probably a result of Hansen's recent paper. http://www.columbia.edu/~jeh1/mailings/2011/20110415_EnergyImbalancePaper.pdf Hansen surmised that we have been underestimated the cooling effect of aerosols, and thereby overestimating the rate of heat uptake by the oceans, suggesting that the models need to be adjusted. His latest aerosol forcing is about half the GHG forcing, but with a rather large uncertainty. In fact, he states that many climate models only use the direct aerosol forcing, and ignore the indirect forcing which may be twice as much.
    0 0
  3. It is also important to remember that models can underestimate problems as well as overstate the case. In the reverence you cited they state in the abstract that sea level rise has been faster than estimated in 2000. The sea ice in the Arctic is also melting much faster than expected. Does anyone have similar examples where change is happening slower than expected?
    0 0
  4. That model evolution figure is very cool. I hadn't seen that before.
    0 0
  5. "A Climate Modelling Primer" by McGuffie and Henderson-Sellers is quite a good book on the subject. Goes through the historical development of climate models and covers some detail. Having said that, it is a bit over the top if you only want a basic understanding.
    0 0
  6. This business about the aerosols, if what Hansen is saying is true and the cooling effect of aerosols is being underestimated, doesn't this prove that we can simply geo engineer our way out of the problem?
    0 0
    Response:

    [DB] "...doesn't this prove that we can simply geo engineer our way out of the problem?"

    Umm, nottasomucha.  There is a logical disconnect in your thinking.  Discussing hypotheticals and theoreticals based on not-yet-published works is pretty off topic for this thread "Can We Trust Climate Models?"

  7. Mr Cadbury@6 Is conducting a massive global experiment to alter climate on top of an existing global climate experiment (AGW) wise? Considering that humanity usually gets things badly wrong with any experiment with nature, I don't think adding another one is appropriate.
    0 0
  8. BTW Mr Cadbury@6, the article is about climate models, so I suggest that before the moderator starts deleting comments, you get back on subject.
    0 0
    Response:

    [DB] Agreed.  Jay is making an extrapolatory leap from 2+2=4 to calculus.  Do not pass go, doesn't parse, that dog doesn't hunt, etc.

  9. Kevin C, I used to hang out here frequently but the discussions are becoming less and less realistic. In your comment (#1 on this thread) you imply that future temperature trends can be predicted. If you can do this, please share your predictions. Can you predict temperature trends for the next few months? How about the next few years or the next few decades? Please submit your response in a graphical or spreadsheet format.
    0 0
  10. GC - he is predicting temperature trends - which in climate is 30 year basis. You might have noticed that models do that job well.
    0 0
  11. scaddenp: I don't know, I think I can make a fairly good prediction for the next few months. Right around here, it's going to get steadily cooler for about two months, then it's going to gradually stop cooling, and start warming up again. I rather suspect those living in the northern hemisphere (like our friend GC) will see the weather get a bit warmer over the next few months, then gradually start to cool. How do I know this? Because I have a model in my head about how the seasons work, based on a lot of personal experience, along with education about historical records that go back a very long time, and an understanding of the very large natural forcing factors that influence (regional) temperature on month-to-year timescales. Is that an accurate model? To some extent. Is it useful? Certainly! Even more so if you combine it with similar regional models of precipitation & sunshine. Farmers rely on such models every year when they plant their crops. Folks very much closer to the poles than I or GC might use it to tell them when to stock up on firewood, or check the furnace works, or similar such actions. I can't tell you what the temperature is going to be next Monday, though - you need a very different, far more sophisticated model for that. The weather bureau just happens to have one, though, and they're telling me it's going to be about the same max temperature as today, but with some showers around. Again, a useful model, with pretty good accuracy in the short term, and increasing uncertainty the further out you go. Kind of like the climate models, although on a different scale both temporally and spatially. gallopingcamel, have you read this post about Hansen's 1981 predictions? Looking at the 30 years of global temperature data prior to 1980, would you have made the same predictions that Hansen did back in 1980? I know I wouldn't have, without a lot of persuasion. Turns out his climate model was pretty much on the money, though. It's been more-or-less right for 30 years now, despite being orders of magnitude simpler than current climate models, and despite there being so much more discovered about how the climate works. So it's a useful model, certainly. (And the natural variability evident in the measured temperatures in that graph should educate you as to why asking for accurate predictions over any period less than 10-15 years is a fool's game)
    0 0
  12. "Climate models have successfully forecast key climate features. For example, model projections of sea level rise and temperature produced in the IPCC Third Assessment Report (TAR - 2001) for 1990 – 2006 show good agreement with subsequent observations over that period." As far as I can see, the "good agreement" is only within a large natural variability , implying large error bars, meaning that they are only loosely constrained. What is the meaning thus of : "There is considerable confidence that AOGCMs provide credible quantitative estimates of future climate change, particularly at continental and larger scales" ? where does this "considerable confidence" come from ? what "credible quantitative estimates" can be done ? it seems that they are "credible" only because the large error bars make "credible" that reality will sure lie somewhere inside ! for me a "good" model must reduce very significantly the uncertainty with respect to very crude estimates, for instance simple extrapolations of the past (which don't need any "model" actually). Only this can allow "non trivial" predictions. I don't see yet where AOGCM have had better performances than these simple crude estimates.
    0 0
  13. jarch: so you're saying that a complex, detailed climate model that predicts future climate trends that later observations closely match is not credible? I agree that we need a longer time period (30 years would be good, like the Hansen predictions I linked in my previous comment) to be really sure they're accurate, but if the best simulations of the climate agree closely with what actually happens over the following years, surely that's an indication that the simulations are at least a usefully good approximation of reality? In any event, did you actually look up the reference linked in that paragraph you quoted? It seems the agreement is pretty good, and the 'error' is much, much less than the range of natural year-to-year variability. The other point to consider, of course, is this: using our best understanding of all the factors that affect climate, scientists have constructed a model that closely matches what the earth's climate actually does. One of those factors (indeed, the dominant one lately) is the large & growing influence of human greenhouse gas emissions. Without greenhouse gases included, the model results are completely wrong. If you cannot demonstrate that the current understanding of natural climate forcings is completely wrong, then you have no valid argument.
    0 0
  14. jarch Climate models ... Considerable confidence? About as much confidence as the rest of us have in our seasonal climate experience. Xmas Day for instance. Australians and others have visions of our Xmas Day spent playing cricket on the beach and Brits have similar idealised visions of a white Christmas. It's absolutely true that each is more likely in its own geographic area, but no sensible person does more than hope for those ideals. The Aussies might be stuck on a beach in a freezing wind. The Brits can look out over a miserable grey day with no sign of the picture postcard white blanket. These are perfectly natural variations within certain bounds. And climate models are much like our direct experience. Britons will never, ever have a calm, sunny 33C day for a Xmas lunch under a cloudless sky. Aussies will never, ever wake up on Xmas morning to a crisp white blanket of pristine snow over Sydney or Perth or Adelaide's suburban expanses. Climate models tell us what features are more likely in various places at various times. Most importantly, models, like our experience, tells us what is and is not surprising in particular places.
    0 0
  15. Good article. I'm interested to know whether you are going to cover methods for downscaling in future articles? Global temperature trends are good to know but they don't tell us an awful lot about impacts in specific areas. Prediction of future rainfall trends for example. These are helpful in determining planning needs for water resources and flood protection. Such models exist in the UK and can predict rainfall trends at a resolution of 5km2. Predictions from the UK Climate Projections are used extensively in future UK planning at both a national and regional levels. It is better to have an estimate with a degree of uncertainty than no estimate at all. That way at least you have some method for determining adaptation measures.
    0 0
  16. While we're at it, I have a load of questions about GCMs which maybe someone can answer. Most of my detailed knowledge of GCMs comes from Science Of Doom's articles, which I may in turn have misunderstood. Here goes: 1. I understand from SoD that all but a handful (no more than 5) processes in GCMs are implemented directly from the underlying physics (with the only issues being fineness of sampling). The remaining 5 or so cannot be modelled on an appropriate scale and so have to be handled with empirical models. Is that correct? Does anyone know what these processes are? Can the empirical models be determined by fine-scale modelling of smaller systems? 2. My impression from SoD is that the parameters for the empirically determined processes are determined by fitting by fitting a stable pre-industrial climate and the forced 20th century climate - but only by fitting global observations such as global mean temperature or precipitation. Is that correct? If so, it would presumably be correct to regard any local behaviour as a true prediction of the model, which gives an independent (if hard to enumerate) indication of the validity of the model. 3. The 2011 Hansen draft paper linked by Eric@2 (thanks, I wrote a précis of it yesterday here) suggests that the rate of deep ocean mixing is wrong in GCMs. Is the deep ocean mixing modelled from the physics, or empirically? If empirically, then the error is already explained - the incorrect aerosol forcing. If physically, then some explanation is required of why the physical model is producing aphysical results. Has any been suggested? Thanks in advance for any pointers on these questions!
    0 0
  17. Bern, Yes, I was aware of that 1981 Hansen prediction. I agree that it looks pretty good. So good in fact than one might be tempted to extrapolate it forwards to 2100 or backwards to 1930 or even 1850. However, when you extend the timescales, the wonderful correlation breaks down. Anyone who believes that CO2 is a major driver of global temperature is looking for hockey stick trends because that is what the CO2 concentration is doing. I tried to match the CO2 hockey stick to the Greenland temperatures shown in the attached graph which I prepared with the idea that temperature trends are magnified at high latitudes. Can you see the correlation? http://diggingintheclay.files.wordpress.com/2010/12/coastal-average.png?w=1024&h=621
    0 0
  18. Kevin, I read both today's and yesterday's posts. I think the result of Hansen's paper is that there is still a lot that we do not understand. He offers several possible explanations; all which will coincide with the observed data. However, that does not tell us which is correct, if any. My opinion is that many of the models underestimate several forcings, in addition to the aerosols. Ocean cycles are still being updated for model use, and may play a much mroe vital role that previously thought. Camel's link to Greenland temperatures may be largely due to the cylcic nature of the AMO. Phil Jones has co-uthored a paper recently in Nature which shows the changes in SST in the north Atlantic during the past century. http://www.nature.com/nature/journal/v467/n7314/full/nature09394.html
    0 0
  19. #9 Galloping Camel -- I think you misunderstand what Kevin C has stated in comment #1. If the only thing you are looking at is a global average temperature, than an incredibly simple 1 box model fed with a set of forcings data will have an output which nearly identical to the most complex AOGCM. If you make a prediction for the annual net forcing for each of year of the next couple of decades, then I will make a prediction of the global average temperature over that period that will be virtually indistinguishable from the AOGCM outputs. Indeed, the each individual run of the typical AOGCM would probably have less correlation with the actual future temperature than my toy model, because of the internal variability of the AOGCMs.
    0 0
  20. #9 Galloping Camel: Yes, Charlie understood me correctly. My only reservation is that a 1-box model is just a bit too simple, because the real system seems to require at least two characteristic periods to fit the 20th century behaviour. With only one, you have to fit the multidecadal response and lose the faster ones. This is most apparent in a failure to fit the response to volcanic forcings. (I haven't tested that however, it is my synthesis of several different bits of work by Tamino and the others.) I actually agree with Arthur Smith here that attaching a physical meaning to even the 2-box model is suspect. You may as well abandon the pretence of physicality and simply determine the response function by a parameterised deconvolution, where a parsimonious parameterisation is chosen to best fit the observed data. Two exponentials happen to work quite well, but it may be possible to do better. Conceptually, cross validation should tell us just how parsimonious to make the parameterisation, but looking at the curves my gut says that there is not enough data once you account for autocorrelation.
    0 0
  21. "I tried to match the CO2 hockey stick to the Greenland temperatures shown in the attached graph which I prepared with the idea that temperature trends are magnified at high latitudes. Can you see the correlation?" So how about instead matching global temperature to total forcings? That is what the climate models are actually about. Claiming that modellers expect that climate is only based on CO2 is a straw man.
    0 0
  22. Kevin C #20, OK, I accept that clarification. You were not claiming the ability to do something that has eluded everyone else. scaddenp @21, That is pretty much my view. There have been huge swings in global temperature over the last 50,000 years in spite of the fact that for most of that time CO2 concentrations were stable. To suggest that suddenly CO2 is a major factor makes no sense. Imagine that you have a magic wand that can eliminate all anthropogenic CO2 emissions overnight. Based on what CGMs can tell us, what would be the effect on global temperatures?
    0 0
  23. "To suggest that suddenly CO2 is a major factor makes no sense." Then try reading some more. The planet has changed in the past because the forcings have changed in the past. Furthermore, our model for climate successfully predicts how much change will happen for a given change in forcing. Changes to CO2 in the past have always affected climate but the CO2 changed as feedback. You cant make milankovitch forcings produce the scale of temperature change without the feedback from CO2. The problem with the idea that it is "just a natural change" is showing what natural forcing has changed that can explain the current climate. For climate with or without anthro forcings, see this figure. For the question as to what would happen if all anthro emissions stopped see Hare and Mannshausen 2006
    0 0
  24. scaddenp @23, The figure you linked claimed ~0.75 degrees difference between the models with and without anthropogenic forcings over the last 100 years. Will similar trends extend over the next 100 years? I would wager $10,000 that it will not but sadly I won't be around to collect my winnings. I could not get to the Hare et al. paper as it was behind a $34.95 pay wall. However, it does sounds like something that addresses the right questions. You seem to be open minded so I want you to tune in to the History Channel at 9 p.m Eastern Standard Time on Friday, May 27th. The program is called "Little Ice Age - Big Chill". This is global warming/cooling as seen by historians, archaeologists and geologists. Climate models have not done as well as historians when it comes to describing past climate changes. Let's continue this after you have watched the program.
    0 0
    Response:

    [DB] An open-copy is available here.

  25. GC, the Little Ice Age has been discussed on this site already. If you care to read through that and some of the other articles linked, you'll find that climate models consider a wide range of natural forcings, in addition to human greenhouse emissions. It's only in the last century that GHG emissions have had a significant effect, and only in the last half century or so that they've come to dominate.
    0 0
  26. Charlie #19: I just caught up with your post on the other thread here where you show your own simple model hindcast. You describe it as a 'a simple linear + 1 lag model' - do I infer correctly from that that you are using two terms: one exponential lag and one which is a direct feed-through of the forcing (i.e. a delta-function response)? If so, that would answer my objection to the 1-box model. The additional linear term counting as the second box, for which the time constant is certainly very short and could probably just as well be a delta function. Kevin Postscripts: Charlie: Thank you for your persistence in engaging with my posts. We may be on different sides of the debate, but you consistently take the trouble to read and give interesting and useful pointers in response. I'm learning a lot. Moderators: I know some of my questions and explorations have been rather tangential to the articles concerned. I'm learning climate science as fast as I can and need to ask questions, and often an active article sparks a question. I'll try and take discussions to a relevant post in future, although my experience is that it posting to an old article is not a good way to get discussion.
    0 0
    Response:

    [DB] You will find moderation here at SkS provides an atmosphere conducive to learning.  Some off-topic dialogue is permitted where it is evident that individuals are trying to learn.  That being said, when the discussion endures, it is advisable to take the discussion to a more appropriate thread at some point.

    There are no dead or closed threads at SkS, only temporarily dormant ones.  Regular commenters follow the Recent Comments thread and will see anything posted there, regardless of the thread it is posted on.

  27. What is nice about the Hare paper is that they show the calculations and parameters, so that if you which to substitute other numbers, you can plot your own values.
    0 0
  28. DB, Thanks for that open-copy link. It addresses my question about the effect of abruptly ceasing emissions. Figure 1 on page 9 shows temperature rise accelerating if CO2 emissions cease on the reasonable assumption that aerosol emissions would also fall. One thing that struck me is that the complexity of the hindcast is quite different from the forecasts. Clearly some tweaking went on in an effort to make the hindcast similar to instrumental readings. Even so, the hindcast does not fit convincingly even with the carefully chosen start date (1850). I hope you and Bern will take the trouble to watch the History Channel today at 9 p.m. It covers the LIA as you would expect from the title but it also looks back to the MWP. As I have said earlier on this thread the models only agree with the historians over a period of 150 years. If the models disagree with what historians tell us over longer periods of time, why would one have any confidence in their predictive power? Let's look at those predictions that use 2005 as the start date. Can we agree that the only curve that matters is #4 (Constant Emissions). Reality may turn out to be a slight fall if there is a vast expansion of nuclear power or a weak global economy. Perhaps more likely, emissions may increase slightly owing to continuing rapid industrialization in densely populated countries such as China and India. Here are the predictions: Temperature rise 1850-2005 = 0.8 Kelvin Temperature rise 2005-2025 = 0.4 Kelvin Temperature rise 2005-2100 = 1.3 Kelvin Not looking good so far. Since the "Hare" paper was written the "Tortoise" seems to be in charge. The pace of warming seems to be slowing rather than accelerating: http://www.drroyspencer.com/latest-global-temperatures/
    0 0
  29. #26 Kevin C says "You describe it as a 'a simple linear + 1 lag model' - do I infer correctly from that that you are using two terms: one exponential lag and one which is a direct feed-through of the forcing (i.e. a delta-function response)?" Here are some notes on how to make a simple spreadsheet that, given a set of forcings either projected or historical, will generate a global average temperature anomaly almost identical to the GISS-E AOGCM. I initially had several misunderstanding about the toy model, mostly because terms were used differently than in engineering fields. For example, a step change in forcings was called an impulse forcing. The points below hopefully make it easier to understand, or at least to avoid several of the misunderstanding I had. 1. The model works with changes in forcings causing changes in global temp anomaly. The estimated global average temp is then calcuated as a running sum of the temperature changes. 2. This works because the earth heat up in response to a step change in forcing, and the warmer earth radiates more energy, cancelling out the step in forcing. The time it takes for the temperature of the earth to respond to a step increase (or decrease)in forcing can be approximated by an exponential with time constant in the 2 to 5 year range. Physically, this corresponds roughly to the time it takes to heat up the well mixed layer of the ocean (i.e. the layer above the thermocline .... typically considered to be an equivalent depth of 60 or 70 meters). 3. The ultra simple model is just a single multiplication. X watt/m^2 step increase in forcing will cause a step increase of (lambda * X) degrees. delta T = lambda * delta F. 4. The 1 box model is the same as above, except that the step change in forcing results in the same temperature delta, but over a period of a few years rather than instantaneously. The step change in forcing is an impulse in the derivative. The impulse response is simply exp(-T/tau). The response is delta T, not T. Specific example: if e-folding time (time constant) is assumed to be 2.6 years, then for each year after the step increase in forcing, the increase in delta T will be about 68% of the increase from the previous year. Exponentially decaying series never go to zero, but including just approximations of the first 6 terms (and a scaling of volcano forcing) resulted in a toy model that emulated the GISS-E model with an error of only about 0.027C rms. An exponential decay of 2.6 year time constant starts off as 1, 0.68, 0.46, 0.32, 0.21, 0.15. If I truncated the series after those 6 terms, and then normalize the sum, the rounded off coefficients are 0.36, 0.24, 0.16, 0.11, 0.08, 0.05. 5. This simplified version can easily be hardcoded into a spreadsheet. A. download the GISS-E forcings for 1880-2003, and then add 10 or 20 rows above for years prior to 1880 and fill the forcings with zeros. B. Setup up a column, Delta F, which is nothing more than forcing change from the previous year. C. Setup a cell called Lambda. D. The delta T for each year is simply lambda* (0.36* delta F + 0.24*delta F of the prior year + 0.16*delta F for the year before that ...... on through the 6 terms). E. Now you have delta T for each year. F. The estimated temperature is the running sums of delta T. That's it. Nothing more is needed to emulate GISS-E model anomaly temps to within about 0.5 C rms error. Before calculating error, be sure to baseline the two anomaly temps series, which have about 0.12C offset. 6. The plot of this simple model, though, shows excessive response to volcanic events, which indicates that the GISS-E model is less sensitive to stratospheric aerosol forcings of volcanoes than to other forcings. A 30% reduction in the volcano forcings resulted in a better fit. 7. A more elegant method of implementation in a spreadsheet would be to set up a coefficient array and then do a dot product array multiplication to get the estimated delta-T. Note that, if the forcings are in a vertical column, latest year at the bottom, then for the array multiplication to work properly the coefficients have to decrease going from right to left. I'll put together than spreadsheet today or tomorrow, and hopefully find a place to post it.
    0 0
  30. GC "Will similar trends extend over the next 100 years? I would wager $10,000 that it will not but sadly I won't be around to collect my winnings." Well I would bet the temperatures will follow the total forcings whatever they actually are and base that bet on established science. You would be basing your bet on what? Hope? Good luck? Pity - my retirement savings could do with a boast. As to history channel - well firstly I am in NZ, dont have access to such a channel and frankly prefer to get my science from published papers, or even my colleague across the passage whose speciality this is, rather than the biases of some tv director. And what on earth is the relevence? The science so far agrees LIA was indeed global event (though far less pronounced in SH than NH), and the response of climate to the forcings of the time. Are you implying the same forcings have suddenly come (the deep solar minimum) which somehow overwhelm all the other forcings.?
    0 0
  31. "Clearly some tweaking went on in an effort to make the hindcast similar to instrumental readings." Can you substantiate that please? Hindcast do have a problem in that proxies have to be used to estimate forcings, but this is fit to proxy not temperature on the whole. As to "not looking good so far" - we had hottest year on record in GISS despite deepest solar minimum since satellite measurements begun but agreed its not accelerating. Would you expect it to when compare forcings change in CO2 cf forcing from sun over same period. Do you seriously expect that temperatures are going to decline as solar cycle revives or are you expecting solar minimum to last till 2100?
    0 0
  32. Scaddenp, I agree that the climate responds to the forcings of the time. The LIA in the SH was probably mitigated by the vast stretch of ocean. Going forward, the climate will also respond to the strongest forcings. If that becomes CO2, then I would expect to see substantial warming. If we observe a strong solar minimum, then I would expect that to dominate. The fact that the models reflect the past has little to do with the accuracy of the models, but is a reflection of past temperature inputs into the models. Afterall, who would design a model which did not accurately reflect the past. The true test of the models will come in future temperature projections and observations. Unfortunately, this may take several years (decades). I also agree that temperatures are not accelerating. However, I would add that recently temperatures are neither increasing nor decreasing; 2010 being sandwich between a cold (relative) 2009 and start of 2011. Is this the start of a long-term decrease as GC maintains, or simply a blip in a long-term increase? Stay tuned, and watch nature, not tv.
    0 0
  33. #29 Thanks for the description Charlie. I've actually just coded it up in python, 'cos that's what I know. Here's what I get using a two-box model with unmodified forcings. I optimise 5 parameters: the equilibrium temperature, the scales and the periods of the two exponentials. That looks pretty much identical to Hansen's "Green's function" version in his figure 8b - the difference being that his response function is determined from running a step function in ModelE, and mine is from fitting the 20th century climate with a 2-box model. The formula of the response function is as follows: R(t) = 0.0434*exp(-t/1.493) + 0.0215*exp(-t/24.8) (where t is in years). The temperature as a function of year is then given by the equation: T(t) = Sum_s ( F(t-s) R(t) ) + c where F(t) is the forcing and c is -0.0764641223, which is a constant which fits the equilibrium temperature. The next step is to see if there is enough data to do a realistic cross-validation, and if so to play with different parameterisations of the response function. I'm not sure whether to include an ENSO term like Arthur did (which gives a much better fit at the cost of adding another parameter - that's bad given that overfitting is a concern), or work with an ENSO-removed temperature series. What was the point of all this? It demonstrates my point1 in #1 that if you just want global temperatures, you don't need a complex climate model. You just need the forcings. 20th century climate provides enough of a constraint on the system behaviour that you can then predict then next few decades with a single equation. So, one answer to the question in the title of this article, "Can we trust climate models?" is "It doesn't matter". We can deduce what will happen to global temperatures over the next few decades for any given set of forcings empirically by looking at the 20th century. Of course if you want to go beyond a few decades, or if you want to know about anything other than temperature, or if you want to know what will happen at a regional or smaller scale, or if the behaviour of the system changes drastically, then you need a climate model. 1 Well, not really my point. Hansen and Held and Tamino and Arthur and Lucia and probably others did it before me.
    0 0
  34. Sigh. I got the most important equation wrong. It should of course read: T(t) = Sum_s ( F(t-s) R(s) ) + c
    0 0
  35. Aerosol distribution may also have been factor in SH. Model design is primarily about accurately representing physics but this is a complex task. That the models work for paleoclimate validation is at least a sign that they are not radically wrong. So indeed the future would tell - but how much future prediction do you think it needs? The incredibly primitive model that was the basis of Wally Broecker's 1975 prediction still allowed him to predict 2010 temperature pretty well. Okay, CO2 isnt has high as he thought it would be and Manabe's model had sensitivity too low, but note that this prediction was made before GISS existed, before any millennial paleoclimate temperature record was around. In short, pure physics. In 10 years time, after another solar cycle to test the Argo temperatures will people still be saying that models are unproven and lets see if they can predict the future?
    0 0
  36. Scaddenp @35, I just had a quick read of Broecker's 1975 paper. Absolutely incredible how well his projections are working out, not only for CO2 but the global SAT as well. But I should not be surprised, as you say, pure physics, and that solid foundation has been understood for a very long time. I'd like to see a comparison between Broecker, Hansen and Lindzen. Broecker's seminal work really does need highlighting more. And look at the 3 C warming for doubling CO2 that is shown in one of Broecker's Tables.....amazing. Did Manabe's model runs from 1991/1992 produce an estimate of global SAT? I can't recall seeing that in the paper, but I have not looked at it in a while.
    0 0
  37. Manabe's model is a marvel of what could be done at the time but the man it was primitive. Its worth thinking back to what else was going on at time. First ice core was bring drilled. d18 thermometry on benthic forams was really setting Milankovich in concrete. Four years later I would be doing my first finite element modelling on rock deformation - card stack at 2am in morning on a Burroughs main frame. The substantial lesson I think though is that the basics of climate arent that complicated. It puts a lie to the idea that climate modelling is somehow curve fitting.
    0 0
  38. Albatross @36, I'm either looking at the wrong article or I simply cannot find that table. Broecker's Climactic Change: Are we on the brink of pronounced global warming?" (1975) discusses work by Wanabe and Wetherald, and by Rasool and Scneider, and concludes that climate sensitivity for doubling CO2 lies between 2 and 4 degrees, but he employs a value of just under 2.4 degrees per doubling (0.32 per 10%, compared to the 0.3 per 10% he used). Using this value, and making no allowance for aerosols or (so far as I can see) thermal inertia he calculates a temperature increase relative to approximately 1850 of 1.1 degrees C. HadCRUT3v gives a 0.9, which very close considering the limitations of his methods. As the prediction was made while global temperatures where falling, it puts the lie to one of Happer's claims, but that is the subject of another thread.
    0 0
  39. The Manabe and Wetherald paper is here
    0 0
  40. Tom @38, I may be reading/interpreting this incorrectly, but the footnote to Broecker's Table 1 says: "Assumes a 0.3 C global temperature increase for each 10% rise in the atmospheric CO2 content". Wow, scientists ahead of their time.
    0 0
  41. Albatross @40, I assumed that it is a compounding 10%, not a fixed value. On that basis, 1.1^7.28 =~= 2, so the climate sensitivity he uses is 7.28*0.3 =~= 2.2 degrees C per doubling, which is quite close to the 2.4 figure he discusses in the last complete paragraph of the first column on page 461.
    0 0
  42. I'm still stuck on the Hansen 2010 thing about pulling all the CO2 out of the GISS and having GMAT drop 6 degrees C in ONE YEAR, and dropping to snowball earth level in a decade. I suspect we all agree that the glaciers to support this could not possibly form this fast. And the thermal inertia of the Oceans? If CO2 is only in the models as radiative forcing, how is it that it's removal (unforcing?)is 7 times more powerful than it's forcing?
    0 0
  43. trunkmonkey @42, based on calculations by Schmidt et al 2010, removing all of the Earth's atmosphere's CO2 would reduce radiative forcing by 31 Watts/m^2, globally averaged. That represents a loss of energy of the order of 5*10^23 Joules per annum ignoring feedbacks. For comparison, according to the NODC the top 700 meters of the worlds oceans have gained around 15*10^22 Joules over the last 55 years, or less than a third of that which would be lost in a year with the complete removal of CO2. At that rate it would take just 12 years to lower a volume of water equal to the top 300 meters of the oceans surface from 14 degrees to 0 degrees. Of course, with feedbacks, the heat would escape at a faster rate initially, but then at a reducing rate while the planet cools. Perhaps you are being confused because you are not taking into account the logarithmic decline of forcing with increasing CO2 concentration. To obtain a reversed forcing of similar magnitude by increasing CO2 concentrations, we would need to instantly increase CO2 levels to around 90 thousand ppm. I assure you that if you modelled that scenario, temperature increases would be suitably rapid to satisfy you. Alternatively, if you merely halved the CO2 content instantly, cooling would closely match the rates of warming obtained for doubling CO2 levels.
    0 0
  44. scaddenp @30 & 31, You disappoint me. Writing off the TV presentation without even bothering to watch it. You ask "What is the relevance?". If you had watched the program you would understand the relevance; the end of the MWP came suddenly and the recovery from the LIA came rapidly too. Climatologists ( -Snip- ) are still trying to identify the smoking gun or guns. A major reason for doubting the predictions based on GCMs is their inability to model these abrupt climate change events that occurred during historic times. At least there is widespread agreement that the coldest period of the LIA occurred during the Maunder minimum leading to the hypothesis that solar activity might be a factor. ( -SNIP- ) Incidentally, you could not classify that History Channel program as "Denialist". It mentions some of the major hypotheses advanced by climatologists without getting judgmental.
    0 0
    Response:

    [DB] I watched the program (I have it on disc as well).  There is ongoing discussion over the THC, but most agree (as stated in the program) that the GW currently underweigh is sufficient to overwhelm the cooling forcings now in play.  Implications of dishonesty and link to website of illicit activity snipped.

  45. gallopingcamel @44: From wikipedia: "According to the Los Angeles Times, The Pirate Bay is "one of the world's largest facilitators of illegal downloading" and "the most visible member of a burgeoning international anti-copyright or pro-piracy movement"." So you wish to use Skeptical Science to incite people to illegal activity? I highly recomend, based on that, that your post be deleted and serious consideration be given to revocation of your posting rights.
    0 0
    Response:

    [DB] I have snipped the relevant portions from GC's comment.  He may not have been aware of the status of the linked website.

  46. Albatross - I have to agree with Tom Curtis; it's a compounding 10%. That came directly from his statements of a logarithmic effect of increasing CO2, and his CO2 doubling sensitivity is stated at 2.4 C. But definitely - it's a fantastic paper, especially for the time.
    0 0
  47. Tom @41 and KR @46, I thought I was missing something--t hat is what happens when you speed read and/or try and do too many things at once. Thanks for pointing that out and correcting me.
    0 0
  48. @Kevin C, #33. Am I correct in assuming that a step forcing of 1 watt into your model (without the extra offset term) results in a final equilbrium temp increase of 0.6329 degrees? With the first few years being 0.0649, 0.1078, 0.1390, 0.1638, 0.1851, 0.2042, 0.2219, and 0.2385 ? I did in a spreadsheet the formula you posted, and came up with similar, but slightly different results. I overlayed the two results in the graph below. Blue is your plot, red line is mine. Note how my plot rounds off the corners a bit more, particularly in the earliest part of the plot. My plot is without an offset added. You said "c is -0.0764641223, which is a constant which fits the equilibrium temperature." I assume that you really meant to say that c was to match the mean of the GISS observed anomaly temp series and the mean of your model anomalies. Correct?
    0 0
  49. "Writing off the TV presentation without even bothering to watch it." I dont say that TV cant get it right, but mostly doesnt so I dont bother. It did if it is the source of these statements: " the end of the MWP came suddenly and the recovery from the LIA came rapidly too." MCA varied in extent in timing around the world. See figure here and here for SH but especially the Mann et al 2009 paper. "Climatologists are still trying to identify the smoking gun or guns. " "A major reason for doubting the predictions based on GCMs is their inability to model these abrupt climate change events that occurred during historic times." If you believe these, then can I suggest you read the Paleoclimate chapter of AR4 and the papers that would say otherwise. Particularly note of figure this figure showing many model reconstructions of those periods. "Incidentally, you could not classify that History Channel program as "Denialist". " Wouldn't have clue but would doubt it. However, the director will be trying to make a program that people pay money to watch and I doubt very much his skills at surveying science compared to the IPCC panel.
    0 0
  50. I confirm the first 8 numbers of the step response match yours. (I actually rounded before posting, so I went back and redid my calc with the rounded values. I get an indistinguishable plot to my original.) Your understanding of c is correct (I didn't match the means, I just threw this into the minimizer as another refineable parameter.) For the total temp rise on step forcing I get 0.62925 after 123 years - I didn't got to convergence, I'm guessing you did? (Not that I trust the long tail of the response function.) Could we be using different forcing data? I picked up the NetF.txt file from http://data.giss.nasa.gov/modelforce/. Here's a few sample values: 1880 .0000, 1900 -.0569, 1920 -.0652, 1940 .2839, 1960 .3988, 1980 1.0099, 2000 1.8661 Otherwise, I confess I'm a bit baffled here, but I'll carry on looking at it. I'm getting interesting (non-)results on the cross validation front too. Results are only stable if the data covers at least 1920-2000 or 1900-1990. Truncating before 1990 means you lose the short response completely. I expected to need a volcano to get the short response, but I'm surprised Agung doesn't do it. Fitting exponentials is always tricky, and I'm using a simplex optimizer, which might be the issue.
    0 0

1  2  3  Next

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us