Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.


Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe

Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...

New? Register here
Forgot your password?

Latest Posts


Contrary to Contrarian Claims, IPCC Temperature Projections Have Been Exceptionally Accurate

Posted on 27 December 2012 by dana1981

There is a new myth circulating in the climate contrarian blogosphere and mainstream media that a figure presented in the "leaked" draft Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report shows that the planet has warmed less than previous IPCC report climate model simulations predicted.  Tamino at the Open Mind blog and Skeptical Science's own Alex C have done a nice job refuting this myth.  We prefer not to post material from the draft unpublished IPCC report, so refer to those links if you would like to see the figure in question.

In this post we will evaluate this contrarian claim by comparing the global surface temperature projections from each of the first four IPCC reports to the subsequent observed temperature changes.  We will see what the peer-reviewed scientific literature has to say on the subject, and show that not only have the IPCC surface temperature projections been remarkably accurate, but they have also performed much better than predictions made by climate contrarians (Figure 1).

Predictions Comparison

Figure 1: IPCC temperature projections (red, pink, orange, green) and contrarian projections (blue and purple) vs. observed surface temperature changes (average of NASA GISS, NOAA NCDC, and HadCRUT4; black and red) for 1990 through 2012.


The IPCC  First Assessment Report (FAR) was published in 1990.  The FAR used simple global climate models to estimate changes in the global mean surface air temperature under various CO2 emissions scenarios.  Details about the climate models used by the IPCC are provided in Chapter 6.6 of the report.

The IPCC FAR ran simulations using various emissions scenarios and climate models. The emissions scenarios included business as usual (BAU) and three other scenarios (B, C, D) in which global human greenhouse gas emissions began slowing in the year 2000.  The FAR's projected BAU greenhouse gas (GHG) radiative forcing (global heat imbalance) in 2010 was approximately 3.5 Watts per square meter (W/m2).  In the B, C, D scenarios, the  projected 2011 forcing was nearly 3 W/m2.  The actual GHG radiative forcing in 2011 was approximately 2.8 W/m2, so to this point, we're actually closer to the IPCC FAR's lower emissions scenarios.

As shown in Figure 2, the IPCC FAR ran simulations using models with climate sensitivities (the total amount of global surface warming in response to a doubling of atmospheric CO2, including amplifying and dampening feedbacks) correspoding to 1.5°C (low), 2.5°C (best), and 4.5°C (high).  However, because climate scientists at the time believed a doubling of atmospheric CO2 would cause a larger global heat imbalance than today's estimates, the actual climate sensitivities were approximatly 18% lower (for example, the 'Best' model sensitivity was actually closer to 2.1°C for doubled CO2).

FAR temp projections

Figure 2: IPCC FAR projected global warming in the BAU emissions scenario using climate models with equilibrium climate sensitivities of 1.3°C (low), 2.1°C (best), and 3.8°C (high) for doubled atmospheric CO2

Figure 3 accounts for the lower observed GHG emissions than in the IPCC BAU projection, and compares its 'Best' adjusted projection with the observed global surface warming since 1990.

FAR vs Obs

Figure 3: IPCC FAR BAU global surface temperature projection adjusted to reflect observed GHG radiative forcings 1990-2011 (blue) vs. observed surface temperature changes (average of NASA GISS, NOAA NCDC, and HadCRUT4; red) for 1990 through 2012.

FAR Scorecard

The IPCC FAR 'Best' BAU projected rate of warming from 1990 to 2012 was 0.25°C per decade.  However, that was based on a scenario with higher emissions than actually occurred.  When accounting for actual GHG emissions, the IPCC average 'Best' model projection of 0.2°C per decade is within the uncertainty range of the observed rate of warming (0.15 ± 0.08°C) per decade since 1990, though a bit higher than the central estimate.


The IPCC Second Assessment Report (SAR) was published in 1995, and improved on the FAR by including an estimate of the cooling effects of aerosols — particulates which block sunlight and thus have a net cooling effect on global temperatures.  The SAR included various human emissions scenarios; so far its scenarios IS92a and b have been closest to actual emissions.

The SAR also maintained the "best estimate" equilibrium climate sensitivity used in the FAR of 2.5°C for a doubling of atmospheric CO2.  However, as in the FAR, because climate scientists at the time believed a doubling of atmospheric CO2 would cause a larger global heat imbalance than current estimates, the actual  "best estimate" model sensitivity was closer to 2.1°C for doubled CO2.

Using global climate models and the various IS92 emissions scenarios, the SAR projected the future average global surface temperature change to 2100 (Figure 4).

IPCC SAR Projections

Figure 4: Projected global mean surface temperature changes from 1990 to 2100 for the full set of IS92 emission scenarios. A climate sensitivity of 2.12°C is assumed.

Figure 5 compares the IPCC SAR global surface warming projection for the most accurate emissions scenario (IS92a) to the observed surface warming from 1990 to 2012.

SAR vs obs

Figure 5: IPCC SAR Scenario IS92a global surface temperature projection (blue) vs. observed surface temperature changes (average of NASA GISS, NOAA NCDC, and HadCRUT4; red) for 1990 through 2012.

SAR Scorecard

The IPCC SAR IS92a projected rate of warming from 1990 to 2012 was 0.14°C per decade.  This is within the uncertainty range of the observed rate of warming (0.15 ± 0.08°C) per decade since 1990, and very close to the central estimate.


The IPCC Third Assessment Report (TAR) was published in 2001, and included more complex global climate models and more overall model simulations than in the previous IPCC reports.  The IS92 emissions scenarios used in the SAR were replaced by the IPCC Special Report on Emission Scenarios (SRES), which considered various possible future human development storylines.

The IPCC model projections of future warming based on the varios SRES and human emissions only (both GHG warming and aerosol cooling, but no natural influences) are shown in Figure 6.

IPCC TAR projections

Figure 6: Historical human-caused global mean temperature change and future changes for the six illustrative SRES scenarios using a simple climate model. Also for comparison, following the same method, results are shown for IS92a. The dark blue shading represents the envelope of the full set of 35 SRES scenarios using the simple model ensemble mean results. The bars show the range of simple model results in 2100.

Thus far we are on track with the SRES A2 emissions path.  Figure 7 compares the IPCC TAR projections under Scenario A2 with the observed global surface temperature change from 1990 through 2012.

TAR vs. obs

Figure 7: IPCC TAR model projection for emissions Scenario A2 (blue) vs. observed surface temperature changes (average of NASA GISS, NOAA NCDC, and HadCRUT4; red) for 1990 through 2012.

TAR Scorecard

The IPCC TAR Scenario A2 projected rate of warming from 1990 to 2012 was 0.16°C per decade.  This is within the uncertainty range of the observed rate of warming (0.15 ± 0.08°C) per decade since 1990, and very close to the central estimate.

2007 IPCC AR4

In 2007, the IPCC published its Fourth Assessment Report (AR4).  In the Working Group I (the physical basis) report, Chapter 8 was devoted to climate models and their evaluation.  Section 8.2 discusses the advances in modeling between the TAR and AR4.  Essentially, the models became more complex and incoporated more climate influences.

As in the TAR, AR4 used the SRES to project future warming under various possible GHG emissions scenarios.  Figure 8 shows the projected change in global average surface temperature for the various SRES.

AR4 projections

Figure 8: Solid lines are multi-model global averages of surface warming (relative to 1980–1999) for the SRES scenarios A2, A1B, and B1, shown as continuations of the 20th century simulations. Shading denotes the ±1 standard deviation range of individual model annual averages. The orange line is for the experiment where concentrations were held constant at year 2000 values. The grey bars at right indicate the best estimate (solid line within each bar) and the likely range assessed for the six SRES marker scenarios.

We can therefore again compare the Scenario A2 multi-model global surface warming projections to the observed warming, in this case since 2000, when the AR4 model simulations began (Figure 9).

AR4 vs. obs

Figure 9: IPCC AR4 multi-model projection for emissions Scenario A2 (blue) vs. observed surface temperature changes (average of NASA GISS, NOAA NCDC, and HadCRUT4; red) for 2000 through 2012.

AR4 Scorecard

The IPCC AR4 Scenario A2 projected rate of warming from 2000 to 2012 was 0.18°C per decade.  This is within the uncertainty range of the observed rate of warming (0.06 ± 0.16°C) per decade since 2000, though the observed warming has likely been lower than the AR4 projection. 

As we will show below, this is due to the preponderance of natural temperature influences being in the cooling direction since 2000, while the AR4 projection is consistent with the underlying human-caused warming trend.

IPCC Projections vs. Observed Warming Rates

Tamino at the Open Mind blog has also compared the rates of warming projected by the FAR, SAR, and TAR (estimated by linear regression) to the observed rate of warming in each global surface temperature dataset.  The results are shown in Figure 10.

tamino IPCC vs obs

Figure 10: IPCC FAR (yellow) SAR (blue), and TAR (red) projected rates of warming vs. observations (black) from 1990 through 2012.

As this figure shows, even without accounting for the actual GHG emissions since 1990, the warming projections are consistent with the observations, within the margin of uncertainty.

Rahmstorf et al. (2012) Verify TAR and AR4 Accuracy

A paper published in Environmental Research Letters by Rahmstorf, Foster, and Cazenave (2012) applied the methodology of Foster and Rahmstorf (2011), using the statistical technique of multiple regression to filter out the influences of the El Niño Southern Oscillation (ENSO) and solar and volcanic activity from the global surface temperature data to evaluate the underlying long-term primarily human-caused trend.  Figure 11 compares their results with and without the short-term noise from natural temperature influences (pink and red, respectively) to the IPCC TAR (blue) and AR4 (green) projections.

RFC12 Fig 1

Figure 11: Observed annual global temperature, unadjusted (pink) and adjusted for short-term variations due to solar variability, volcanoes, and ENSO (red) as in Foster and Rahmstorf (2011).  12-month running averages are shown as well as linear trend lines, and compared to the scenarios of the IPCC (blue range and lines from the 2001 report, green from the 2007 report).  Projections are aligned in the graph so that they start (in 1990 and 2000, respectively) on the linear trend line of the (adjusted) observational data.

TAR Scorecard

From 1990 through 2011, the Rahmstorf et al. unadjusted and adjusted trends in the observational data are 0.16 and 0.18°C per decade, respectively.  Both are consistent with the IPCC TAR Scenario A2 projected rate of warming of approximately 0.16°C per decade.

AR4 Scorecard

From 2000 through 2011, the Rahmstorf et al. unadjusted and adjusted trends in the observational data are 0.06 and 0.16°C per decade, respectively.  While the unadjusted trend is rather low as noted above, the adjusted, underlying human-caused global warming trend is consistent with the IPCC AR4 Scenario A2 projected rate of warming of approximately 0.18°C per decade.

Frame and Stone (2012) Verify FAR Accuracy

A paper published in Nature Climate Change, Frame and Stone (2012), sought to evaluate the FAR temperature projection accuracy by using a simple climate model to simulate the warming from 1990 through 2010 based on observed GHG and other global heat imbalance changes.  Figure 12 shows their results.  Since the FAR only projected temperature changes as a result of GHG changes, the light blue line (model-simuated warming in response to GHGs only) is the most applicable result.

FS12 Fig 1

Figure 12: Observed changes in global mean surface temperature over the 1990–2010 period from HadCRUT3 and GISTEMP (red) vs. FAR BAU best estimate (dark blue), vs. projections using a one-dimensional energy balance model (EBM) with the measured GHG radiative forcing since 1990 (light blue) and with the overall radiative forcing since 1990 (green). Natural variability from the ensemble of 587 21-year-long segments of control simulations (with constant external forcings) from 24 Coupled Model Intercomparison Project phase 3 (CMIP3) climate models is shown in black and gray.  From Frame and Stone (2012).

FAR Scorecard

Not surprisingly, the Frame and Stone result is very similar to our evaluation of the FAR projections, finding that they accurately simulated the global surface temperature response to the increased greenhouse effect since 1990.  The study also shows that the warming since 1990 cannot be explained by the Earth's natural temperature variability alone, because the warming (red) is outside of the range of natural variability (black and gray).

IPCC Trounces Contrarian Predictions

As shown above, the IPCC has thus far done remarkably well at predicting future global surface warming.  The same cannot be said for the climate contrarians who criticize the IPCC and mainstream climate science predictions.

Richard Lindzen

One year before the FAR was published, Richard Lindzen gave a talk at MIT in 1989 which we can use to reconstruct what his global temperature prediction might have looked like.  In that speech, Lindzen remarked

"I would say, and I don't think I'm going out on a very big limb,  that the data as we have it does not support a warming...I personally feel that the likelihood over the next century of greenhouse warming reaching magnitudes comparable to natural variability seems small"

The first statement in this quote referred to past temperatures — Lindzen did not believe the surface temperature record was accurate, and did not believe that the planet had warmed from 1880 to 1989 (in reality, global surface temperatures warmed approximately 0.5°C over that timeframe).  The latter statement suggests that the planet's surface would not warm more than 0.2°C over the following century, which is approximately the range of natural variability.  In reality, as Frame and Stone showed, the surface warming already exceeded natural variability two decades after Lindzen's MIT comments.

Don Easterbrook

Climate contrarian geologist Don Easterbook has been predicting impending global cooling since 2000, based on expected changes in various oceanic cycles (including ENSO) and solar activity.  Easterbrook made two specific temperature projections based on two possible scenarios.  As shown in Figure 1, neither has fared well.

Syun-Ichi Akasofu

In 2009, Syun-Ichi Akasofu (geophysicist and director of the International Arctic Research Center at the University of Alaska-Fairbanks) released a paper which argued that the recent global warming is due to two factors: natural recovery from the Little Ice Age (LIA), and "the multi-decadal oscillation" (oceanic cycles).  Based on this hypothesis, Akasofu predicted that global surface temperatures would cool between 2000 and 2035.  Akasofu's prediction is the least wrong of the contrarian predictions examined here, but with a 0.02°C per decade cooling prediction between 2000 and 2012, has not matched the 0.06°C per decade warming trend, despite the fact that according to Foster and Rahmstorf, natural climate influences have had an approximately 0.1°C cooling effect since 2000.

John McLean

John McLean is a data analyst and member of the climate contrarian group Australian Climate Science Coalition.  He was lead author on McLean et al. (2009), which grossly overstates the influence of the El Niño Southern Oscillation (ENSO) on global temperatures.  Based on the results of that paper, McLean predicted:

"it is likely that 2011 will be the coolest year since 1956 or even earlier"

In 1956, the average global surface temperature anomaly in the three datasets (NASA GISS, NOAA NCDC, and HadCRUT4) was -0.21°C.  In 2010, the anomaly was 0.61°C.  Therefore, McLean was predicting a greater than 0.8°C global surface cooling between 2010 and 2011.  The largest year-to-year average global temperature change on record is less than 0.3°C, so this was a rather remarkable prediction, and not surprisingly turned out to be very wrong.

IPCC vs. Contrarians Scorecard

Figure 1 at the top of this post compares the four IPCC projections and the four contrarian predictions to the observed global surface temperature changes.  We have given Lindzen the benefit of the doubt and not penalized him for denying the accuracy of the global surface temperature record in 1989.  Our reconstruction of his prediction takes the natural variability of ENSO, the sun, and volcanic eruptions from Foster and Rahmstorf (2011) (with a 12-month running average) and adds a 0.02°C per decade linear warming trend.  Note that this was not a specific prediciton made by Lindzen, but rather is our reconstruction based on his 1989 comments.  All other predictions are as discussed above.

Not only has the IPCC done remarkably well in projecting future global surface temperature changes thus far, but it has also performed far better than the few climate contrarians who have put their money where their mouth is with their own predictions.

Conservative IPCC Errs on the Side of Least Drama

Although the IPCC climate models have performed remarkably well in projecting average global surface temperature warming thus far, Rahmstorf et al. (2012) found that the IPCC underestimated global average sea level rise since 1993 by 60%.  Brysse et al. (2012) also found that the IPCC has tended to underestimate or failed to account for CO2 emissions, increased rainfall in already rainy areas, continental ice sheet melting, Arctic sea ice decline, and permafrost melting.  Brysse et al. concludes that the on the whole the IPCC has been too conservative in its projections, "erring on the side of least drama" — in effect preferring to be wrong on the conservative side in order to avoid criticism.

Note: this post has been incorporated into the rebuttal to the myth "IPCC overestimate temperature rise"

0 0

Printable Version  |  Link to this page


Comments 1 to 35:

  1. Minor niggle: under 'Don Easterbook' you have the typo 'bsaed', instead of 'based'.
    The SAR also maintained the "best estimate" equilibrium climate sensitivity used in the FAR of 2.5°C for a doubling of atmospheric CO2. However, as in the FAR, because climate scientists at the time believed a doubling of atmospheric CO2 would cause a larger global heat imbalance than current estimates, the actual "best estimate" model sensitivity was closer to 2.1°C for doubled CO2.
    I'm confused by this. I thought the current best estimate of equilibrium sensitivity was about 3oC? I gather from the text that, if we take models used in FAR and apply a 2.1oC sensitivity, we come closer to observations. Is that a correct understanding, or am I missing the point? Mods: what is the correct code to render a degree sign?
    0 0
  2. Thanks Doug @1, typo fixed. Yes, the current best equilibrium climate sensitivity (ECS) estimate is 3°C. However, remember that's equilibrium sensitivity, and we're far out of equilibrium. More relevant would be the transient climate response, but that wasn't reported in earlier IPCC reports. It's not so simple as adjusting for lower ECS, because it could just be that the model transient response is too fast or too slow, irrespective of ECS. So while the 'best estimate' models in the FAR and SAR had roughly 2.1°C ECS, we don't know what their transient climate responses were, so we can't adjust for that.
    0 0
  3. dana1981 @ 2, I'm still confused, so I am obviously missing something. If I have read the OP correctly, FAR and SAR originally used 2.5o ECS, the revised graphs were based on 2.1o and the real ECS is approx. 3o. To my confused little mind, the revised graphs should use 3o. Where am I going wrong? PS: I still cannot get superscript tags to work, to render the degrees symbol: I am using the HTML codes 'sup' and '/sup' inside pointed brackets, but that is not working.
    0 0
  4. Doug... I'd suggest just using "C" to denote temp. That's pretty much the standard format used everywhere. Remember, lots of folks are in the US where the degree symbol will be construed as degrees F.
    0 0
  5. Hm... No sooner did I post that then I noticed Dana able to use the ° symbol. (I just cut and pasted it from Dana's post.)
    0 0
  6. Rob @ 5, yes, it was because Dana is using it that I thought it must be invoked by using special html code. The standard HTML for superscript is 'sup', but that does not work here. Never thought of cutting and pasting it - doh! "8-\
    0 0
  7. Well, Dana has special cyborg powers, so creating such symbols is easy for him. ;-)
    0 0
  8. Doug @3 - no, what I was trying to say is that the FAR and SAR thought they were using models with 2.5°C ECS; however, because they overestimated the forcing associated with doubled CO2, the actual ECS of the models was about 2.1°C. All I've done in my graphs is take actual GHG emissions into account - I haven't done anything with regards to climate sensitivity there. On a Mac the degree symbol is shift-option-8, by the way. On a PC it's Alt-176 I think.
    0 0
  9. dana1981 @ 8, thanks for the explanation. FAR and SAR were based on models using a lower ECS than the IPCC thought they were using. I take it that the graphs would look different again (worse), if we plugged in the current best estimate for ECS (3°C). As for the degree symbol, I am using Linux. It turns out that I can use the HTML code '& deg;' with no spaces - see this page.
    0 0
  10. @Doug H "& deg;" is a HTML code which applies to all op systems. See the full list here.
    0 0
  11. I just noticed that one can drag-and-drop the '©' symbol from the bottom left corner of this page where it says '© Copyright 2012 John Cook'. Can I suggest a degree (°) symbol is placed somewhere near there—or alongside the heading 'Post a Comment'—then anyone can drag and drop it into their comment while typing. Speaking personally I'd also like a '—' character too. I usually have to use two hyphens--like this--but the correct symbol would be useful.
    0 0
  12. Dana, many thanks for the informative posting. I compared your figure 9 for the A4 multi-model projections with this figure of Gavin Schmidt. Schmidt plotted the observational data together with the A1B model data and due to i.a. the La-Nina's in 2008 and 2011 the observations are running a bit lower than the model average. This is also visible in your figure 9, but due to the different scaling it is less distinct than in the RealClimate image. How would the scaling you use in figure 9 work out for the hindcast as is shown in the RealClimate image? I checked the IPCC A2 model data and calculated a warming trend over 2000-2012 from the average of the A2 model data and got 0.19 °C/decade (ranging from min 0.03 to max 0.35). You got 0.16 °C/decade. Am I using the wrong data?
    0 0
  13. Doug @9, as I noted @2, it's not as simple as adjusting for different ECS estimates because we're not in equilibrium.
    0 0
  14. JosHag @12 - if I were to extend Figure 9 back further in time, it would look essentially the same as Gavin's figure, though slightly different has he's using A1B and I'm using A2. Our baselines are probably also different. I just set the model mean value in 2000 equal to the 1998-2002 average temperature anomaly. I'm using the A2 model mean data from here, but you're right that the trend should be 0.18 or 0.19°C per decade to this point. I'm not sure how I got 0.16°C, I'll have to check on that later, but I've revised the post accordingly.
    0 0
  15. Dana, Excellent post as always. You should consider sending a copy of this post to the lead author of the IPCC chapter. There is some room for misunderstanding their original graph.
    0 0
  16. dana1982 @ 13 thanks for persevering. I think it has sunk in now. John Russell @ 11, there are codes that provide all the common symbols. Some that we have discussed in this thread are:
    ° renders as the degree symbol °. © renders as the copyright sign ©. – renders as an n dash –. — renders as an m dash —.
    It would be best to review the pages linked to in earlier comments here, as HTML is off topic for the thread. Thank you, mods, for letting it run.
    0 0
  17. dana 1981 It seems that the best qualified skeptic Akasofu got closest to the actual 'averaged' temperatures. Presumably these averages are annual without any smoothing?? All the IPCC predictions are trending strongly up (0.19 deg/decade) - while the actual temperatures are slightly up (0.06 deg/decade). Akasofu at 0.02deg/decade cooling is closer than the IPCC. Why is he then the 'least wrong'? Fig 11 corrects the actual temperature observations to what they would have been without solar, volcanoes and ENSO and gets a better match with IPCC projections. Did the original IPCC projections exclude all these factors? I seem to recall AR4 mentioned volcanoes as an intermittent and unpredictable cooling effect, ENSO is supposed to be neutral over several cycles (20 years?) and solar is dismissed in AR4 as a minor forcing something like 0.12W/m-2 compared with a total radiative forcing of 1.6W/m-2. 20 years (1990-2010) is roughly 2 solar cycles - so the effect should be neutral over this period in any case unless more is made of longer term solar effects that does the IPCC in AR4. From what we know of ENSO, Solar and volcanic effects - why would they aggregate in the 1990-2010 period to a number significantly different to any random 20 year period?
    0 0
  18. ENSO isn't supposed to be neutral. It's an oscillation, hence over any given interval it will have a maximum possible influence on a trend line. 20 years certainly isn't long enough to be able to read an accurate air temperature gradient without controlling for ENSO.
    1 0
  19. Tristan ENSO is supposed to be neutral. It is described as a cyclical redistribution of heat around the system according to Trenberth and others. If ENSO is not neutral over time then it becomes an external forcing which has not so far been accounted for in AR4 for example. There are several ENSO cycles over 20 years, so please nominate what time period is required for ENSO to become neutral.
    0 0
    Moderator Response: [DB] Please take further discussion, including responses, about El Nino / La Nina / ENSO to the "It's El Nino" thread.
  20. Ron King, I think most of your questions are answered in the above post. For the influences of solar, ENSO, and volcanoes, see my post on Foster and Rahmstorf (2011). Akasofu is roughly as close to the measured trend since 2000 as the AR4, but as noted in the post, virtually every natural temperature influence has been in the cooling direction over that timeframe, and when those are filtered out with the Foster and Rahmstorf methodology, he's way off and AR4 is spot on. Climate models include natural variability simulations, but when multiple model runs are averaged together, that natural variability gets averaged out, which is why the multi-model temperature projections are quite smooth.
    0 0
  21. dana 1981 Looking at your post on FR201: The authors conclude by averaging all of the data sets together (Figure 4): "Because the effects of volcanic eruptions and of ENSO are very short-term and that of solar variability very small, none of these factors can be expected to exert a significant influence on the continuation of global warming over the coming decades. ." This says that the effects of ENSO, volcanoes and solar are effectively small to negligible over the coming decades. Yet you conclude that those 3 factors were significant (cooling) effects over the last 2 decades - effectively masking the AGW warming signal with natural cooling factors. Again there are nearly 2 solar cycles in the last 20 years which should neutralize TSI effects which are small anyway according to FR, volcanic chart shown Pinitubo (1993?) as the only major volcanic cooling effect (short term), which leaves ENSO which is also short term and not expected to be decadal factor in the future. So practically the whole case rests on ENSO being a major cooling factor in the past 20 years - so much so that it has flattened the temperature trend from 0.18 down to 0.06 deg/decade. Sorry, I can't see this from the MEI chart from FR 2011 from 1990 - 2010. Neutral to slightly positive over 20 years looks more like it.
    0 0
  22. Ron, read the paper. You're basically saying "I don't agree with the results of FR11 because my eyeballs disagree". Sorry, eyeballs are subject to bias, statistics are not.
    0 0
  23. Ron, yes the overall decadally (or multi-decadally) averaged ENSO will be small over the coming decades. However ENSO can make a large contribution over a period of several years to a decade (depending on the length of time ENSO persists in an overall positive or negative mode). Can't really say what solar/volcanic influences will be in the coming deades since their variability is (as far as we can say) stochastic outwith the solar cycle. The solar contribution is expected to be small. However a prolonged solar "downturn" can make a persistent (small) contribution to surface temperature. So inspection of the ENSO index indicates it's been largely negative especially during the last 6 or 7 years. Likewise there has been a rather anomalous progression of solar activity with an extended minimulm out of cycle 23. These add up to a significant negative contribution to surface temperature since about 2005/6. So the decadal temperature trend just past is suppressed. That's not difficult to understand I think. However neither of us needs to attempt to characterize these contributions in words since Foster and Rahmstorf (2011) have done the calculations!
    0 0
  24. Regarding Figure 1, when did it become common to combine whatever combination of temperature records were needed to prove one's point?
    0 0
  25. cormagh - It's not terribly common to average multiple temperature records in these discussions. But this certainly short-circuits the "You used a data set that (has poorly specified problems that boil down to contradicting my point of view)" arguments commonly heard from 'skeptics'. Personally, I prefer GISTEMP for accuracy, as it avoids the polar holes present in (for example) the HadCRUT and satellite data. And that's because I prefer looking at all the data rather than subsets. But again, averaging multiple data sets is reasonable, and probably preferable to plotting multiple data sets in this already crowded graph.
    0 0
  26. All I can do is LOL at cormagh @24. No matter what dataset you pick, deniers accuse you of cherrypicking. You average multiple datasets together, and you're accused of 'combining temperature records to prove your point'. There's just no winning with denialists.
    0 0
  27. The only answer which need be given to cormagh's comment is that his claim is simply false. A combination of temperature records are not "needed" to prove the point. Any of the major temperature records (GISS, NCDC, HadCRUT4, UAH, RSS, BEST, et cetera) or any combination of them would show the same results... the IPCC projections have been in the ballpark and all the denier estimates above have turned out to be below observations. The differences between the observation data sets are much smaller than the differences between the IPCC projections and the denier estimates.
    0 0
  28. The degree sign on a PC can be got by using alt 167 ºººº
    0 0
  29. In regard to above comments about multiple data sets, perhaps a clearer case can be made to help prove the point by indeed combining many data sets extending over a long time period but then linearizing the expected relationship between CO2 and temperature by plotting aveage temp anomaly versus log base 2 ( Concentration CO2 by year / Concentration CO2 in 1850). I have used the averaged temp anomaly since 1850 in the SKS temperature trend calculator to do this, including CO2 from Law Dome data plus Keeling. Then I performed this linearization. Here is the result: The statistical analysis of my data processing program shows that an uncorrelated relationship between these variables has a probability of less than 1 in 10,000. The R value is strong at 0.91
    0 0
    Moderator Response: [DB] Fixed image.
  30. A simple question on the IPCC predictions, in fig.12 for example- do they hold GHG constant or do they also include a social policy prediction? If so, what CO2 level are they designed around? No doubt the answer is in the references, but such critical information should be presented with the predictions. I speak as a casual reader with a PhD in biology.
    0 0
  31. In regards to my comment 29 above, despite John Cook's valient efforts to help me out in cyberspace, my graph still did not plot! This is a polite website but perhaps an "Oh Piffle!" is o.k. However, I was able to post my plot onto
    0 0
  32. peggy @30 - all projections are based on a certain GHG emissions scenario. In Figure 12 they used the IPCC FAR 'business as usual' scenario discussed towards the top of the post.
    0 0
  33. The IPCC FAR 'Best' BAU projected rate of warming from 1990 to 2012 was 0.25°C per decade. However, that was based on a scenario with higher emissions than actually occurred.

    Where does this come from? I'm looking at the IPCC FAR and it says "Under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, the average rate of increase of global mean temperature during the next century is estimated to be about 0.3°C per decade (with an uncertainty range of 0.2°C to 0.5°C) This will result in a likely increase in global mean temperature of about 1°C above the present value by 2025." (page xi). Furthermore, the graph on page xxxiv appears to show almost constant CO2 emissions until 2020 in the BaU scenario, whereas it is reported that CO2 emissions have actually increased since 1990.

    In the IPCC's defense, their second report in 1995 greatly reduced projections (SAR page 39), and the first report had said, in bold, “There are many uncertainties in our predictions particularly with regard to the timing, magnitude and regional patterns of climate change, due to our incomplete understanding” (page xii).

    0 0
  34. "Where does this come from" - page xxii of summary for policy makers of FAR. (pg 30 of the PDF version found here).

    0 0
  35. Could this post be updated, particularly the prediction gif, to 2018?  2012 isn't as impressive as that would be. 


    0 0

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.

The Consensus Project Website


(free to republish)

© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us