Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  930  931  932  933  934  935  936  937  938  939  940  941  942  943  944  945  Next

Comments 46851 to 46900:

  1. Watts Interview – Denial and Reality Mix like Oil and Water

    For what it is worth, from an email to A Scott:Scott,

    I also have been a little delayed in responding, the reason being I have been working on a comment at SkS, of which more later.

    I continue to believe you are trying too hard to find a problem. The reason for having linguistic conventions is so that you do not have to continuously restate details. That is, the convention of year zero BP exists so that scientists do not have to restate the start year every time the report on a new proxy or dating. It follows that if such a convention exists (and it does), if no further information is provided, that is sufficient reason to consider the ages to be stated relative to 1950.

    However, seeing you want to be obtuse (I'm sorry, but there is no better word), the facts are:

    1) Meese et al 1994 established the standard chronology of the GISP2 icecore, in which "The column Ice Age gives ages in years before 1950 AD, where "0" refers to summer, 1950." The chronology is called the Meese/Sowers chronology because it is correlated with the Sowers et al 1993 chronology of Voskok. That is the standard chronology used thereafter and reference to the Meese/Sowers chronology does not represent a revision after publication for any publication after 1994.

    2) Cuffey and Clow 1997 used the Meese/Sowers chronology, including the use of the 0 BP = 1950 convention beyond doubt, given that they state that "the age according to Meese/Sowers, given as years before 1950 AD. "

    3) Alley 2000 use the same chronology as Cuffey and Clow. We know this because:
    a) They do not mention using a different chronology for the data obtained from Cuffey and Clow, a severe lapse if they had in fact altered the chronology;
    b) Present = 1950 is the standard convention, and using a different convention without stating so would have been a significant lapse;
    c) Alley has confirmed the use of the Cuffey and Clow dating when inquiries were made;
    d) Comparison of the smoothed Cuffey and Clow accumulation rates with the reported Alley et al accumulation rates provides a best match on the assumption that Alley et al used the convention that 0 BP = 1950;
    e) Comparison of the Alley et al temperature reconstruction with the smoothed GISP2 d18O values gives the best match if you assume Alley et al used the standard convention; and
    f) The youngest gas age (indicating closure of the firn under compaction) for the GISP2 core is 89 BP, corresponding well with Alley's earliest date of 95 BP.

    I believe the last is relevant because prior to the closure of the firn, water vapour in the atmosphere can penetrate the firn contaminating the data. Once the firn is closed, the relative mass of ice and water vapour ensures the data is dominated by the former.

    Against this the only counter evidence is a NOAA page not constructed by Alley which is known to have at least one wrong date. Specifically, it says the start date of the data is -107,175 AD, whereas the earliest date in the data repository is 49981 BP (or 48,032 BC). Given that the NOAA page is out by over 50 thousand years for the start year, are you really going to rest your case on a 145 year discrepancy for the end year? On top of that, you have Easterbrook's claims, when he is not in the positon to know; and the caution of Alley's expression - which is not evidence that it is wrong.

    This brings me to the SkS post which bypasses the whole Easterbrook vs the evidence conundrum. In 2011, Kobashi et al created a new reconstruction of GISP2 temperatures using different data from the same ice core. Importantly, they brought the reconstruction up well beyond 1950, and included a direct comparison with Box 2009's reconstruction of site temperatures from regional weather data, and with the automatic weather station on site for the last two decades. In that post, I overlay Alley 2000 with Kobashi 2011, and the overlay clearly shows that onsite temperatures increased by just under 1.5 C between the end of the Alley 2000 data and the end of the 20th century. It further shows that the terminal date in Alley 2000 is in the 19th century.

    You may want to do that comparison yourself (along with those mentioned in 2 e & f above), but having done so, surely that is the end of the matter.

    Except (a prediction here), it won't be for Easterbrook. He will continue using a single proxy from a highly variable region in preference to using a large number of proxies from across the globe. He will continue to insist that (at best) only the global average temperature increase should be added to his terminal data rather than the actual temperature increase at the site as shown by regional thermometers and by a new analysis of the GISP2 data. Possibly he'll go one worse and insist on using the Greenland average air/firn temperature difference rather than the site specific difference as determined by data. All in all, he will continue to insist that Alley 2000 shows that modern temperatures (early 21st century) are very low relative to the Holocene average.

    I make the prediction with considerable confidence because he has been making the same refuted argument for many years, even though I know he has been advised of his mistakes. He is not interested in being accurate - he is interested in selling a message.

  2. To frack or not to frack?

    I do note from those reports that in the areas where fracking seems to cause concern (and the names mean nothing to me so could be corrected here), the thermogenic gas is "wet". (contains other higher chain hydrocarbons, -butane, ethane, propane). Their presense is more strongly diagnostic and easier to measure than isotope studies which can get muddied by mixing of biogenic and thermogenic methane.

  3. To frack or not to frack?

    gws - 14C would be no use in distinquishing biogenic coal gas from thermogenic methane - in both sources, 14C would be undetectably low.

    I followed the link but still scratching my head somewhat.The link I posted refers to analysis of aqifer that vroomie is worrying about and come up biogenic. I note that COGCC article does identify a well contaminated by thermogenic gas but tracing it to a leaking cap. There is not a lot of transparency around this data.

    I would be first to admit that I have next to no information on the fracking practise in the States. That extra chemicals (surfacants) are added is known to me, but that is common here too in geothermal. I would say that they pose managable risks involved with storage, transport and particularly in seepage from circulation pits. Is the risk from these any higher than that involved from other chemicals in common use in various industries? I dont see how the new chemicals  increase the risk of contamination of aquifers from the fracking process itself however. The key factors a casing seal and eventual well seal.

  4. Making Sense of Sensitivity … and Keeping It in Perspective

    elsa,

    It is one of the few times I have read someone so wildly inaccurate in their assessment that they could not even aspire to being wrong. I do not think you read the same Economist article as everyone else. You are talking about an article whose second sentence is "But the problem is not going to go away.", and which finds the climate sensitivity debate "Hardly reassuring". Perhaps you should have read a bit farther than you did.

    ubrew2, the Economist is usually sound on the science, but is slavish in its promotion of fracking and natural gas as a "bridge fuel". Here, we can see the same trends as the housing bubble emerging. But fracking and bridge fuels are at least more "respectable" ideas than fake scepticism. Obama's new Energy Secretary seems to also hold to those views.

    Moderator Response:

    [JH] I deleted elsa's post because it was nothing more than sloganeering.

    [DB] Fixed typo per request.

  5. citizenschallenge at 08:35 AM on 29 March 2013
    The two epochs of Marcott and the Wheelchair

    I should have written

    ... around this Maginot Line

  6. To frack or not to frack?

    14C is not (yet) used in these studies. It would be challenging on methane anyway.

    @scaddenp: please follow the link in the post re water contamination. Thanks for the comment re 13C isotopes. OTOH, you are recycling a myth stating "that's been done safely for decades". Todays techniques are different and they do pose different challenges, particularly re fugitive emissions. Safety re water contamination is a different question.

  7. citizenschallenge at 08:27 AM on 29 March 2013
    The two epochs of Marcott and the Wheelchair

    "Magnon Line"

    hmmmm...

    Tragically, we won't be able to do an end run around it.

  8. citizenschallenge at 08:25 AM on 29 March 2013
    The two epochs of Marcott and the Wheelchair

    Thanks for reposting this article by Jos Hagelaars.

    Now I'm able to repost it,  yippy   ;)

    "Marcott et al. 2013 - A Collection of Examinations and Reviews"

    http://whatsupwiththatwatts.blogspot.com/2013/03/marcott-et-al-2013-collection-of.html

     

    Thanks for all the hard work you folks do !

    Cheers

     

  9. Making Sense of Sensitivity … and Keeping It in Perspective

    elsa@12:  Please read the rebuttal to the bogus argument that "CO2 is just a trace gas".  The difference between a CO2 concentration of 0.0% and the pre-industrial concentration of 0.3% is the difference between a frozen Earth and a nice comfortable Earth that can support a population of billions.  Small amounts of lots of things make a big difference.  It's not the percentage that matters, it's the doubling of it that matters.

  10. To frack or not to frack?

    "... it seems strange to suddenly have a hue and cry about something thats been done safely for decades."

    What's been done for decades was simple water and sand fracking; what is recently being done is the use of many toxic chemicals in the frack fluid, some of which the oil corps *will not* iallow nformation on. To me that is a huge big red flag. "Gasland" is a good movie, and you ought to view it; the 'flaming faucet' isn't far from where I live, and I live smack dab in the middle of one of the biggest fracking frenzies in the country. given the lies we were told, vis--avis 'safety' in the Gulf (and in many other despoiled environments), I have little reason to believe fracking is as safe as the oil corps claim it is.

    I am *not* willing to bet my source of fresh water--in a  region where it is already scarce--to it possibly being ruined. And when  you ruin an aquifer--think the Imperial Valley--it's ruined. No more water. An average fracked well uses between 4-6 million gallons of fresh water (brackish/salt water won't work) and the vast majority of that is ruined for animal use, human and non-human. If ever there was a call for the precautionary principle, writ larger, I'm not sure of where. There is no substitute for water.

  11. Making Sense of Sensitivity … and Keeping It in Perspective

    I read the Economists' article and all I can say is dana1981 seems to be correct.  For example, the article gives pretty much equal weight between IPCC's 3C sensitivity estimate and Berntsen's 1.9C sensitivity.  It backs up Berntsen's estimate with other studies, and mentions even lower estimates several times.  The casual reader could easily come away with a 'feeling' the sensitivity is likely 2C or lower.  What the Economist omits is the many studies that point to a 4C sensitivity or higher.  This presentation of a false balance should be familiar to watchers of Fox News.

    From the Economist article "[if climate sensitivity is low] more adaptation rather than more mitigation might be the right policy"  It might be if mitigation costs are high, but what if they are low?  Why does a magazine that calls itself 'the Economist' omit the HALF of that equation they would seem most qualified to estimate?  By omission, the article maintains the myth that mitigation costs are high, and hence the 'adaptation' decision is solely dependent on those durned climate scientists and their tea leaves.

    From the Economist article: "If climate scientists were credit-rating agencies, climate sensitivity would be on negative watch"  But actually, if climate scientists were credit-rating agencies, then they would rate Wallstreet mortgage derivatives 'AAA' and sell them to pensioners.  The Economist missed an $11 trillion housing bubble, yet the credit-rating agencies that helped create it are still the 'gold standard' when it comes to judging value?  Such is the world we live in, apparently.

    The uncertainty about climate change is now essentially ALL on the side of evaluating economic impacts versus costs of mitigation.  Given its recent failings, I guess I can't blame the Economist for focusing on climate uncertainty instead.  But its no service to their readers.  The cost of mitigation is variously estimated to be 1% of GDP, which is pretty trivial.  And the benefits?  What is the benefit to keeping Manhattan and its credit-rating agencies from drowning?  Whatever that is, the Economist won't be talking about it.

  12. Making Sense of Sensitivity … and Keeping It in Perspective

    The climate sensitivity seems to me what divides the warmists from the deniers. (-snip-).  Take on board the full implications of the statement "To illustrate that latter point, in the Norwegian study referred to earlier, an estimate of sensitivity using temperature data up to the year 2000 resulted in a relatively high sensitivity of 3.9 C per doubling. Adding in just a single decade of data, from 2000 to 2010, significantly reduces the estimate of sensitivity to 1.9 C." and you ought to see what I mean.  To claim that we know with 95% confidence (which actually the IPCC has never done) that the average temperature will rise by x degrees cannot sit comfortably alongside this statement.

    Moderator Response: [DB] Off-topic sloganeering snipped. Perhaps in the 2 years and 7 months that have elapsed you should have taken the time to actually learn something about the climate models (one of the basis for determining CS) you so thoroughly are uninformed about.
  13. Most of the last 10,000 years were warmer

    Lanfear @25, I apologize for the delayed response.  It is with good purpose, however.  I have had the good fortune of rediscovering a recent paper (Kobaski et al 2011) showing a reconstruction of GISP2 data using a different method from Alley et al (2000).  That paper, based on the same ice core as used by Alley et al, carries the reconstruction through to about 1990, something they are able to do because of a higher resolution sampling.  Further, they directly compare their reconstruction to the Box 2009 reconstruction of temperatures since 1840 that is used in the main post; and with the AWS temperature record.

    The main post here uses the Box data under the assumption that temperature differences determined by the same method are likely to be correct in terms of relative position, but when compared to records determined by other methods there may be biases (possibly unknown) which distort the result.  Based on this principle, it adds the temperature difference between the decadal average of 1850-59 (corresponding to the end of the Alley et al temperature proxy) and 2000-09 (the present) to the Alley proxy.  Decadal differences are used because of the low resolution of the proxy.

    Kobaski et al discuss the issue directly, saying:

    "Before the two records are combined with the 4000‐year temperature record, it is necessary to make an adjustment, as firn temperatures are colder than air temperatures by 0.2°C to 2.6°C in Greenland [Steffen and Box, 2001] as a result of the surface radiative cooling and inversion as noted above. As the mean difference between the decadal average reconstructed Summit temperatures and the reconstructed Greenland temperature for the 1845–2005 period is 1.75°C, the AWS and reconstructed Summit temperatures are reduced by 1.75°C (Figure 1)."

    (My emphasis)

    So you are correct that the cause of the difference is the difference between firn temperature and the surface air temperature.

    As to the question about GISP2 temperatures and the MWP, that is more complex.  The Kobashi 2011 reconstruction shows greater variability in temperature due to its higher resolution.  As a result it shows some temperatues exceeding modern values in the MWP.  It also shows a significant peak in temperatures around 700 AD which significantly exceed modern values, although that predates the traditional dating of the MWP:

    (Kobashi et al 2011, figure 1.  Click on image for higher resolution.)

    I have attempted to overlay the Alley 2000 proxy data with the Kobaski 2011 reconstruction for comparison.  Doing so clearly shows the difference in resolution.  It shows some differences in the result as well.  Most notably, the 700 AD peak in Kobashi 2011 corresponds to a trough in Alley 2000.  I am certainly not expert enough to say which is likely to be more accurate where the two disagree.

    Despite the differences, the curves track each other reasonably closely closely to show.  The match should put paid to any doubts that the termination of Alley 2000 is around 1855, and that the 1855 temperature in Alley et al is approximately 1.5 C lower than modern temperatures at the same location:

    (Alley 2000 overlayed with Kobaski 2011.  Alignment was achieved based on the axis.  Click on image for higher resolution.)

  14. Making Sense of Sensitivity … and Keeping It in Perspective

    Elsa, given a no-feedback sensitivity of 1.2, by what physical process do you propose to get a sensitivity close to zero??

  15. To frack or not to frack?

    Distinguishing biogenic (from say coal beds) from thermogenic is not straightforward when biogenic source is too old for C14 to be of use. If you only have CH4 then thermogenic is lighter (d13 depleted) but it can be a difficult call. A better indicator of thermogenic gas is other hydrocarbons (eg ethane, butane) which would be unlikely biogenic gases. According to this report which looked at contamination discussed in the Gasland movie and did both geochemical and isotope studies, water contamination was from biogenic sources.

    And then you have the problem that fracking operations could theoretically at least mobilize biogenic methane.

    I havent seen Gaslands but I am suspicious of many anti-fracking claims. While needing a good regulatory environment (like all mining), it seems strange to suddenly have a hue and cry about something thats been done safely for decades.

     

  16. The two epochs of Marcott and the Wheelchair

    I'd call it the "Magnon Line" as a play on the Magnot Line since it embodies the beginning range of time that modern humans have taken dominance over Neanderthal and projects out to the end of humanity's pre-eminence on the planet (if we continue on this line).

    I think the A1B data should be dotted, and they should be using A1FI since it is the best fit to current emissions trends.  

  17. michael sweet at 06:07 AM on 29 March 2013
    Enhanced SkS Graphics Provide New Entry Point into SkS Material

    Jeff Masters at Wunderground weather in the US posted a blog that has a lot of Skeptical science graphs in it.  No new data but it shows that SkS is helping other scientists out.

  18. The two epochs of Marcott and the Wheelchair

    @Tom Curtis

    I think you want much more from this short post than it was intended to be.
    For me the graph of Marcott et al, as presented in figure 2, makes it visible that the decreasing trend in temperatures starting around ~7000 BP was reversed somewhere before 1900 and this is backed up by the instrumental record. The same applies for the graphs of Marcott et al and the graphs of Tamino you copied here, they also show that this long-term decreasing trend did change.
    That's the main message of my post.

    I had no intention whatsoever to go into the details, like uncertainties, robustness, different algorithms or calibration methods as present or used in the Marcott et al reconstruction. I think I clearly state that the study is about the Holocene and if you don't want to call that graph a hockey stick that's OK with me.
    Other graphs with same message:
    http://www.skepticalscience.com/pics/c4u-chart7.png
    http://www.arcus.org/synthesis2k/synthesis/2k_Arctic_temperature.jpg

    In my opinion the slow decline since ~7000 BP would not have changed without the human influence. According to the CMIP5 data the total anthropogenic forcing in 1940 with reference to 1765 is 0.6 W/m². With a TCR of ~0.5 °C/(W·m²) this equates to ~0.3 °C. Certainly substantial.
    Data from PIK-Potsdam. See also this SkS post.

    The other message of my post is that if humans continue on this path we will generate a change in temperatures of the same order of magnitude as from the last deglaciation to the Holocene maximum according to the Shakun data and that this change will be very fast on a geological timescale.

    You may be right that my posts (this post is a translation of a Dutch original) at the Dutch version of ourclimatechange do take some, or a lot, scientific knowledge for granted. Something worthwhile discussing with Bart and the other co-bloggers.

    Please write your own post if you are not happy with mine. Reading your last comment #20, I certainly have the impression that your opinion about global warming is about the same as mine, we probably only differ in opinion about the presentation and/or the level of detail needed.

  19. New Research Confirms Global Warming Has Accelerated

    Question: 

    Does this data indicate that Trenberth's modeled global energy budget (http://www.skepticalscience.com/Tracking_Earths_Energy.html)  underestimated surface absorption and consequently underestimated Aerosol-cloud cooling?

    The AR-5 reduced its estimates for the value of negative forcing caused by aerosols.  Does this then indicate that there is a higher uncertainty for these values, potentially stronger negative forcing?

  20. Making Sense of Sensitivity … and Keeping It in Perspective

    For the most part, The Economist has doing a surprisingly good job covering anthropogenic climate change.  However, this was a very dissapointing piece by The Economist.

    As Dana noted climate sensitivity is complex (as is also evidenced by The Economist's long missive), and the inability of the article's author to correctly interpret the myriad of papers and data sources showed.  That they erroneously consider Nic Lewis to be a climate scientist was a huge red flag that the article's author was not informed enough to tell the real science from the chaff.  All in all, The Economist piece is an exercise in wishful thinking.

    Climate sensitivity is the fake skeptics' final fall-back position for inaction, so I can understand why they will cheer anything that they perceive to support their agenda. But even if climate sensitivity to doubling is "only" near 2 C, that does not disappear anthropogenic warming, nor does it disappear ocean acidification, nor does it disappear rising sea levels, loss of Arctic sea ice, receding glaciers or increases in certain extreme weather phenomena, nor does it change the fact that on our current emissions path we will likely quadruple CO2 levels before 2100.

    Regardless of what equilibrium climate sensitivity is, we should be reducing GHG as quickly as possible because we have most likely already missed the boat to keeping warming to 2 C or less.

    Odd then, that some people seem to be trying to console themselves that everything will be OK or that aggressive action to reduce GHG emissions is not required by trying to argue for lower climate sensitivity.  That approach is very much backwards and amounts to nothing more than wishful thinking.

    Hopefully The Economist does better next time round.

  21. Making Sense of Sensitivity … and Keeping It in Perspective

    Kevin,

    The recent article is hardly "anti AGW" either.  

    So what does all this amount to? The scientists are cautious about interpreting their findings. As Dr Knutti puts it, “the bottom line is that there are several lines of evidence, where the observed trends are pushing down, whereas the models are pushing up, so my personal view is that the overall assessment hasn’t changed much.”


    But given the hiatus in warming and all the new evidence, a small reduction in estimates of climate sensitivity would seem to be justified: a downwards nudge on various best estimates from 3°C to 2.5°C, perhaps; a lower ceiling (around 4.5°C), certainly. If climate scientists were credit-rating agencies, climate sensitivity would be on negative watch. But it would not yet be downgraded.

    Best estimate is still 3 C...but maybe it will be concluded to be 17% less.

    Both articles have their caveats.  Both also emphasize a skeptical angle with regards to best estimate climate sensitivity (the 2011 article describes it as "good news".  Failing to do so would be boring, and the media doesn't like boring.

    Clearly, this isn't the first time The Economist has had a skeptical angle in their articles, although they certainly have had better-quality articles on climate-related issues.

    tobyjoyce,

    It has already cited by contrarians around the web as though it was somehow supportive of their position.

    What deniers look for is movement towards their position from any realm (media, scientists, politicians, real or perceived), which is ultimately a strong desire not to legislate reductions in greenhouse gas emissions.  Even the perceived slightest movement warrants heavy spin, as they believe it helps with public perception, as in "hey look, the media is starting to figure out we're right".  

    My observations are that media on the whole has moved away from science in general, due in part to cost-cutting measures and the big economic downturn a few years ago.  What remains isn't as good of quality.  At the same time, denial is also on the decline over the last few years (public polls indicate that as well).  ClimateGate is virtually dead, and I'm not seeing as many contrarian views covered in media, other than the usual suspects.  Could be the whole ClimateGate thing killed contrarian credibility, and media felt duped in the end.  Deniers have to stretch these days to find something supportive of their movement.

     

  22. Arctic methane outgassing on the E Siberian Shelf part 2 - an interview with Dr Natalia Shakhova

    The age of the methane is of interest.  If it is found to be greater than the 50,000 years that can be measured with carbon dating, this would point to a geological source such as outgassing from coal, shale or liquid hydrocarbon deposits.  This would reinforce the idea that such seeps would be trapped under ice sheets as clathrates once the ice sheet had reached about 300m deep.  Over the roughly 100,000 years that glacials last, a lot of methane can accumulated only to be released when a milankovitch nudge starts the melt.

    It could be that no methane pulse would be seen in ice bubbles from Antactic or Greenland cores.  The half life of methane is about 7 years and much of the oxidation is due to hydroxyl ions in the upper atmosphere.  Whether this oxidation would continue inside the ice is a moot point.  However, the top 70 or so meters of a forming ice sheet still exchanges gas from the atmosphere.  The bubbles are not yet closed.  This would tend to blur or even eliminate a methane signature in ice bubbles even if it did occur in the atmosphere. 

    Another unsettling effect of methane release, especially on the continental shelf is the loosening of the sediment both by the melting of the "clathrate permafrost" and the expansion of the gas, turning the sediment into a fluidized bed.  Sudden slumps on continental shelves have been implicated in localized but very severe tsunamis.  Will the coastal people of the Arctic ocean experience some of these in the not too distant future.

  23. Making Sense of Sensitivity … and Keeping It in Perspective

    I was disappointed overall with the tone of the Economist piece, because they are generally "sound" on the science of climate change. It shows that propaganda about a temperature "hiatus" and lower carbon sensitivity has gained some traction in the more rational media.

    It has already cited by contrarians around the web as though it was somehow supportive of their position.

    But look on the bright side - David Rose and the Daily Mail it wasn't. Not even close. 

  24. Making Sense of Sensitivity … and Keeping It in Perspective

    NewYorkJ,

    Before you take the SUV out for a celebratory spin, though, it is worth bearing in mind that this is only one study, and, like all such, it has its flaws. The computer model used is of only middling sophistication, Dr Schmittner admits. That may be one reason for the narrow range of his team's results

    That is the statement from the previous article you reference.  Not exactly an anti AGW statement.  It looks like this other article was just a piece on this one paper.

  25. Making Sense of Sensitivity … and Keeping It in Perspective

    Although deniers are trying to spin it otherwise (part of the message they're trying to sell that their movement is growing, and any material that they might spin as being skeptical is evidence of that), this isn't the first time The Economist has commented on climate science, or with a arguably skeptical angle.  In 2011, for example,

    The climate may not be as sensitive to carbon dioxide as previously believed

    Another area where The Economist article fails is being selective about the evidence they present.  The neglect, for example,

    Fasullo and Trenberth Find Evidence in Clouds for High Climate Sensitivity

  26. Making Sense of Sensitivity … and Keeping It in Perspective

    elsa,

    A zero climate sensitivity is so far beyond reality as to border on insanity.  That's like saying gravity might stop at any moment -- that the effect of gravity could turn out to be near zero.  A zero climate sensitivity is that insane.

    Best estimates for climate sensitivity today are still between 2.5 and 4 C per doubling.  Even the lowest in that range, 2.5 C, will turn out to be very, very bad for civilization, and at this point in time it seems vanishingly improbable that we will not shoot past and beyond a mere doubling.

    People don't seem to recognize how much of a change that minimum reasonable estimate represents.  And heaven help us if it's even just a little higher (like in that most likely case of 3 C or even 3.5 C).

  27. Making Sense of Sensitivity … and Keeping It in Perspective

    I just thought it was strange for The Economist to try and tackle this subject because they clearly lack the expertise to properly interpret the research.  It seemed like they sent an intern to find a bunch of papers to reference, dumped them into an article, and didn't really know what to say about them.

    elsa @2 - given that climate sensitivity is indisputably different from zero, I guess you're agreeing that Zeke's statement is true.

  28. Making Sense of Sensitivity … and Keeping It in Perspective

    Climate Sensitivity - is it rather a strange subject for in-depth analysis by The Economist? Not really.

    Surely economists would do well to understand it. And even if it were not, in its Science & Technology section, The Economist does cover subjects that cannot be relevant to economics. For instance, last week it looked at the evolution of animals 500m years ago. Perhaps then, with all the economic gloom, an occasional diversion is good.

    The Economist did a poor job printing the article discussed by this post. They did also run a front-piece editorial which struck a better note, concluding "If the world has a bit more breathing space to deal with global warming, that will be good. But breathing space helps only if you actually do something with it."  Still, I don't think it makes up for the dreadful article.

  29. David Rose Hides the Rise in Global Warming

    I am a UK resident, and have submitted a complaint to the UK Press Complaints Commission.  They informed me they are already investigating the article due to an earlier complaint.  I let you know the outcome.

  30. Making Sense of Sensitivity … and Keeping It in Perspective

    "Just how warm the world will be in 2100 depends more on how much carbon is emitted into the atmosphere, and what might be done about it, than on what the precise climate sensitivity ends up being."  That would only be true if the precise climate sensitivity was a postive number materially different from zero.  At zero the volume of CO2 in the atmosphere would make no difference to temperature. 

  31. Making Sense of Sensitivity … and Keeping It in Perspective

    "Ultimately it was rather strange to see such a complex technical subject as climate sensitivity tackled in a business-related publication. While The Economist made a good effort at the topic, their lack of expertise showed."

    The Economist has published articles on science and technology for at least 40 years, indeed it usually has an entire section devoted to that topic.  For some years it has also taken the warmist line.  That they are now raising questions for the AGW side marks something of a turning point in my view.

  32. The Big Picture (2010 version)

    tcflood,

    I was hoping to have found a place where I could ask honest questions and have honest discussions.

    That's not how it looks to me, based entirely on your own posts.  You listed 5 common, low-level denial arguments that are very easily rebutted just by using the Search box and following the links.  None of them required interaction with or interjection by people who have a deep understanding of the climate science.

    Yet people engaged you anyway, and gave you a wealth of information that a trained chemist should be able to easily absorb.

    All you did in response was to change the subject and list more and more denial arguments.  I put a lot of time into creating this response for you, directly answering your question, in the fashion you requested (no math), using a frame of reference (chemistry) that should be right up your alley.

    You apparently did not even read it, let alone respond.  Instead, you decided to take your ball and go home in a huff.

    Your behavior suggests another agenda, that you were never really here to ask questions or to learn.  All you did was to clutter up this thread with off-topic denial sloganeering (all of which, if strictly subject to the site's Comments Policy, should be deleted).

    People responded, and you never once provided a counter-argument or delved more deeply into any subject.  You simply got angry, complained, or more often than that rushed to change the subject to yet another silly denial argument.

    Sorry, but based on your own behavior, visible to all, you'll get no sympathy here.  Your understanding of climate science is abysmal.  I know high school students with a deeper grasp of the science than you have demonstrated.  If you wish to be respected for your knowledge and understanding, then it needs to be adequately developed.

    Your decision to stop posting is probably the correct one.  However your decision to leave, without taking the time to learn exactly how much you fail to properly understand, is ill-advised.

    More people need to spend more time reading and learning instead of gettting angry because their ignorance is not given "due respect."

    Asking honest questions is always welcome.

    Asking questions with no intention of listening to the answers is not.

  33. 2013 SkS News Bulletin #5: Alberta Tar Sands and Keystone XL Pipeline

    chriskoz:

    I agree completely with your assessment of Andy Skuce's article, Keeping the Cork in the Oil Sands Bottle. In fact, I woiuld like to see it reposted on SkS. 

  34. A Detailed Look at Renewable Baseload Energy

    Citibank recently did an analysis, discussed here, which closely matches my own expectations. This analysis suggests that the most likely scenario for global energy production in the short term is rapid growth of solar and wind power with natural gas as backup generation. This would slowly eliminate the concept of 'baseload' power... you'd have intermittent power from wind and solar and whenever demand exceeded the available intermittent power the natural gas plants would kick in temporarily to make up the difference. As time goes on the need for natural gas backups would then be eliminated by having a large enough smart grid to dispatch intermittent power from areas with excess to areas with a shortfall and/or various methods of power storage. Some areas will continue to rely on relatively steady power sources like hydro, geothermal, and nuclear which are already in place, but most generation will be from solar and wind.

    Barring some major new power innovation this path seems nearly inevitable to me. Wind, solar, and natural gas are all now cheaper in nearly every market than coal, oil, and nuclear, and this price gap will only get greater with time... and indeed wind and solar will soon be cheaper than natural gas as well. Hydro and geothermal are cheaper in some areas, but more expensive or impossible in others. Thus, economics will inevitably drive wind and solar to replace coal, oil, and nuclear. Natural gas compared to other fossil fuels is cheaper, probably less polluting (though more study of methane leakage is needed), and easier to quickly bring online and shutdown without tremendous inefficiency. Thus, it will logically fill the role of backup power source for most of the world until the infrastructure needed for fully renewable power is in place.

    If the future plays out this way then global carbon emissions will decline to safe levels long before we get to scenarios where we burn all available fossil fuels. Maybe we pass 560 ppm atmospheric CO2, but certainly not 1120 ppm. Of course, that could still result in very damaging AGW (especially if large methane feedbacks kick in), but we should be able to avoid the most catastrophic scenarios. Unfortunately, all that is based on current technology. Someone could come up with a cheap way to collect methane hydrates from the ocean in a few years and thus undercut solar and wind prices with a massive new fossil fuel source... at which point we're back to catastrophe unless governments actually respond intelligently and start putting a price on carbon.

  35. To frack or not to frack?

    Chriskoz @3:

    The problem is that there is 'natural' fossil methane 50 meters down in some places, so the expected isotopic fingerprint would be that of fossil carbon. Also,  shallow methane is more likely to be present in areas where fracking occurs, because both may have the same source. Therefore, only a correlation cannot prove causation. The only way to know is to do a measurement before drilling, and one after.

    Also, even when you can prove a contamination is from a nearby well, the cause may not be the fracking process, but a leak near the top of the well. The latter can be mitigated by better regulation, the former cannot, so it is important to know what actually is the case. 

  36. Dumb Scientist at 21:15 PM on 28 March 2013
    Tung and Zhou circularly blame ~40% of global warming on regional warming

    The article linked to a radiative forcings chart that includes deforestation as a small "land use" component. It's negative because clearing rainforests to plant endless fields of identical crops increases the albedo, reflecting more sunlight and producing a slight cooling effect.

    I've already noted that deforestation also releases CO2 but in much smaller quantities than fossil fuel emissions. If deforestation before 1950 warmed the planet at the same rate as fossil fuel emissions after 1950, then why did atmospheric CO2 increase faster after 1950?

  37. Tung and Zhou circularly blame ~40% of global warming on regional warming

    what if the 0.08/decade increase is not attributable to new emissions but the effect of deforrestation, deforrestation has been occurring at a significant rate since 1850s which corresponds to the data set time period

  38. Polynomial cointegration refutes AGW

    Hendry & Pretis published a comment on Beenstock et al in February 2013 in Earth System Dynamics.

    http://www.earth-syst-dynam-discuss.net/4/219/2013/esdd-4-219-2013.html

    It constitutes a full refutation of their paper. 

    Abstract:

    In their analysis of temperature and greenhouse gases, Beenstock et al. (2012) present statistical tests that purport to show that those two variables have different integrability properties, and hence cannot be related. The physics of greenhouse gases are well understood, and date from insights in the late 19th century by Arrhenius (1896). 10 He showed that atmospheric temperature change was proportional to the logarithmic change in CO2). Heat enters the Earth’s atmosphere as radiation from the sun, and is re-radiated from the warmed surface to the atmosphere, where greenhouse gases absorb some of that heat. This heat is re-radiated, so some radiation is directed back towards the Earth’s surface. Thus, greater concentrations of greenhouse gases15 increase the amount of absorption and hence re-radiation. To “establish” otherwise merely prompts the question “where are the errors in the Beenstock et al. analysis?”.

    We will demonstrate several major flaws in their approach, such that none of their
    claimed conclusions has any evidential basis.

    Section 2 uses an uncontroversial example to highlight the dangers of approaches
    20 that fail to address all the complications inherent in statistical analyses of observationaldata.

    Section 3 applies the reasoning to the apparently more controversial case of the
    relationship between greenhouse gases and temperature.

  39. The two epochs of Marcott and the Wheelchair

    KR @19, I am certainly not arguing that the original paper did not adequately communicate the lack of robustness in the reconstruction over the last two centuries.  In addition to the passage you quote, they showed several graphs emphasizing the point.

    Nor am I necessarilly criticizing the article above.  If that article was written as an introduction to the paper for a scientifically literate audience who could be expected to understand about uncertainty and robustness, and to look up the paper for themselves, it is quite appropriate.

    Here, however, because SkS is trying to reach not just the scientifically literate, SkS needs to explicitly canvas the issues regarding robustness; and also the method actually used by Marcott et al to determine their headline result.  This is particularly the case in that the interpretation of the uptick as representing the twentieth century warming, and Marcott et al as another hockey stick is being widely played around the web.  Both interpretations are false, and easilly shown to be false.  Not providing the correct information, therefore, merely allows "skeptics" to score cheap points on the paper without addressing the substantive issues that it raises, ie, that:

    1)  Temperatures now are in the upper range of Holocene temperatures and will rapidly exceed temperatures experienced throughout the Holocene in this century; and

    2) Temperatures at the start of the 20th Century where close to minum Holocene values, such that temperatures have gone from near minimum to close to maximum Holocene values in just one century, thus reversing 4 thousand years of temperature decline in just one year.  That rate of change is likely unprecedented throughout the Holocene.

  40. 2013 SkS News Bulletin #5: Alberta Tar Sands and Keystone XL Pipeline

    Well written piece on Planet 3.0 by our own Andy Skuce.

    I argee with the analysis therein: it's not the co2 numbers that matter in that debate but the symbolic approval of the investors to continue/increase the financing of these industries. The industries would not go ahead witrhout those investments. KXL itself maybe small step of increased emissions but will lead to a "giant leap" when investment monies go behind it.

  41. To frack or not to frack?

    Did they measure the d14C in this elevated level of methane around fracking wells? The lower than background d14C would confirm that the source is fosil fuel rather than active biosphere, therefore settle the question of leaks that FF industries are obviously denying despite rumours.

    Or are the expected isotopic differences too small to measure?

  42. New Research Confirms Global Warming Has Accelerated

    GrindupBaker - Dr Trenberth was lamenting the inadequate state of global observations. He was not literally saying the heat was missing - just that we lacked the ability to measure where it was i.e the deeper ocean. In essence, he was urging scientists to work harder on solving this problem. Contrarians have predictably twisted the intent of Dr Trenberth's words, but that comes as no surprise. 

    The ARGO network has remedied the inadequacy of the ocean measurements to some extent, but they still only measure down to 2000 metres - whereas the the global oceans are over twice that depth on average. 

    Still a lot of work to be done, and a deep-ocean observation system would be desirable but, as Balmaseda (2013) has demonstrated, the observations show that global warming has actually accelerated over the last 16 years.

    This implies that the climate sensitivity (the temperature increase with a doubling of atmospheric CO2 concentration) is at the higher end of the range of estimates.

  43. A Detailed Look at Renewable Baseload Energy

    vroomie,

    I already tried making that point (here and here) but apparently discovering that he was wrong about fuel supply by a factor of 150 was not enough to make JvD question his level of subject knowledge. He never addressed my point about the potential penetration level of nuclear power after I showed France was not the good example he thought it was, either.

    Regarding the "uranium from seawater" idea, "The total amount of uranium recovered in an experiment in 2003 from three collection boxes containing 350 kg of fabric was >1 kg of yellow cake after 240 days of submersion in the ocean." (Wikipedia, citing Seko et al. 2003.) That suggests that you would need 55,000 tonnes of fabric per reactor, submerged in seawater for 2/3 of the year, in order to collect enough uranium for that reactor for one year. Practical? You be the judge.

    JvD, quoting The Australian, apparently quoting Hansen, said:

    Even in Germany, which pushed renewables heavily, they generated only 7 per cent of the nation's power.

    The thing about true sceptics rather than "fake skeptics" is that we allow facts to change our minds rather than mindlessly accepting arguments from authorities that we happen to like. In this case it's entirely possible that The Australian is accurately reflecting Hansen's opinion (although I wouldn't automatically assume that), but even though I respect Hansen greatly I'm not going to simply take his word for it, especially since it's not an area that he is an actual authority in.

    In fact, in 2011, 20.5% of Germany's electricity supply was produced from renewable energy sources, compared to 17.7% from nuclear, and Germany has far from the highest percentage of renewable power generation of any country in the world, or even the EU!

    JvD has repeatedly tried to portray SkS as portraying the "contrarian" position when it comes to renewables and nuclear power, and himself as holding the "scientific" position. Normally, to disavow someone of that notion, I would point them to a well-written and researched article on SkS. In this case, the article I would point JvD to... is this one! He hasn't argued against any of the reports presented, he has merely stated that it was "per definition" that "Intermittent renewables cannot provide baseload power". If only SkS realised that it could debunk all those skeptical claims by saying they were wrong "per definition"!

    Since JvD is in Europe, it's surprising that he overlooks one significant benefit that Europe has, which is Norway's massive potential for pumped hydro storage. Fully developed it could single-handedly power all of Europe for weeks, allowing Europe to easily take advantage of large amounts of intermittent renewables. Indeed, pumped hydro storage construction is booming in Europe, with this decade set to the the largest growth in capacity on record precisely to accommodate intermittent renewables. 

  44. New Research Confirms Global Warming Has Accelerated

    @grindupBaker:

    Unlike most SkS authors, I am not a climate science wonk and therefore am not able to respond to your question. I suspect that others will do so however. 

  45. New Research Confirms Global Warming Has Accelerated

    mehus@19 dana1981@20 John Hartz@21 I inferred from Kevin Trenberth lecture on Utube that "missing heat" was discrepancy between radiation imbalance (KT stated 0.9 wm**-2) & delta ocean heat expected from it. If so, volcanos aerosols & others would not affect "missing heat". Did I misunderstand what the discrepancy was ? Was it rather a discrepancy twixt the underlying physics, or models, & prior delta ocean heat measured ?

  46. The two epochs of Marcott and the Wheelchair

    Tom Curtis - "The question then becomes, why did you not communicate those uncertainties and lack of robustness?"

    I believe that uncertainty was sufficiently stated in the original paper, with "...this difference is probably not robust."

    Personally, after examining both the paper and in particular Tamino's analysis, my initial response is confirmed - the size of the uptick is to some extent an artifact of the processing. The RegEM processing (which would have been my preference, quite frankly, with infilling rather than increasingly limited spatial data) and the Tamino differencing methods are far more likely to reflect the ground truth. 

    And, of course, the alignment with current temperatures was done over 500-1500AD, not over the last 500 years, with the Mann 2008 and HadCRUT3 data aligned as per overlap. 

    Which leaves the deniers frantically looking for some possibility of current temperature swings being natural (good luck with that!) rather than the result of our actions. 

  47. The Big Picture (2010 version)

    tcflood - Regarding reflection (albedo), in the SW frequencies it's about 30% reflection. Again, a number that is easy to find if any effort is put forth, and one of the components leading to the 240 W/m2 average insolation. Why have you not looked this up???

    I hate to say this, but since you have not actually posed any questions or concerns regarding the greenhouse effect, but rather talked a lot about uncertainties without either quantifying said uncertainties, or conveying much of substance, I would at this time regard you as a concern troll

     

    I would be more than willing to be proven wrong, mind you - but that's going to take some actual questions or assertions on your part, rather than vague 'concerns' and run-on postings. 

  48. The two epochs of Marcott and the Wheelchair

    KR @11, that is correct.  However, as Marcott et al show (Fig 1E), there results are fairly robust to choice of recent reconstruction; and as Tamino shows, are robust even for direct matching to the instrumental over the period of overlap provided the method of differences is used rather than simple averages.

  49. The two epochs of Marcott and the Wheelchair

    Eli @14, I agree.  However, when the graph of the reconstruction is not robust (in the last two centuries), we should clearly stat that rather than focussing of trivia of shape.  It is a matter of clear science communication.  If we do not do it, we create hostages for the deniers.  People whose understanding of the graph is superficial, and hence who are likely to be persuaded by superficial criticisms.

  50. The two epochs of Marcott and the Wheelchair

    Jos Hagelaars @15, I will accept your word that you were aware of the uncertainties, and presumably lack of robustness, of the uptick at the end of the Marcott graph.  The question then becomes, why did you not communicate those uncertainties and lack of robustness?  Given that there are serious issues about the robustness of the reconstruction over the last few centuries, why is that issue never canvassed in your article?  Why is the word "robust" not even mentioned?

    In your article you write:

    "The temperature reconstruction ends mid-20th century, so the rapid temperature rise since 1850 is clearly visible in the graphs presented in their study. And what do we see?  Again something that looks like a hockey stick as in the graph from Mann et al 2008."

    "[S]omething that looks [as much]  like a hockey stick as in the graph from Mann et al 2008."  Really?

    (Source)

    Everything beyond the uptick visible in the RegEM reconstruction is an artifact of the drop out of proxies with time.  Therefore the hockey stick like appearance of the graph that you are focussing on is an illusion - an artifact of a poor statistical technique.

    As it happens the mean uptick in the resconstruction using the method of differences and simple averaging only reaches an anomaly value relative to 1961-90 of -0.04 C.  (That is not clear from Tamino's graph, as he uses a different baseline.)  That lies outside the lower range of the 2 sigma error margin of the Standard 5x5 method shown in the main graph.  The mean uptick using original published ages for the proxies is only -0.1 C relative to 1961-90, or 2.5 Standard Deviations below the uptick actually shown.  That is probably the best estimate of the mean temperature from 1930-1950 using the Marcott proxies, and is a value exceeded by greater than 50% of Holocene temperatures.  It compares well with the HadCRUT3v value of approximately -0.06 C for the same period.

    Again, this is not an issue of uncertainty, but of robustness.  The Marcott et al uncertainty estimates do not capture the effect of averaging without regard to the drop out of proxies.  More precisely, the show the influence of the reduced number of proxies, but do not account for the influence of the relative temperatures of the proxies that drop out.  That is why Marcott et al indicated the twentieth century temperatures were not robust (ie, that they were likely to change significantly as the result of improved or different methods), rather than that they are uncertain (ie, that improved proxy sets are likely to narrow the estimate within the current estimated uncertainty range).

    This can be seen with the table of values and 1 sigma uncertainties for the different methods tried in Marcott et al: 

    Method

     

    1940 Anomaly

    (Degrees C)

    1 sigma

    (Degrees C)

    Standard 5x50.60.28
    Standard 30x300.420.16
    Standard 10 lat0.520.25
    Standard0.280.23
    RegEM0.02 0.63
    RegEM5x50.050.13
    Jack300.650.34
    Jack500.600.40

    The values which all estimates overlap within 2 Sigma is 0.1-0.31, well above the probable values for the period.  Further, only half of the methods (Standard, RegEM, RegEM 5x5, and Jack50) have 2 sigma uncertainty ranges that overlap the probable values.

     

    Further, it is not true that the "After the year 1850, the influence of man-made emissions is clearly visible in Marcott's figure".  First, this is not true because the rise in temperature to 1940 is primarilly (though not exclusively) due to natural causes.  Second, it is not true because the rise to 1940 is well within the range of variability demonstrated by Mann et al 2008's reconstruction over the preceding 2000 years.  Third, it is not true because even the start of the rise is not a robust feature of the analysis carried out by Marcott et al., as seen in Fig 1 C of the paper:

    In that figure, depending on which of several plausible methods you use, the rise starts as early as 1650.  That is because the rise in those methods is primarilly an artifact of the drop out of proxies, and hence cannot be used to determine the timing of the real rise.

    Using the differencing method, a better estimate can be obtained, but then the rise starts 1750 or 1800 depending on whether you use original published ages or Marcott's redating of the proxies. (See the fourth figure in my post @8.)

     

    Finally, although you did quote some of the passage I quoted from Marcott et al, you did not explain the reasoning behind the quote.  You did not show how the analysis by Marcott et al allowed them to reach their conclusions.

    This is crucial.  By leaving people with the impression that Marcott's conclusion was based on the spurious spike, you also leave them vulnerable to believing McIntyre has refuted Marcott when all he has done is quibbled about some fringe issues.  More generally, by not showing the why of the reasoning, you have left people reliant on authority, or there own ability to interpret Marcott et al (which is not the clearest paper when it comes to describing methodology).

    I do not know you or your blog.  You may write for scientifically trained or technically minded people who can be rellied on to look up the paper for themselves, and examine the ins and outs of reasoning.  Skeptical Science, however, is aimed at the general audience.  In principle, most blogs should be readable and understandable by a person with only ten years of schooling, and a moderate facility with science.  For those readers, in cross posting, the basics of Marcott's reasoning and the pitfalls in his method should have been explained.

    The still need to be, in an addendum to the post.

     

Prev  930  931  932  933  934  935  936  937  938  939  940  941  942  943  944  945  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us