Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  1724  1725  1726  1727  1728  1729  1730  1731  1732  1733  1734  1735  1736  1737  1738  1739  Next

Comments 86551 to 86600:

  1. CBDunkerson at 21:28 PM on 4 May 2011
    Why 450 ppm is not a safe target
    owl905 wrote: "Why have you written an article about 450ppm CO2 when the issue is 450ppm CO2e?" Which... is one of the points made by the article. As the article notes in the second paragraph, CO2 ppm alone is often used on the grounds that aerosol cooling offsets warming from other greenhouse gases. However, as the remainder of the article then explains, warming from other greenhouse gases is increasing while aerosol cooling is decreasing. It will be interesting to see what sort of limits IPCC 5 suggests. Given that ice loss and sea level rise are progressing much faster than previously estimated I have to wonder if the 2 C 'safe' increase itself isn't out the window... regardless of how we get there. newscrusader's post above makes the same point in greater detail. That said, this may actually be 'good' news in a way. Yes, we have probably already gone beyond the point where we are going to significantly raise sea levels and have planet-wide changes to weather patterns for thousands of years... but this is happening fast enough that it MAY help to wake people up to reality before we get to the point where we vastly decrease agricultural production and/or devastate the ocean ecosystem.
  2. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chip @ 119 (and before) "As far as 2010 having a greater melt index value when processed through the methodology described in our paper, I have not done the calculation, so I don’t know. I have reasons to believe it is not as cut and dry a situation that many of you all tend to want to make it." Surely the data are now available to do "the calculation" and demonstrate that it is "not as cut and dry a situation"? To me this is the obvious next step in order to prove that the 2010 melt season really "would not have altered the general nature" of the paper.
  3. How climate change deniers led me to set up Skeptical Science website
    JoeRG, I'm sure John Cook can answer for himself if he feels the need to but I think he makes it clear that the 'ultimate authority' you are referring to (i.e. a single body) is different from the 'ultimate authority' referred to here - the peer-reviewed science. That has no definition as a defining body which determines the truth - rather, as a whole, the peer-reviewed science has congregated around a consensus (in the same way as with regard to Evolution Theory) and this was what John consulted to determine the facts. This is explained, as far as I can see, in the piece above : I made peer-reviewed science the ultimate authority. There's no higher standard than evidence-based research conducted by experts, which is then rigorously scrutinised by other experts. As I began to piece together the various pieces, a clear picture began to emerge.
  4. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chip Knappenberger at 06:30 AM on 4 May, 2011 Since I live in Europe and go to bed 5 or more hours ahead of most of you, your post as been partly answered by others, and maybe we’ve all now learned as much as we need to about this! But concerning statistical analysis I was being a little more specific than is indicated in your point about the IPCC figure; i.e. addressing the assessment of statistical significance in differences between contemporary empirical data with historical data constructed with a model. Your IPCC paleoreconstruction example is a case in point. The 95% confidence range in the annual data in the reconstruction can obviously be defined. However the difference in the instrumental data (apples) with respect to the reconstruction (oranges) is a little misleading when assessed in terms of statistical significance. They don't do this in your example (as your excerpt shows, when addressing comparison of contemporary and paleo data they use a more qualitative, probabilistic statement: "it is likely that the 1990s have been the warmest...."). Obviously one needs to do some sort of analysis, but an apples/orange stats analysis should be accompanied by some careful thought about what’s actually being compared, and in my opinion isn’t suitable for a single bald statement in an abstract. In addition, understanding the relationship between contemporary empirical data and historical reconstruction is deficient without consideration in the context of attribution. With these thoughts in mind I’m mostly curious why the reviewer chose not to consider the stats (perhaps another reviewer did?)! You’ve indicated why you didn’t consider the 2010 melt data, which is quite instructive. It’s left a statement that is factually-incorrect without qualification in the abstract. Still it’s the job of the reviewer(s) to address these points and you seem to have been given a free pass on that!
  5. newcrusader at 19:15 PM on 4 May 2011
    Why 450 ppm is not a safe target
    At this point 400ppm, sustained over time is not a safe number- as Jim Hansen has said, while in Australia over the winter..... In new research just out, Hansen concludes that at the current temperature, no “cushion” is left to avoid dangerous climate change, and that the Australian government target goals “… of limiting human-made warming to 2° and CO2 to 450 ppm are prescriptions for disaster”. The question Hansen raises is direct and brutal in its implications: is the planet already entering a zone of dangerous climate change? In a draft of a new research paper, Hansen and his collaborator Makiko Sato has opened a new debate about what might be the conditions for a safe climate; that is, one in which people and nations can continue to live where and as they have been, with secure food production, and in a bio-diverse environment. The period of human settlement over the past 10,000 years is known as the Holocene, during which time temperatures and hence sea levels (the two having a close correspondence) have been remarkable stable. Temperatures over the period have not been more than 0.5C warmer or cooler than the mid-line (see chart). The warmest part of the Holocene (the “Holocene maximum”) was about 8000 years ago, and according to Hansen, today’s temperature is about, or slightly above, the Holocene maximum: That is, we are already a little above the Holocene maximum. This matters because Hansen’s and Sato’s look at climate history (paleoclimatology) in this new research finds that it is around this temperature level that the large polar ice sheets start to behave differently. During the Holocene, the Greenland and Antarctic ice sheets have been relatively stable, as reflected in the stability of the sea level. But once substantial melting starts, the loss of heat-reflecting white sea-ice, which is replaced by heat-absorbing dark ocean water, produces an “albedo flip”: Their conclusion is that: “… the stability of sea level during the Holocene is a consequence of the fact that global temperature remained just below the level required to initiate the ‘albedo flip’ mechanism on Greenland and West Antarctica.” The implication is clear that “just above” the Holocene maximum lurks real danger. As Hansen and Sato say: “… the world today is on the verge of a level of global warming for which the equilibrium surface air temperature response on the ice sheets will exceed the global mean temperature increase by much more than a factor of two.” To put it bluntly, we are on the edge of a precipice in terms of large ice-sheet losses and sea-level rises, and there is little “cushion” left: “Polar warmth in prior inter-glacials and the Pliocene does not imply that a significant cushion remains between today’s climate and dangerous warming, rather that Earth today is poised to experience strong amplifying polar feedbacks in response to moderate additional warming.” “… the fundamental issue is linearity versus non-linearity. Hansen argues that amplifying feedbacks make ice-sheet disintegration necessarily highly non-linear. In a non-linear problem, the most relevant number for projecting sea level rise is the doubling time for the rate of mass loss. Hansen suggested that a 10-year doubling time was plausible, pointing out that such a doubling time from a base of 1 mm per year ice sheet contribution to sea level in the decade 2005-2015 would lead to a cumulative 5-metre sea-level rise by 2095. “ Here Hansen repeats his view, first published in 2007 but widely ignored, that a 5-metre sea-level rise is possible. In fact, recent research by Blancon et al published in Nature in 2009, examining the paleoclimate record, shows sea-level rises of 3 metres in 50 years due to the rapid melting of ice sheets 123,000 years ago in the Eemian, when the energy imbalance in the climate system was less than that to which we are now subjecting the planet. We are perhaps already a few tenths of a degree above the Holocene maximum, and the system seems to be in the early stages of rapid change. It is widely expected Arctic sea-ice will be totally lost in summer with a few years to a decade or so, perhaps at less than 1C or warming. Very few scientists think Greenland would be stable in an Arctic with little or no summer sea-ice, and opinion is split as to whether it is past its tipping point already. It is hard to argue that anything above the Holocene maximum (of about 0.5 degrees above the pre-industrial temperature) can preserve a safe climate, and that we have already gone too far. The notion that 1.5C is a safe target is out the window, and even 1 degree looks like an unacceptably high risk.
  6. How climate change deniers led me to set up Skeptical Science website
    Dear John, Reading your excerpt from the Guardian's, for me it's a bit odd. On one hand you say that you think you have a sceptic view, on the other hand you define something as an 'ultimate authority'. In my point of view there is a conflict. As I understand it, an ultimate authority means something like the Pope or a government or general officers at the military etc. That means that only people like believers or soldiers have to follow such an institution without questioning. If not, well, a believer would face the risk to be excommunicated or a soldier would be die soon if he questions the orders. But is this in common with science? Shouldn't it be that way that even scientists have to question themselves, to question their own theories and not only those that might be contradictory? Is it really sceptic when you declare an ultimate thruth -what you definitely do if you state an 'ultimate authority'- what will never be possible because there is no such thing like an 'utimate truth'? To be sceptic means for me to question everything - independent from the meaning behind and regardless of the person that told it. Would you agree to this?
  7. Could global warming be caused by natural cycles?
    johnd, just to be clear, please provide a link to your assertion that "[the El-Nino Index value/La-Nina' is expected to strengthen again and remain negative well into 2012.
  8. How climate change deniers led me to set up Skeptical Science website
    Marcus wrote : "I will say this much, though. Its true that Plimer & Monckton have become slightly *less* important, in the eyes of the Denialist Community, since the start of this year-maybe because their comments are so blatantly embarrassing that even their fellow travelers are beginning to want nothing to do with them." Due, no doubt in large part, to sites like this which pick apart their outpourings and show them up for the propagandist disinformers they really are. Previously, the so-called skeptics used to lap-up whatever they said or wrote because it sounded good (coming as it did from supposed experts, albeit self-described ones) and made their denial seem more science-based. Now that they have been shown to be nothing more than wafflers and purveyors of ill-thought out ideas, some of the so-called skeptics are embarrassed and are now trying to deny all knowledge of these people - especially by pretending that it is sites like these that are giving them publicity, as opposed to the reality that sites like this are reacting to, showing-up and countering the disinformation. Older Australians of a certain political persuasion still seem to lap-up their nonsense, though...
  9. Why 450 ppm is not a safe target
    From the article, it sounds like ensuant drought, famine and flooding should take care of all these problems quite effectively.
  10. How climate change deniers led me to set up Skeptical Science website
    "Other than say - well done John, one could suggest that disagreement is not denial, and that there are valid reasons to doubt some of the supporting evidence for the AGW case." Hmmm, I'd say the only *valid* reasons to doubt the supporting evidence would be if (a) the evidence was obviously flawed or (b) if the Contrarian Camp were able to come up with a better hypothesis to explain the observed warming trend of the past 60 years, in general, & the last 30 years in particular. Neither seems to be the case-the evidence is as strong as it was over 100 years ago & the Contrarians have yet to come up with an opposing hypothesis which is able to stand up to scrutiny. What I see from Contrarians, most often, are attempts to discredit the scientific community, wacky conspiracy theories, attempts to downplay the future seriousness of the problem and attempts to overstate the possible economic consequences of taking action. None of which I'd define as *valid* reasons for doubting scientific *fact*-just a desperate bid to muddy the waters & delay action for as long as possible.
  11. Why 450 ppm is not a safe target
    Why have you written an article about 450ppm CO2 when the issue is 450ppm CO2e? Off the cuff, CO2 is into 'the red zone' at about 425ppm with the rest coming primarily from increases in CH4. Pre-Ind 2009 Increase Forcing wm2 C02 280 ppm 388 ppm 108 ppm 1.46 CO4 700 ppb 1745 ppb 1045 ppb 0.48 With CO2 concentrations growing at about 2ppm per year, it's less than 20 years to be 'in the red zone'. But even the 2 ppm growth rate will be surpassed once the current global recession gives way to new global growth.
  12. Video and podcast about confusing the hockey stick with the 'decline'
    "Baseload power" is a concept that exists only because certain technologies are inflexible, not because it reflects what is actually required. The demand for power fluctuates through a range of about 2.5:1. In other words, the peak power requirement during a hot sunny day in a place like California is 2.5 times greater than in the middle of the night in winter. Coal-fired power stations and nuclear powerplants really like a constant output, both for technical reasons (it's difficult to change the output for both plant types) and economic reasons (coal and nuclear powerplants have relatively high capital costs, relatively low fuel costs, and finite lifetimes, so getting a good cost per kWh generated depends on generating as much as possible during the lifetime of the plant). Because of this mismatch between supply and demand we have "peaking power generators", like gas, that follow the load and produce more power when the demand is high and less when it is low. Peak power generators have much higher fuel costs compared to capital costs and they pay for themselves by only selling power when the price is right. This is why many utilities offer pricing schemes where they charge more for power during peak periods and much less for power during off-peak. The forecast expansion of nuclear was also the driving force behind expanding pumped hydro capacity a few decades ago (see Dinorwig power station for an example) so it's a mistake to think that energy storage is an issue for renewables only. In fact, with solar power, because the generating behaviour more closely correlates with demand, the gap between what is being generated and what is being demanded is less, and peaking power generators are actually required less often when using solar than when using so-called "baseload power generators" like coal and nuclear. Nevertheless, there is no reason why solar and wind can't displace a large percentage of "baseload power" generators with load-following generators filling the gap in exactly the same way they do now, and as storage is added to the system (molten salt, pumped hydro, etc.) peaking power will be used less and less often. Of couse, we have a long way to go before this becomes an issue -- until solar and wind penetration exceeds 20% of the grid, we can just use "grid storage" (i.e., use them to displace other sources of power), which is extremely efficient because you're not really "storing" energy at all, you're simply avoiding the generation of energy you don't need. Here is a report from Germany on the reduction in the cost of electricity that wind power resulted in that more than compensated for the subsidy paid to have that wind power added: Merit Order Effect The reason for the cost reduction was that although wind power wasn't the cheapest energy source available, wind power providers will dump all the power they can generate onto the market whenever they can -- there's no benefit to them in turning off turbines when the price gets too low because there's no "fuel" to save in doing so. The availability of wind power electricity at whatever-price-was-going meant that customers avoided buying power from peak power generators using more expensive technology when there was enough wind for them to do so.
  13. Philippe Chantreau at 15:33 PM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    I find the paper of seriously limited interest. Since it does not consider calving, it does not address mass balance, which is the really important metric. The main focus appears to be on statistics, while ice sheet dynamics hold many areas of inquiry that are far more interesting than the mere descriptive numbers. It is not very useful for much of anything. As with most statistical approaches, the lack of the most recent data in the analysis makes it even less useful. I'm sure Pat Michaels can throw around a lot of sound bites from it that are technically true within the frame of the paper, and that satisfies him. After all, his main line of work seems to be PR. However, for those who really study ice, like Dr Box or Mauri Pelto, it offers next to nothing toward an improved understanding of the physical reality. Much ado about nothing, really.
  14. Ken Lambert at 15:21 PM on 4 May 2011
    How climate change deniers led me to set up Skeptical Science website
    Albatross #72 I invited all to the Flanner thread at #64. Tom Curtis has not responded to my posts there since 30th April. He then places a post here at #65 which should have been on the Flanner thread. Direct your criticism to Tom Curtis. If this thread has a clearly defined topic, it is about the personal musings of our host John Cook on what motivated him to start up the site. Other than say - well done John, one could suggest that disagreement is not denial, and that there are valid reasons to doubt some of the supporting evidence for the AGW case.
  15. How climate change deniers led me to set up Skeptical Science website
    Marcus @77, You can add Pat Michaels and Christy to that list.
  16. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Hello Chip, Thanks for your post @119. You did not refer me to #95 before, you referred me to #112. Regardless, I'm afraid that you insist on continuing to evade the point. Quite frankly your "strong belief" that including the 2010 data would not make a difference to your conclusions is irrelevant, and unscientific to boot. What it boils down to is this. The validity of your argument now rests on you demonstrating qualitatively to everyone that the melt stats for 2007 exceeded those for 2010. While what you believe may be correct (albeit unlikely, given other people's findings on 2010 melt versus other years), you have not demonstrated that despite having ample opportunity to do so. Rather, you admit that you are making a claim without having done the analysis and choose to cite "belief" instead. Not a compelling argument, so thank you for confirming that.
  17. How climate change deniers led me to set up Skeptical Science website
    Marcus, johnd - Here in the States, we don't read the Australian at all. Washington Post? CNN? The Guardian? Yep. But not the Australian. That's "local" news for a locality far from here. That said, Monckton has shown up in front of Congress for the Kabuki theater that is the US Congress (which I state as someone for whom Congress is local news). When he does, we in the States make appropriate noises in response. Usually gagging. --- For me, looking at the world through the clearest glass I can find is important for my personal integrity. Logical fallacies, rhetoric contrary to facts, and lying for self-interest are maddening to me. I have the greatest respect for John Cook for putting so much effort into this website. I believe that it has had an effect - clearly delineating the science, and hopefully killing off some of the counterfactual rhetoric. I hope, in my own small way, to help contribute to this effort for some time to come.
  18. How climate change deniers led me to set up Skeptical Science website
    I will say this much, though. Its true that Plimer & Monckton have become slightly *less* important, in the eyes of the Denialist Community, since the start of this year-maybe because their comments are so blatantly embarrassing that even their fellow travelers are beginning to want nothing to do with them. That certainly shouldn't allow the Denialist Movement to try & rewrite history, & pretend like Monckton & Plimer weren't ever their "star performers". As I said, though, some elements of the Denial community *still* view Monckton & Plimer as heroes of their movement. Last point, John, I personally *rarely* mention Plimer or Monckton unless they're first mentioned by one of the Denialists. Given that it was *you* & Ken who brought them up in this thread, then that surely suggests its *you*-& not us-who are fixated on them.
  19. How climate change deniers led me to set up Skeptical Science website
    By the way Ken Lambert, Your last comment on the Flanner thread was May 2nd while your last Flanner based comment here was May 4th. If you really intended to invite people over there you would post something to the effect of "please see my comments here" with a link to your latest comment. And so I don't add to your feelings of persecution, Tom Curtis should be doing the same.
  20. Models are unreliable
    Slight difference. We can't predict ENSO, PDO,AMO in models. These internal variabilities exist in models. You can certainly run a model and get PDO index out of it. However, you cant initialise a model to predict them. I don't of any paper which puts a case for DO being internally driven. Do you? Everything has physical causes.
  21. How climate change deniers led me to set up Skeptical Science website
    "Is Monckton that important a figure to you that you are now actively recommending others should read certain sections of certain newspapers just to keep up with the latest news about him?" The point, John, is that whether you choose to accept it or not, the majority of your fellow Denialists clearly see him as an important figure-to the extent of viewing him as an "expert" in the field-otherwise, why would they give him so much air-time? Talk-back radio, Right Wing Newspapers (like The Australian), Fox News & the US Republican Party *all* give him & his ilk plenty of lime-light, hardly something they'd do for someone they considered inconsequential. This, of course, speaks volumes about the dearth of *real* experts that exist within the Denialist Camp. JMurphy's link was simply to show that, in spite of your rampant denial, The Australian still views him as a very newsworthy figure. That point, like most others, appears to be completely lost on you however.
  22. Models are unreliable
    trunkmonkey - So, a little uncertainty, and we know nothing? That's not a reasonable statement, trunkmonkey. I don't think that's been established anywhere. Tamino has demonstrated how to look at ENSO and other local variations, and remove their influence. The results? The warming we expect from the CO2 we've added, at the trends we expect from the physics. A wee uncertainty is reason for resolving how much we know despite the uncertainty, not reason for throwing up our hands and giving up. That said, an overturning of the thermohaline circulation due to increased fresh water would be a large change in climate state. But that hasn't happened for a long time, and won't happen unless due to our influence, barring major changes in natural forcings which are well out of what we've seen in millenia.
  23. How climate change deniers led me to set up Skeptical Science website
    Ken Lambert@71 Your comment @2 was an off topic complaint about moderation and a claim of victory on the Flanner thread. Nothing at all to do with the creation of SkS or why you come here. This was a fairly successful redirected of this thread into a subset of Flanner. Quite frankly I think the moderators should delete *ALL* the Flanner based comments (including my own) as they are 100% off topic.
  24. How climate change deniers led me to set up Skeptical Science website
    "Marcus at 18:24 PM, you are confirming my point." How so? I don't exactly seek out mentions of Monckton, Plimer or any of those other hard-core Denialists in The Australian, they're there in Black & White in every single Op Ed piece that is devoted to the issue of Climate Change. The Editorials also frequently make mention of these "stars" of the denial movement, as do many of the journalists who work for The Australian. If you've missed these frequent mentions, then I can only suggest that its because you're trying to read the *braille* version of the newspaper.
  25. Chip Knappenberger at 14:32 PM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Albetross, Since you think that repeating things seems to help clarify things, I'll go ahead and repeat what I wrote in comment 95 concerning the melt extent in 2007 and 2010...
    Granted, incorporating the melt extent for the summer of 2010 into the methodology as described in our paper may have required a few minor tweaks to some of the wording (and a few specific numbers). But, by and large, as I have said many times, I strongly believe (although I have not done the analysis) that the changes would not have altered the general nature of our conclusions (as we explained to the JGR editor).
    and in comment 112 concerning the same thing
    As far as 2010 having a greater melt index value when processed through the methodology described in our paper, I have not done the calculation, so I don’t know. I have reasons to believe it is not as cut and dry a situation that many of you all tend to want to make it.
    So I guess now that we've both repeated ourselves, the topic should be about as settled as it is ever going to get until (when/if) our analysis gets updated. Agreed? -Chip
  26. trunkmonkey at 14:32 PM on 4 May 2011
    Models are unreliable
    363. We established a while back that we know nothing at a decadal scale because the models can't resolve the irregular and powerful influenses of the ocean sloshings; ENSO, PDO, AMO, IOD (It's almost like Lake Tahoe must have a dipole as well). These influences are stronger than the expected warming so we could concievably have a decade of cooling or a decade of warming much greater than the models predict and it would mean nothing (except politically). The implication of DO/meltwater is profound, because if it is internally driven, we might not know anytinig at a centennial or millenial scale either.
  27. How climate change deniers led me to set up Skeptical Science website
    Ken @71, "I have already done that at #64" And yet before your comment #71 and after #64 you managed to make two more off topic posts at #68 and #70 :) Now, please either speak to the content of John's post/article, or move on. Thank you so very much.
  28. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Hello Chip, Regarding your claim that "My [Chip's] command of English must be slipping,". Perhaps it is. Let me help if I may. And for the record you never have provided an explanation or specific reply to my comment original comment @61, so it is not possible for me to not "like" your explanation. Below is my post at 61 repeated here for everyone's convenience. I spelled out the problem with your claim very clearly: "You claimed in your response that: "We would like to note that waiting for one more year of data is not going to materially affect our analysis or conclusions." We know that claim in demonstrably false, because the melt data for 2010 surpassed those for 2007, which renders the following from your abstract obsolete (i.e., including those 2010 would have very much have affected your conclusions): "The melt extent observed in 2007 in particular was the greatest on record according to several satellite-derived records of total Greenland melt extent." Including those 2010 data also renders the first part of this conclusion in your abstract obsolete, while also calling into question the validity of the second part of the following: "The greatest melt extent over the last 2 1/4 centuries occurred in 2007; however, this value is not statistically significantly different from the reconstructed melt extent during 20 other melt seasons, primarily during 1923–1961" So contrary to your claims made here and elsewhere, the 2010 do very much affect your conclusions and desired narrative. " To summarize (again), the greatest melt over the last 2 1/4 centuries was in fact in 2010, not 2007. You claim that excluding those 2010 data "is not going to materially affect our analysis or conclusions" is demonstrably false, unless you are trying to argue that the 2010 figure was not higher than that observed in 2007. Thank you.
  29. Chip Knappenberger at 11:56 AM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Albatross, My command of English must be slipping, because upon re-reading your comment 61 I see nothing that I haven't already addressed. Just because you, perhaps, didn't like my explanation doesn't mean that I haven't offered one. -Chip
  30. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Hi Chip, No, #112 does not address the issue, and your post @115 simply evaded the issue. Please read my comment @61 again-- I provided a link to that comment for your convenience in my post at @114 above. I was very specific in that post, concerning the claim that you made @ 58 and why is was demonstrably false. Thank you.
  31. muoncounter at 11:38 AM on 4 May 2011
    Why 450 ppm is not a safe target
    dorlomin#2: "methane in the atmosphere exceeds 2ppm." Shakova and Semiletov 2007 reports: the surface layer of shelf water was supersaturated up to 2500% relative to the present average atmospheric methane content of 1.85 ppm, pointing to the rivers as a strong source of dissolved methane -- emphasis added That's consistent with the graph shown below: source
  32. Chip Knappenberger at 11:30 AM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Hi Albatross, I think I discussed the general topic you were interested in at #61 regarding the impact of 2010 (which remains unknown), in my comments posted above at #112. If not, maybe you can ask them again. Thanks, -Chip
  33. Lindzen Illusion #3 and Christy Crock #5: Opposing Climate Solutions
    "And in a competitive economy what measures would those be? Bankruptcy??" Of course, this is a text book argument from the Contrarian Movement. As has already been pointed out, the additional cost of carbon-rich energy, when taxed, will be applied to the *unit* cost of the energy (in MJ or kw-h), not to the total bill. In the last 10 years, I've seen my electricity tariffs rise by 8c/kw-h (due almost entirely to privatization & inflation), yet I'm currently paying about $40 per month *less*-on my total electricity bill-than what I was 10 years ago, simply by halving my daily use (purchasing energy efficient appliances & light globes, less reliance on stand-by modes & swapping my electric hot water system for a continuous flow gas hot water system). Now it's true that I spend about $20/month for my gas hot water, but that still leaves me $20/month better off than 10 years ago-& I can probably improve my position further by getting insulation installed to reduce my energy use for heating & cooling-so claims that reducing our CO2 emissions will send us *bankrupt* are just complete nonsense. Businesses & Industry also have plenty of room for reducing their CO2 emissions *and* reducing their total energy use. For example, the cement industry generates about 1t of CO2 for every tonne of cement made. Now I've seen evidence to suggest that this can be significantly reduced by measures like (a) recycling of old cement, (b) use of aluminium silicate instead of Calcium Carbonate, (c) capturing the CO2 from baking Calcium Carbonate & converting it to algal biomass, (d) using bio-gas, rather than natural gas, to bake the calcium carbonate. Of course, if the industry also captured its waste heat & converted it to electricity, then they could offset the costs of a carbon tax via the sale of electricity (co-generation). So we see that, yet again, a Carbon Tax represents an *opportunity*, more than it does a *burden*.
  34. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Hello Chip, Back at #61, I showed that you made a demonstrably false statement/claim in you post @58. I was wondering whether or not you have any thoughts on that matter? I also have some other comments/questions, but have to take care of some other matters right now.
  35. Why 450 ppm is not a safe target
    mandas: Aim high and underachieve, rather than aim low and underachieve by even more.
  36. Why 450 ppm is not a safe target
    I both agree and disagree with your position re the target of 450 ppm. I agree that it is dangerous, and that the effect on the environment are likely to be significant. However, I disagree that it should not be an aspirational target, because given the current state of politics and the pitiful efforts to reduce current emissions, 450 ppm is likely to be at the low end of what we can realistically achieve. I am very concerned by this, and I despair about what the state of the environment is going to be at the end of this century because of it. But what can you do?
  37. Chip Knappenberger at 09:40 AM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Tom@111, Thanks for the questions. Our ice melt index is an average of the ice melt extent reported by three different research teams (using different methodologies) studying ice melt across Greenland. We didn’t feel that we were in a position to pass judgment on which of the methods was better, so we took the data provided us by the researchers, standardized it, and averaged the three standardized values together for each year. That is why we refer to our value as an ice melt extent “index” (it is unitless). So, the zero value of our index is really just close to the average of the index for the period 1979-2009. Regarding calving, see my comment @112, I think it will answer your question. And regarding the relationship between our reconstructed ice melt index and Greenland temperatures, remember that we also incorporate winter NAO along with summer temperatures into our reconstruction model. NAO doesn’t have a large effect, but it does has some effect, and its influence likely explains the situation that you describe. I hope this helps! -Chip
  38. Ken Lambert at 09:39 AM on 4 May 2011
    How climate change deniers led me to set up Skeptical Science website
    pbjamm #66 "Ken Lambert@everywhere Please stop trying to derail this thread and take the discussion back where it belongs." I have already done that at #64 - come on over to the Flanner thread yourself.
  39. An Even Cloudier Outlook for Low Climate Sensitivity
    KR, "RW1 - Looks like both SW and LW; see the bottom of column 1, page 1524, where Dessler 2010 discusses combining the uncertainties of SW and LW measurements to determine total uncertainties." Yes, I know about that. That would seem to indicate it is the net SW and LW flux, but I should probably clarify this with Dessler himself just to be sure.
  40. Ken Lambert at 09:37 AM on 4 May 2011
    How climate change deniers led me to set up Skeptical Science website
    #69 Tom Curtis Dr Trenberth is estimating the 'annual' contribution of Sea Ice loss in the Arctic to the global energy imbalance budget. If you want to look at the Sea Ice loss relative to 1979, you need to look at the 'cumulative' loss over 32 years.
  41. Chip Knappenberger at 09:28 AM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    skywatcher@110, Thanks for the good questions. Let me try to answer them. You are correct that our uncertainty bounds represent statistical uncertainty in our model (which has some contribution from measurement error in that the model was built from likely non-perfect observations). So, the better the model performs, the tighter the error bounds. Or another way, if you prefer, the worse our model is, the wider the error bars and voila, nothing is statistically different than anything else (I would imagine that this is a property shared by most statistical models). All I can say, is that we tried to produce the best model that we could and reported the statistics associated with it. I believe this is the same thing as was done in the study I referred to that the IPCC highlighted. We did not report the error bars associated with our observed melt index, nor did we consider them in our determination of whether 2007 was or was not statistically different from any of our reconstructed values (including error bars). If we had considered errors in our observed melt index, I imagine that the number of reconstructed values which were not found to be statistically different from 2007 would have increased. And, just as in the example I pointed out from the IPCC, our error bars do indeed increase as predictor variables drop out as we go back in time. This is clearly stated in our paper, where we stated “It should be noted that the confidence of the reconstruction, as indicated by the error bars in Figure 2, degrades in the period prior to 1840 as the amount of independent data is reduced.” As far as 2010 having a greater melt index value when processed through the methodology described in our paper, I have not done the calculation, so I don’t know. I have reasons to believe it is not as cut and dry a situation that many of you all tend to want to make it. Further, I don’t see it as a factual error. If the sentence that everyone has the most concerns about is placed back into its proper context of the abstract, it should be plainly clear that our period of record, and thus the sentence in question, only goes through 2009. Heck, the title of our paper is “A reconstruction of annual Greenland ice melt extent, 1784–2009.” So I don’t think that anyone who reads the title and/or the abstract thinks that we have secretly left out 2010. If you picked our paper up in the year 2025 and started reading it, I don’t think you would be under the impression that this sentence in the abstract “The greatest melt extent over the last 2 1/4 centuries occurred in 2007; however, this value is not statistically significantly different from the reconstructed melt extent during 20 other melt seasons, primarily during 1923–1961” was referring to the period 1800-2025. So there is not a factual error in that sentence as written in context, nor has it been established that 2010 has a greater melt index (using our methodology) than 2007. But, like we said in our paper, “…preliminary estimates place the ice melt of 2010 at their highest level since at least 1958” so the possibility exists that 2010 may exceed 2007 (but even so, it doesn’t effect the correctness of that sentence in our abstract in context). Regarding calving…our paper was about surface ice melt, so our comments on other dynamic changes were mere speculation under the assumption that surface melt and other dynamic processes were correlated. As we stated in the paper “The forces acting in concert with ice melt across Greenland to produce higher global sea levels currently, should also have been acting during the extended high‐melt conditions from the mid‐1920s to the early 1960s.” If this assumption is invalid, or is becoming invalid, then our suppositions following there from may need reassessment. I hope this helps explain where we were coming from. -Chip
  42. Why 450 ppm is not a safe target
    Are you sure about this? "In parts of the East Siberian Arctic, methane in the atmosphere exceeds 2ppm. This is partly responsible for temperatures in the Arctic rising 2-3 times faster than in the tropics" The Zepplin station in Svalbard shows a slight decline in YOY methane concentrations. http://www.esrl.noaa.gov/gmd/dv/iadv/graph.php?code=ZEP&program=ccgg&type=ts I would have thought Arctic amplification is far more likely to be down to factors such as increased water vapor in the air, changes in ice albedo than methane.
  43. How climate change deniers led me to set up Skeptical Science website
    Ken Lambert @68, so you don't see using the previous lowest year to 2007 as the benchmark when the discussion regards net increase in incoming flux relative to 1979 as an error?
  44. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chip Knappenberger, I am having difficulty coming to grips with your paper, in large part because I am unable to access the original (given my limited means). I would appreciate it if you would answer some questions for me. With regard to the Ice Melt Extent Index in your paper, does the zero value represent total ice melt equaling total snow accumulation such that for negative values, the GIS gains mass if we ignore glacial calving, and positive values a mass loss? If not, what value does represent an equality of mass gain and loss (ignoring calving)? Further, although your paper does not deal with glacial calving, do you think it is reasonable to treat the Ice Melt Index as a proxy for total mass loss from the GIS as Michaels is doing, and as you appear to to do here. Finally, your Ice Melt Index shows similar values in the periods 1840-1920 and 1960-1990 even though Greenland temperatures where apparently half a degree colder in the former period. Is there any particular reason for this?
  45. Ken Lambert at 09:01 AM on 4 May 2011
    How climate change deniers led me to set up Skeptical Science website
    Tom Curtis #65 I have already invited everyone to the Flanner thread, where Tom Curtis has not responded to my latest piece, even though he has quoted numbers in detail above in #65. This gross bit I cannot let go elsewhere: "To cap the later of you then divided the result by two again apparently because the measured 2 million square kilometer reduction in sea ice did not fit your prejudice. Again, you have refused to acknowledge that that mistake was a mistake." I got that 'prejudice' from Dr Trenberth, Tom - Dr Trenberth says in his "Tracking the Earth's Energy" Aug09 paper: Quote "Sea ice is important where it forms. Record losses of Arctic sea ice of about 10^6 km2 occurred in summer of 2007 relative to the previous lowest year [25], although the thickness and volume of the ice is quite uncertain. To melt 10^6 km2 of ice 1 m thick and raise the temperature of the water by 10 degC requires 3.4 x 10^20 J, or globally 0.02 W/m2. For 2004–2008 this is about 0.9 x 10^20 J/yr." Endquote Am I to be allowed to answer the rest of Tom Curtis' post at #65?
  46. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    So far as I can see though Chip, your calculation of error is a purely statistical construct from the modelled timeseries values (the RMSE of the series), rather than a physically-based estimation of the uncertainty, which might be generated in this case, say by comparing observed melt values with modelled melt values where the series overlap? I'm not saying it's necessarily an incorrect way of estimating uncertainty, but it is not similar to the IPCC example you refer to - as seen by the fact that your uncertainties appear constant, while in the example, they change (increase with age in that case). Please correct me if I am wrong! Your conclusion fundamentally depends on the uncertainties being large enough to encompass the recent values in 2007 (and of course should have included 2010). Your factual error, surely, is of course that 2010 is the greatest melt season, not 2007, which also means that the two largest melts occurred in the past four years. Any further thoughts on my comments regarding melt values in comparison to mass loss (calving, recently accelerated flow rates)? I feel your paper inadequately deals with that and this largely leads to the erroneous statements on sea level. And why would sea level rise have had to have jumped to 3mm p.a. in the 1960s to confirm an accelerated contribution from Greenland alone? You never talk of all the relative contributions to sea level (thermal expansion etc) so this statement is made without context and is seriously in error.
  47. Bibliovermis at 08:14 AM on 4 May 2011
    Why 450 ppm is not a safe target
    For comparison, the global mean passed 350 ppm in 1988 and is currently at 391 ppm. NOAA/Earth System Research Laboratory: Global CO2 Data
  48. Models are unreliable
    "Isn't meltwater, meltwater," Volume matters. At end of glacial, you have large amount of ice to melt below the arctic circle. The issue of interest for now wrt to sealevel rates, is that rate of warming is far higher than exiting a glacial but amount of ice available to melt is far less. As to BP, look at the data yourself. As to role of meltwater - this is unsettled science. There is good evidence of disturbance to thermahaline cycle and ditto for solar forcing. What causes the disturbance, relationships and timing is not settled. If you are really interested, read Wally Broecker on the subject. Relevance to now? Well no solar forcing and no disturbance to thermahaline cycle detected as possible causes of warming.
  49. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chip Knappenberger at 05:39 AM, fear not being seen as dense. Far from it, your contributions have provided an unusual degree of clarity and logic.
  50. Chip Knappenberger at 06:30 AM on 4 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chris@107, If I may join your conversation… I am not following your assertion that we are somehow unique in comparing an observed data value with reconstructed data values, and assessing whether or not they differ from one another by examining the error range that we determined about the reconstructed values. As a prominent example, I direct you to page 2 and 3 of the IPCC Third Assessment Report, particularly Figure 1b (I hope I am not falling for some sort of bait here). It shows a 1,000 year reconstruction of northern hemisphere temperatures along with the observed temperatures since 1861 and includes, among other statements, “The 95% confidence range in the annual data is represented by the grey region. These uncertainties increase in more distant times and are always much larger than in the instrumental record due to the use of relatively sparse proxy data. Nevertheless the rate and duration of the warming of the 20th century has been much greater than in any of the previous nine centuries. Similarly, it is likely that the 1990s have been the warmest decade and 1998 the warmest year of the millennium.” This figure, procedure and conclusion are similar in nature to ours. And as to our offending sentence in the abstract, I don’t think any portion of it has been firmly established to be factually in error. -Chip

Prev  1724  1725  1726  1727  1728  1729  1730  1731  1732  1733  1734  1735  1736  1737  1738  1739  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us