Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  1725  1726  1727  1728  1729  1730  1731  1732  1733  1734  1735  1736  1737  1738  1739  1740  Next

Comments 86601 to 86650:

  1. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    And last, but not least, in his CATO missive Michael's argue a strawman about the work of Rignot et al. (2011) and in doing so misrepresents their work. But first the quote: "And, this is important, the period of the lowest ice melt extent across Greenland for more than a century occurred from the early 1970s through the late 1980s – or very near the beginning the time period analyzed by Rignot et al." And another version here: "There was another paper on Greenland ice published by my nefarious research team at the same time as Rignot's. Instead of looking at recent decades (satellite monitoring of polar ice only began in 1979), we estimated the Greenland ice melt using a remarkable 225-year record from weather stations established there by the Danish colonists. We found that about the time that the satellites started sending back data the ice melt was the lowest it had been for nearly a century. In other words, Greenland was unusually icy when Rignot et al. started their analysis." Now here is what Rignot et al. (2011) actually did. From their abstract: "Here, we present a consistent record of mass balance for the Greenland and Antarctic ice sheets over the past two decades, validated by the comparison of two independent techniques over the last 8 years." Specifically, all their figures (and trend lines) are for the period 1992 through 2009, much later than suggested by Michaels. Michaels has misrepresented Rignot et al's work. Also, the lowest SMB loss in the FKM study the minimum running mean in FKM was in the 70s, with the lowest three datapoints for that era in FKM's Fig. 2 occurring in a cluster in late sixties early seventies.
    Response:

    [DB] Fixed Link tag.

  2. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Yes, in a world of unbiased models and data the earth is younger than we thought, a few thousand years maybe ... I don't know what it is but don't call it science please.
  3. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    KR @172, Wow...so Pat's emotional appeal for unbiased reporting and balance rings very hollow indeed.
  4. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    And yet another nugget from Pat Michael's CATO missive, in which he misrepresents the science again: "But, as we have noted previously in this Wisdom, many of the proposed mechanisms for such a rapid rise — which is caused by a sudden and massive loss of ice from atop Greenland and/or Antarctica — don't seem to operate in such a way as to produce a rapid and sustained ice release." I'm not sure to what CATO propaganda piece he is referring to, but that is certainly not what the paleo data suggest. From a post here at SkS on what happened the last time CO2 was this high, in which he discusses the work of Csank et al. (2011) and Dwyer and Chandler (2008). Specifically, Dwyer and Chandler note that: "These results indicate that continental ice volume varied significantly during the Mid-Pliocene warm period and that at times there were considerable reductions of Antarctic ice." Additionally, Rohling et al. (2009) conclude that: "Reconstructions indicate fast rates of sea-level rise up to 5 cm yr-1 during glacial terminations, and 1–2 cm yr-1 during interglacials and within the past glacial cycle." They also conclude that, "On the basis of this correlation, we estimate sea level for the Middle Pliocene epoch (3.0–3.5 Myr ago)—a period with near modern CO2 levels—at 25 +/- 5m above present, which is validated by independent sea-level data. Our results imply that even stabilization at today’s CO2 levels may cause sea level rise over several millennia that by far exceeds existing long-term projections." And from Pfeffer et al. (2008): "We consider glaciological conditions required for large sea-level rise to occur by 2100 and conclude that increases in excess of 2 meters are physically untenable. We find that a total sea-level rise of about 2 meters by 2100 could occur under physically possible glaciological conditions but only if all variables are quickly accelerated to extremely high limits. More plausible but still accelerated conditions lead to total sea-level rise by 2100 of about 0.8 meter." And from Grinsted et al. (2010): "Over the last 2,000 years minimum sea level (−19 to −26 cm) occurred around 1730 ad, maximum sea level (12–21 cm) around 1150 ad. Sea level 2090–2099 is projected to be 0.9 to 1.3 m for the A1B scenario, with low probability of the rise being within IPCC confidence limits." And looking forward, from Horton et al. (2008): "Our results produce a broader range of sea level rise projections, especially at the higher end, than outlined in IPCC AR4. The range of sea level rise results is CGCM and emissions-scenario dependent, and not sensitive to initial conditions or how the data are filtered temporally. Both the IPCC AR4 and the semi-empirical sea level rise projections described here are likely to underestimate future sea level rise if recent trends in the polar regions accelerate." Also, from Vermeer and Rahmstorf (2009): "For future global temperature scenarios of the Intergovernmental Panel on Climate Change's Fourth Assessment Report, the relationship projects a sea-level rise ranging from 75 to 190 cm for the period 1990–2100." Indications are that the loss of ice in the Polar regions may indeed be accelerating. And remember sea levels will continue to rise beyond 2100, and also that these estimates are higher than the upper range cited in the IPCC's latest report.
  5. Rob Honeycutt at 07:53 AM on 7 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Albatross @ 170... That has to be one of the most insane statements I've read in a while. I had to read it twice to make sure he was saying what he was really saying. Let's see if we can apply this thinking to something else... "The number of papers coming out saying 'there are far more dinosaurs with feathers in the Cretaceous than we thought' vastly outnumber the papers saying there are fewer than we thought. In a world of unbiased science these two should be equal." Hm... I'm thinking that Michaels is maybe just bitter because his own biased position is not panning out in the full body of research.
  6. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Folks - Keep in mind that Knappenberger and Michaels are principals of New Hope Environmental Services, "...an advocacy science consulting firm that produces cutting-edge research and informed commentary on the nature of climate, including patterns of climate change, U.S. and international environmental policy, seasonal and long-range forecasting targeted to user needs...". (Emphasis added.) New Hope Environmental Services also puts out the World Climate Report blog, which ties them closely to the Cato Institute. As such, it is to be expected that they will produce primarily advocacy position papers, i.e. PR papers, for their clients. Oddly enough, I have no objection to that as long as I know what the source is about, as I can then rely upon/discount the work accordingly. Much as I treated the work of the Tobacco Institute, or now treat any of the PAC's for "Clean Coal" and the like. I do wish that these advocacy groups would be called on their paid positions in public discussions, however, and not treated as unbiased science. Note: I consider it worthwhile to look at the reasoning behind work presented in the sphere of science, and paid advocacy definitely has an effect - I don't intend this as an ad hominen in any fashion. But this is the elephant in the room, the selective attention test.
  7. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Sorry, failed to provide a link for the quote @170. Here it is.
  8. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Michaels also has said this, here: "But the rapid sea level rise beat goes on. In global warming science, we note, the number of scientific papers with the conclusion "it's worse than we thought" vastly outnumbers those saying "new research indicates things aren't so dire as previous projections." In a world of unbiased models and data, they should roughly be in balance." Whaaat? The science has to present a 50/50 balance, regardless of the reality? The overwhelming evidence points to a problem, that is not indicative of bias, but a reality presented by the data. There is just no getting away with that. This is an incredibly disingenuous and/or misguided belief that Michael's holds. I suppose then that using Pat's bizarre logic the same should apply to HIV/AIDS research suggesting that there is no link between AIDS and HIV, or research into linkages between between tobacco and cancer....That a scientist of Pat's alleged stature would hold the belief in the quoted text beggars belief.
  9. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Re the comments made at @167 by Chip, Instead of being an apologist for Michaels (why I care I don't know, it is your reputation that you are throwing away by doing so), how about you please read the actual State of the Arctic Coast report. Rising sea levels do not operate in a vacuum as some apologists would try and have us believe-- there are cumulative impacts at play here and rising sea levels is one of them. From the report's conclusions: "Sea-level rise in the Arctic coastal zone is very responsive to freshening and warming of the coastal ocean (leading to increased sea level at the coast) and is highly susceptible to changing large-scale air pressure patterns." Also, "Many Arctic coastal communities are experiencing vulnerabilities to decreased or less reliable sea ice, greater wave energy, rising sea levels, changes in winds and storm patterns, storm-surge flooding or coastal erosion, with impacts on travel (on ice or water), subsistence hunting, cultural resources (e.g. archaeological remains, burial sites) and housing and infrastructure in communities. In some places, this has necessitated community relocation, which in some cases increased vulnerability." And from Page 6 of the report: "The response to climate warming is manifest in a succession of other changes, including changes in precipitation, ground temperatures and the heat balance of the ground and permafrost, changes in the extent, thickness, condition, and duration of sea ice, changes in storm intensity, and rising sea levels, among other factors." Also from page 6: "There is evidence from some areas for an acceleration in the rate of coastal erosion, related in part to more open water and resulting higher wave energy, in part to rising sea levels, and in part to more rapid thermal abrasion along coasts with high volumes of ground ice. This directly threatens present-day communities and infrastructure as well as cultural and archaeological resources such as cemeteries and former settlement sites, particularly in areas of rising relative sea level (where postglacial uplift is limited or regional subsidence is occurring)." And I'm only on page 6 of the report at this point.....sea-level rise features prominently. Michaels is making fun of people losing their homes and having their lives affected by AGW. Shame on him, and shame on those who elect to uncritically apologize on his behalf.
  10. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    In this CATO Institute missive, Michaels makes this extraordinary claim: ”Our simple computer model further indicated that there were several decades in the early and mid-20th century in which the ice loss was greater than in the last (ballyhooed) ten years. The period of major loss was before we emitted the balance of our satanic greenhouse gases. So, about half of the observed change since 1979 is simply Greenland returning to its normal melt rate for the last 140 years or so, long before there was global warming caused by dreaded economic activity.” Really, "there were several decades in the early and mid-20th century in which the ice loss was greater than in the last (ballyhooed) ten years..."? Their research demonstrates no such thing, and they even state in their abstract that the melt in 2007 was the highest on record, "The greatest melt extent over the last 2 1/4 centuries occurred in 2007; however, this value is not statistically significantly different from the reconstructed melt extent during 20 other melt seasons, primarily during 1923–1961." and we also know that the 2007 record was surpassed in 2010. Michaels is publicly misrepresenting and exaggerating his own work, and that of his co-authors. If I were Knappenberger or Frauenfeld, I would very incredibly uneasy and unhappy about that. Also, I was unaware that Greenland had a "normal" melt rate. The above quote from Michaels also (predictably) plays into the “skeptic” myth that what we are experiencing now, and what we will experience under business as usual, is nothing out of the ordinary. It also suggests that the recent acceleration of ice loss from Greenland is attributable to natural variability. As the GISTEMP maps below show, the warming in the vicinity of the Greenland Ice Sheet (GIS) between 1923 and 1953 (based on the “warm” period noted by Wake et al. 2009) was highly regional in nature. Now compare that with what has been observed since 1980 (the most recent 30-yr period). Quite the striking difference. Something very different is happening, and this is still relatively early in our rather bizarre experiment that we have decided to undertake. Now, of course, internal climate modes and regional climate regimes can amplify or mute the underlying long-term trend. No denying that. But, oscillations cannot generate a long-term trend as we are witnessing (see here and here). As greenhouse gases increase, the greatest warming is expected to occur at high latitudes, and we are already observing polar amplification of the warming over the Arctic (Flanner et al. (2010), Screen and Simmonds (2010)) in response to the long-term warming trend.
    Response:

    [DB] Fixed images.  I note that it was hotter in Mongolia way back when...

  11. Chip Knappenberger at 07:03 AM on 7 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Albatross@166, The climate topic of Pat's Cato article about buying beach houses was sea level rise (notably absent from the list you bolded). I imagine that he would take different things into consideration if he were looking into buying coastal property in Alaska (rather than say, the Outer Banks of North Carolina). -Chip
  12. Trenberth can't account for the lack of warming
    Indeed. One of many things that would change my mind. 5-10 years on though, and with measurements improving, if the "missing heat" is accounted for, are you changing YOUR mind?
  13. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chip I asked if you and Frauenfeld agreed with Pat's juvenile challenge? It seems then that you agree with Pat and are also interesting in buying some coastal property..... Re your political blog citation, yes there has "always" been coastal erosion, just as there have "always" been forest fires, yet we know that people cause fires too. You seem to be intent on missing the point entirely. This excerpt from the Arctic report referenced above: "There is growing evidence that accelerated erosion may be attributed to retreating sea ice, changes in storm wave energy, and increased sea-surface temperature (Jones et al., 2009b; Overeem et al., 2010; Barnhart et al., 2010)." I really do not know what compels you to be an apologist for the disingenuous actions of Pat Michaels. Anyhow, I have only gotten started with Pat's abuse of FKM11.....more soon.
  14. Chip Knappenberger at 06:11 AM on 7 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Pete@163, The terms of the AGU's usage policy are unclear to me, so I am not sure whether I can post the reprint or not. Maybe someone can clarify this. If allowed, I'd gladly make it avaible. -Chip
  15. Bob Lacatena at 06:03 AM on 7 May 2011
    Why 450 ppm is not a safe target
    36, Jesús,
    I cannot link to silence, because it just means they've ignored it.
    This is B.S. You can't attack someone because they failed to say what you want to hear. RC has many, many posts on sea level. They're all rather objective and thorough. Their "failure" to explicitly come out and attack what was never even a projection is hardly an offense.
    Hansen & Sato's suggestion of 5m SLR is not supported by any other scientist.
    Perhaps because it doesn't exist? Where in the paper do you see this stated? And if you haven't looked yourself, then why are you talking about it, and linking to the paper, when you haven't read it yourself?
    Caution when assuming more than 1 meter SLR is my personal opinion...
    The key words there being "my personal opinion."
    ...then you should go against this blog post summary, not against my comments
    But you're spreading it as if it's true, without proper citation, and without even having followed up yourself. In fact, you accused RC of "silence" without evidence, and you've accused Hansen and Sato of making a prediction which they didn't make. And yet the fault isn't yours, you were just blindly parroting what you read on some blog somewhere (and yet you can't provide a link to what you read, and a Google search finds nothing). We can hardly blame you, now, can we?
    ...now you seem more interested in discussing the semantics...
    No, I'm more interested in paying attention to what people are actually saying, instead of distorting their position to create a strawman. Your desire to equate "meters" with "5 meters" goes hand in hand with your own misunderstanding of what is being presented. Once again, you are taking the BAU comment out of context, and exaggerating what it says. Hansen's basic point is that sea level rise is unlikely to be linear, and most linear estimates wind up at 1m or 2m, meaning any non-linear increase must be over 2m, and therefore on the order of "meters." This is reasonably well supported by the fact that ice loss is accelerating. Is it guaranteed? No. Is it a viable position? Yes. But in the end, you attack RC, you attack Hansen, and you come back with bluster and hand waving. You can't support your statements, you won't even take the blame (it's the blogger's fault), but you still dig in and refuse to budge. Typical.
  16. Chip Knappenberger at 06:03 AM on 7 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Albatross@161, Maybe you'll find this article to be of interest: Settling on an unstable Alaskan shore: A warning unheeded. -Chip
  17. Pete Dunkelberg at 05:58 AM on 7 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Chip Knappenberger, would you please make the paper available online?
  18. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    An addendum to my post at 161. Michaels also says this, here: "So should you sell your beach house because of the impending doom? I say yes. You need to beat the rush, put it on the market at a bargain-basement price, and sell it to me. And then I will keep it until the cows come home." Well, maybe there is a loophole for Michaels et al., because it seems that the cows have already "come home" as per the evidence cited @161 above. Then again, perhaps he should approach the residents of Tuktoyaktuk for some ocean front property.
  19. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Daniel Bailey, While much attention has rightfully been placed on the decision by the author’s to exclude the record 2010 data, as a number of posters here have noted, another big story here is how the KFM2011 is being used by Pat Michaels to play down and to belittle the seriousness of the situation that future generations face down the road. In his CATO Institute missive, Michaels openly taunts and belittles the situation with this juvenile title “The Current Wisdom: Please Sell Me Your Beach House”. Fortuitously, the a new report has just been released titled “State of the Arctic Coast 2010:Scientific Review and Outlook" which contains a some inconvenient truths for Pat Michaels. In the report they state that: “Regions with frozen unlithified sediments at the coast show rapid summer erosion, notably the Beaufort Sea coast in Alaska, Yukon, and the Northwest Territories and large parts of the Siberian coast. The ACD compilation (Lantuit et al., 2011) showed that the Beaufort Sea coast in Canada and the USA had the highest regional mean coastal erosion rates in the Arctic (1.15 and 1.12 m/year in Alaska and Canada, respectively).” You see, rising sea levels are only part of the bigger picture that Pat Michaels unwisely chooses to ignore to advance his agenda. Again from the Arctic report: “Trends of decreasing sea ice and increased open-water fetch, combined with warming air, sea and ground temperatures, are expected to result in higher wave energy, increased seasonal thaw, and accelerated coastal retreat along large parts of the circum- Arctic coast.” As a result some communities have already had to be evacuated, with more on alert. From the report: ”The US Army Corps of Engineers (2006) report provides a synopsis of the situation for threatened Alaskan communities. The situation in some communities is sufficiently dire that they are considering immediate relocation (e.g. Shishmaref (http:// www.shishmarefrelocation.com/). In other cases (e.g. Tuktoyaktuk – Johnson et al., 2003; Catto and Parewick, 2008), phased retreat to a new location is an option which is now being considered (http://www.cbc.ca/technology/story/2009/09/08/climate-changetuktoyaktuk-erosion.html; http://hosted.ap.org/specials/interactives/_science/tuktoyaktuk/)" So a challenge to Pat Michaels. There are a few communities in Alaska alone who would probably very much like for you to buy their land and beach-front property. In fact, someone ought to let them know that Pat Michaels (and the CATO institute?) has publicly expressed a sincere interest in purchasing their land. I’m also interested to know whether or not his co-authors Frauenfeld and Knappenberger unequivocally stand by the bizarre challenge made by Michaels?
  20. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Grypo @157, The same misinformation and juvenile tone are being presented at the CATO site. More soon. The title egotistical and conceitful nature of title of their series, "Current Wisdom", beggars belief.
  21. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Oh, I was looking at this oops.
  22. Chip Knappenberger at 04:34 AM on 7 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    grypo, Pat's Daily Caller article has always been a trimmed down version of his Current Wisdom piece for Cato. Here is the link to Current Wisdom piece which is where there is more detail given about our latest paper: http://www.cato.org/pub_display.php?pub_id=13010 I don't think anything has been changed in either article. -Chip
  23. Jesús Rosino at 04:25 AM on 7 May 2011
    Why 450 ppm is not a safe target
    Sphaerica, I think there's some misunderstanding here. I just said that RC didn't say anything about Hansen & Sato's 20th century multimeter [if you like it more than 5m] SLR suggestion. Of course I cannot link to silence, because it just means they've ignored it. The citations you provide are, of course, previous to Hansen & Sato 2011 and I think they just support my point: that Hansen & Sato's suggestion of 5m SLR is not supported by any other scientist. Caution when assuming more than 1 meter SLR is my personal opinion, which I've tried to back up whithout mentioning RC at all. I cannot imagine where you get the idea that I've said that there's any "comment at RC suggesting that a 5m rise by 2100 is at all likely" nor what makes you think I should provide such a link. On the other hand, if you think Hansen made no such a projection of 5m SLR, then you should go against this blog post summary, not against my comments. In any case, my comments apply the same to the "I find it almost inconceivable that BAU climate change would not yield a sea level change of the order of meters on the century timescale". So now you seem more interested in discussing the semantics of "projection". Sorry I'm not. Regardless of the label of your choice for Hansen & Sato's multimeter fantasy, it's way off the peer reviewed literature numbers and they provide weak evidence (I'd be more open to the idea if they meant in something like 300 years rather than one century). As I said before, if they can get that through peer review, I will take it more seriously. Regarding scientists' opinions on Hansen 2007, you can see William Connolley here or here, or James Annan here. Alexandre, yes, I refer to the same IPCC projection. See your link where they say: "They include a contribution from increased Greenland and Antarctic ice flow at the rates observed for 1993-2003, but this could increase or decrease in the future". I don't think we know what the cause of the IPCC underestimate is (I may be wrong, it's just my impression from what I've read in blog posts), but I guess it can only be ice sheets or thermal expansion. Given that thermal expansion seems easier to calculate and there's a lot of uncertainty about ice sheet dynamics, the latter turns up as a more likely culprit. However, I think this is rather speculative, especially considering this is a short-term comparison (we have just a couple of decades of data to compare with projections). See, for example, Deep ocean warming solves the sea level puzzle about Song & Colber 2011. Don't trust me on the consensus matter, I'm not an expert, but that's my impression. I think the +1m is rather based on empirical data, which is quite compelling, but, lacking a physical understanding of the underlying causes, I think it's difficult to call it a consensus yet. Anyway, I may be too influenced by this discussion I had with Zorita and its later blog post. Cheers.
  24. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Well it appears the Beach House article on Cato and the Daily Caller has changed to exclude any specifics to this research. I thought I'd gone nuts, but the original was reposted elsewhere, so I was able to recheck my sanity.
  25. Lindzen Illusion #2: Lindzen vs. Hansen - the Sleek Veneer of the 1980s
    angusmac, you're misrepresenting the facts. Maybe Hansen described Scenario A as the result if we continued with "business as usual", but the fact is that the radiative forcing has been nowhere near that in Scenario A. Hansen is not in the business of predicting how GHG emissions will change, he's in the business of projecting for a given GHG change, how much temps will change. That's what the adjusted Scenario B represents. Your claimed "dramatic drop in temperature projections" is purely imagined. That's why it's not mentioned.
  26. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Well Eli Rabbet has something of potential interest to this issue. But I'm sure the "skeptics" will probably again bend over backwards to try and defend this too. "Now look at the legend, notice that the red line is a ten year trailing average. Now, some, not Eli to be sure, might conjecture, and this is only a conjecture, that while using a trailing average is a wonderfully fine thing if Some Bunny, not Chip Knappenberger to be sure, is looking at smoothing the data in the interior of the record, but, of course, Chip understands that if you are comparing the end of the data record to the middle, this, well, underweighs the end. The rising incline of the trailing average at the end is depressed with respect to the data. Some Bunny, of course, could change to a five year moving average, which would make the last point not the average of the melt between 1999 and 2009 but the average between 2003 and 2009, a significantly larger average melt. Of course, this effect would be even clearer if someone, not Eli to be sure, knew that the melt in 2010 was a record."
  27. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Tom Curtis at 10:47 AM on 6 May, 2011 In response to:
    1) Ice melt extent in the 2000's would be statistically indistinguishable from that for the highest period in the reconstruction. (This is true as it stands, but apparently was not worthy of comment in the paper.)
    Huh? I thought what everyone is upset about is that the paper basically says the melt during the 2000s is statistically indistinguishable from that for the prior reconstruction. But if you agree that the ice melt extent is 2000 is statistically indistinguishable from that of the highest period in the reconstruction, I'm sure Michaels will be happy to report that Tom Curtis decrees this is so.
    2) The two years with the greatest ice melt extent would have occurred in the last ten years, and five of the most extensive melts would have occurred in the last ten years. In no other ten year period would more than two of the ten most extensive melts have occured. 3) The year with the highest melt extent would be the most recent year, with just eleven reconstructed values having 95% confidence intervals encompassing that value.
    So what if the record happens to fall in the most recent year? This summer will probably have a lower melt than 2010. I'm unaware of any scientific rule that says papers discussing reconstructions can't be published if they happen to end with a melt index that is not a record. Moreover, if 2010 is a record it will still be a record until it is broken. The fact that it might not have been broken again in 2011 won't prevent anyone from pointing out that a record was broken in 2010.(And of course, if 2010 is not a record, it's not a record. ) On the claim that two years with the greatest ice melt would have occurred in the past ten years: How are you concluding this with any certainty? It's true that the satellite measurements would indicate that the melts for 2007 and 2009 are greater than all reconstructed melt indices. But the reconstructed melt indices have uncertainties associated with them. Based on the observation that the 2007 melt index falls outside the ±95% uncertainty intervals for the reconstruction and there is at most a 60% probability that the 2007 melt is greater than all melts during the previous period. (The probability is actually much lower because I did a quick calculation which assume the 2007 melt index exactly equaled the upper 95% confidence value for 20 cases. So I did ( 1- 0.975^20) = 0.3973 as the probability all previous 20 are lower. But this represents the upper bound for the probability that all 20 are lower because in each individual case the probability a particular value from the previous period is < 0.975). So with respect to 2007-- you can't know for sure its melt exceeded those in during the 30s. That certainly makes your claim that two years with the greatest melt occurred in the past 10 years tenuous. (I suspect if we got the more detailed data from Chip and did the full analysis, we'd find the probability your claim is true is less than 1/2. ) But even your claim that one year with the greatest melt occurred in the past 10 years is tenuous. Assuming your estimate that 2010 would fall outside the uncertainty intervals for 11 years, there is at least a 24% probability that the 2010 value is a not record. Very few people would consider this probability sufficiently high to say with confidence that even 2010 was a record. So you can't even say 2010 must be a record. (Though of course it might be. If we got the data from Chip, we could do a better calculation, and the probability that it's a record is even lower. ) If a reviewer has been thoughtful, they might have asked FKM to do this calculation to firm up the numbers-- but given the traditions for making statistical calls, no one looking at the probabilities would decree that a record can be called even with only 11 and even making the simplifying assumption I made above. But the reviewers didn't do that. The consequence is the text in FKM doesn't discuss this at all, and text that might have needed more extensive modification showing that the probability of a record is "x%" isn't included in the paper. As the abstract stands in the published paper, changing "11" to "20" and "2007" to "2010" does not need to be modified. (So, yeah, assuming your '11' is correct, I missed on edit.) As a practical matter, the abstract only needs a "tweak" and you would have been no happier with it. Note: When the observed index is falls inside fewer than 5 ±95% uncertainty intervals, a more refined calculation will be needed to figure out if we can 'call' a record. At some point-- I'm SWAGING when the observed melt index falls inside fewer than 2 ±95% uncertainty intervals-- it will be impossible to say that there is any realistic probability that the melt falls outside the range experienced during the 20s-40s. I suspect this will happen during the next El Nino. Since KFM's reconstruction is now published, you'll be able to do this and using the KFM reconstruction and they will need to admit it. (I don't think this notion seems to have sunk in.)
    4) The relatively low ice melt extents in the early satellite period are due in large part to major tropical volcanic eruptions, eruptions which were absent in the 1930s. In the absence of these eruptions, the longest and period of extensive melting would be that straddling the end of the 20th century, not that in the middle. Clearly natural forcings have favored extensive ice melt in the mid 20th century, while acting against it towards the end. (True also in 2009, and surely worth a mention in the paper.)
    First: If this is intended to engage or rebut anything I wrote, it's a bit misplaced. I wrote about changes to the existing manuscript that would be required if 2010 was incorporated. Second: I don't disagree with your explanation of why the data looks as they do. Given the nature of this paper, I even think the paper would be stronger with this sort of discussion inserted. However, the reviewer (Box) who made a rather vague suggestion to this effect while simultaneously requesting inclusion of data that was not available (and is still unavailable than 8 months later) bowed out because that not-yet-available data were not incorporated. Evidently, whatever happened, neither the editors, the other reviewers nor the authors thought to incorporate this sort of thing. It's worth noting that not every paper showing a time series or reconstructions discusses why the time series looks the way it does-- for example the "Surface mass-balance changes of the Greenland ice sheet since 1866 L.M.WAKE,1 P. HUYBRECHTS,2,3 J.E. BOX,4,5 E. HANNA,6 I. JANSSENS,2 G.A. MILNE1" doesn't discuss Volcanism when explaining they reported
    "Higher surface runoff rates similar to those of the last decade were also present in an earlier warm period in the 1920s and 1930s and apparently did not lead to a strong feedback cycle through surface lowering and increased ice discharge. Judging by the volume loss in these periods, we can interpret that the current climate of Greenland is not causing any exceptional changes in the ice sheet."
    So, while I agree both the Wake paper and the FKM paper -- both decreeing that this century appears more or less similar to the previous melt period-- might have benefited from inclusion of a few sentences mentioning causal factors for the previous high and low melt periods, neither did. It seems the editors and reviewers standards are consistent in this regard.
    A paper drawing these conclusions, IMO, would be substantially different from the paper actually produced. Certainly it would have been very hard for Michaels to place the spin on it that he as been doing.
    As I understand it, his "spin" amounts to your conclusion (1) above which is "Ice melt extent in the 2000's would be statistically indistinguishable from that for the highest period in the reconstruction. (This is true as it stands, but apparently was not worthy of comment in the paper.)" Since other conclusions you make are unsupportable based on the data, your suggesting that his including them would prevent him from "spinning" as he is seem a bit odd. It would be rather silly to suggest that FKM are required to include incorrect or tenuous conclusions to avoid whatever you, Tom, happen to consider "spin". Other issues that puzzle me in your comment:
    Tedesco shows mass loss, while FK&M show melt extent.
    The graph I inserted is figure 1C from Tedesco indicates it shows "standardized melting index anomaly". The caption reads "(c) standardized melting index (the number of melting days times area subject to melting) for 2010 from passive microwave data over the whole ice sheet and for different elevation bands." Tedesco also shows a graph of "SMB" (Surface Mass Balance". It's labeled figure 3a in Tedesco. Since FKM use melt extent, incorporating data for 2010 would involve 2010 melt extent data, not SMB data.
    Second, this analysis has been done graphically, and has all the consequent uncertainties (ie, the numbers might be out by one or two in either direction).
    Of course it's done graphically and your numbers might be out one or two.... I believe I said I was discussing an estimate and I assume you are too. We could add: Done in blog comments. Not even at the top of a blog post were many people could see it. Using Tedesco data as a proxy for the data that would really be used by FKM and so on. I've suggested before that this will be worth doing when the melt data used by FKM do become available. I see in your later comment you speculate that if only FKM had used a different proxy to reconstruct, they would get different answers, and you speculate as to what those results would be based on eyeballing graphs. Ok... but if their choice of proxy was tenuous, or the reviewers had wanted to see sensitivity to choice of proxy, then that was the reviewers call. They didn't make that call. Also: The fact that choice of proxy makes a difference widens the true uncertainty intervals on the reconstruction relative to those show in n FKM. So it would take an even longer time to decree that we feel confident that the current melt index is a record. When the melt index data are available, would you recommend doing the analysis to determine how much widen the uncertainty intervals on the reconstruction? It seems to me that may well be justified.
  28. Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Dikram
    steven mosher@152 If you want papers published with turn key code and data, then I would agree that would be great, but who is going to provide the (quite considerable) additional funding to support that? Are you going to fund that just for climatology, or for all science? Sadly it is just unrealistic.
    I've often told Steve the same thing. I do however agree with the principle that authors should grant access to key data on request. So, for example, if someone asked Chip for the data points underlying FKM figure 2 or asked Tedesco for the data points underlying figures 1(c) in his paper, I think both authors should grant that sort of request and fairly promptly. Ideally there would be some sort of formal archive for this sort of thing possibly funded by NSF/DOE office of science or something. People whose projects were funded would be required to either deposit it there or say where the data are deposited. But turnkey code? With all data in a nice neat little folder? I think Mosher is insisting on something that goes beyond what is practical.
  29. Ken Lambert at 23:56 PM on 6 May 2011
    Trenberth can't account for the lack of warming
    "So Ken, do I understand by "missing heat", that you mean that you will accept AGW if better measurements can close the energy budget, but in the meantime you will choose to believe that "missing heat" means that energy imbalance isnt real and we are not warming?" The energy imbalance is as real as the reality of our measurement. I have never argued that we have not had warming (0.75 degC surface since AD1750). The energy absorbed to produce that temperature increase is in the past. If surface temperature rise is flattening and heat increase in the oceans is also flattening with better measurement then a reasonable conclusion is that heat imbalance is reducing. The missing heat might stay missing because it was never there.
  30. Ken Lambert at 23:44 PM on 6 May 2011
    A Flanner in the Works for Snow and Ice
    muoncounter I assume that your attempt to inject a little humour means that you can't really disagree with my parting comment at #149 MC? Tom seems to have vacated the field for other threads.
  31. Ken Lambert at 23:37 PM on 6 May 2011
    Brisbane book launch of 'Climate Change Denial'
    Why did you not invits me John? A good chance for us to catch up and see if I could get an autographed copy.
    Response: Sorry Ken, there were a few Brisbanites I forgot to invite (I've gotten some cross emails).
  32. Ken Lambert at 23:33 PM on 6 May 2011
    Lindzen Illusion #3 and Christy Crock #5: Opposing Climate Solutions
    Marcus #84, #85 "It took fossil fuels around 60 years, & massive State support, to reach the relatively cheap prices they are today" That type of comment betrays a lack of understanding of the history of electricity generation in Australia and most of the first world. What technologies were available 100 years ago for large scale central generation? Answer - fossil fuel and hydro. In the absence of hydro resources and the availablity of relatively cheap coal - the choice of was one of necessity - coal. Such large investments by State Utilities (ie owned by the taxpayer) with 25-40 year lives were bound to remain a mainstay of our generation until nuclear arrived after WW2. You all know what has happened to nuclear. Oil gas coal hydro and nuclear are there because of economics. If there were better cheaper technologies, they would take over. When you talk of PV Solar being economical in boutique applications now - that is not new. PV Solar has been the best choise for powering remote area small scale applications for many years. I was playing around with Solar brine ponds in 1984 - the technology looked simple and effective but did not fly for cost reasons, chiefly maintenance in a very corrosive environment.
  33. Why 450 ppm is not a safe target
    Jesús Rosino at 17:15 PM on 6 May, 2011 When I said IPCC projections exclude "glacier and ice sheet mechanics" I was referring to the IPCC caveat "Model-based range excluding future rapid dynamical changes in ice flow". As I understand it (feel free to correct me), this is the main cause of the IPCC underestimate. I agree that, at least at this point, the 5m SRL projection for 2100 is a big outlier. OTOH I don't know how far a consensus goes here, but I thought the 1+ m was already considered to be quite plausible at the BAU scenario.
  34. Bob Lacatena at 21:54 PM on 6 May 2011
    Why 450 ppm is not a safe target
    33, Jesús, I was expecting a link to a RealClimate blog post. You did not provide a link, and a Google search shows no hits anywhere for the text that you have posted. A search for "sea level" rise at realclimate.org, however, provides any number of reasoned, scientific posts at RC explaining upper and lower boundaries for sea level rise, and the reasons behind each. In particular, reasonable estimates recognized by RC fall between 0.5 and 2.0 meters. From RC 9/4/2008:
    We stress that no-one (and we mean no-one) has published an informed estimate of more than 2 meters of sea level rise by 2100.
    From RC 11/15/2010:
    ...and Gillis shows that most of the experts now assume a considerably higher rise until 2100 than IPCC: about one meter, potentially even more.
    I see nothing remotely close to a comment at RC suggesting that a 5m rise by 2100 is at all likely, and must insist that if you cannot produce a link to such a statement, you must openly and loudly withdraw the statement that "RealClimate silence about this ... is somewhat telling". From the Hansen and Sato 2011 paper you posted:
    Alley (2010) reviewed projections of sea level rise by 2100, showing several clustered around 1 m and one outlier at 5 m, all of which he approximated as linear. The 5 m estimate is what Hansen (2007) suggested was possible, given the assumption of a typical IPCC's BAU climate forcing scenario.
    Also:
    However, the fundamental issue is linearity versus non-linearity. Hansen (2005, 2007) argues that amplifying feedbacks make ice sheet disintegration necessarily highly non-linear. In a non-linear problem, the most relevant number for projecting sea level rise is the doubling time for the rate of mass loss. Hansen (2007) suggested that a 10-year doubling time was plausible, pointing out that such a doubling time from a base of 1 mm per year ice sheet contribution to sea level in the decade 2005-2015 would lead to a cumulative 5 m sea level rise by 2095. Non-linear ice sheet disintegration can be slowed by negative feedbacks. Pfeffer et al. (2008) argue that kinematic constraints make sea level rise of more than 2 m this century physically untenable, and they contend that such a magnitude could occur only if all variables quickly accelerate to extremely high limits. They conclude that more plausible but still accelerated conditions could lead to sea level rise of 80 cm by 2100.
    You will note that Hansen is not with this projecting a 5m rise by 2095. He's doing math. He's saying if ice sheet disintegration is non-linear, and if doubling time for rate of mass loss is the best predictive factor, and if that doubling rate is 10 years (which is plausible, but in no way predicted), then the math says that a doubling of 1 mm/year from 2005-2015 would arrive at a cumulative 5 m sea level rise by 2095. He never says this is going to happen. He never suggests that it will. The paper in many places is actually a rather dispassionate discussion of all of the various estimates of others (as well as Hansen 2007) on sea level rise. From Hansen's 2007 paper, where the proposal was presented (given that Hansen and Sato 2011 is merely summarizing the current state of the literature):
    Of course I cannot prove that my choice of a ten-year doubling time for nonlinear response is accurate, but I am confident that it provides a far better estimate than a linear response for the ice sheet component of sea level rise under BAU forcing.
    His point is clearly that linear estimates are overly simplistic, and that a non-linear mechanism could produce a dangerously higher value. The point here is not the value. It's not a prediction. It's a demonstration of the importance of not taking a purely linear approach to a non-linear problem, much in the way that exponential growth is taught to school children by pointing out that if they start with a penny, and double it every day, by the end of a month they have over 10 million dollars. From the same paper, a few sentences later:
    The nonlinearity of the ice sheet problem makes it impossible to accurately predict the sea level change on a specific date. However, as a physicist, I find it almost inconceivable that BAU climate change would not yield a sea level change of the order of meters on the century timescale.
    Suggestion:If something bothers you, take the time to actually read the source material, in the proper context and perspective, and pay attention to the words, not your emotional reaction to the words. And never, ever base anything on a "blog post."
    Moderator Response: [Dikran Marsupial] s/2010/2100/
  35. CBDunkerson at 21:40 PM on 6 May 2011
    10 Indicators of a Human Fingerprint on Climate Change
    neil, night time temperatures increasing faster than day time temperatures is indicative of enhanced greenhouse warming because of the way that greenhouse warming operates. That is, increasing the concentration of greenhouse gases in the atmosphere (which is being caused by human industry) decreases the rate at which the planet cools. Consider a 100 degree day in Miami vs a 100 degree day in the Arizona desert. Once the Sun goes down the temperature starts dropping... but Miami is very humid (lots of atmospheric water vapor, a greenhouse gas) and can actually stay warm all night. The desert on the other hand gets very cold very fast because it has almost no water vapor and thus the daytime heat escapes quickly. Thus, if we increase the level of atmospheric carbon dioxide and other greenhouse gases which have much lower geographic variance than water vapor we are decreasing the rate of night-time cooling for the entire planet... nights stay warmer longer and thus the average night time temperature increases faster than the day time temperature. As to ocean heat content... a strong indication of warming, but not of anthropogenic causes. Any warming forcing would result in most of the energy going into the oceans. For instance, if the observed global warming were being caused by increased solar radiation we would expect to see days warming faster than nights (because there is no sunlight at night) and ocean heat content rising as most of the solar forcing went into heating the upper ocean. Ergo, we'd have the ocean warming either way, but the day vs night warming speed would be different.
    Response:

    [DB] Additionally, what was predicted by models and then subsequently confirmed later by observational studies is summarized here.  For example:

    Nights warm more than days Arrhenius 1896 Dai et al. 1999
    Sherwood et al. 2005
  36. alan_marshall at 19:24 PM on 6 May 2011
    Brisbane book launch of 'Climate Change Denial'
    I would like to copies of your book on the shelves of Parliament Shop in Parliament House, Canberra. Maybe you will find an MP to suggest it. They already stock "Requiem for a Species" by Clive Hamilton. Located in the main Foyer, the Shop is open 7 days a week 9:30 am - 5:00 pm (on sitting days extended to 5:45 pm). Ph: (02) 6277 5050 Fax: (02) 6277 5068
    Response: Haydn and I will actually be heading down to Canberra on May 16 and will be delivering a copy of the book to every federal MP in Australia (more news on this shortly). I'll check whether NewSouth Books are distributing to Parliament shop.
  37. CO2 is plant food? If only it were so simple
    It seems the evidence indicates that climate change is already negatively impacting global cereal crop years in the period 1980 to the present. The paper published in Science yesterday is behind a paywall, but Scientific American has a summary here.
  38. Lindzen Illusion #2: Lindzen vs. Hansen - the Sleek Veneer of the 1980s
    Angusmac, I have no problem acknowledging Hansen's 1988 prediction were out because the primitive model has a sensitivity that was too high. However, given the uncertainties in determining climate sensitivity, I think it was a pretty good effort for its time. Both history and better model show a convergence on sensitivity of 3.
  39. Lindzen Illusion #2: Lindzen vs. Hansen - the Sleek Veneer of the 1980s
    Scaddenp @68, you misrepresent me by stating that, "You [I] want real airplanes to be designed from a high school flight model, because it's simple?"

    Real engineers and scientists use simple models every day to check the validity of their more complex models. They usually use simple physics to check if small changes in their assumptions lead to large changes in outcomes.

    Dana in this blog used a simple spreadsheet to adjust Hansen's Scenario B without the need to re-run the whole computer model.

  40. Jesús Rosino at 17:15 PM on 6 May 2011
    Why 450 ppm is not a safe target
    Alexandre, Rahmstorf & Vermeer base their projections on empirical correlation between 20th Century SLR and temperature. However, 20th Century SLR doesn't have a big contribution from ice sheets (thermal expansion plays a big role), so I don't think it can reflect future contribution from ice sheets. I think that warmer deep ocean would be a more likely explanation of the IPCC undersetimation (and this would hopefully help to close the global heat budget). On the other hand, the IPCC projections do include a contribution from increased ice flow from Greenland and Antarctica at the rates observed for 1993-2003. The point by Rahmstorf and Vermeer being that it may accelerate. My comment was motivated by Martin #13:
    "[...] Until now, I was under the impression that predictions for a sea level rise by the end of this century were between 0.5 and 2m. That Hansen should predict 5m and so soon (by 2095!! not 2195 or 3095) surprises me. Do you know what the reaction to Hansen is among climate experts (the kind that publish peer reviewed papers on climate change)? Is this a consensus view?"
    I'm trying to answer that by saying that this projection by Hansen & Sato cannot be considered anything close to a scientific consensus (besides, it's not peer reviewed and it's quite different to what peer reviewed papers say), and highlight that even those latest figures we are recently more used to (+1 meter) must be taken with some caution, as we always look for physical expalantions, rather than empirical correlations. Yes, I do know that current data show that IPCC projections are undersetimating SLR, but between 19-59 cm (IPCC) and more than 1 meter there's a big gap. SLR is, of course, a serious threat with any of those numbers. Sphaerica, regarding the citation, I'm trusting this blog post:
    "Hansen and Sato (2011) using paleoclimate data rather than models of recent and expected climate change warn that “goals of limiting human made warming to 2°C and CO2 to 450 ppm are prescriptions for disaster”. They predict that pursuit of those goals will result in an average global temperature exceeding those of the Eemian, producing decadal doubling of the rate polar ice loss, resulting in sea level rise of up to 5m by the end of this century."
  41. Lindzen Illusion #2: Lindzen vs. Hansen - the Sleek Veneer of the 1980s
    Dana@67, I am not arguing for higher sensitivity.

    I just find it ironic that SkS can praise the accuracy of Hansen's 1988 projections without mentioning the dramatic drop in temperature projections. Here is a timeline derived from SkS blogs for Hansen's 2019 temperature anomaly:

    • Hansen (1988): Scenario A is "business as usual." Anomaly = 1.57°C.
    • Hansen (2006): Scenario B is ''most plausible.'' Anomaly = 1.10°C.
    • Dana (2011) in this blog: Adjusted Scenario B "matches up very well with the observed temperature increase." Dana is silent on the anomaly but the blog's chart indicates that Anomaly = 0.69°C.

    In summary, SkS have presented many blogs praising the accuracy of Hansen (1988) but have neglected to mention that the Hansen's original projection for the 2019 temperature anomaly has plummeted from 1.57°C in 1988 to 0.69°C in Dana (2011). Why is this not mentioned?

    Dana, I find it difficult to believe that these plummeting temperatures have not escaped your attention. Do you have a problem with declaring that your adjusted Scenario B is only slightly above Hansen's zero-emissions Scenario C?

  42. Dikran Marsupial at 16:58 PM on 6 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    steven mosher@152 If you want papers published with turn key code and data, then I would agree that would be great, but who is going to provide the (quite considerable) additional funding to support that? Are you going to fund that just for climatology, or for all science? Sadly it is just unrealistic. Also the idea that would allow the reviewer to see if additional data would change the result, that would only apply to a limited range of studies. I am working on a project at the moment that has several processor centuries of computation behind it!
  43. Lindzen Illusion #2: Lindzen vs. Hansen - the Sleek Veneer of the 1980s
    Moderator@66, your Eyecrometer sounds like a very useful device. Can I buy one on Amazon?

    You already have the post-2000 figures. Scenarios B and C both show a warming trend of 0.26 °C/dec for the 1958-2000 period, however, they diverge post-2000. I show the trends for 2000-2011 here and I summarise them below:

    • Scenario B = 0.38 °C/decade
    • Scenario C = 0.12 °C/decade
    • Real-world (GISS LOTI) = 0.15 °C/decade

    There is a definite dogleg in real world temperatures post-2000.

    I request that you publish the figures for the adjusted Scenario B so that I can incorporate them in my models.

    Response:

    [DB] "There is a definite dogleg in real world temperatures post-2000."

    The warmest decade in the instrumental record is a "dogleg"?  Where's the dogleg:

    Ten

    [Source]

     

    Well, it's not evident in any of the temperature records.  How about since 1975 (removing the cyclical noise & filtering out volcanic effects):

    1975

    [Source]

    No dogleg there.  How about the warming rate estimated from each starting year (2 sigma error):

    CRU

    [Source]

    THe warming since 2000 is significant.  Still no dogleg.

    Your focus on statistically insignificant timescales is misplaced.  While a world-class times-series analyst (like Tamino) can sometimes narrow down a window of significance to decadal scales, typically in climate science 30 years or more of data is used.

  44. Lindzen Illusion #3 and Christy Crock #5: Opposing Climate Solutions
    "You mean they used to *burn* this stuff. Burn? Seriously? Talk about primitive!" Yep, they'll say that the way people today say "they used to think the Earth was *flat*? Seriously?"
  45. Lindzen Illusion #3 and Christy Crock #5: Opposing Climate Solutions
    Tom " ...actually I think it will be a very easy sell to our grandchildren simply because they will be able to see the consequences of our inaction." I think it will be more than that. Our grandchildren, and their descendants, will have quite a lot of "What the ..!" responses. In a society which values carbon fibre products as a necessity (along with other products we've not yet developed), there'll be a lot of head scratching along the lines of "You mean they used to *burn* this stuff. Burn? Seriously? Talk about primitive!" (Or brainless, or silly, or stupid, or choose your own word.)
  46. steven mosher at 13:18 PM on 6 May 2011
    Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?
    Tom, I have no issue with your characterization. My point is rather simple. As you probably know i've advocated for open science from day 1. I think papers should be published with turn key code and data. If that were the case any reviewer or subsequent reader could see for themselves if newer data changed the answer. And that would be that. Whatever the answer is. instead we have a situation where on all sides people make inferences about the motives of authors, reviwers, editors. I'm suggesting that one way past some of this is to push for reproducible results. I don't have any issue calling out michaels. When he and santer testified together, I thought that santer cleaned his clock. And I said so.
  47. 10 Indicators of a Human Fingerprint on Climate Change
    Also, a suggestion: I would suggest that you add in the increase in Ocean Heat Content Anomaly (OHCA). We expect about 90% of the Anthropogenic forcing to go into heating the upper ocean - and this is indeed what is observed, and this close link between TOA radiative imbalance and OHCA increase I believe is some of the strongest evidence of anthropogenic warming. There are good recent updates on ocean heat content: Lyman et al. (2010) And there are other papers connecting the TOA radiation with the oceanic warming: you have some Trenberth ones, but Levitus principally talks about this: Levitus et al. (2005) Levitus et al. (2009) Apologies if this was already obvious and listed elsewhere.
  48. Lindzen Illusion #3 and Christy Crock #5: Opposing Climate Solutions
    "heir democratically elected governments back to 1900 who started the electricity generation industries with (except for hydro), the only feasible, abundant and cheap fuel available - fossil fuel - chiefly coal." What's your point here Ken? I never argued that fossil fuels should *never* have received government support-in their infancy. I was merely pointing out how fossil fuels *continue* to enjoy State subsidies, in spite of their apparent maturity. Yet mention *any* kind of government funding for renewable energy-especially ones which cut into the profits of the big energy suppliers, like rooftop solar-& the usual suspects cry foul. I call that *hypocrisy*!
  49. Lindzen Illusion #3 and Christy Crock #5: Opposing Climate Solutions
    "Marcus, I will be the first to buy your PV Solar panel when it can re-produce itself without the help of relatively cheap fossil fuels." What a completely bogus argument. It took fossil fuels around 60 years, & massive State support, to reach the relatively cheap prices they are today-& even then only with ongoing, often well hidden, government "subsidies". Most renewable energy technologies are already approaching parity with fossil fuels, in about half the time & with far less State support than what fossil fuels received-& continue to receive. Also, as scaddenp rightly points out, future PV's probably *will* increasingly be made without the help of fossil fuels, as relatively cheap *alternative* energy sources-like bio-diesel, solar-thermal, geo-thermal, bio-gas, tidal & wind power-become the mainstay of the manufacturing industry (as is already the case in California, for instance). So again, Ken, your argument really is quite weak, & getting weaker with each new posting.
  50. 10 Indicators of a Human Fingerprint on Climate Change
    Hi John, Thanks for your useful site. About #7: The decreasing diurnal temperature range (DTR) is listed as a fingerprint of Anthropogenic warming, with references. However, you do not say what the basis is for this claim - why would the DTR be expected to decrease under greenhouse warming rather than solar warming of some other forcing? Initially it seems intuitive because obviously the sun shines in the day, and not at night -so increasing night-time temperatures "should" be due to something else. However, upon more thought I started doubting that this is the case. Indeed, going back over the references you gave, plus further back into for example the Stone and Weaver (2002, 2003) papers, and even the Easterling papers - I don't see anywhere a definitive statement that a reduced DTR is indeed a fingerprint of additional greenhouse forcing. There is some mention of this in the intro's but never anything further. It seems to come down to a matter of clouds and soil moisture in the Stone and Weaver papers, and in Braganza (2004). All of them say it is a good index of recent climate change, while I don't see any of them actually saying it provides evidence of anthropogenic warming.

Prev  1725  1726  1727  1728  1729  1730  1731  1732  1733  1734  1735  1736  1737  1738  1739  1740  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us