Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  1603  1604  1605  1606  1607  1608  1609  1610  1611  1612  1613  1614  1615  1616  1617  1618  Next

Comments 80501 to 80550:

  1. OA not OK part 1
    Pete, Open source has the problem of what data is available, especially without scrounging. HSC has over 20000 compound and elements in its database, which is extensible. I'm sure however that open source is just fine for this case. This is rather trivial stuff. The hard part is the air/water volume assumptions you make, if you're really trying to model fluxes... and then do you partition water phases into pressure zones? My copy at this point is legacy from past employment. I keep it some statistics and graphics packages around for nostalgia's sake. I rarely use them anymore in my current roles.
  2. Philippe Chantreau at 02:36 AM on 2 July 2011
    Websites for Watching the Arctic Sea Ice Melt
    Eric, Warm waters are not nearly as much of a factor for determining the minumum extent as weather patterns. The extent drops when ice is fragmented and exported South. Winds and currents do that.
  3. A Detailed Look at Renewable Baseload Energy
    BBD wrote: "Two terawatts of photovoltaic would require installing 100 square meters of 15-percent-efficient solar cells every second, second after second, for the next 25 years. (That’s about 1,200 square miles of solar cells a year, times 25 equals 30,000 square miles of photovoltaic cells.)" '100 square meters every second, second after second, for the next 25 years' Wow. That sounds bad. Except, of course, there is no logical reason that the installation would have to proceed sequentially. Instead, let's say that 1% of the people on the planet go out and install 100 square meters (i.e. 10' x 10') panels. Let's further say the installation is really slow and takes a full day. ~700,000,000 people / 86,400 seconds in a day = 8,101 panels per second Huh. That one per second thing doesn't seem quite as impossible anymore. '30,000 square miles' and 'Renewistan' Gee. That sounds bad too. Except, of course, that it is entirely possible to put solar panels on the roofs of buildings, over parking lots, on telephone poles, et cetera. Human beings are already using ALOT more than 30,000 square miles of land in ways that could be dual-purposed to also hold solar panels.
  4. Stephen Baines at 02:31 AM on 2 July 2011
    Roy Spencer on Climate Sensitivity - Again
    So when will we see Spencer submit this paper to Nature, I wonder? He should be challenged to do so. If he doesn't, then I don't know why I should bother bothering about anything he writes at this point. He's entered the realm of von Daniken, IMO -- only he seems to have a cynical political motivation.
  5. A Detailed Look at Renewable Baseload Energy
    BBD @69, I'm not going to go through all those numbers. Instead, I will introduce a few of my own. You indicate in 72 that energy infrastructure has a life of "thirty years plus". The plus is not to large in that by the time a power station is thirty years old, it requires significant and ongoing refurbishment to remain efficient. So, for the sake of discussion, I will assume an average life span of thirty five years. That means that in 25 years time, 70% of our existing generation capacity will be replaced, or upgraded at a similar cost to replacement. In terms of generation capacity, that is 11.2 terrawatts, or about the capacity that you indicate needs to be replaced by renewables. At this point I don't see why I should bother going through your numbers on scale. Evidently, the scale of the task is similar to the scale of the task of ongoing business as usual. Yes, that is a huge task, but we are a busy, productive, and numerous species, and the task is not greater in scale than any we are not already committed to.
  6. A Detailed Look at Renewable Baseload Energy
    5. Conclusion - Renewables are not the correct choice for the rapid displacement of coal from the global energy mix. - Promoting renewables as the ‘solution’ to global warming is mistaken and misleading. - Advocacy pushing global energy policy in the wrong direction is dangerous. - Studies advocating a substantial expansion of renewables within the energy mix must be subject to close critical scrutiny. - A far better understanding of the limitations of renewables is required for a balanced evaluation of what contribution they can make to progressive decarbonisation of the global energy supply.
  7. A Detailed Look at Renewable Baseload Energy
    4. The UK experience so far The UK is already embarked on a huge shift toward on- and offshore wind, mandated by emissions reduction commitments set out in the UK Climate Change Act 2008. We are in the unhappy position of front-row observers as unworkable policy begins to break. For example, there is this:
    2010 Renewables Target Missed by Large Margin The Renewable Energy Foundation (REF) today published an Information Note on the performance of the UK renewables sector in 2010 based on analysis of new DECC and Ofgem data (see www.ref.org.uk). The work shows that the 2010 target for renewable electricity has been missed by a large margin, and confirms longstanding doubts as to the feasibility of this target, and the still more ambitious target for 2020. The key findings are: • The UK failed to reach its 10% renewable electricity target for 2010, producing only 6.5% of electricity from renewable sources, in spite of a subsidy to renewable generators amounting to approximately £5 billion in the period 2002 to 2010, and £1.1 billion in 2010. • Onshore wind Load Factor in 2010 fell to 21%, as opposed to 27% in 2009, while offshore fared better declining from 30% in 2009 to 29% in 2010. • Although low wind in 2010 accounts for some part of the target shortfall, it is clear that the target would have been missed by a large margin even if wind speeds had exceeded the highest annual average in the last 10 years. • The substantial variation in annual on-shore wind farm load factors is significant for project economics, particularly Internal Rate of Return (IRR), and future cost of capital. • Planning delays do not appear to have been responsible for the missed target, with large capacities of wind farms, both on and offshore, consented but unbuilt.* • The failure to meet the 2010 target confirms doubts as to the UK’s ability to reach the 2020 EU Renewable Energy Directive target for 15% of Final Energy Consumption, a level requiring at least 30% of UK electricity to be generated from renewable sources.
    This is the sort of thing that has prompted Professor Roger Kemp to write an ominous letter to the Guardian newspaper (see original for links):
    What is missing is recognition of the scale of technical challenges involved in decarbonising Britain's energy supply infrastructure. The fourth carbon budget said that 60% of new cars should be electric by 2030, a figure far higher than industry's most optimistic projections. The document also planned for gas boilers to be replaced by heat pumps in 25% of houses in the same time. These represent huge engineering programmes where the solutions have to be tailored to households and geographical areas. We also need to rebuild our electricity generation and transmission infrastructure, subject of the CCC's [Committee on Climate Change] December 2010 report. In the next 20 years, the coal-fired power stations, which provided more than half our electricity last winter, will be closed. All but one of the existing nuclear stations will expire. The CCC's plans say that, by 2030, renewable energy should supply 45% of our needs, compared with 3% today. Given that energy infrastructure is designed for a life of 30 plus years, this is a massive engineering challenge. We have not run large fleets of offshore wind turbines long enough to understand maintenance needs; our experience of wave energy is restricted to a few prototypes; carbon capture has, so far, been limited to a few megawatt prototypes, not the tens of gigawatts that will be required. The CCC should be more upfront about the challenges it is creating.
    Professor Roger Kemp Institution of Engineering and Technology And with Diesendorf (2010) in mind, there is more worrying news in this recent study conducted by energy consultancy Pöyry:
    The creation of an offshore 'super grid' and a major upgrade of energy interconnections are not the silver bullet solutions to Europe's energy needs, an independent study published by Pöyry has found. The report has found that the introduction of improved connectivity would only partially alleviate the volatility of increased renewable energy generation. In the North European Wind and Solar Intermittency Study (NEWSIS) Pöyry conducted detailed market analysis of the future impacts wind and solar energy have on the electricity markets across Northern Europe as it heads towards the 2020 decarbonisation targets and beyond. The study also concluded that weather is going to play a major role in determining how much electricity is generated and supplied to home and businesses throughout Europe, with electricity prices much lower when it is very windy, but unfortunately higher when it is still.
    I could, literally, go on all day, but this should give you an idea of the way things are already starting to look rather less rosy than the advocates for wind would have us believe. As I said, we in the UK have front-row seats. You can see very clearly from them.
  8. A Detailed Look at Renewable Baseload Energy
    3. Renewable scenarios in the UK David MacKay’s Sustainable Energy – Without the Hot Air (full text; html) is excellent, although focussed on the UK. MacKay is an ardent advocate of both renewables and decarbonisation, but a critical reading of his book shows, once again, just what we are up against. Note how conservative MacKay’s still extremely optimistic scenarios look next to Jacobson & Delucchi (2010). You can get a handle on the possible ways the UK might increase the proportion of renewables here, here and here. You can decide for yourself how politically, socially and technically feasible you find the various scenarios. MacKay’s take on the bigger picture is here. MacKay is the chief scientific advisor to the UK Department for Energy and Climate Change (DECC).
  9. A Detailed Look at Renewable Baseload Energy
    2. Renewable limits Jacobson & Delucchi’s WWS proposal is a much-cited example of a number of studies claiming that very substantial contributions to the energy mix are possible from renewables. As such, it needs close critical scrutiny - under which it fails dramatically. Professor Barry Brook finds much at fault with Jacobson & Delucchi. Be sure to follow the links in Brook's review to further critiques of by Charles Barton and Gene Preston. Brook is unsparing, and rightly so:
    They make a token attempt to price in storage (e.g., compressed air for solar PV, hot salts for CSP). But tellingly, they never say HOW MUCH storage they are costing in this analysis (see table 6 of tech paper), nor how much extra peak generating capacity these energy stores will require in order to be recharged, especially on low yield days (cloudy, calm, etc). Yet, this is an absolutely critical consideration for large-scale intermittent technologies, as Peter Lang has clearly demonstrated here. Without factoring in these sort of fundamental ‘details’ — and in the absence of crunching any actual numbers in regards to the total amount of storage/backup/overbuild required to make WWS 24/365 — the whole economic and logistical foundation of the grand WWS scheme crumbles to dust. It sum, the WWS 100% renewables by 2030 vision is nothing more than an illusory fantasy. It is not a feasible, real-world energy plan.
    Power transmission consultant Dr Preston is equally sceptical:
    In sum, I do not believe this is achievable at all. Therefore the concept envisioned in the SA [Scientific American] article [summarising the J&D paper inEnergy Policy] is not a workable plan because the transmission problems have not been addressed. The lines aren’t going to get built. The wind is not going to interconnect. The SA article plan is not even a desirable plan. The environmental impact and cost would be horrendous. Lets get realistic.
    A summary of the constraints on a rapid increase of renewables in the energy mix is provided here: Renewables and efficiency cannot fix the energy and climate crises (part1) and (part 2). Brook and others examine the limits to renewables in greater detail in a series of twelve articles here. If you really want to understand why renewables are not going to displace coal from the global energy mix to a significant extent, the above is essential reading.
  10. A Detailed Look at Renewable Baseload Energy
    1. Context and scale Let’s start off with a reminder of what it was that James Hansen said in his letter to President Obama (emphasis added):
    Energy efficiency, renewable energies, and an improved grid deserve priority and there is a hope that they could provide all of our electric power requirements. However, the greatest threat to the planet may be the potential gap between that presumption (100% “soft” energy) and reality, with the gap filled by continued use of coal-fired power. Therefore it is important to undertake urgent focused R&D programs in both next generation nuclear power and carbon capture and sequestration. These programs could be carried out most rapidly and effectively in full cooperation with China and/or India, and other countries. Given appropriate priority and resources, the option of secure, low-waste 4th generation nuclear power (see below) could be available within a decade. If, by then, wind, solar, other renewables, and an improved grid prove that they are capable of handling all of our electrical energy needs, then there may be no need to construct nuclear plants in the United States. Many energy experts consider an all-renewable scenario to be implausible in the time-frame when coal emissions must be phased out, but it is not necessary to debate that matter. However, it would be exceedingly dangerous to make the presumption today that we will soon have all-renewable electric power. Also it would be inappropriate to impose a similar presumption on China and India. Both countries project large increases in their energy needs, both countries have highly polluted atmospheres primarily due to excessive coal use, and both countries stand to suffer inordinately if global climate change continues.
    With Hansen’s cautionary words in mind, it’s time to start thinking about scale. Here’s Stewart Brand writing on Saul Griffith and the scale problem. Welcome to Renewistan:
    The world currently runs on about 16 terawatts (trillion watts) of energy, most of it burning fossil fuels. To level off at 450 ppm of carbon dioxide, we will have to reduce the fossil fuel burning to 3 terawatts and produce all the rest with renewable energy, and we have to do it in 25 years or it’s too late. Currently about half a terrawatt comes from clean hydropower and one terrawatt from clean nuclear. That leaves 11.5 terawatts to generate from new clean sources. That would mean the following. (Here I’m drawing on notes and extrapolations I’ve written up previously from discussion with Griffith): “Two terawatts of photovoltaic would require installing 100 square meters of 15-percent-efficient solar cells every second, second after second, for the next 25 years. (That’s about 1,200 square miles of solar cells a year, times 25 equals 30,000 square miles of photovoltaic cells.) Two terawatts of solar thermal? If it’s 30 percent efficient all told, we’ll need 50 square meters of highly reflective mirrors every second. (Some 600 square miles a year, times 25.) Half a terawatt of biofuels? Something like one Olympic swimming pools of genetically engineered algae, installed every second. (About 15,250 square miles a year, times 25.) Two terawatts of wind? That’s a 300-foot-diameter wind turbine every 5 minutes. (Install 105,000 turbines a year in good wind locations, times 25.) Two terawatts of geothermal? Build 3 100-megawatt steam turbines every day-1,095 a year, times 25. Three terawatts of new nuclear? That’s a 3-reactor, 3-gigawatt plant every week-52 a year, times 25.” In other words, the land area dedicated to renewable energy (”Renewistan”) would occupy a space about the size of Australia to keep the carbon dioxide level at 450 ppm. To get to Hanson’s goal of 350 ppm of carbon dioxide, fossil fuel burning would have to be cut to ZERO, which means another 3 terawatts would have to come from renewables, expanding the size of Renewistan further by 26 percent. Meanwhile for individuals, to stay at the world’s energy budget at 16 terawatts, while many of the poorest in the world might raise their standard of living to 2,200 watts, everyone now above that level would have to drop down to it. Griffith determined that most of his energy use was coming from air travel, car travel, and the embodied energy of his stuff, along with his diet. Now he drives the speed limit (and he has passed no one in six months), seldom flies, eats meat only once a week, bikes a lot, and buys almost nothing. He’s healthier, eats better, has more time with his family, and the stuff he has he cherishes. Can the world actually build Renewistan? Griffeth said it’s not like the Manhattan Project, it’s like the whole of World War II, only with all the antagonists on the same side this time. It’s damn near impossible, but it is necessary. And the world has to decide to do it. Griffith’s audience was strangely exhilerated by the prospect.
  11. Websites for Watching the Arctic Sea Ice Melt
    Dikran Marsupial no doubt that just two purported cycles do not make a real pattern, more so when an arbitrary underlying trend is added. We've already seen this a while ago. No physics no party, I'd say as a physicist ;)
    Moderator Response: [Dikran Marsupial] Absolutely, statistics can be very helpful in trying to identify the data generating process, but at the end of the day, the data generating process is what we want to understand, rather than the data. For that you need physics.
  12. A Detailed Look at Renewable Baseload Energy
    dana1981 Thanks for linking to this article and inviting me to comment. There is much to say. I dislike comments which set lots of ‘homework’ but in this case it’s unavoidable. I’ve split my response up , but I don’t know how sensitive your spamometer is to links so some of it might get filtered.
  13. Eric the Red at 00:26 AM on 2 July 2011
    2010 - 2011: Earth's most extreme weather since 1816?
    Dikran, WHile I agree with most of your post regarding records and extremes, I maintain that record rainfall during monsoons or tropical cyclones is tenuous. Two reason for this: 1) the amount of rainfall is highly variable such that only a slight variations in conditions (atmosperic and oceanic) are needed to create the extreme, and 2) rainfall is highly variable within the measurement area creating a larger spread in the data. These aspects are absent in other readings such as temperature.
  14. Pete Dunkelberg at 00:26 AM on 2 July 2011
    OA not OK part 1
    HSC 6.15? How about open source? I'm pretty sure Dr Mackie knows of various sources for the original HCO3- in Eq 1. The point of the equation is what happens in the water column: the reaction increases dissolved CO2. Chemware @6, before saying Eq 1 is wrong think about the equilibrium constants for all the steps including CO2 exchange with air. The author probably teaches the details in Chem 1. Meanwhile think on the point of the equation.
  15. Eric the Red at 00:14 AM on 2 July 2011
    Websites for Watching the Arctic Sea Ice Melt
    michael, Sounds like a fun bet. I believe that the 2007 record is safe for another year. All the other years cluster together near the end of July before separation during September. A stronger than average melt could bring 2011 to 2nd lowest, but a weaker melt could land it at 6th. Therefore, I will bet on 4th lowest coming in somewhere between 2009 and 2010.
  16. thepoodlebites at 00:11 AM on 2 July 2011
    Increasing CO2 has little to no effect
    #47 Have there been any updated model runs since Meehl et al. (2004)? It would be interesting to see how the temperature plots (1890-2000) would look including data from the last decade, using the same model assumptions. I'm wondering if all of the natural components have been properly accounted for. Model parameters can be adjusted to match any set of temperature observations. The statement that "late-twentieth-century warming can only be reproduced in the model with anthropogenic forcing" is a bit too strong as a conclusion without including the need for follow-up studies.
  17. 2010 - 2011: Earth's most extreme weather since 1816?
    DB @ 189 Because Excel does have an easy to use trending tool I did apply it to Perth Heatwave data set. I am not sure how to transfer the data to this webpage but I can at least describe the results and if anyone doubt them they can add the data to an excel sheet and use the trend line tool available. I was searching for historical data on Heatwaves on Google and Yahoo. Mostly the only search results that came up were of specific droughts in various parts of the world or how global warming is increasing heatwaves or models that show heat waves will increase in the future. All good and well but I wanted specific historical data especially one with individual data points I could use on an Excel spread sheet to satisfy your demand for more statistical analysis of data as opposed to using an "eyecrometer" or using common sense interpretation of a data set. I finally found what I was looking for on a page that describes the heat waves in Perth Australia over a longer period of time (81 years of data). The webpage is good as it not only lists various heat wave data but it also gives a reasonable explanation of what causes heat waves in Perth (it is a blocking pattern). One could use that information to determine if a warming globe (ocean and land) would generate more of these blocking patterns. Here is the link to Perth Heatwaves. Perth Australia heat waves. I used the Perth Airport temperature readings for my temp graphs as the article gives an * for the city temperatures. On the duration of the 15 listed heat wave events, the trend line was flat. There was no trend in the duration of the heat waves and one severe anomaly of 64 days in 1978 (this did not change the trend line as it took place in the mid area of the heat waves). The trend line for the average temp of the heat waves was negative, down from 38C to 37C at the end of the series. The trend line for the maximum temp was also negative from just below 44C at the start of the series to just below 43C at the end. About a 1C difference. The frequency of the heat waves was strongly positive. The time between the first few droughts was 13 years and then 23 years. After 1978 the time between the droughts is close to one every three years. My own conclusion. Drought frequency has shown a marked increase. Duration of each drought is flat. Temperatures of each drought is decreasing. DB, is that what you were requesting?
    Response:

    [DB] Time series analysis is a tricksy thing that even those who have done it for years can often make mistakes on.  This goes more than double when performing single-point trend analysis'.  By focusing on one station's worth of data only, how do you know the results of your analysis mean anything?  As Albatross points out below, that's essentially cherry-picking.  You need something to compare it to, either as a control or to make it meaningful.  Without context, there is no meaning.

    As for a good resource for you, since you appear to prefer Excel's analytical tools, D Kelly O'Day has an excellent primer on climate analysis using Excel here.  He even supplies spreadsheet examples of his work.  Highly recommended, as it will save you endless hours of false starts and shinnying down rabbit holes.

  18. Dikran Marsupial at 00:01 AM on 2 July 2011
    2010 - 2011: Earth's most extreme weather since 1816?
    Eric@193 "Can we really call a period of record rainfall an extreme event when it occurs in the monsoon season in an area which typically experiences heavy rainfall?" Yes. If the rainfall was an extreme event in the context of a monsoon season at that location, then it is an extreme event regardless of the season as the maximal rainfall ocurrs in the monsoon. If an event is a one in a hundred year event, and extreme rainfall only happens in monsoons and there is one monsoon a year, then an "once in 100 monsoon" event is very likely also a "once in 100 year" event and vice versa. If the extreme rainfall ocurred outside the monsoon season, it would be extremly extreme ;o) Eliminating areas that have extreme rainfall is a neat way of simplyfying statistical analysis of extreme rainfall! ;o) The penultimate paragraph is getting into multiple hypothesis test territory, for which there are standard methods. Of course there will be extreme rainfall happening somewhere every year. The question is has the propensity for extreme rainfall changed (i.e. have the return periods shortened). There is a branch of statistics called "extreme value theory". You can bet that climatologists and statisticians that have worked with climatologists will be familiar with this (there is a very good book by Stuart Coles sitting on my bookshelf), will have applied it, and understand the caveats well.
  19. Eric the Red at 23:51 PM on 1 July 2011
    2010 - 2011: Earth's most extreme weather since 1816?
    adelady, I liked your post, as it added reason and perspective to extreme events. Can we really call a period of record rainfall an extreme event when it occurs in the monsoon season in an area which typically experiences heavy rainfall? If it occurs outside of the monsoon area, then yes. We get into a gray area when the heavy rains falls outside the area, but are associated with typical rains. For eaxmple, I had an argument a year ago concerning record rains in Texas as a result of a tropical storm moving inland. Texas is not immune to tropical activity, although the area in question had been spared such storms in the past. Consequently, I prefer to eliminate these area from extreme weather discussions. I do not know how many weather reporting stations exist in the world, but let us say 10,000 for this exercise, and for simplicity that they have data for 100 years. Statistically, about 99% of the stations will fall within the extremes for the 100 years, but 1% (half above, half below) will fall outside the range. All things being equal, 50 stations would report a record high, while 50 report a record low. However, we know all things are not equal, such that warm years will produce more highs and fewer lows, and cool years, the converse. Also, areas in which stations are clustered could report a greater abundance of records during such a period. When you get down to daily records, statistically every station should produce at least one daily record high and low every year. I may have rambled a little, but my point is that a certain amount of records (or extreme weather) can be expected every year, and oftentimes the records are clustered together (in the U.S. most states set their records highs in the 1930s, while records lows are evenly divided before and after 1940). When the extremes start occurring year after year, then we have a problem.
    Moderator Response: (DB) Your comment contains specious logic (ie, "Gilles-isms") and calls out for correction; no doubt one of the regulars here will oblige you.
  20. OA not OK part 1
    I have a copy of HSC 6.15. (A thermodynamics program) It can produce some very nice graphs of equilibrium concentrations, and effects of changing the amounts of a given component on the concentration of equlibrium species. Happy to help if requested.
  21. Dikran Marsupial at 23:31 PM on 1 July 2011
    Websites for Watching the Arctic Sea Ice Melt
    Eric@182 "sine components" is not a theory. There is no physical explanation of why there should be a sine component, just a suggestion that one exists (which could be used to make a prediction). History tells us that when predictions are made that are not based on physics, all that happens when the data refute the prediction is that the prediction is altered (e.g. epicycles in Ptolmaic astronomy) until it fits again. In this case, all that will happen when the data refutes the current view of cycles is that some new natural cycle will be introduced acting on a longer time scale to "explain" the discrepancy. If you want an example of that, just look at the "no warming since 1998, no err 2002" canard. Skeptics were claiming that the lack of warming since 1998 was the start of the cooling trend. When it was clear that wasn't happening, they just shifted the start point and made the same claims all over again. Those of us who look at the physics understood why the claim was bogus in the first place. I would venture to suggest that like many statisticians you are paying too much time to the data and too little to the data generating process (the whole point of statistics is to learn about the data generating process). If there is a plausible physical model for the last cycle and a half, it is bogus to claim that there is three cycles and hence a low probability that it ocurred by chance alone (nobody is claiming it was chance alone BTW, but due to physics). However, if you want to give a formal statistical analysis of the cycles, then go ahead, I for one would be interested in discussing it (although this is probably not the right thread).
  22. Websites for Watching the Arctic Sea Ice Melt
    184 skywatcher - Tamino has a good phrase for that, as used in his recent chaos thread.
  23. Websites for Watching the Arctic Sea Ice Melt
    #182 Eric: What physics was in Camburn's statement? The, ah, physics of the sine wave? I prefer the physics that tells us about the absorbtion of infrared radiation by particular kinds of small molecules, like CO2 and CH4. Pattern matching tells us nothing about the underlying processes. On one hand here, we have a good pattern match (using solar, GHG, aerosol and volcanic forcing) that is underpinned by sound physics, of the kind that allows mobile phones, microwave ovens and CO2 lasers to work, and is verified by a large amount of observation. On the other hand, we have an apparent pattern less than two cycles long with no predictive power whatsoever. People see patterns and sine waves everywhere but, without a sound physical basis, they're garbage.
  24. michael sweet at 23:23 PM on 1 July 2011
    Websites for Watching the Arctic Sea Ice Melt
    A brief recap of June for those who do not follow sea ice: IJIS Graph for June 30, 2011 2011 ran neck and neck with 2010 for the lowest amount of sea ice for the entire month of June. The last two days of June 2011 fell off a little so 2010 is ahead. 2007 started its amazing two week drop at the end of June. Everyone expects 2007 to be in lowest place in a few days. 2010 fades a little in July. The weather in July will determine which year is lowest at the end of July. Even WUWT has stopped claiming that the ice is recovering this year. Where will 2011 end up?? Place your bets. The NSIDC monthly report should be out by next Friday. They always have insightful reporting. Hopefully it will be a long summary. Antarctic sea ice is running slightly below average.
  25. OA not OK part 1
    17 - good link. Some more voice
    Modeling demonstrates that if CO2 continues to be released on current trends, ocean average pH will reach 7.8 by the end of this century, corresponding to 0.5 units below the pre-industrial level, a pH level that has not been experienced for several millions of years (1). A change of 0.5 units might not sound as a very big change, but the pH scale is logaritmic meaning that such achange is equivalent to a three fold increase in H+ concentration. All this is happening at a speed 100 times greater than has ever been observed during the geological past. Several marine species, communities and ecosystems might not have the time to acclimate or adapt to these fast changes in ocean chemistry.
    I'm looking forward to the rest of this series... having been thrown out of chemistry at a very young age!
  26. 2010 - 2011: Earth's most extreme weather since 1816?
    Tom Curtis and Albatros, On an earlier thread "Linking Extreme weather and Global Warming" at post 148 for Albatros (a link to US CEI index at NOAA) and then 149 for Tom Curtis. To look at the CEI graph for Extremes in 1-day precipitation in the US was suggested by Tom Curtis. US CEI 1-day preciptiation extremes. At post 151 on the same thread Albatross comments "Tom @149, Indeed. Up." The graph shows a clear upward trend. However that is one on the points I am making and trying to demonstrate. There could be longer patterns that need more time to determine. That is why I like to look back as far as data will take. The longer the data trend the better (provided it does not fuzz to greatly looking back, generally some type of proxy has to be selected for really long trends as there is not direct measurements available to study). Here is another version of the Extreme 1-day preciptitation for the US (they use a different index to measure but the graphs follow the same trend over the parallel time scale). Note: I could not get the graph itself to link in its format. I will link to the article and state that the graph of interest in on page 297 of the article. Page 297 has a 1-day US extreme preciptiation graph. Abstract for the article linked above. "Abstract. An analysis of extreme precipitation events indicates that there has been a sizable increase in their frequency since the 1920s/1930s in the U.S. There has been no discernible trend in the frequency of the most extreme events in Canada, but the frequency of less extreme events has increased in some parts of Canada, notably in the Arctic. In the U.S., frequencies in the late 1800s/early 1900s were about as high as in the 1980s/1990s. This suggests that natural variability of the climate system could be the cause of the recent increase, although anthropogenic forcing due to increasing greenhouse gas concentrations cannot be discounted as another cause. It is likely that anthropogenic forcing will eventually cause global increases in extreme precipitation, primarily because of probable increases in atmospheric water vapor content and destabilization of the atmosphere. However, the location, timing, and magnitude of local and regional changes remain unknown because of uncertainties about future changes in the frequency/intensity of meteorological systems that cause extreme precipitation."
  27. Climate half-truths turn out to be whole lies
    dawsonjg, before your long diatribe is deleted (along with this comment), I suggest you do some reading at Skeptical Science : Newcomers Start Here The Big Picture Then check out The Most Used Skeptic Arguments That should clear up a lot of your obvious confusion.
    Response:

    [DB] Thank you for setting a good example!

  28. Climate half-truths turn out to be whole lies
    dawsonjg@53: I got as far as your first point before finding a major misconception:
    Absolutely right. For example, the 20 years and billions of dollars spent looking for a man made cause of climate change proves nothing until as much effort is exerted looking for alternative explanations.
    So, your suggestion is that the scientific method demands that we assemble a list of every possible hypothesis, devoting equal time and funding to all of them, and then having done that we will be in a position to pick between them? I have a few problems with that:
    1. It's not how science has ever been done in the past. You are asking climate scientists to adopt a novel and untried approach to science on your unsupported advice.
    2. The universe of possible hypotheses is sufficiently large as to make science economically and practically un-viable if done according to your approach.
    3. Those who have offered credible opinions on how science should be done in the past (Bacon, Popper, Kuhn, Feyerabend and so on) have done so an the basis of years of study of the history of science, and have argued their case on the basis of that history. You have failed to do that.
    Instead, the practical approach to science is very different, and far more effective in terms of cost and effort. First there is a search for pertinent facts. Then a hypothesis is created. The hypothesis is tested against the known data. If the hypothesis fits the data, it may either be accepted provisionally or additional predictions may be made on the basis of the hypothesis and new data obtained to further test it. If the hypothesis is considered credible, others will start building new hypotheses on top of it. Each of those will have consequences which can be tested. Every time we do a test, we get a chance to show one of our hypotheses is wrong. The problem might be the most recent hypothesis. It might be one for a while back, which has had lots of other hypotheses built on top of it. But the more we build on a wrong hypothesis, the sooner we find it is wrong. So actually, it is rather rare to go too far along a false direction. In other words, the way science is practised has several important features: - It is self-correcting, since we keep going back to the data, and the more fundamental a hypothesis is, the more it gets tested. - It is efficient, because we don't demand that a hypothesis be absolutely cut and dried before we start exploring possible consequences. - It is economical, because testing a subsidiary hypothesis provides additional tests of all the parent hypotheses. However your aim in suggesting how science should be done is a noble one. If you wish to pursue it further, then a good starting point would be to spend the next decade reading history and philosophy of science. I suggest you focus first on non-climate related areas, which are less clouded by political influences.
  29. Roy Spencer on Climate Sensitivity - Again
    If anyone can answer my question in post #1, could you also direct me to a comprehensive definition of "transient thermal response"? I see multiple uses of the term in scientific literature, but I haven't been able to puzzle out the full context.
  30. Eric the Red at 22:53 PM on 1 July 2011
    Websites for Watching the Arctic Sea Ice Melt
    Dikran, While your summary is plausible based on the data, Camburn's theory is also plausible based on the same data. It could just be coincidence that the two previous warming cycles aligned in both magnitude and duration, while the cooling cylces align in duration, but differ slightly in magnitidue. Both theories are on based on physics, but are using different principles. The physics in Canburn's sine component tells us that temperatures should decline in the next decade or two, while in Dikran's explanation, temperatures should resume a large warming trend. That is where the data enters. The data will tell us whether the temperature changes were cyclic or spurious. As a fellow statistician, I can say that three cycles (if they were to occur) have a large improbability of being due to chance alone.
    Response:

    [DB] To further Dikran's point, a focus on mystical cycles without a grounding in the physical principles underlying and explaining the cycles is termed mathturbation.  It is the statistical version of saying "I see Dead People", and little removed from poking at chicken entrails with a stick.

    This post is on Websites for Watching the Arctic Sea Ice Melt.  Other threads here at SkS deal with cycles and such can be located via the Search function.

  31. OA not OK part 1
    DB @ 13 Will do.
  32. Arctic icemelt is a natural cycle
    This St. Roch nonsense has been done to death many times before, even on SkS. Any way of searching, so that reference can be made to the last thread that went through all this ?
    Response:

    [DB] Would you care to write up and contribute a guest post on that topic?  :)

  33. Climate half-truths turn out to be whole lies
    JC = John Cook’s Age piece JD = my comments. JC Half the truth on emissions John Cook June 28, 2011 Cherry-picking the evidence to suit a pseudo-scientific argument misses the alarming reality. A Yiddish proverb states ''a half truth is a whole lie''. By withholding vital information, it's possible to lead you towards the opposite conclusion to the one you would get from considering the full picture. JD Absolutely right. For example, the 20 years and billions of dollars spent looking for a man made cause of climate change proves nothing until as much effort is exerted looking for alternative explanations. JC In Bob Carter's opinion piece on this page yesterday, this technique of cherry-picking half-truths is on full display, with frequent examples of statements that distort climate science. JD Read: “that contradict our man-made climate change narrative”. ( - Snip - )
    Response:

    [DB] Long Gish Gallop snipped.  Please see JMurphy's kind and instructive advice to you below.

    Please also read and comply with this site's Comment Policy.  Comments that do not get moderated.

  34. 2010 - 2011: Earth's most extreme weather since 1816?
    The dismissal of evidence and search for other explanations, reminds me of an analogy where acceptance of the facts is also too much to bear : Shortness of breath ? "I've been short of breath in the past. In fact, I've had worse episodes after running." Wheezing ? "Must be related to the shortness of breath, as it has been in the past." Nausea ? "It's nothing. I've had worse nausea in the past, especially after several beers." Vomiting ? "Obviously related to the nausea, which has easily explicable causes." Feeling light headed ? "Means nothing. Had it before. It's nothing." Pain in the left arm ? "I've had worse pain before and in the other arm too. Simple explanation is that it could be anything." Pain in the chest ? "I had worse pain last year and 10 years ago. So what ? Doesn't mean anything." "Hang on, put them all together and it could be a heart attack ! Why did I look at them all in isolation and dismiss the symptoms separately as nothing untoward or more serious than anything else I've experienced previously ? I must do something." Too late...
  35. alan_marshall at 20:30 PM on 1 July 2011
    OA not OK part 1
    Doug Mackie @9 One of the skeptic arguments Sks seeks to rebut is "Ocean Acidifcation Isn't Serious". On 19 March 2011, Sks posted the intermediate rebuttal under the title "Examining the Impacts of Ocean Acidifcation". This article, which I authored, describes the ocean chemistry in a set of three equations, as well as in additional notes in comment 44. While I appreciate there is more chemistry to come in this series, and different ways to write such equations, I draw your attention to these earlier articles in the interests of overall consistency of presentation of the science on this site. The material from your series will not only form the basis of a booklet, but will also be welcome input to an advanced version of the rebuttal.
  36. Roy Spencer on Climate Sensitivity - Again
    Now that is a very interesting critique, thanks Chris! Let me make a fool of myself, having produced my own very simple model to estimate climate sensitivity, described in this post. In my defence: - Unlike Spencer I make no claim to authority in this field - I think I do a better job of assessing the limitations of my own work than him. - I am also, I hope, willing to learn from others critiques. Test me on this! Rationale: I like to test things for myself. I had already reproduced the ITR from the GHCN data, and the next project seemed to be to try and make my own estimate of climate sensitivity. I realised that I didn't have time to learn enough about the system to use a physics based approach, and so was limited to an empirical approach. Tamino's 'two box model' (also implemented by Lucia and Arthur Smith) has a bare minimum of physics - two heat reservoirs driven by the external forcing. However they had to pick values for the heat capacities. Refining the time constants of the exponentials is tricky and likely to be unstable. (Indeed when I tried it, it was). Arthur Smith also tried simply fitting an empirical response function composed of Gaussian basis functions in this post, which was my primary inspiration. I didn't really like the Gaussians though, because the long tails are not constrained by the short temperature record and yet contribute significantly to the sensitivity estimate. Additionally, Arthur's model is probably overparameterised. So I pared the model down to 5 quadratic B-spline basis functions on a log_{2.5} time axis, plus a constant temperature offset (total 6 params), and the constraint that the B-spline coefficients must be positive. The result is a climate sensitivity over ~60 years of just over 0.6C/(W/m2), or 2.1C/x2CO2. That's a lower bound because of the short period, assuming there isn't some mysterious negative feedback on a century timescale, and so fits in well with the IPCC estimates. That doesn't mean it's right. (To do: A transient sensitivity calculation.) However we know from Roy's previous exploits that 6 parameters is enough to not only fit the elephant but make it wiggle it's trunk. I tried to address this by the cross-validation experiments I showed (fit the model on 62 years, predict 62). I've more recently recognised that I'm also subject to the equilibriation problem, so those results are probably invalid (unless the model is in approximate equilibrium by chance at 1880 and 1942). Stability is another interesting question, which can also give an indication of overfitting. Is the model stable? Sort of. If I remember correctly it gives fairly stable results for teh response function for a start date of 1880 and end dates between 1940 and 1980, but with a lower sensitivity of 0.44 (1.6C/2x) over ~60 years. With a later end date this increases towards the value I quoted earlier. Of course the recent data contains the strongest signal due to the big increase in both CO2 and temperature, and some nice volcanoes. But that doesn't explain the stability of the shorter run values. That's interesting and needs further investigation. But the elephant in the room is the forcings. If the forcings are bigger, the sensitivity to give the same temperature change is smaller, and vice-versa. So I am completely dependent on accepting Hansen's forcing data. If the aerosol forcing were much smaller (less negative), and thus the total forcing more positive, the resulting sensitivity would be lower. Hansen actually suggests that the aerosol forcings should be greater (more negative) than they are, however. So the outstanding problems I can see at the moment with my DIY approach are: 1. I can't prove at the moment that the model is not over-fit. 2. The stable lower sensitivity of shorter runs, with a change in sensitivity when including the recent data, needs explaining. 3. I should probably try varying the aerosol effects and see what happens. So, if nothing else I've confirmed that it is hard. I may have produced a credible sensitivity estimate, but I can't prove it to my own satisfaction. I'll carry on playing with it. But maybe you can spot some more fundamental problems?
  37. alan_marshall at 19:53 PM on 1 July 2011
    OA not OK part 1
    Doug Mackie @9 I'm with mb in being a bit puzzled about the formation of CaCO3 being described as a source of CO2 rather than a sink. That statement may be true in relation to equation 1 in isolation, but if the precipitation of CaCO3 removes carbon atoms from the water, where did those carbon atoms ultimately come from? Isn't the main source the CO2 absorbed from the atmosphere? I am looking forward to this series so perhaps we will read the answer down the track.
  38. 2010 - 2011: Earth's most extreme weather since 1816?
    #183 Stevo - very interesting insight into the human psyche there! It it possible to say that more of your audience gave the link between extreme weather and climate some thought because you presented the data before suggesting humans mightbe the cause? I suspect that if you had said something like "this human-caused global warming is driving severe weather, see this graph", more of your audience might have been incluned to avoid the evidence, their minds having shut up and gone home on mention of AGW. But by seeing the evidence first and acknowleding the trend, it might be rather harder for them to back out an pretend nothing is happening. It's a sad state of the politicisation of a scientific issue, but I have hope the people will observe the world changing around them, then wonder why? Harder to avoid the truth, then...
  39. Dikran Marsupial at 19:17 PM on 1 July 2011
    Roy Spencer on Climate Sensitivity - Again
    Roy Spencers article on whether warming causes carbon dioxide increases clearly demonstrates he is unaware of the limitations of regression analyses. In that case we know 100% of the long term rise in CO2 is anthropogenic as natural emissions are always exceeded by natural uptake, yet Spencer's simple model says that the rise is 20% anthropogenic and 80% natural. It is sad that Spencer seems not to have learned from his previous errors; fitting a model to data does not mean the model is correct.
  40. Dikran Marsupial at 19:11 PM on 1 July 2011
    Roy Spencer on Climate Sensitivity - Again
    If Spencer's model has some validity, then he will publish it in a reputable journal, if not, it will remain as a blog post and eventually be forgotten by all but the die-hard deniers. Much like his post on the rise in CO2 being due to ocean temperatures (the mathematical error in which was very obvious). In a way it is good that Roy is willing to share his ideas openly; in a less contentious area of science it would be a very good thing. Sadly in climate science it is likely that there are those who will uncritically cling onto the bad ideas as well as the good.
  41. Dikran Marsupial at 18:19 PM on 1 July 2011
    Websites for Watching the Arctic Sea Ice Melt
    Camburn wrote: "One of the main reasons for this is the sine components within the measured and proxy temp record." Human beings are very good at detecting patterns where no pattern exists. Just because there appears to be a cyclical pattern in the data, doesn't mean that it is indicatve of some underlying physical process. This is especially true where the quantity of interest is affected by a number of different independent processes. In this case, temperature is affected by solar forcing, CO2 radiative forcing, aerosol cooling, ... and on top of that you have the effects of the various feedbacks. So if there is an apparent cycle, how can you tell whether there really is some underlying cyclical physical process. Just looking at the data can't tell you that; statistics can't tell you that (correlation is not causation); what you need is physics. Climatologists call this sort of work an "attribution study", for instance the IPCC attribute the warming of the first half of the 20th century to solar forcing, aerosol cooling cusing a levelling of of temperatures mid-century and CO2 radiative forcing dominant from 1970 or so. These three independent events give the appearance of a centenial scale cycle, but that is entirely spurious. As a statistician, I know that physics trumps stats, so I go with the climatologists who have explanations of why the climate record looks like it does, rather than statisticians/eye-ballers who attribute the climate record to unexplained "climate cycles".
  42. Oceans are cooling
    Hansen is not so coy - the 2011 paper analyses the "flat" OHC. (And 0-2000 is not flat). Within that framework, he also makes prediction for what OHC will imply for global energy balance over the full cycle (a prediction of less than 10 years, and yes, he expects OHC to increase and within given bounds). If Hansen is right (ie the GCMs are right), then we have energy imbalance. You cant blame ocean cycles moving heat around because whole ocean is warming. TSI is flat (do you mean this Svalsgaard paper? Low early 20C solar would imply higher CO2 sensitivity wouldn't it?). Enough to change your position? On the other hand, if OHC decreases over that period, I would say climate theory needs a serious overhaul.
  43. 2010 - 2011: Earth's most extreme weather since 1816?
    Albatross @ 182 "curious coincidence? Unlikely. And I for one am incredibly tired of people alleging to post here in "good faith" when all they appear to be doing is regurgitating stuff from a highly questionable political document prepared in the guise of science. And that said regurgitations do not even accurately represent the findings of the original paper or are not applied in context." Sorry to disappoint but I have not heard of NIPCC report until I went to your Tamino link. It is not that curious of a conincidence. I go on Google and put in key phrases looking for historical data. I am seeking peer-reviewed publications. It is an easy coincidence. The claim made is that droughts are increasing and intensifying. So I put in the History of droughts and other search titles looking for .pdf material (seems a better chance for peer-review or at least well explained data, usually good graphs and detailed explanations of how data was collected and compiled and maybe good statistical analytical tools). The compilers of NIPCC report would probably be doing the same thing...looking for .pdf files for their material on droughts, flooding, hurricanes, tornadoes, all the typical events counted as weather extremes. They would be able to find the same articles. I do not understand your point ("And that said regurgitations do not even accurately represent the findings of the original paper or are not applied in context") based upon my posts. I post a link to the article, read the material and then look at the graphs. I then ask you to tell me where is an obvious trend that indicates an increase in frequency, intensity or duration of historical droughts in North America. You know Tamino uses his "eyecrometer" with his graphs on arctic ice loss. Here is his posted arctic ice graph Tamino Arctic ice graph. What he says about this graph. "Identifying a change in Arctic sea ice extent that can be attributed to temperature is as easy as looking at this graph:" Note also in this post by Tamino, he goes on to show how statistics and trend lines can easily be distorted to show a false picture of reality. That is why I am seeking very long trends and just posting the graphs. Because of the nature of my posts I still would like to understand "And that got me thinking about the fairly steady stream of papers being posted by "skeptics" (e.g., Norman)on this thread trying to convince people that there is nothing un-towards or unusual going on with the climate system." I am not trying to convince you of this, I am posting the information and asking for you to explain to me an upward trend in the data provided. If it is there, I am fine with it. A simple request I would think. Last one. Can you clarify this statement as I think the last sentence is about my posts.... "I will also note that one of the papers that they (NIPCC; Idso and Singer) cite in reference to drought in N. America is being used for purposes not intended by the authors. I happen to know the authors of the paper in question and I know for a fact that they are not "skeptics" or in denial about AGW. So these it is worrisome to see Idso and Singer to misrepresent the science in papers that actually do not go against the theory of AGW. And worse yet, to see uncritical "skeptics" perpetuate the misinformation and distortion." What misinformation and distortion am I perpetuating? Here is the content of my post @140: "Here is one with droughts across North America. In the text they explain that the causes of drought in North America were also responsible for Global Climate patterns (more rain in some areas droughts in others). From this study it states there were much worse droughts in the past than today. They also have graphs at the end of the article which show 1000 years of droughts. I would challenge you to find an increase in frequency of droughts today as compared to the long 1000 year history." Here is a direct quote from their conclusion (Paper on 1000 year droughts posted @ 140): "Many of these reconstructions cover the last 1000 years, enabling us to examine, in detail, how the famous droughts of modern times compare to their predecessors during a time of quite similar boundary conditions (e.g. orbital configuration). Upon examination, what becomes apparent, is that the famous droughts of the instrumental era are dwarfed by the successive occurrence of multi-decade long ’mega-droughts’ in the period of elevated aridity between the eleventh and fourteenth century A.D. Whilst these mega-droughts stand out in terms of persistence, they share the severity and spatial distribution characteristics of their modern-day counterparts." My question asked for evidence of increased frequency, intensity or duration. The mega-droughts were of the same intensisty but had a much longer duration.
    Response:

    [DB] "You know Tamino uses his "eyecrometer" with his graphs on arctic ice loss."

    Note also that Tamino always performs background analysis to check for significance, which work, sadly, he does not always show).  But he always performs them anyway.

  44. 2010 - 2011: Earth's most extreme weather since 1816?
    Norman "... a list of extreme weather events of 2010 without at least investigating other years and long term climate patterns of given regions?" What you seem to be overlooking is that 'extreme weather events' include 'long term climate patterns' of all the the places mentioned, by definition. Everyone would probably accept that my local, record-breaking 1 in 3000yr heatwave event of a few years ago counts as hot by almost anyone's notion of hot weather. But more recently, Sydney had a record heatwave, temperatures nothing like ours, but devastating nevertheless. (It's probably a fairly subjective judgment whether 40C+ with literally breath-taking <15% humidity feels worse than 32C at >70%. After a week, nobody really cares.) Heatwaves are defined as x degrees above local climate norms for the time of year. Excessive rainfall likewise. After all, monsoons are 'normal' in some places. Here normal monsoon rainfall would be unbelievable. In Adelaide at any time of the year, an overnight minimum of -1C would set a record - Canadians would laugh themselves sick at that idea. Queensland flooded in the Gulf country around the same time as the devastating floods in SE Queensland. Only the SE floods count as extreme. The floods in the north might have been a bit on the high side of normal (I don't even know) - but they don't count as extreme because flooding is normal for that area in that season. Extreme events are anomalous - by definition. If you want to check meteorologists' reports for those areas for 100+ years, you're setting yourself a large, and largely futile, task.
  45. 2010 - 2011: Earth's most extreme weather since 1816?
    Norman @184, the information you are after can be found from Loew and Wirtz, "Structure and needs of global loss databases of natural disasters". Loew and Wirtz describe the classification system used by Munich Re. The categorise by disaster type, by the quality of the source information, with events documented from low quality sources not included in the data base for analysis, but retained in case of confirmation, and by the "disaster category". The disaster category is a scale from 1 to 6, with one being from 1 to 9 deaths, and/or minor damage; 4 being a Major Disaster, more or less as defined by Neumayer and Barthel, but with a stepped scaling for property damage; and 6 being a Great Natural Disaster, ie, a Great Natural Catastrophe as defined by the UN. More information can be found in this Munich Re document, which includes a category 0 for not fatalities/losses, ie, a natural hazard. I presume category 0 events are retained on the record but not included in the data base.
  46. Bibliovermis at 16:25 PM on 1 July 2011
    The greenhouse effect is real: here's why
    HenryP (#49), From your site: if an increase in green house gases is to blame for the warming, it should be minimum temperatures (that occur during the night) that must show the increase (of modern warming). In that case, the observed trend should be that minimum temperatures should be rising faster than maxima and mean temperatures. That is what would prove a causal link. --- From this site: post-1970 results show monthly-minimum temperatures rising faster than monthly maximum temperatures. These results are entirely consistent with greater CO2 forcing later in the century than earlier. --- A Quick and Dirty Analysis of GHCN Surface Temperature Data (paragraph above Figure 3)
  47. 2010 - 2011: Earth's most extreme weather since 1816?
    Stevo @ 183 "Norman @178 The trouble with the eyecrometer is that it is entirely subjective and prone to cherry picking." Is that really the case? Here is a graph of all the major temp records of the globe. Major global temp graphs combined. Would your "eyecrometer", looking at this graph, not conclude the Globe is warming? For large and obvious trends statistical analysis is not necessary, the eyecrometer does fine. How could one subjectively claim that the graph does not indicate global warming? If you look at many of my posts, I am attempting to find peer-reviewed articles or at least ones with indepth explanations of large regions. Sometimes difficult to find. Why would this qualify as cherry=picking? Cherry Picking. “Choosing to make selective choices among competing evidence, so as to emphasize those results that support a given position, while ignoring or dismissing any findings that do not support it, is a practice known as “cherry picking” and is a hallmark of poor science or pseudo-science.” – Richard Somerville, Testimony before the U.S House of Representatives Committee on Energy and Commerce Subcommittee on Energy and Power, March 8, 2011 [1] "Good science looks at all the evidence (rather than cherry picking only favorable evidence), controls for variables so we can identify what is actually working, uses blinded observations so as to minimize the effects of bias, and uses internally consistent logic."" I have stated what my goals are with my many posts. Would it not be cherry-picking to accept a list of extreme weather events of 2010 without at least investigating other years and long term climate patterns of given regions? Jeff Masters did good research work to compile his list of extreme weather events for 2010. Now it is my research work to see if the extreme weather events for 2010 are really so extreme in a historical context. If you agree that you can see global warming from the temp graphs using your eyecrometer, than why is it subjective to look for obvious trends in other data? Lastly, I am still in the process of figuring out the Munich Re disaster list. I am in need of a clear definition of a disaster as used as a number in the graphs.
  48. Climate half-truths turn out to be whole lies
    Nicely done John Cook. Carter should be ashamed.
  49. MoreCarbonOK at 16:04 PM on 1 July 2011
    The greenhouse effect is real: here's why
    Well you clever guys. If I were you I would still have a peek at my new and updated pool tables. http://www.letterdash.com/HenryP/henrys-pool-table-on-global-warming No junk science there. No hypothesis. Actual results from actual measurements> Quite interesting results, too. I have split up the NH and the SH. Any ideas on the differences between NH and SH?
    Moderator Response: [Dikran Marsupial] IIRC, it has already been pointed out to you that the difference in NH and SH trends is probably due to the unequal distribution of land mass between the two hemispheres. The problems with your pool have also been pointed out to you, repeatedly. Please do not repeat that discussion here, if you want to discuss your pool, please read the excellent series of posts by Glen Tamblyn (starting here), and if you still want to argue that your method is best, then first you need to explain why the methods used by the climatologists, who have looked into the issues in great detail, are defficient. That discussion should be posted on one of the articles in Glen's series, not here.
  50. 2010 - 2011: Earth's most extreme weather since 1816?
    Rob Honeycutt @179 I read your train analogy. It is simialar to the one posted by Sphaerica @ 88 about falling out the window of a tall building and assuming everything is alright until you hit the ground. I did post a reponse to Sphaerica @ 96. If you read my post it will demonstrate I am indeed wanting to move away from fossil fuel and find alternate replacements that can not only maintain a high standard of living for industrial nations but for all the people on Earth. Fossil fuels (even without CO2 byproduct) could not meet the need of raising the standard of living for all people. Your quote "Norman @ 178... Your comment here contains a common theme that I see in those who wish to dismiss climate change as man made and a serious issue." I do not believe any of my posts suggested that climate change was not man made or that is a position I advocate. Man is releasing a large amount of CO2 while destroying some carbon sinks, that is factual information. CO2 has been empirically demonstrated to redirect IR in its absorption bands (measures of down-welling longwave radiation and also a lower amount of IR in these bands being emitted to space as measured by orbiting satellites). It could even be a serious issue in the future. My posts are a basic challenge to the linking of extreme weather events to global warming without providing the actual mechanisms that would verify this conclusion. An extraction of Jeff Masters from Tom Curtis post @181 "But the ever-increasing amounts of heat-trapping gases humans are emitting into the air puts tremendous pressure on the climate system to shift to a new, radically different, warmer state, and the extreme weather of 2010 - 2011 suggests that the transition is already well underway." I am questioning if 2010 was really that extreme. The only way to do this is to look at historical weather information, to have something to compare it to. I am questioning if Global warming is causing a noticeable change in climate or weather patterns. Climate defined here.

Prev  1603  1604  1605  1606  1607  1608  1609  1610  1611  1612  1613  1614  1615  1616  1617  1618  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us