Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  921  922  923  924  925  926  927  928  929  930  931  932  933  934  935  936  Next

Comments 46401 to 46450:

  1. The Fool's Gold of Current Climate

    Serendipity graphs above are also on a hopeful site, because they consider the A-O CO2 exchange only. They do not consider the earth system response. So far, what is known about it, is that we can expect only positive feedbacks: methane release from clathrates and permafrost, decreased albedo from melting arctic ice, warmer ocean degassing CO2 because warm water can hold less of it.

    The only problem is that the quantity of those feedbacks are unknown (maybe with the exception of ice albedo). I expect those figures (abstractive and outdated already, we need to update the starting level to 400ppm) to become more pessimistic (more warming in the pipeline) once those positive feedback are quantifiable.

  2. Making Sense of Sensitivity … and Keeping It in Perspective

    Just remember that formula is post-hoc. You get sensitivity out of a climate model run by solving for k from deltaT/deltaF which gives you a useful way to estimate temperature for given forcing. However, the GCM do not derive temperature from that formula internally that internally.

  3. Bob Lacatena at 14:28 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer --

    Spencer Weart has this reference to the first such calculation in 1967 by Manabe and Wetherald.

    You might want to look over this timeline.

    I'd also very strongly suggest reading Spencer Weart's The Discovery of Global Warming.  It's interesting reading, and it adds a lot of depth to both an understanding of the science and how old and broadly based climate science is.

  4. Bob Lacatena at 14:22 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    [engineer -- Your other post just went onto the next page.]

  5. Bob Lacatena at 14:21 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    ∆T = k log2(CO2final/CO2initial)

    Where k is the climate sensitivity in degrees C per doubling of CO2.

    I myself have never found the derivation for that, either. We at SkS should probably make a concerted effort to find it, as it would be well worth looking at and referencing.

    It may have arisen primarily from experimental observations, or else through "experimentation" using the MODTRAN line-by-line radiative transfer computations (developed by the US Air Force, one of the pioneers in this stuff, due to their interest in making infrared missiles work properly in the atmosphere).  If it was determined through physical principles, it would need to take into account the varying density of the atmosphere (with altitude), as well as the resulting variations in IR absorption and emission as balanced against the number of collisions per second with non-GHG molecules like O2 and N2 (and of course the number of collisions is affected by both density and temperature, i.e. the average velocity of each molecule).  Then there are other complications such as bandwidth overlaps with other greenhouse gases (like H2O), and broadening of the absorption spectrum (pressure broadening and doppler broadening).

    All in all, it's pretty complicated.

    I'll ask and see what people can turn up.

  6. engineer8516 at 14:17 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    I read through the link. The formula I was referring to was dT = climate sensitivity * dF.

    Hopefully this isn't a doble post. I'm not sure what happened to my other one.

  7. Bob Lacatena at 14:12 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    You might also want to look at this page, courtesy of Barton Paul Levenson.  I don't think it's been updated since 2007, so it lacks a good 5 years worth of further research, but it gives you some idea of the breadth of the work that's been done in the area, and how much the end results give basically the same answer.

    [Be wary of any study that gives too high or too low a climate senstivity.  Like anything else, the outcome depends on underlying assumptions, and not all papers that are published withstand scrutiny forever.  In fact many are quickly refuted.  Peer-review is only the first hurdle.  A good example is Schmittner et all (2011), which found a lower climate sensitivity than many, but also assumed a lower temperature change from the last glacial to the current interglacial -- a lower temperature change obviously will yield a lower sensitivity, so the question shifts more towards establishing the actual temperature change in order to arrive at the correct sensitivity... as well as recognizing that his was the sensitivity of the planet exiting a glacial phase.]

  8. engineer8516 at 14:10 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    I looked at the link. I was referring to the formula dT = climate senstivity * dF.

  9. Bob Lacatena at 14:05 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    It's not derived through a formula -- that would be like having a single derived formula that computes the expected age of a species of animal, based on the animal's biology.  It's just too complex for that. 

    The link I already gave you ("many methods of estimating climate sensitivity") gives some (not all -- in particular, that link skips over modeling, which is a very important and valuable technique) of the methods of computing sensitivity.  To really understand it you'd need to find copies of and read many of the actual papers.

    Another approach is to use the search box up top, and search for "climate sensitivity".

    The best thing you can do with climate sensitivity is to learn a lot about it.  Make it a multidimensional thing that you understand from many angles.

  10. engineer8516 at 13:32 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    thanks for the replies @sphaerica I wasn't trying to insult climate scientists. I was trying to figure out the basis for the assumption and I wasn't implying that it was abitrary.

    Also do you guys know of any good links that goes into the details of the derivation of climate sensitivity? Not how the value is estimated, but the derivation of the formula. I couldn't find any good sites on Google. Thanks again.

  11. The two epochs of Marcott and the Wheelchair

    Forgive me if I havent been following this closely enough, but surely no spike in global temperatures is possible unless there are spikes of the appropriately same magnitude in individual proxies. So one question, is there any spikes of centennial scale with size of modern warming evident in individual proxies (like ice core) that have sub-century age resolution? If there are none, then it hardly matters about the nicities of the multi-proxy processing surely?

  12. Bob Lacatena at 12:58 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    One last thing.  You said:

    ...we're assuming that climate sensitivity behaves nicely...

    No, we're not.  Scientists aren't stupid, and they don't work from arbitrary assumptions.  There are reasons for believing the climate sensitivity behaves a certain way, based on physics, past climate and present observations.  It's not just some assumption that has been wantonly adopted because it makes life easier.  Nobody in any field or profession gets to do things that way.  Why would climate scientists?

  13. Bob Lacatena at 12:55 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    That's a good question, with a complex answer.

    It is absolutely true that climate sensitivity is not and would not be exactly constant. Climate sensitivity is a result of a wide variety of feedbacks which individually have different impacts.

    There are fast feedbacks which are physical mechamisms which are somewhat predictable through physical modeling (for instance, the fact that warmer air will hold more moisture, thus adding the greenhouse gas effect of H2O to the air).

    There are also slow feedbacks that depend on physical, initial conditions.  The ice sheets, for example, during a glacial period contribute a lot to keeping the planet cool by reflecting large amounts of incoming sunlight.  When temperatures warm and the ice sheets retreat, that results in a positive feedback.  If you imagine the ice sheets spread over the globe, it is easy to see that those ice sheets are larger when they are further south.  As the ice sheets retreat, they get smaller and smaller, and each further latitude of melt reduces them by less, so that the feedback is not continuously proportional.

    CO2 as a feedback instead of a forcing is also a diminishing feedback.  As you add more and more CO2 to the atmosphere, the additional CO2 has less and less of an effect, so you need even more CO2 to accomplish the same amount of warming.

    So for any particular feedback, the initial climate state is important.

    But there are a lot of different, intermixed feedbacks.  CO2 and CH4 can be added to the atmopshere due to major ecosystem changes (forest die offs or permafrost melt).  There is the melting of ice sheets.  Ocean circulation and stratification patterns can change.  The list goes on.

    As a result, given all of the varying feedbacks with varying effects under different conditions... it all averages out.

    There are many methods of estimating climate sensitivity.  Some look at past climates, to see what has happened before.  Some work with models the try to emulate the physical mechanisms.  Some directly observe how the climate changes in the very short term due to known and measured forcing changes.

    The thing is that all of these methods produce varying results in a broad range, but most converge on the more narrow range of 2 to 4.5C, and most converge on the same value of about 3C.  Taken individually, nothing is exactly the same as the current climate, but since most studies, past and present, seem to fall into the same range, it suggests that there is validity to the broad assumption (Occam's Razor) that the climate generally behaves in about the same way.

  14. Making Sense of Sensitivity … and Keeping It in Perspective

    Engineer - that was done more or less by Broecker for his remarkably accurately 1975 prediction but that is not how any modern climate model work. Instead, climate is emergent from the interaction of forcings with the various equations in the model. If you want to know what the climate sensitivity of model is, then you work backwards from the temperature at end point as calculated by model compared to CO2 forcing. Can do run the model with various forcing to see what sensitivity to say a solar forcing of same magnitude is. (see for instance ModelE results). Over a very big temperature range, there would be good reason to suppose sensitivity would change. Eg when all ice is melted from both poles, then the only only albedo feedback would be weak ones from land cover change. Preserve us from having to worry about that for the next 100 years.

  15. The two epochs of Marcott and the Wheelchair

    Tom Curtis - To clarify my earlier comments regarding Monte Carlo analysis: 

    Given a set of proxies with date uncertainties, if there is a large (50 sigma?) signal involved, even if the initial dates are incorrect at least some of the realizations in the perterbation space will accurately reflect that spike, being shifted to reinforce each other. And some will show more than that spike, due to unrelated variations being date-shifted intot he same time period. The number, and density of the Monte Carlo probability function, will be dependent on the distance of the various date errors - but there will be a density function including the spike and more in the full realization space. This is very important - if an alignment that shows such a spike is possible under the uncertainty ranges, it is part of the Monte Carlo realization space

    1000 runs is going to sample that space very thoroughly. And nowhere in the 1000 Marcott runs pre-19th century do you see a density function excursion of this nature. 

    [ Side note - pruning the Monte Carlo distribution would be completely inappropriate - the entire set of realizations contributes to the density function pointing to the maximum likelyhood mean, and pruning induces error. Even extreme variations are part of the probability density function (PDF). The mean in the Marcott case is clearly plotted; "Just what were the mean values for the thousand years before and after that spike in that realization? Clearly you cannot say as its hidden in the spaghetti." is thereby answered. ]

    With very high probability, the real temperature history does not follow the mean of the monte carlo reconstructions...

    I disagree entirely. That is indeed the maximum likelihood of the history, based upon the date and temperature uncertainties. And the more data you have, the closer the Monte Carlo reconstruction will be to the actuality. Exactly? No, of course not, reconstructions are never exact. But quite close. Monte Carlo (MC) reconstructions are an excellent and well tested method of establishing the PDF of otherwise ill-formed or non-analytic functions. 

    It would certainly be possible for somebody to go through and make a maximum variability alignment of idividual proxies as constrained by uncertainties in temporal probability. If that were done, and no convincing, global Tamino style spikes existed in that record, that would be convincing evidence that no such spikes existed.

    That is exactly what a MC exploration of date and temperature uncertainties performs. No such excursions are seen in any of the 1000 realizations. Meanwhile, the rather smaller +/- excursions over the last 1000 years are quite visible, as is the 8.2 Kya event of 2-4 centuries (and yes, it's visible as a ~0.1-0.15 C drop in Marcott). I believe it is clear that the Marcott data would show a larger 0.9 C global two-century spike event if it existed - and it does not show. 

  16. Glenn Tamblyn at 12:34 PM on 6 April 2013
    Food Security - What Security?

    Agnostic, you mention one impact of warming in the summary but don't expand on it any further; warming of the oceans. I have never actually seen an assessment of the negative impact on biological productivity in the oceans of increased water temperatures. Have you any references on that?

  17. engineer8516 at 12:26 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    I have some questions regarding climate sensitivity.

    Basically temp data and changes in forcing are used to calculate climate sensitivity, which is then used to calculate the rise in temp that would result from 3.7W/m^2 of forcing from double CO2.

    My problem with this calculation is that we're assuming that climate sensitivity behaves nicely (ie. is almost constant). My question is how do we know that this a reasonably valid assumption? Thanks.

  18. Food Security - What Security?

    Villabolo @ 4:

    Da Google sent me to this link...

    http://www.skepticalscience.com/North-Carolina-Lawmakers-Turning-a-Blind-Eye-to-Sea-Level-Reality.html

  19. Food Security - What Security?

    Concerning food storage, it is possible to store wheat and other grains for 10 to 30 years depending on the method.

    One method is conventional cans (about gallon size and larger) in an oxygen free environment - nitrogen is used to replace the oxygen. So long as its stored, out of the heat, at room temperature it will last for 20-30 years. Brown rice will only keep for 10 years.

    Abrupt changes in climate will give us a 'feast or famine' cycle where we'll be able to grow adequate amounts some years but suffer extensive loss on other years. The only way to alleviate that is to store large quantities of grains during the 'good' years to make up for the bad years.

    Eliminating biofuel production will also give us a large safety margin.

    Going vegetarian will help even more but it's not going to happen voluntarily.

  20. The two epochs of Marcott and the Wheelchair

    Sphaerica @48, I assume you do not think I am following the steps to denial.

    The fact is that some people, most notably Tamino who at least has made a check of the claim, are claiming that the Marcott et al. reconstruction excludes Tamino style spikes.  That is more than is claimed by the authors of the paper.  Before we can accept that Marcot et al does exclude such spikes, we need to check the evidence.  So far the evidence is tantalizing but far from conclusive.  Ergo we should not use that claim as evidence in rebutting AGW skepticism.  When "skeptics" get to step two, we need only point out (truthfully) that they are over interpreting Marcot et al's remark about frequency and that they have not shown that such a spike could exist and not be shown in the Marcot et al reconstruction.  In short, they have no evidence for their claim at step two.  Although many AGW "skeptics" like to treat an absence of refutation of their position as proof their position is true, we need not accept such irrationality and cannot reach those who continue to accept it once it is pointed out to them.

    However, this discussion is indeed a distraction and absent a significantly new argument or evidence that changes my opinion, I shall withdraw from further discussion of this point.

  21. The two epochs of Marcott and the Wheelchair

    KR @43:

    "Hence I would consider that a single realization at that point represents a value that is too high for the data, an outlier, that consists of perturbations that happen to overlay variations in a reinforcing fashion at that point."

    I think you are missing the point of the monte carlo reconstructions.  We know that there are errors in the temperature reconstruction and dating of the proxies.  That means at some points the unperturbed reconstruction will be too low, or too high relative to reality.  At those points, particular realizations in the Monte Carlo reconstructions which are low or high (respectively) are closer to reality.  Not knowing the precise course of reality, we cannot know whether or not particular spike in a particular realization, whether high or low, maps reality closely or not.

    One way you could approach the general problem would be to do 20,000 monte carlo realizations and to prune away all realizations with an overal probability relative the the original data and errors of less than 5%.  That will leave you with approximately 1000 realizations all of which could plausibly be the real history of temperature over the Holocene.  You could then examine those thousand plausible realizations to see if any contained Tamino style spikes.  If not, then the data excludes such spikes.  If they do, the data does not.

    As it happens, Marcott et al did not prune their realizations based on global probability. Consequently about 50 of their realizations are not plausible temperature reconstructions.  Importantly, though, even plausible reconstructions will contain short intervals which are implausible given the data.  Indeed, about 5% of their length, on average will be implausible based on the data.  Therefore we cannot look at a single peak and conclude from the fact that  the realization is clearly an outlier in that time period, that the realization is an outlier over the entire holocene.  You need the entire realization to come to that conclusion, and due to the nature of 1000 realization spagheti graphs, we do not have the entire history of any realization.

    So, for all we know from the spaghetti graph, there are plausible realizations containing Tamino style spikes within  it.  In fact, what we can conclude from the Marcott et al realizations is that:

    1) With very high probability, the real temperature history does not follow the mean of the monte carlo reconstructions;

    2) With very high probability, the 300 year mean of the real temperature history lies within 0.27 C of the  mean 95% approximately 95% of the time; but that

    3) With high probability the 300 year mean of the real temperature history is further than 0.27C from the mean about 5% of the time.

    The question is whether the approximately 275 years we expect the 300 year mean of the real temperature to be over 0.27 C above the mean is the result of a sustained multicentury warmth, or whether it could be from a Tamino style spike.  (Note, a Tamino style spike as a 300 year mean temperature increase of 0.3 C.)  On information todate, the later is improbable, but not excluded.

    "But this is all really a side-show. We have proxies down to decadal and near-annual resolution (ice cores and speleotherms, in particular), and none of them show a global 'spike' signal of this nature. The only reason the question was raised in the first place, IMO and as noted here, is as obfuscation regarding global warming. Current warming is unprecedented in the Holocene, and it is due to our actions - it's not a 'natural cycle'."

    Well, yes, they don't show global anything because they are regional proxies.  I assume you mean stacks of such proxies.  Well, consider the only two high resolution (20 year) proxies used by Marcot et al over the period 8 to 9 Kya:

    In particular consider the spike of 3 C at 8.39 Kya in the Dome C icecore, which is matched by a 1.5 C spike in the Agassiz-Renland icecore.  That looks suspiciously like a Tamino style spike.  Or consider the 3 C trough at 8.15 Kya in the Aggassiz-Renland icecore and the 2 C trough at 8.23 K in Dome C.  Although the resolution of these ice cores is actually annual, at 8 Kya plus years, the 1 sigma error of age estimate relative to 1950 is greater than 150 years.  Ergo it is quite possible those two troughs should align, creating a Tamino style trough.

    These two possible candidates do not show up in the mean of the Marcott et al reconstruction.

    It would certainly be possible for somebody to go through and make a maximum variability alignment of idividual proxies as constrained by uncertainties in temporal probability.  If that were done, and no convincing, global Tamino style spikes existed in that record, that would be convincing evidence that no such spikes existed.  But until it is done, there is sufficient variability in individual regional proxies that we cannot make the claim that the individual proxies exclude that possibility.

  22. Food Security - What Security?

    @ Matt #1

    "Remember North Carolina's 2012 bill that would have required people to only use past trends to predict sea level rise?"

    Can you provide me a reference to that bill?

     

  23. Daniel Bailey at 09:20 AM on 6 April 2013
    The Fool's Gold of Current Climate

    It is the mark of wisdom to also consider the Temperature Change portion of Andy's graphic above.  And note that this is the result of a complete cessation of human-sourced emissions (brought to zero and held there for 300 years).

    Questions, anyone?  Bueller?

  24. The Fool's Gold of Current Climate

    william: What comes out of his talk is that if we stop putting carbon dioxide into the atmosphere, it could reduce in the atmosphere rather rapidly.

    Actually, if we had stopped emissions dead, in 2010, the atmospheric concentration of CO2 would have declined to about 340 ppm by 2300. For comparison, that's the amount it was in 1980. The CO2 will reduce, over 190 years, at approximately one-sixth of the rate that we are currently increasing it. That's the most drastic case imaginable. Graph below is from Serendipity.

  25. Food Security - What Security?

    PS: Lester Brown has *lots* of raw data for his book 
    "Full Planet, Empty Plates; The new geopolitics of food scarcity"
    on his website (as excel files): http://www.earth-policy.org/data_center 

  26. Food Security - What Security?

    Not Everybody thinks we will have 10 billion humans: Jorgen Randers (Club of Rome) predicts a peak of 8 billion due to lower fertility in cities, see short introduction of his report to the club of Rome "2052, a global forecast for the next 40 years", picking up after 40 years of "Limits to Growth": http://www.clubofrome.org/?p=703 . Randers says, that this lower than generally assumed population will cause lower growth than expected and a push back of the more catastrophic climate change effects to the second half of the century (if nothing is changed, which is what he explicitly assumes after 40 years of environmental activity with limited success ...).

    It is also interresting to view the three videos given a the Smithsonian institute for the 40 years of "Limits to Growth", with each of the three speakers (Meadows, Randers, Brown) giving a different priority to the three dangers from the "Limits to Growth": resource scarcity (Oil, ...), pollution (climate change), population (food, water, ..).

    Dennis Meadows (Oil; Resource Scarcity):
    http://www.youtube.com/watch?v=f2oyU0RusiA
    Jorgen Randers (Climate Change; Pollution):
    http://www.youtube.com/watch?v=ILrPmT6NP4I
    Lester Brown (Food+Water; see his book http://www.earth-policy.org):
    http://www.youtube.com/watch?v=KPfUqEj5mok

  27. Matt Fitzpatrick at 07:26 AM on 6 April 2013
    Food Security - What Security?

    The "10 billion by 2065"(pdf) projection appears to have been made by the U.S. Census Bureau under the Bush administration in 2004, based on 2002 data, in a report with zero mentions of climate change, and zero references to climate change publications. This leads to eyebrow raising predictions, like Chad being among the fastest growing nations on Earth through 2050, more than tripling its population, even as Lake Chad shrinks to a record minimum in the west and desertification creeps into the east.

    Remember North Carolina's 2012 bill that would have required people to only use past trends to predict sea level rise? This Census Bureau report reminded me a lot of that.

    So it'll be interesting to see what effects climate change predictions will have on the Census Bureau's next world population projection revision - assuming it doesn't ignore climate change this time. Surely projected birth and mortality rates should change, at least on a regional basis. Combined with migration triggered by climate change, I'd expect the distribution of population growth to be on a different track, and perhaps even the total population curve.

  28. The two epochs of Marcott and the Wheelchair

    Thanks KR, I will have a look at them.

  29. The two epochs of Marcott and the Wheelchair

    Dissident - NOAA has a great number of climate reconstructions here, most with the data tables. Others can be found with a simple Google Scholar search, such as on "Holocene temperature reconstruction". 

    Available Holocene reconstructions include those based on ice cores, pollen deposition, speleothems, alkenonesbenthic foraminifera, corals, and so on. 

  30. grindupBaker at 06:21 AM on 6 April 2013
    The Fool's Gold of Current Climate

    @Michael Whittemore(20) "...a little warming from induced CO2 could be a good thing, which Dana also seems to suggest", so there's the obvious corollary that humans this century using every last drip of coal & problem-recovery unconventionals is a tad selfish now we know that future humans might have used it to mitigate a glacial when it started rather than having it dissolved uselessly in oceans after a long hot spell.

  31. The Fool's Gold of Current Climate

    It is far to flippant to call Riley's comments greenwash.  If one can take his points as being valid, and I see no reason why he would lie about them, this is the other side of fossil fuels.  We have increased growth of plant life due to increased carbon dioxide in the atmosphere and a reduced the use of resources from nature which were to the  detriment of the natural environment.   It is still, probably a valid point that if we continue to put ever increasing amounts of carbon dioxide into the atmosphere, we will very likely cause a climate flip and that would likely be disasterous.  What comes out of his talk is that if we stop putting carbon dioxide into the atmosphere, it could reduce in the atmosphere rather rapidly.  The other point is that the more we switch to wind, solar and other energy sources such as wave and tidal currents, the better off we will be all around.  We won't be testing the theory of sudden climate change with gay abandon and we won't be putting a strain on the woodpeckers.  I think we should take his comments seriously, examine them as scientists should and combine them into a bigger picture.

  32. The two epochs of Marcott and the Wheelchair

    KR @ comment 43 gave the best debunkment of these 'magic spikes' in temperature, the fact that there are other proxies from ice cores which show (in polar regions) there have been no such variations - if there were, where are the peer reviewed papers demonstrating such. Or would a 'magic spike' occur everywhere else except polar regions? Something, which from my relatively limited yet growing understanding of the Earths climate would require the total decoupling of polar regions from the rest of the Earths climate systems (even down to a physical barrier reaching to the mesosphere, since those proxies are based upon the isotope ratios of oxygen, carbon dioxide etc) it would be nice to see a comparison between them. Are there any that can be cross linked to?

  33. Bob Lacatena at 04:16 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    Steps to Denial:

    1. Marcott does not prove that there could not have been previous spikes equivalent to current warming
    2. We don't know enough about the climate system, so there could have been some magical climate force that did cause a previous spike, just like today's
    3. CO2 is not causing the current warming, even though they can't find an alternative explanation (and this is presumably supported by the fact that magical, invisible past spikes in warming do, in fact, maybe-exist).
    4. CO2 would not cause any warming, even though our understanding of physics, the atmosphere, and evidence of past climates all show definitively that it would.

    So, the deniers deny:

    1. That Marcott demonstrated a lack of similar episodes
    2. That we know enough to recognize why such episodes are at best unlikely
    3. That CO2 is causing the current warming (it's due to mumble-mumble-mumble).
    4. That CO2 should cause any warming.

    For deniers to support their position, they must prove:

    1. That the Marcott paper is invalid.
    2. That at least one previous, similar spike exists
    3. That there is an alternative, physical cause for current warming
    4. That CO2 is not causing current warming
    5. That there is some reason why CO2 would not cause any warming
    6. That there is some mechanism that would bring temperatures back down as fast as they have risen

    And yet they seem to be having a rather hard time with proof #1.

    Does anyone else see how ludicrous this is?

  34. Philippe Chantreau at 04:10 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    Actually KR, the argument from the fake skeptic side seems rather to be that there could be spikes at any time that were due to yet unknown forcings such as leprechauns, ballybogs or grogokhs...

    These of course are powerful enough to make temps rise very fast. Then they are also of such nature as to suppress feedbacks so temperatures can also come down very fast when they decide to go to bed after a hard half century of work, thus leaving no trace whatsoever in the proxy records. Sha-zam...

    You can't make this stuff up...

  35. Rob Honeycutt at 04:01 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    KR...  They also lack any sort of mechanism that might actually cause such spikes.  

    I've only been watching this discussion in a passing manner but the arguments against Marcott seem to me to be something akin to fanatical navel gazing (i.e., rather meaningless).

  36. Daniel J. Andrews at 03:40 AM on 6 April 2013
    The Fool's Gold of Current Climate

    Andy Skuce already posted the article I was going to post. And CBDunkerson just highlighted the same point I wanted to highlight. And mandas gave the seasoned experienced wildlife biologist perspective, which is what I wanted to do. Then dhogaza trumped me with a couple of comments regarding Terranova's graph, and what Terranova will learn when he starts his new Masters.

    I'm feeling quite redundant and rather useless here---think I'll go comment on a "skeptic" site where I'm outnumbered but at least can post relevant scientific information before anyone else. :)

  37. The two epochs of Marcott and the Wheelchair

    Rob Honeycutt - "If there were such spikes, would that not be an indication of extremely high climate sensitivity?"

    Yes, that would. Which is one more reason I find the 'skeptic' postulation of such spikes very odd - extremely high sensitivity means AGW would be even worse than predicted. But hey - "A foolish consistency is the hobgoblin of little minds" - given the dozens of posts denigrating this paper on the 'skeptic' blogs, a wee bit 'o self-contradiction is apparently no impediment to trying to undermine a paper they don't like. One clearly and graphically showing that we've changed the climate...

  38. Rob Honeycutt at 03:27 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    Here's a question.  Sorry if someone else brought this up already.  So, part of the idea here is whether or not there could be spikes in global temperature over the course of the holocene, somewhat proportionate to what we see in the 20th c, that do not show up in the Marcott graphs due to methodology.  Right?

    If there were such spikes, would that not be an indication of extremely high climate sensitivity?

  39. The two epochs of Marcott and the Wheelchair

    Tom Curtis - I see a single 0.45C spike at ~5.75 Kya, a few more at ~7 Kya; a Monte Carlo perturbation of proxies with an embedded spike of 0.45C would show (in the space of all possible realizations) a distribution of realizations with spikes at (nominal unperturbed realization), below (blurred by perturbations that reduced overlay) and above (where unrelated short term spikes get perturbed under the larger one) that value. 

    Hence I would consider that a single realization at that point represents a value that is too high for the data, an outlier, that consists of perturbations that happen to overlay variations in a reinforcing fashion at that point. 

    A more interesting period IMO is the range of ~1.8-0.2 Kya; showing a distribution of realizations that result in an approximately 0.15 C rise and a following drop. That is the kind of blurred pattern (although rather small) I would expect from previous Holocene reconstructions indicating a variation of ~0.3 C at that point - encampassing the MWP and LIA. 

    Holocene Temperature Variations (GlobalWarmingArt)

    Again: In perturbed Monte Carlo reconstructions the possible space of realizations must include original level excursions, plus realizations below (many of these due to blurring) and above (a few due to stacking, which in fact permits full level realization in the presence of date errors) - a distribution. Again, I do not see any possible century-level 0.9 C spikes in the Marcott realization set. 

    The only possible way for such a spike to be missed is if it wasn't in the data at all - and given the data (proxy sampling minima: 20 years, maxima: 530, median: 120) a two century excursion would have been sampled. 

    ---

    But this is all really a side-show. We have proxies down to decadal and near-annual resolution (ice cores and speleotherms, in particular), and none of them show a global 'spike' signal of this nature. The only reason the question was raised in the first place, IMO and as noted here, is as obfuscation regarding global warming. Current warming is unprecedented in the Holocene, and it is due to our actions - it's not a 'natural cycle'. 

  40. The two epochs of Marcott and the Wheelchair

    KR @39, I can accept your first point.  However, consider the spike above the cloud at about 5.75 Kya on the 1000 realization spaghetti graph.  Just what were the mean values for the thousand years before and after that spike in that realization?  Clearly you cannot say as its hidden in the spaghetti.  So, for all we know it could be a spike of about 0.45 C that would be all that is shown of a 0.9 C spike based on Tamino's analysis.  Indeed, given the attenuation of the spike simply from the low resolution of some proxies, if that spike is a fluctuation from the mean of the series, it reproduces Tamino's "uperturbed" example.

    That is the problem.

    Unfortunately, I still do not know why Tamino's examples produce greater variability than do Marcott et al's actual reconstruction; and not knowing that, I do not know that the difference which causes it would not also smooth away a brief, high amplitude spike.

  41. The two epochs of Marcott and the Wheelchair

    MrPete - The fact that our current temperature rise will take thousands of years to reset is not from the Marcott et al paper, but rather from basic physics and the atmospheric concentration lifetime of CO2 (Archer 2005, as but one resource). Even once the atmosphere and oceans equilibrate in CO2, it will take thousands of years for CaCO3 reactions and silicate weathering to draw down the excess. 

    The only circular arguments being made in regards to this paper are those claiming that short-term spikes could have been missed by the Marcott analysis (despite there being no physical basis for such up/down spikes, nor evidence for them, and despite high-resolution proxy evidence against such spikes), and that therefore current warming is natural and nothing to worry about. That's entirely circular, unsupported, and nonsensical. 

  42. The two epochs of Marcott and the Wheelchair

    CBDunkerson, you wrote:

    Rather, temperatures will continue their precipitous rise until we get greenhouse gas emissions under control and then they will stay at that high temperature, decreasing only very slowly, for thousands of years.

    This statement is not so easily proven. The fact that it is not proven is why so much is being invested to discover the answer. We certainly can't make that assumption in a paper that's supposed to help us understand whether that statement is true or not... that would be presuming the conclusion, ie circular logic.

  43. The two epochs of Marcott and the Wheelchair

    Tom Curtis - Tamino added the spike to a single instance of the dates (best estimate, I expect), then the Monte Carlo procedure perturbed those dates causing the smearing he saw in his analysis. If the time of the spike in each proxy did not change then neither would the spikes change from the shape initially introduced - the +/- uncertainties in the data would average out entirely. He did not add the spike after perturbations. 

    One very important point about the Marcott et al discussion on resolution is that they calculated that resolution from a frequency point of view - evaluating power spectra at various frequencies. 

    The gain function is near 1 above ~2000-year periods, suggesting that multi-millennial variability in the Holocene stack may be almost fully recorded. Below ~300-year periods, in contrast, the gain is near-zero, implying proxy record uncertainties completely,  remove centennial variability in the stack. Between these two periods, the gain function exhibits a steady ramp and crosses 0.5 at a period of ~1000 years.

    [Marcott et al 2013 supplemental]

    This does mean that a frequency-limited signal such as a 300-yr sinusoidal variation would be removed entirely. However, a spike of 200 years duration (0.9 C in the Tamino analysis) contains many frequencies - from the 0 frequency average value added to the full signal by the spike, down to the ~decadal sharp edge transitions that are completely lost. Their Monte Carlo analysis is in effect a low-pass filter, which would not have removed that large a spike - just blurred it. In that regard I feel the authors are under-rating their own procedures; a spike like modern warming, even if reversed by non-physical gremlins, would still show in their final data. 

    What's more - given the nature of Monte Carlo analysis, some of the perturbed realizations would include such a spike in full (if it existed), others would not, resulting in blurring. Some peturbations, by shifting unrelated data under the spike, would actually increase it in that realization. 

    But if you examine the full 1000 realizations that Marcott et al ran:

    Marcott full realizations

    There are no such spikes visible in any realization. It is my opinion that Marcott et al have down-played their own work - that a warming like the one we are currently experiencing, if it had occurred in the Holocene, would have shown up in this analysis. 

  44. The two epochs of Marcott and the Wheelchair

    Tom Curtis, while obviously it would be nice to be able to 'conclusively prove' that there were no brief temperature spikes, comparable to the current rate of warming, over the entire period of the Marcott study... but I have to ask whether that is really even necessary?

    I think you would agree that both of the following are easily proven;

    1: The recent warming will not be a brief spike. Rather, temperatures will continue their precipitous rise until we get greenhouse gas emissions under control and then they will stay at that high temperature, decreasing only very slowly, for thousands of years.

    2: The Marcott data does provide enough detail to conclude that there has been no similar 'new high temperature plateau' over the period of the study.

    Who cares if it is theoretically possible that some never observed or imagined phenomena could have caused a temperature spike comparable to current warming, which then immediately reversed back down to roughly the pre-warming temperature such that the change could be undetected by the Marcott study? Aside from being rampant speculation with absolutely no basis in evidence and seeming highly unlikely to even be possible... whatever might cause such a brief spike would also be completely irrelevant to what is happening today.

  45. The two epochs of Marcott and the Wheelchair

    Further to Tamino's attempt to show that spikes in temperature would show up in Marcott et al's reconstruction, it turns out that high energy physicist, Clive Best has attempted to refute Tamino.  He introduces Tamino like spikes into his own replication of Marcott et al, and finds a detectible but small responce:

    He writes:

    "The peaks are indeed visible although smaller than those claimed by Tamino. In addition, I believe we have been too generous by displacing the proxy data linearly upwards since the measurement standard deviation should be properly folded in. I estimate this would reduce the peaks by ~30%.

    What I find very interesting is that there actually do appear to be smaller but similar peaks in the real data (blue arrows), one of which corresponds to the Medieval Warm Period !"

    Unfortunately is his over optimistic.  HIs replication of Marcott et al procedes converting ally proxie data to anomallies, and then dividing the time span into 50 year bins.  Then for each bin, if a proxy date falls in that bin, it is included but excluded otherwise.  The anomaly temperature for each 50 year interval is found by taking the "geographic average" by which he possibly means an area weighted average for 5x5 cells.  Crucially, he does not introduce any interpolation between data points.  Thus, proxies with 300 year time resolution will only be found, on average, in every sixth bin.  That is, the number of proxies in each successive bin varies considerably.  As he takes a (possibly area weighted) mean of each bin rather than using the mothod of differences, the result is that the average jumps around a lot every fifty years, introducing a large amount of spurious short term variability. 

    Because he introduces so much spurious short term variability by his method, he is certainly mistaken to claim examples of that variability as possible large spikes in the temperature data as smoothed by Marcott et al.

    Importantly, the failure to linearly interpolate is not the cause of Best's reduced spike.  Had his bins included linearly interpolated values of proxies that did not have a date within the bin range (and included a mean of the data of poxies with multiple data points in the bin range, instead of all data points), the result would be to introduce additional data that was unperturbed by the spike, thereby reducing even the small spike he shows.

    Best's method will also include some proxies with high resolution multiple times in each bin.  In his first attempt at this method he used 100 year bins, which would have included some proxies five times in each bin, and others only once every fifth bin.  The inclusion of multiple data points from a single proxy in the bin has the effect of giving that proxy additional weight.  This probably accounts for the slight difference in the overall shape of his reconstruction when compared to that of Marcott et al, and is definitely inferior to Marcott et al in that regard.  His crude method does confirm, however, that the overall shape of the Marcott et al reconstruction is a product of the data, not an artifact:

    Returning to the issue at hand, what of Best's reduced peak?  I believe that is because he did not add the full spike to each data point in the relevant bins.  Specifically, he "simply increased all proxy temperatures within 100 years of a peak by DT = 0.009*DY, where DY=(100-ABS(peak year-proxy year))".  That is a reasonable proceedure.  I have asked Tamino whether he did the same in a comment; but as he declined to post that comment or an answer I do not know whether he did likewise, or added the full value to each proxy within the timespan.

    Assuming Tamino did not make so egregious an error (a wise assumption), the second relevant factor is the temporal perturbing of proxy dates.  If Tamino added a spike linearly adjusted by year to each proxy within 100 years of the peak of the spike, and then temporally perturbed the proxies before linearly interopotating etc, that would tend to smear the spike out.  If instead he temporally perturbed the proxy dates, then introduced the spike based on linear interpolation he would produce a very distinct spike.  That is because the time of the spike would remain fixed across all temporal perturbations.  After taking the mean, a strong signal would emerge.  Indeed, if this is indeed what he has done, the method is analogous to a process used by amateur astronomers to take pictures with much finer resolution then the cameras they use to take the pictures. 

    If this is his procedure, it is, however, a mistake.  We are not in the position of amateur astronomeurs who can take a 1000 photo stack and convert it into a single high resolution photograph.  Rather, we have a single low resolution photo (the 73 proxies) and cannot get a greater resolution than provided by those 73 proxies.

    The correct method to test whether a short, high amplitude spike would be detectable is to first perturb the proxies temporally once, then introduced the linearly extrapolated spike to perturbed proxies what fall within the 200 year history of the spike.  Then, taking the resulting pseudoproxies, apply the full Marcott procedure to them.  That is, for each pseudoproxy, make one thousand perturbed proxies, perturbed both in date and anomaly temperature.  From these thousand realizations of each proxy, create a thousand reconstructions, then take the mean of those thousand reconstructions.  Only if the introduced spike is visible and distinct at the end of that process have you proved that a large short term spike would be visible in the Marcott et al reconstruction. 

    Moderator Response:

    [RH] Fixed image width.

  46. The Fool's Gold of Current Climate

    The article Andy Skuce linked to also includes a bit about the fact that we know from fossil remains that polar bears used to live in the Baltic sea off Sweden and Finland... but now they don't. Still plenty of ringed seals there, but no polar bears. Why? Because there is no longer sea ice in that region for polar bears to hunt from. The presence of their prey doesn't mean a thing if they can't catch it.

  47. climatelurker at 22:54 PM on 5 April 2013
    The Fool's Gold of Current Climate

    As someone else suggested, it seems likely that polar bears will continue only as a hybrid species crossbred with other bears if they lose their habitat. I'm not sure how a polar bear could 're-emerge' as a distinct species during ice ages, though. Once the DNA has been mixed, it can't exactly be un-mixed.

    However, there are a lot more animals in the crosshairs of climate change than polar bears and penguins. How is climate change going to alter disease trajectories in different species, for example the white nose fungus that's decimating bat populations (is it possible even if not yet evident that it's linked to climate change?), or if climate change has a hand in the bee colony collapse (aside from the neonicotinoid pesticide link). What will climate change do with bird or swine flu? E-Bola? Rabies? Strep Throat? Are fish and cetacean populations experiencing mass die-offs as a result of climate change already? What if previous mass extinctions from climate change had disease explosions as their vectors? Is that possible to study in the fossil records?

    To try to tie this back to the blog topic, it seems like these underlying things  may already be happening, hardly what I would call benign or beneficial (not even bringing the physical effects of rising water and weather pattern changes into the discussion).

  48. Klotzbach Revisited and John Christy's response, part 2

    I started reading the Klotzbach paper and notice they reference surfacestations.org several times, which is maintained by Anthony Watts. It reminded me of this sad exchange: Watts uses a picture to 'prove' that antarctic surface stations are influenced by urban heat island. The picture turns out to be from the wrong end of the continent, and the weather station on the picture isn't even used for climate data. And Watts never acknowledges his errors.

    Why anyone even wants to associate with that guy is beyond me, but if they do that's an instant mark against their credibility.

  49. CO2 limits will hurt the poor

    Apologies Ray - your comment was clearly engendered by my comments are whether you were in the ideological camp or not. It is my response to you that would have been off topic and so I have brought it here.

    Ray, the climate negotiations go nowhere because US in particular dont commit to reductions. Without that happening (and Europe) obviously no progress is made, but the intent of Doha is reductions by rich nations without restricting the growth of poorer nations. Not even the US denies this aim. And Kyoto clearly gives lie to your ascertain that idea that western powers are trying to restrict the growth of the poor. Rehman statement says that is it the failure by the west to reduce their emissions (and thus inflicting climate change on the poor) that is the problem.

    However, I now I might have misread you. Your statement was "my political views are that I find it difficult to accept that the major western powers are trying to enforce, on countries which are much poorer than they are, actiions that will disadvantage the citizens of those countries in their efforts to attain the standards of living approaching those of the developed world."

    By this I understood you mean that thought the west was trying to restict growth of emissions in the 3rd world (actions that will disadvantage), where as I realise that you might have meant that they are forced to accept inaction by the west and thus limited by climate change in trying to improve their standard of living. If this was your meaning, then I apologise.

    If you agree that western powers need to drastically reduce emissions so that poor nations can grow without harming their climate, then we have no disagreement.

  50. The two epochs of Marcott and the Wheelchair

    many thanks for the explanations CBDunkerson, Tom Curtis and Tom Drayton. I thought random spikes would have made a visible trace in the record of past temperature, but didn't know how it could be shown.

    Philip Shehan, why bother going on WUWT? Except perhaps as a comedy venue...

    Moderator Response:

    [JH] Phillip Shehan's most recent comment was deemed to be "off topic" and hence was deleted.

Prev  921  922  923  924  925  926  927  928  929  930  931  932  933  934  935  936  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us