Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  921  922  923  924  925  926  927  928  929  930  931  932  933  934  935  936  Next

Comments 46401 to 46450:

  1. The two epochs of Marcott and the Wheelchair

    Tom Curtis - "That is the reason Marcott et al compare modern temperatures to the PDF of temperatures in the realizations rather than the mean."

    Comparing PDF's is indeed the appropriate method - and comparing the means of those PDF's is part of that analysis.  

    It may be, looking at his results, that Tamino failed to apply the averaging of sampling resolutions when inserting his spike into the proxy data - but given the median 120 year sampling, that would at most reduce such 200-year spikes by a factor of ~2; still large enough to be visible in the full realization.

    WRT Monte Carlo analysis - the PDF of the full MC perturbed realization space in the presence of noise must include the raw data, and beyond, as at least some of the realizations will shift uncorrelated variations under the such a spike. The majority will blur a short spike by shifting proxies to not coincide, but will still pull up the mean in that area. That's true even given dating errors, as some of the perturbations will undo such errors. In the 1000 realization set (which should be a good exploration of the MC space) as shown by the 'spaghetti graph' - the outer bounds of those realizations do not include any such spikes.

    Now, it may be that 1000 realizations is not a sufficient exploration of the MC set (unlikely, though), or that the combination of proxy smearing and MC low-pass filtering might make a small spike difficult to distinguish. I would disagree, but I haven't gone through the full exercise myself.

    However - Isn't this all just a red herring? One initially raised by 'skeptics' in an attempt to dismiss the Marcott paper?

    • Events like the 8.2 Kya cooling show up quite strongly in multiple proxies (which is how we know of it), it even appears to be visible in the Marcott reconstruction as perhaps a 0.1C global cooling.
    • If a 0.9 C, 25x1022 Joule warming spike occurred in the Holocene we should have proxy evidence for it - and we don't.
    • There is no plausible physical mechanism for such an up/down spike.
    • There is a known physical mechanism for current warming (which won't be a short spike, for that matter).
    • There is therefore no support for the many 'skeptic' claims that "current warming might be natural" and out of our control.

    The Marcott et al paper is very interesting, it reinforces what we are already fairly certain of (that there is a lack of evidence for large spikes in temperature over the Holocene, that there is no physical basis for such spikes), and I expect their methods will be extended/improved by further work. But arguing about the potential existence of mythic and physically impossible spikes is an irrelevant distraction from the issues of current warming. 

  2. Food Security - What Security?

    Don't worry, WWIII is around the corner......it will solve your problems.

  3. Bob Lacatena at 00:21 AM on 7 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    After a brief exchange with Dr. Ray Pierrehumbert at the University of Chicago, he directed me to his 2007 post at Real Climate titled What Ångström didn’t know, wherein he basically presents the derivation in plain English (no math).  To supplement that, I'd also suggest doing some research on optical thickness and the Beer Lambert Law.  If you have the chops for it, the Science of Doom website has some very good explanations (warning: math!) of a lot of things.

  4. Bob Lacatena at 23:04 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    I'd just like to add that Neal King has (offline) pointed out that this was previously discussed on this same thread, at comments 73, 76 and 78.

    Offline, he also pointed out that:

    ...the explanation from Pierrehumbert is that the radiative forcing is due to the change in flux when the critical point (at which the optical depth, as measured from outer space downward, reaches the value 1: Photons emitted upward from this point will escape, so this defines an effective photosphere for the given frequency.) changes its altitude.

    This would greatly simplify the calculation problem.

    I may pursue this further myself, if I can find the time... it's a very interesting question.  In particular, it's about time I plunked down the cash on Ray Pierrehumbert's text book Principles of Planetary Climate, and perhaps John Houghton's The Physics of Atmospheres.

  5. Making Sense of Sensitivity … and Keeping It in Perspective

    Glenn has answered a lot of your questions, but the confusion is about to use it. Once you know (or have estimated) a climate sensitivity, then you can use it to calculate deltaT directly. However, you need the full blown GCM to derive the climate sensitivity in the first place. This is the reason behind debate on CS. Estimates can be made empirically from paleoclimate or more commonly from the models but you have a range of values coming from those, with most clustering between 2.5 and 3. The key to CS is the feedbacks. By itself 3.7W/m2 TOA forcing gives you 1.2C of temperature. However, with a temperature rise you immediately have feedback from increased water vapour. In slightly longer term you get feedback from albedo (particularly change in ice) and on longer timescales you have temperature-induced increases in CO2 and CH4 from a variety of sources. Add into the equation change in cloudiness with temperature (and whether this is low level cloud or high level cloud) and you start to get feel for the complexity of GCMs.

  6. Food Security - What Security?

    Jonas@2, thanks for the links. It awakens memories. I think Forrester's world dynamics model had an unusual first "public" appearance. So far as I know, the results of his "World 1" simulation model first appeared in Playboy magazine. Dennis Meadows presented a preliminary version of the "limits to growth" model at our institute. In 1971, I was invited to speak to the Ann Arbor chapter of the Sierra Club on these modeling efforts. I focused mostly on the Forrester model, with which I was intimately familiar, because the Meadows work had not yet been completed.

    Donella H. Meadows article "System dynamics meets the press" might have some useful suggestions for those interested in improving the communication of climate change and global warming issues to the public.

  7. Glenn Tamblyn at 18:00 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer

    There are two parts to this. Calculating the change in Radiative Forcing at the Top of Atmosphere (TOA) due to a change in GH gases etc - essentially the change in the Earth's energy balance. Then calculating  the temperature change expected to result as a consequence of that.

    The standard formula used for the radiative imbalance change is

    Delta F = 5.35 ln(C/C0) where CO is your reference CO2 concentration and C is the concentration you are comparing it to. The usual CO chosen is Pre Industrial of around 275 ppm. This formula is from Myrhe et al 1998 that was included in the IPCC's Third Assessment Report (TAR)

    So a doubling is 5.35 ln(2) or 3.7 W/M2

    This formula is in turn a regression curve fit to the results from a number of Radiative Transfer Codes. These are programs that perform numerical solutions to the Eqn of Radiative Transfer. Essentially they divide the atmosphere up into lots of layers and calculate the radiation fluxes between each layer, taking into account the properties of each layer - temperature, pressure, gas composition etc, the transmision, apsorption, emission and scattering of EM radiation in each layer based on detailed spectroscopic data for each of the gases present from databases such as HiTran. They perform this calculation, summing what is happening to each layer. And they do this for either each single spectral line - a large computational task - or by dividing the specta up into small bands. The accuracy of these programs has been well demonstrated since the early 1970's.

    It is important to understand that these are not climate models. They perform a single, although large, calculation of the radiative state of a column of air at one instant, based on the physical properties of that air column at that instant.

    The second stage of the problem is to work out how temperatures change based on the radiative change. Back of an Envelope calcualtions can get you into the ball park, which is what people did up until the 1960's. The very first, extremely simple climate models assumed a CS value. Current Climate Models, which are far from simple, now actually derive the CS as a result of the model. The radiative changes are fed into the model, along with lots of known physics - conservation of energy, mass & momentum; thermodynamics; cloud physics; meteorology; ice behaviour; atmospheric chemistry, carbon cycle chemistry, ocean models etc. These are then left to run, to see how the system evolves under the calculations. The result then, among other things, indicates the CS value.

    Climate Models howevber are not the  only other way to estimate CS. The Wiki entry you cite gives another example, of a class of exampled that are probably better than the climate models - the behavious of past climates. In order to determin CS you don't have to have just a CO2 change. Anything that will produce a forcing change - volcanic activity, dust, changes in solar output, will all provide data points to amass a broad estimate of what CS actually is.

    One trap to watch out for is that CS isn't always expressed the same way. Usually it is expressed as 'Deg C per doubling of CO2' but sometimes in the literature it is expressed as 'Deg C per w/M2 of forcing'

    So what we are looking for is multiple evidence streams indicating similar values for CS. And broadly they do. Although these estimates often have longer tails of possible outlier values the central point of the probability distribution of the results from most source, the majority of them derived from observations of present and past climate, is fairly strongly at around the 3-3.5 range.

    Hope this helps.

  8. engineer8516 at 16:38 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    thanks for the replies and links.

    @scaddenp

    I'm not sure if I'm understanding you correctly...so the climate models estimate the temp increase from double CO2. Then taking the est. temp increase and dividing it by 3.7 gives climate sensitivity. So that equation is just the equation for slope i.e. rise over run, and it isn't directly used to calculate climate sensitivity from historical data. The reason I'm confused is because I think the wikipedia article on climate sensitivity says that the equation can be used directly, which would imply that there is a physical foundation for it.

    "The change in temperature, revealed in ice core samples, is 5 °C, while the change in solar forcing is 7.1 W/m2. The computed climate sensitivity is therefore 5/7.1 = 0.7. We can use this empirically derived climate sensitivity to predict the temperature rise from a forcing of 4 W/m2, arising from a doubling of the atmospheric CO2 from pre-industrial levels. The result is a predicted temperature increase of 3 °C...Ganopolski and Schneider von Deimling (2008) infer a range of 1.3 to 6.8 °C for climate sensitivity determined by this approach." - wikipedia

  9. The two epochs of Marcott and the Wheelchair

    scaddenp @55, are there large spikes in individual proxies?  Yes, and there are also large spikes in multiple individual proxies simultaneiously to within temporal error (see my 53).

    KR @54:

    1)  A 0.9 C spike is approximately a 2 sigma spike, not a fifty sigma spike.  That is, the 300 year average of an interval containing that spike will be 0.3 C (approx 2 SD) above the mean.  If you want to argue it is more than that you actually have to report the frequency of such spikes in the unpruned Monte Carlo realizations.  Marcott et al did not report it (although I wish they had), and nobody else has reproduced it and reported it so we just don't know.

    2)  We don't see any density function excursions in the Monte Carlo realizations because:

    a)  Marcott et al did not plot the PDF of centenial trends in the realizations (or their absolute values); and

    b) In the spahhetti graph you cannot see enough to track individual realizations over their length to determine their properties.

    Perhaps you are forgetting that the realizations are reconstructions with all their flaws, including the low resolution in time.  That means a Tamino style spike in the realization will be equivalent in magnitude to his unperturbed reconstruction, not the full 0.9 C spike.   As such, such a spike starting from the mean temperature for an interval would not even rise sufficiently above other realizations to be visible in the spaghetti graph.

    3)  Pruning the realizations is a statistical blunder if you are plotting the PDF for any property.  It is not a blunder, or wrong in any way if you want to see if a statistically defined subset of realizations have a particular property.

    4)  If I throw two fair dice the maximum likilihood result of the sum of the dice is seven.  That does not mean I will always get a seven each time over one hundred throws.  In fact, with high probability, over one hundred throws I will get a 7 only 17 times (16.66%).  Also with significant probability, I will get a 12 about 3 times.  As it happens, the probability of a realization lying on the mean of the realizations at any time is about 5%. Ergo, about 95% of the time for any particular realization it will not lie on the mean, but be above it or below it.  Most realizations will lie on the mean more frequently than any other temperature, but on no temperature relative to the mean very oftern at all.

    That is the reason Marcott et al compare modern temperatures to the PDF of temperatures in the realizations rather than the mean.  Directly comparing with the mean is, unfortunately, tempting, but wrong.  So also is treating the mean as the course of temperature over the Holocene.  Rather it is a path that statistically constrains what Holocence temperatures could have been given what we know.

  10. The Fool's Gold of Current Climate

    Serendipity graphs above are also on a hopeful site, because they consider the A-O CO2 exchange only. They do not consider the earth system response. So far, what is known about it, is that we can expect only positive feedbacks: methane release from clathrates and permafrost, decreased albedo from melting arctic ice, warmer ocean degassing CO2 because warm water can hold less of it.

    The only problem is that the quantity of those feedbacks are unknown (maybe with the exception of ice albedo). I expect those figures (abstractive and outdated already, we need to update the starting level to 400ppm) to become more pessimistic (more warming in the pipeline) once those positive feedback are quantifiable.

  11. Making Sense of Sensitivity … and Keeping It in Perspective

    Just remember that formula is post-hoc. You get sensitivity out of a climate model run by solving for k from deltaT/deltaF which gives you a useful way to estimate temperature for given forcing. However, the GCM do not derive temperature from that formula internally that internally.

  12. Bob Lacatena at 14:28 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer --

    Spencer Weart has this reference to the first such calculation in 1967 by Manabe and Wetherald.

    You might want to look over this timeline.

    I'd also very strongly suggest reading Spencer Weart's The Discovery of Global Warming.  It's interesting reading, and it adds a lot of depth to both an understanding of the science and how old and broadly based climate science is.

  13. Bob Lacatena at 14:22 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    [engineer -- Your other post just went onto the next page.]

  14. Bob Lacatena at 14:21 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    ∆T = k log2(CO2final/CO2initial)

    Where k is the climate sensitivity in degrees C per doubling of CO2.

    I myself have never found the derivation for that, either. We at SkS should probably make a concerted effort to find it, as it would be well worth looking at and referencing.

    It may have arisen primarily from experimental observations, or else through "experimentation" using the MODTRAN line-by-line radiative transfer computations (developed by the US Air Force, one of the pioneers in this stuff, due to their interest in making infrared missiles work properly in the atmosphere).  If it was determined through physical principles, it would need to take into account the varying density of the atmosphere (with altitude), as well as the resulting variations in IR absorption and emission as balanced against the number of collisions per second with non-GHG molecules like O2 and N2 (and of course the number of collisions is affected by both density and temperature, i.e. the average velocity of each molecule).  Then there are other complications such as bandwidth overlaps with other greenhouse gases (like H2O), and broadening of the absorption spectrum (pressure broadening and doppler broadening).

    All in all, it's pretty complicated.

    I'll ask and see what people can turn up.

  15. engineer8516 at 14:17 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    I read through the link. The formula I was referring to was dT = climate sensitivity * dF.

    Hopefully this isn't a doble post. I'm not sure what happened to my other one.

  16. Bob Lacatena at 14:12 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    You might also want to look at this page, courtesy of Barton Paul Levenson.  I don't think it's been updated since 2007, so it lacks a good 5 years worth of further research, but it gives you some idea of the breadth of the work that's been done in the area, and how much the end results give basically the same answer.

    [Be wary of any study that gives too high or too low a climate senstivity.  Like anything else, the outcome depends on underlying assumptions, and not all papers that are published withstand scrutiny forever.  In fact many are quickly refuted.  Peer-review is only the first hurdle.  A good example is Schmittner et all (2011), which found a lower climate sensitivity than many, but also assumed a lower temperature change from the last glacial to the current interglacial -- a lower temperature change obviously will yield a lower sensitivity, so the question shifts more towards establishing the actual temperature change in order to arrive at the correct sensitivity... as well as recognizing that his was the sensitivity of the planet exiting a glacial phase.]

  17. engineer8516 at 14:10 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    I looked at the link. I was referring to the formula dT = climate senstivity * dF.

  18. Bob Lacatena at 14:05 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    It's not derived through a formula -- that would be like having a single derived formula that computes the expected age of a species of animal, based on the animal's biology.  It's just too complex for that. 

    The link I already gave you ("many methods of estimating climate sensitivity") gives some (not all -- in particular, that link skips over modeling, which is a very important and valuable technique) of the methods of computing sensitivity.  To really understand it you'd need to find copies of and read many of the actual papers.

    Another approach is to use the search box up top, and search for "climate sensitivity".

    The best thing you can do with climate sensitivity is to learn a lot about it.  Make it a multidimensional thing that you understand from many angles.

  19. engineer8516 at 13:32 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    thanks for the replies @sphaerica I wasn't trying to insult climate scientists. I was trying to figure out the basis for the assumption and I wasn't implying that it was abitrary.

    Also do you guys know of any good links that goes into the details of the derivation of climate sensitivity? Not how the value is estimated, but the derivation of the formula. I couldn't find any good sites on Google. Thanks again.

  20. The two epochs of Marcott and the Wheelchair

    Forgive me if I havent been following this closely enough, but surely no spike in global temperatures is possible unless there are spikes of the appropriately same magnitude in individual proxies. So one question, is there any spikes of centennial scale with size of modern warming evident in individual proxies (like ice core) that have sub-century age resolution? If there are none, then it hardly matters about the nicities of the multi-proxy processing surely?

  21. Bob Lacatena at 12:58 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    One last thing.  You said:

    ...we're assuming that climate sensitivity behaves nicely...

    No, we're not.  Scientists aren't stupid, and they don't work from arbitrary assumptions.  There are reasons for believing the climate sensitivity behaves a certain way, based on physics, past climate and present observations.  It's not just some assumption that has been wantonly adopted because it makes life easier.  Nobody in any field or profession gets to do things that way.  Why would climate scientists?

  22. Bob Lacatena at 12:55 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    engineer,

    That's a good question, with a complex answer.

    It is absolutely true that climate sensitivity is not and would not be exactly constant. Climate sensitivity is a result of a wide variety of feedbacks which individually have different impacts.

    There are fast feedbacks which are physical mechamisms which are somewhat predictable through physical modeling (for instance, the fact that warmer air will hold more moisture, thus adding the greenhouse gas effect of H2O to the air).

    There are also slow feedbacks that depend on physical, initial conditions.  The ice sheets, for example, during a glacial period contribute a lot to keeping the planet cool by reflecting large amounts of incoming sunlight.  When temperatures warm and the ice sheets retreat, that results in a positive feedback.  If you imagine the ice sheets spread over the globe, it is easy to see that those ice sheets are larger when they are further south.  As the ice sheets retreat, they get smaller and smaller, and each further latitude of melt reduces them by less, so that the feedback is not continuously proportional.

    CO2 as a feedback instead of a forcing is also a diminishing feedback.  As you add more and more CO2 to the atmosphere, the additional CO2 has less and less of an effect, so you need even more CO2 to accomplish the same amount of warming.

    So for any particular feedback, the initial climate state is important.

    But there are a lot of different, intermixed feedbacks.  CO2 and CH4 can be added to the atmopshere due to major ecosystem changes (forest die offs or permafrost melt).  There is the melting of ice sheets.  Ocean circulation and stratification patterns can change.  The list goes on.

    As a result, given all of the varying feedbacks with varying effects under different conditions... it all averages out.

    There are many methods of estimating climate sensitivity.  Some look at past climates, to see what has happened before.  Some work with models the try to emulate the physical mechanisms.  Some directly observe how the climate changes in the very short term due to known and measured forcing changes.

    The thing is that all of these methods produce varying results in a broad range, but most converge on the more narrow range of 2 to 4.5C, and most converge on the same value of about 3C.  Taken individually, nothing is exactly the same as the current climate, but since most studies, past and present, seem to fall into the same range, it suggests that there is validity to the broad assumption (Occam's Razor) that the climate generally behaves in about the same way.

  23. Making Sense of Sensitivity … and Keeping It in Perspective

    Engineer - that was done more or less by Broecker for his remarkably accurately 1975 prediction but that is not how any modern climate model work. Instead, climate is emergent from the interaction of forcings with the various equations in the model. If you want to know what the climate sensitivity of model is, then you work backwards from the temperature at end point as calculated by model compared to CO2 forcing. Can do run the model with various forcing to see what sensitivity to say a solar forcing of same magnitude is. (see for instance ModelE results). Over a very big temperature range, there would be good reason to suppose sensitivity would change. Eg when all ice is melted from both poles, then the only only albedo feedback would be weak ones from land cover change. Preserve us from having to worry about that for the next 100 years.

  24. The two epochs of Marcott and the Wheelchair

    Tom Curtis - To clarify my earlier comments regarding Monte Carlo analysis: 

    Given a set of proxies with date uncertainties, if there is a large (50 sigma?) signal involved, even if the initial dates are incorrect at least some of the realizations in the perterbation space will accurately reflect that spike, being shifted to reinforce each other. And some will show more than that spike, due to unrelated variations being date-shifted intot he same time period. The number, and density of the Monte Carlo probability function, will be dependent on the distance of the various date errors - but there will be a density function including the spike and more in the full realization space. This is very important - if an alignment that shows such a spike is possible under the uncertainty ranges, it is part of the Monte Carlo realization space

    1000 runs is going to sample that space very thoroughly. And nowhere in the 1000 Marcott runs pre-19th century do you see a density function excursion of this nature. 

    [ Side note - pruning the Monte Carlo distribution would be completely inappropriate - the entire set of realizations contributes to the density function pointing to the maximum likelyhood mean, and pruning induces error. Even extreme variations are part of the probability density function (PDF). The mean in the Marcott case is clearly plotted; "Just what were the mean values for the thousand years before and after that spike in that realization? Clearly you cannot say as its hidden in the spaghetti." is thereby answered. ]

    With very high probability, the real temperature history does not follow the mean of the monte carlo reconstructions...

    I disagree entirely. That is indeed the maximum likelihood of the history, based upon the date and temperature uncertainties. And the more data you have, the closer the Monte Carlo reconstruction will be to the actuality. Exactly? No, of course not, reconstructions are never exact. But quite close. Monte Carlo (MC) reconstructions are an excellent and well tested method of establishing the PDF of otherwise ill-formed or non-analytic functions. 

    It would certainly be possible for somebody to go through and make a maximum variability alignment of idividual proxies as constrained by uncertainties in temporal probability. If that were done, and no convincing, global Tamino style spikes existed in that record, that would be convincing evidence that no such spikes existed.

    That is exactly what a MC exploration of date and temperature uncertainties performs. No such excursions are seen in any of the 1000 realizations. Meanwhile, the rather smaller +/- excursions over the last 1000 years are quite visible, as is the 8.2 Kya event of 2-4 centuries (and yes, it's visible as a ~0.1-0.15 C drop in Marcott). I believe it is clear that the Marcott data would show a larger 0.9 C global two-century spike event if it existed - and it does not show. 

  25. Glenn Tamblyn at 12:34 PM on 6 April 2013
    Food Security - What Security?

    Agnostic, you mention one impact of warming in the summary but don't expand on it any further; warming of the oceans. I have never actually seen an assessment of the negative impact on biological productivity in the oceans of increased water temperatures. Have you any references on that?

  26. engineer8516 at 12:26 PM on 6 April 2013
    Making Sense of Sensitivity … and Keeping It in Perspective

    I have some questions regarding climate sensitivity.

    Basically temp data and changes in forcing are used to calculate climate sensitivity, which is then used to calculate the rise in temp that would result from 3.7W/m^2 of forcing from double CO2.

    My problem with this calculation is that we're assuming that climate sensitivity behaves nicely (ie. is almost constant). My question is how do we know that this a reasonably valid assumption? Thanks.

  27. Food Security - What Security?

    Villabolo @ 4:

    Da Google sent me to this link...

    http://www.skepticalscience.com/North-Carolina-Lawmakers-Turning-a-Blind-Eye-to-Sea-Level-Reality.html

  28. Food Security - What Security?

    Concerning food storage, it is possible to store wheat and other grains for 10 to 30 years depending on the method.

    One method is conventional cans (about gallon size and larger) in an oxygen free environment - nitrogen is used to replace the oxygen. So long as its stored, out of the heat, at room temperature it will last for 20-30 years. Brown rice will only keep for 10 years.

    Abrupt changes in climate will give us a 'feast or famine' cycle where we'll be able to grow adequate amounts some years but suffer extensive loss on other years. The only way to alleviate that is to store large quantities of grains during the 'good' years to make up for the bad years.

    Eliminating biofuel production will also give us a large safety margin.

    Going vegetarian will help even more but it's not going to happen voluntarily.

  29. The two epochs of Marcott and the Wheelchair

    Sphaerica @48, I assume you do not think I am following the steps to denial.

    The fact is that some people, most notably Tamino who at least has made a check of the claim, are claiming that the Marcott et al. reconstruction excludes Tamino style spikes.  That is more than is claimed by the authors of the paper.  Before we can accept that Marcot et al does exclude such spikes, we need to check the evidence.  So far the evidence is tantalizing but far from conclusive.  Ergo we should not use that claim as evidence in rebutting AGW skepticism.  When "skeptics" get to step two, we need only point out (truthfully) that they are over interpreting Marcot et al's remark about frequency and that they have not shown that such a spike could exist and not be shown in the Marcot et al reconstruction.  In short, they have no evidence for their claim at step two.  Although many AGW "skeptics" like to treat an absence of refutation of their position as proof their position is true, we need not accept such irrationality and cannot reach those who continue to accept it once it is pointed out to them.

    However, this discussion is indeed a distraction and absent a significantly new argument or evidence that changes my opinion, I shall withdraw from further discussion of this point.

  30. The two epochs of Marcott and the Wheelchair

    KR @43:

    "Hence I would consider that a single realization at that point represents a value that is too high for the data, an outlier, that consists of perturbations that happen to overlay variations in a reinforcing fashion at that point."

    I think you are missing the point of the monte carlo reconstructions.  We know that there are errors in the temperature reconstruction and dating of the proxies.  That means at some points the unperturbed reconstruction will be too low, or too high relative to reality.  At those points, particular realizations in the Monte Carlo reconstructions which are low or high (respectively) are closer to reality.  Not knowing the precise course of reality, we cannot know whether or not particular spike in a particular realization, whether high or low, maps reality closely or not.

    One way you could approach the general problem would be to do 20,000 monte carlo realizations and to prune away all realizations with an overal probability relative the the original data and errors of less than 5%.  That will leave you with approximately 1000 realizations all of which could plausibly be the real history of temperature over the Holocene.  You could then examine those thousand plausible realizations to see if any contained Tamino style spikes.  If not, then the data excludes such spikes.  If they do, the data does not.

    As it happens, Marcott et al did not prune their realizations based on global probability. Consequently about 50 of their realizations are not plausible temperature reconstructions.  Importantly, though, even plausible reconstructions will contain short intervals which are implausible given the data.  Indeed, about 5% of their length, on average will be implausible based on the data.  Therefore we cannot look at a single peak and conclude from the fact that  the realization is clearly an outlier in that time period, that the realization is an outlier over the entire holocene.  You need the entire realization to come to that conclusion, and due to the nature of 1000 realization spagheti graphs, we do not have the entire history of any realization.

    So, for all we know from the spaghetti graph, there are plausible realizations containing Tamino style spikes within  it.  In fact, what we can conclude from the Marcott et al realizations is that:

    1) With very high probability, the real temperature history does not follow the mean of the monte carlo reconstructions;

    2) With very high probability, the 300 year mean of the real temperature history lies within 0.27 C of the  mean 95% approximately 95% of the time; but that

    3) With high probability the 300 year mean of the real temperature history is further than 0.27C from the mean about 5% of the time.

    The question is whether the approximately 275 years we expect the 300 year mean of the real temperature to be over 0.27 C above the mean is the result of a sustained multicentury warmth, or whether it could be from a Tamino style spike.  (Note, a Tamino style spike as a 300 year mean temperature increase of 0.3 C.)  On information todate, the later is improbable, but not excluded.

    "But this is all really a side-show. We have proxies down to decadal and near-annual resolution (ice cores and speleotherms, in particular), and none of them show a global 'spike' signal of this nature. The only reason the question was raised in the first place, IMO and as noted here, is as obfuscation regarding global warming. Current warming is unprecedented in the Holocene, and it is due to our actions - it's not a 'natural cycle'."

    Well, yes, they don't show global anything because they are regional proxies.  I assume you mean stacks of such proxies.  Well, consider the only two high resolution (20 year) proxies used by Marcot et al over the period 8 to 9 Kya:

    In particular consider the spike of 3 C at 8.39 Kya in the Dome C icecore, which is matched by a 1.5 C spike in the Agassiz-Renland icecore.  That looks suspiciously like a Tamino style spike.  Or consider the 3 C trough at 8.15 Kya in the Aggassiz-Renland icecore and the 2 C trough at 8.23 K in Dome C.  Although the resolution of these ice cores is actually annual, at 8 Kya plus years, the 1 sigma error of age estimate relative to 1950 is greater than 150 years.  Ergo it is quite possible those two troughs should align, creating a Tamino style trough.

    These two possible candidates do not show up in the mean of the Marcott et al reconstruction.

    It would certainly be possible for somebody to go through and make a maximum variability alignment of idividual proxies as constrained by uncertainties in temporal probability.  If that were done, and no convincing, global Tamino style spikes existed in that record, that would be convincing evidence that no such spikes existed.  But until it is done, there is sufficient variability in individual regional proxies that we cannot make the claim that the individual proxies exclude that possibility.

  31. Food Security - What Security?

    @ Matt #1

    "Remember North Carolina's 2012 bill that would have required people to only use past trends to predict sea level rise?"

    Can you provide me a reference to that bill?

     

  32. Daniel Bailey at 09:20 AM on 6 April 2013
    The Fool's Gold of Current Climate

    It is the mark of wisdom to also consider the Temperature Change portion of Andy's graphic above.  And note that this is the result of a complete cessation of human-sourced emissions (brought to zero and held there for 300 years).

    Questions, anyone?  Bueller?

  33. The Fool's Gold of Current Climate

    william: What comes out of his talk is that if we stop putting carbon dioxide into the atmosphere, it could reduce in the atmosphere rather rapidly.

    Actually, if we had stopped emissions dead, in 2010, the atmospheric concentration of CO2 would have declined to about 340 ppm by 2300. For comparison, that's the amount it was in 1980. The CO2 will reduce, over 190 years, at approximately one-sixth of the rate that we are currently increasing it. That's the most drastic case imaginable. Graph below is from Serendipity.

  34. Food Security - What Security?

    PS: Lester Brown has *lots* of raw data for his book 
    "Full Planet, Empty Plates; The new geopolitics of food scarcity"
    on his website (as excel files): http://www.earth-policy.org/data_center 

  35. Food Security - What Security?

    Not Everybody thinks we will have 10 billion humans: Jorgen Randers (Club of Rome) predicts a peak of 8 billion due to lower fertility in cities, see short introduction of his report to the club of Rome "2052, a global forecast for the next 40 years", picking up after 40 years of "Limits to Growth": http://www.clubofrome.org/?p=703 . Randers says, that this lower than generally assumed population will cause lower growth than expected and a push back of the more catastrophic climate change effects to the second half of the century (if nothing is changed, which is what he explicitly assumes after 40 years of environmental activity with limited success ...).

    It is also interresting to view the three videos given a the Smithsonian institute for the 40 years of "Limits to Growth", with each of the three speakers (Meadows, Randers, Brown) giving a different priority to the three dangers from the "Limits to Growth": resource scarcity (Oil, ...), pollution (climate change), population (food, water, ..).

    Dennis Meadows (Oil; Resource Scarcity):
    http://www.youtube.com/watch?v=f2oyU0RusiA
    Jorgen Randers (Climate Change; Pollution):
    http://www.youtube.com/watch?v=ILrPmT6NP4I
    Lester Brown (Food+Water; see his book http://www.earth-policy.org):
    http://www.youtube.com/watch?v=KPfUqEj5mok

  36. Matt Fitzpatrick at 07:26 AM on 6 April 2013
    Food Security - What Security?

    The "10 billion by 2065"(pdf) projection appears to have been made by the U.S. Census Bureau under the Bush administration in 2004, based on 2002 data, in a report with zero mentions of climate change, and zero references to climate change publications. This leads to eyebrow raising predictions, like Chad being among the fastest growing nations on Earth through 2050, more than tripling its population, even as Lake Chad shrinks to a record minimum in the west and desertification creeps into the east.

    Remember North Carolina's 2012 bill that would have required people to only use past trends to predict sea level rise? This Census Bureau report reminded me a lot of that.

    So it'll be interesting to see what effects climate change predictions will have on the Census Bureau's next world population projection revision - assuming it doesn't ignore climate change this time. Surely projected birth and mortality rates should change, at least on a regional basis. Combined with migration triggered by climate change, I'd expect the distribution of population growth to be on a different track, and perhaps even the total population curve.

  37. The two epochs of Marcott and the Wheelchair

    Thanks KR, I will have a look at them.

  38. The two epochs of Marcott and the Wheelchair

    Dissident - NOAA has a great number of climate reconstructions here, most with the data tables. Others can be found with a simple Google Scholar search, such as on "Holocene temperature reconstruction". 

    Available Holocene reconstructions include those based on ice cores, pollen deposition, speleothems, alkenonesbenthic foraminifera, corals, and so on. 

  39. grindupBaker at 06:21 AM on 6 April 2013
    The Fool's Gold of Current Climate

    @Michael Whittemore(20) "...a little warming from induced CO2 could be a good thing, which Dana also seems to suggest", so there's the obvious corollary that humans this century using every last drip of coal & problem-recovery unconventionals is a tad selfish now we know that future humans might have used it to mitigate a glacial when it started rather than having it dissolved uselessly in oceans after a long hot spell.

  40. The Fool's Gold of Current Climate

    It is far to flippant to call Riley's comments greenwash.  If one can take his points as being valid, and I see no reason why he would lie about them, this is the other side of fossil fuels.  We have increased growth of plant life due to increased carbon dioxide in the atmosphere and a reduced the use of resources from nature which were to the  detriment of the natural environment.   It is still, probably a valid point that if we continue to put ever increasing amounts of carbon dioxide into the atmosphere, we will very likely cause a climate flip and that would likely be disasterous.  What comes out of his talk is that if we stop putting carbon dioxide into the atmosphere, it could reduce in the atmosphere rather rapidly.  The other point is that the more we switch to wind, solar and other energy sources such as wave and tidal currents, the better off we will be all around.  We won't be testing the theory of sudden climate change with gay abandon and we won't be putting a strain on the woodpeckers.  I think we should take his comments seriously, examine them as scientists should and combine them into a bigger picture.

  41. The two epochs of Marcott and the Wheelchair

    KR @ comment 43 gave the best debunkment of these 'magic spikes' in temperature, the fact that there are other proxies from ice cores which show (in polar regions) there have been no such variations - if there were, where are the peer reviewed papers demonstrating such. Or would a 'magic spike' occur everywhere else except polar regions? Something, which from my relatively limited yet growing understanding of the Earths climate would require the total decoupling of polar regions from the rest of the Earths climate systems (even down to a physical barrier reaching to the mesosphere, since those proxies are based upon the isotope ratios of oxygen, carbon dioxide etc) it would be nice to see a comparison between them. Are there any that can be cross linked to?

  42. Bob Lacatena at 04:16 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    Steps to Denial:

    1. Marcott does not prove that there could not have been previous spikes equivalent to current warming
    2. We don't know enough about the climate system, so there could have been some magical climate force that did cause a previous spike, just like today's
    3. CO2 is not causing the current warming, even though they can't find an alternative explanation (and this is presumably supported by the fact that magical, invisible past spikes in warming do, in fact, maybe-exist).
    4. CO2 would not cause any warming, even though our understanding of physics, the atmosphere, and evidence of past climates all show definitively that it would.

    So, the deniers deny:

    1. That Marcott demonstrated a lack of similar episodes
    2. That we know enough to recognize why such episodes are at best unlikely
    3. That CO2 is causing the current warming (it's due to mumble-mumble-mumble).
    4. That CO2 should cause any warming.

    For deniers to support their position, they must prove:

    1. That the Marcott paper is invalid.
    2. That at least one previous, similar spike exists
    3. That there is an alternative, physical cause for current warming
    4. That CO2 is not causing current warming
    5. That there is some reason why CO2 would not cause any warming
    6. That there is some mechanism that would bring temperatures back down as fast as they have risen

    And yet they seem to be having a rather hard time with proof #1.

    Does anyone else see how ludicrous this is?

  43. Philippe Chantreau at 04:10 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    Actually KR, the argument from the fake skeptic side seems rather to be that there could be spikes at any time that were due to yet unknown forcings such as leprechauns, ballybogs or grogokhs...

    These of course are powerful enough to make temps rise very fast. Then they are also of such nature as to suppress feedbacks so temperatures can also come down very fast when they decide to go to bed after a hard half century of work, thus leaving no trace whatsoever in the proxy records. Sha-zam...

    You can't make this stuff up...

  44. Rob Honeycutt at 04:01 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    KR...  They also lack any sort of mechanism that might actually cause such spikes.  

    I've only been watching this discussion in a passing manner but the arguments against Marcott seem to me to be something akin to fanatical navel gazing (i.e., rather meaningless).

  45. Daniel J. Andrews at 03:40 AM on 6 April 2013
    The Fool's Gold of Current Climate

    Andy Skuce already posted the article I was going to post. And CBDunkerson just highlighted the same point I wanted to highlight. And mandas gave the seasoned experienced wildlife biologist perspective, which is what I wanted to do. Then dhogaza trumped me with a couple of comments regarding Terranova's graph, and what Terranova will learn when he starts his new Masters.

    I'm feeling quite redundant and rather useless here---think I'll go comment on a "skeptic" site where I'm outnumbered but at least can post relevant scientific information before anyone else. :)

  46. The two epochs of Marcott and the Wheelchair

    Rob Honeycutt - "If there were such spikes, would that not be an indication of extremely high climate sensitivity?"

    Yes, that would. Which is one more reason I find the 'skeptic' postulation of such spikes very odd - extremely high sensitivity means AGW would be even worse than predicted. But hey - "A foolish consistency is the hobgoblin of little minds" - given the dozens of posts denigrating this paper on the 'skeptic' blogs, a wee bit 'o self-contradiction is apparently no impediment to trying to undermine a paper they don't like. One clearly and graphically showing that we've changed the climate...

  47. Rob Honeycutt at 03:27 AM on 6 April 2013
    The two epochs of Marcott and the Wheelchair

    Here's a question.  Sorry if someone else brought this up already.  So, part of the idea here is whether or not there could be spikes in global temperature over the course of the holocene, somewhat proportionate to what we see in the 20th c, that do not show up in the Marcott graphs due to methodology.  Right?

    If there were such spikes, would that not be an indication of extremely high climate sensitivity?

  48. The two epochs of Marcott and the Wheelchair

    Tom Curtis - I see a single 0.45C spike at ~5.75 Kya, a few more at ~7 Kya; a Monte Carlo perturbation of proxies with an embedded spike of 0.45C would show (in the space of all possible realizations) a distribution of realizations with spikes at (nominal unperturbed realization), below (blurred by perturbations that reduced overlay) and above (where unrelated short term spikes get perturbed under the larger one) that value. 

    Hence I would consider that a single realization at that point represents a value that is too high for the data, an outlier, that consists of perturbations that happen to overlay variations in a reinforcing fashion at that point. 

    A more interesting period IMO is the range of ~1.8-0.2 Kya; showing a distribution of realizations that result in an approximately 0.15 C rise and a following drop. That is the kind of blurred pattern (although rather small) I would expect from previous Holocene reconstructions indicating a variation of ~0.3 C at that point - encampassing the MWP and LIA. 

    Holocene Temperature Variations (GlobalWarmingArt)

    Again: In perturbed Monte Carlo reconstructions the possible space of realizations must include original level excursions, plus realizations below (many of these due to blurring) and above (a few due to stacking, which in fact permits full level realization in the presence of date errors) - a distribution. Again, I do not see any possible century-level 0.9 C spikes in the Marcott realization set. 

    The only possible way for such a spike to be missed is if it wasn't in the data at all - and given the data (proxy sampling minima: 20 years, maxima: 530, median: 120) a two century excursion would have been sampled. 

    ---

    But this is all really a side-show. We have proxies down to decadal and near-annual resolution (ice cores and speleotherms, in particular), and none of them show a global 'spike' signal of this nature. The only reason the question was raised in the first place, IMO and as noted here, is as obfuscation regarding global warming. Current warming is unprecedented in the Holocene, and it is due to our actions - it's not a 'natural cycle'. 

  49. The two epochs of Marcott and the Wheelchair

    KR @39, I can accept your first point.  However, consider the spike above the cloud at about 5.75 Kya on the 1000 realization spaghetti graph.  Just what were the mean values for the thousand years before and after that spike in that realization?  Clearly you cannot say as its hidden in the spaghetti.  So, for all we know it could be a spike of about 0.45 C that would be all that is shown of a 0.9 C spike based on Tamino's analysis.  Indeed, given the attenuation of the spike simply from the low resolution of some proxies, if that spike is a fluctuation from the mean of the series, it reproduces Tamino's "uperturbed" example.

    That is the problem.

    Unfortunately, I still do not know why Tamino's examples produce greater variability than do Marcott et al's actual reconstruction; and not knowing that, I do not know that the difference which causes it would not also smooth away a brief, high amplitude spike.

  50. The two epochs of Marcott and the Wheelchair

    MrPete - The fact that our current temperature rise will take thousands of years to reset is not from the Marcott et al paper, but rather from basic physics and the atmospheric concentration lifetime of CO2 (Archer 2005, as but one resource). Even once the atmosphere and oceans equilibrate in CO2, it will take thousands of years for CaCO3 reactions and silicate weathering to draw down the excess. 

    The only circular arguments being made in regards to this paper are those claiming that short-term spikes could have been missed by the Marcott analysis (despite there being no physical basis for such up/down spikes, nor evidence for them, and despite high-resolution proxy evidence against such spikes), and that therefore current warming is natural and nothing to worry about. That's entirely circular, unsupported, and nonsensical. 

Prev  921  922  923  924  925  926  927  928  929  930  931  932  933  934  935  936  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us