Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  397  398  399  400  401  402  403  404  405  406  407  408  409  410  411  412  Next

Comments 20201 to 20250:

  1. SkS Analogy 1 - Speed Kills

    " Slam on the breaks"  should be " Slam on the brakes"

    Moderator Response:

    [JH] Glitch corrected. Thank you for pointing it out.

  2. Increasing CO2 has little to no effect

    Is this post system working?  My last three tries have not produced a post after Tom Curtic @ 300.

    Moderator Response:

    [JH] Yes, the system is working. All of your comments have appeared.

    [RH] What's probably happening is, he's posting at the end of one page and the comments is showing up on the new following page. That one trips people up from time to time.

  3. Increasing CO2 has little to no effect

    Tom Curtis @ 300  I've been trying to post a physics question, but the page won't update with my submission, so I'll just use English. Does the forcing equation arise from fundamental physics to your knowledge, or is it something from curve-fitting efforts?  Is it possible to put partial differential equations in this text-box?

    Moderator Response:

    [JH] All of your prior posts were visible. 

  4. Science of Climate Change online class starting next week on Coursera

    curiousd: In the context of your commentary, the following caught my eye... 

    39-year-old drawing hints at what the Event Horizon Telescope may have just captured: the true shape of a black hole

    What Does a Black Hole Really Look Like? by  Amanda Montañez on April 17, 2017

  5. Increasing CO2 has little to no effect

    Tom Curtis @300. Thank you. Can you tell me if the formula for radiative forcing is a curve-fitting equation (ie, without the sensitivity factor), or is it derived from fundamental physics? I can't seem to locate something that ought to be related, say, to the partial derivative of enthalpy wrt T at constant p, . Have you seen something of the sort? My apologies if this is a duplicate. The page did not update with my submission, until I took the symbols out.

    Moderator Response:

    [JH] They all appeared. Two of the duplicate posts have been deleted.

  6. Increasing CO2 has little to no effect

    Tom Curtis @300.  Thank you.  Can you tell me if the basic equation is a curve-fitting equation (ie, without the sensitivity factor), or is it derived from fundamental physics?  I can't seem to locate something that ought to be related, say, to the partial derivative of enthalpy wrt T at constant p, (∂H/∂T)p.  Have you seen something of the sort?  My apologies if this is a duplicate.  The page did not update with my submission.

    Moderator Response:

    [JH] Your prior two duplicate posts have been deleted.

  7. Yes, we can do 'sound' climate science even though it's projecting the future

    Lamar Smith: "Anyone stating what the climate will be... at the end of the century is not credible"  Then Chairman Smith is not credible, since the argument for doing nothing about fossil emissions is based on a prediction that climate in 2100 will be unaffected by it.  This 'pushback' argument is not made often enough: the 'do nothing' alternative is still a course of action based on a prediction (that despite doing nothing, everything will be OK).  On what is that prediction based?  History?  Intuition?  Madam Costanza's crystal ball?  No rational course of action, or inaction, is made without an estimate of its future impact.  Since all courses of action, or inaction, require such future predictions, why are only the predictions of the climate scientists being questioned?

    Deniers, when questioned on this, will often appeal to history: 'climate has always changed naturally over the course of Earth's history'.  This is a non sequitor: if Smith shot his neighbors dog, his defense can't rest on the observation that most dogs throughout history died naturally.  Besides: name something that hasn't changed naturally over the course of Earth's history.  

    Rarely, deniers will reveal something closer to the heart of their objection, as when Dr. Roy Spencer said "Earth and its ecosystems — created by God’s intelligent design and infinite power and sustained by His faithful providence — are robust, resilient, self-regulating, and self-correcting, admirably suited for human flourishing, and displaying His glory."  So, that's a prediction of future climate based on 'Everything is going to be OK, because God told me so'.  Personally, I prefer Madam Costanza's crystal ball.

  8. Humans on the verge of causing Earth’s fastest climate change in 50m years

    Even if humans don't burn all the fossil fuels, the rapidly thawing permafrost could boost CO2 to that higher level, or to the GHG equivalent since that would add a lot of methane to the mix.

  9. Science of Climate Change online class starting next week on Coursera

    Another thing....I should call my corrections one and two "suggested supplements" to MILA. The potential for the user to obtain his/her emissivity of 0.92 would be solved simply if, someplace on the screen that the user sees when the website is opened, the statement is made that the emissivity used is 0.98.  

    I first brought this up at Science of Doom, and Steve Carson was the one who suggested working up a revised MILA estimate for the clear sky OLR. The other S of D responder posted that although he had used MILA he had never thought of investigating the button for the underlying program.

    I will acknowledge Steve Carson when I contact David Archer. Again, S of D explains the diffusivity approximation at length, and recommends it in their section that derives the symbolic solution to the upward stream of the two stream solution to Schwarzchild's equation.  

    Finally, in post 38 above, the statement is made "I have tracked down this Schwarzchild Equation and, rather than the ones that explains the wobbles of the planet Mercury, it is a rather mundane equation that I didn't appreciate had a name and which is explained here."  

    1. The S.E. that is  used in the present context was first used in elucidating the physics of the sun it is not:

    2. The first solution to the general theory of relativity that is the "Schwarzchild Radius" of a black hole. And of course general relativity is what described the precession of the perihelion of mercury, and that discovery was independent of Schwarzchild. 

    3. I guess this thread is the first time you have heard of the Schwartzchild Equation, M.A. Rodger? The equation may look simple but I found it not so simple to apply in practice. There is a subtlety in the application of the boundary conditions, IMO.

    Moderator Response:

    [JH] A reminder, MILA = "Modtran Infrared Light in the Atmosphere"

  10. Yes, we can do 'sound' climate science even though it's projecting the future

    Thank you Kevin for that oped. Gavin's TED talk is an excellent complementary explanation, in case you haven't watched it yet. Be wise and source reality from the relevant experts rather than from fake news makers.

  11. Science of Climate Change online class starting next week on Coursera

    In the above, I state mistakenly that Note that my 1.7 factor is 1.666 or 2.3 rounded off.  Mistake. my 1.7 factor is 1.666 rounded off. One over 1.666 equals the cosine of 53.96 degrees.   53.96 degrees rounds off to 54 degrees. One over 1.7 equals .5882, and the ArcCosine of 0.5882 is 53.9 degrees which rounds off to 54 degrees, which is the angle I choose.

    I have by going through this exercise come to realize that Pierrehumbert stating that cos theta = 2/3 would be an equally valid choice as cos theta = 1/2 implies an upper limit for the angle of 60 degrees and a lower limit of 48.1 degrees. 

    Grant Petty's text suggests cos theta equals 3/5 which is the arccosine of 53.1 degrees.

    Houghton chooses 3/5 as the ratio,a ratio which is the Arc cosine of 53.1 degrees, which is the same angle recommended by Petty.

    My angle of 54 degrees lies within the range of 60 degrees upper limit to 48.1 degrees lower limit corresponding to page 191 of Pierrehumbert, and is only ~1 degree greater than the 53.1 degrees used by both Petty and Houghton. 

  12. Increasing CO2 has little to no effect

    DrBill @299, the formula for the radiative forcing of CO2 is 5.35  x ln(CO2current/CO2initial).  That is equivalent to 12.32 x log(CO2current/CO2initial).  NOAA gives this, and formulas for the radiative forcing of other greenhouse gases here.

    To determine the equilibrium response to a given radiative forcing, you need to multiply the forcing by the climate sensitivity factor.  That is approximately equal to 0.8 +/- 0.4oC/(W/m2), which is what wikipedia says.

  13. Humans on the verge of causing Earth’s fastest climate change in 50m years

    Tom@7,

    Thanks for the pointer to the source publication of your claim. It was an interesting read. And it confirms that the amount on FF reserves can be hgher than I thought beforehand and Wink12K scenario of releasing 12EgC anthropogenic CO2 slug is at least theoretically possible, though constrains other than resource limit can make it unrealistic. I would add that with a CO2 slug that big, the natural feedback of thawing permafrost and unknown mechanisms that can turn the ocean into CO2 source, can contribute to even more CO2 release.

    BC@8,

    A conversion of .75 when going from forcing to temp is the fact that Equilibrium Climate Sensitivity used by Hansen is 3K/double CO2 while double CO2 creates 4Wm-2 forcing. Hence 3/4=.75.

    But I think not just ECS should be used to determine temp evolution on the timescale in the OP graph. More appropriate is Earth System Sensitivity which takes into account millenial scale feedbacks, such as melting icesheets, permafrost thaw I mentioned above. ESS is larger than ECS. So, having not read the OP study, I don't understand why the temps on fig 4 are smaller than ECS. From your eyeballing, it looks more like Transient Climate Sensitivity figures - 2K/double CO2 wich is 50% of forcing number in Wm-2 - e.g. 3.0 degC for RCP6. Maybe figure h shows just the amount of CO2 released in each scenario and does disregards the ocean CO2 uptake, optimistically assuming the strength of the ocean CO2 sink will not be affected till the end of the period shown and ESS feedback wil not happen. With such assumptions you can say that ocean CO2 uptake will largely counter-balance the warming progressing from TCS to ESS within decadal to a century timescale and postulate TCS only level of warming on that timescale.

  14. Increasing CO2 has little to no effect

    I read a lot of this thread,  but not all of it, an found that the disussion had moved on from the basic question of its title, to a temperature rise due to a concentration change.  This is similar to something from wikipedia that says the forcing resulting from CO2 is accordiing to 5.65K*log CO2(1)/CO2(0) = change in temperature.  Is this the understanding here, or has wiki got it wrong?  Just trying to understand and if this is the wrong thread, I apologize in advance.

  15. Science of Climate Change online class starting next week on Coursera

    Note to Tom Curtis comment 37. I had always assumed, say for two years or so,  that I could never understand anything that button revealed, as it would have something to do with the program source code.  It was only when I determined by integrating one of the outputs  by using a digitized result with an underlying width which by accident extended between  5 wn to 2000 wn and got a significantly larger OLR than was given in the MILA output, and then found I was within 2% of the MILA output if the underlying interval was limited to 100 wn to 1500 wn. that I realizedsomething was wrong.  

     By user output I meant what the user sees in the window that appears when you click on the URL for MILA. Then I looked at that button and found that the actual integration was limited to 100 to 1500 wn and that the emissivity was0.98 The way I found this out, and that indeed the information was revealed by that button, was described earlier in the thread, Tom Curtis, which you might have looked at more carefully before accusing me of a false claim.

    Regarding comment 38. Here is how I get the upward intensity as a finction of emissivity and altitude.

    For the zero altitude in band intensity I go to the SpectralCalc black body calculator APP and calculate the in band radiance for 288.2 degrees and the emissivity I put in.  I use a wave length range of 500 to 850 wn which completerly includes the bending mode band of CO2.

    For CO2 the bending mode range is contained within the 500 to 850 wn range. The band from 100 wn to 500 wn is completely transparent if CO2 is the only green house gas, which I also put into SpectralCalc. The same is true for the window between 850 wn and 1500 wn. Therefore, since CO2 is completerly transparent for 100 wn to 500 wn, and 850 wn to 1500 wn I also obtain those intensities using the SpectralCalc Black Body Calculator as described above in this post. The upward intensity for a CO2 only atmosphere, will be the same for all altitudes for those two outer widows, as observed at ground level, and therefore I just add them at higher altitudes to the output I get for the 500 to 850 band.

      For the 500 to 850 band I use the "Atmospheric Paths" APP of SpectralCalc. This gives either transmittances or radiance outputs. Here I use the radiance output. The way this works is that there is a virtual source at ground level, for which you can put in a temperature of 288.2 degrees and an emissivity of choice. Again, I choose an atmosphere with only CO2, major isotopologue. A complication is that the U.S. Standard atmosphere is used in the scale factors. Therefore a scale factor of one does not correspond to 400 ppm of CO2, since back in the 70s the CO2concentration was less than this. For 400 ppm you must therefore use a scale factor of 1.212. The U.S. standard atmosphere is default for the SpectralCalc atmospheric path sections. 

      Here is my only non standard step: I use instead of 400 ppm a concentration of 1.7 times 400 ppm or 680 ppm. This corresponds to a scale factor of 1.212 times 1.7 or 2.06. Why do I do this? I am in effect using the "diffusivity approximation" as described in great length both in Pierrehumbert's text and Grant Petty's text. The idea is that if one does not wish to integrate the output radiance to obtain diffusive flux, one can approximate the result of integrating that radiance by multiplying the in band radiance by pi and simultaneously replacing all paths involved by a straight line path going at angle theta relative to vertical. Here I quote from the pages within Pierrehumbert that are not available in the URL http://cips.berkeley.edu/events/rocky-planets-class09/ClimateVol1.pdf

    Quoting from Pierrehumbert p. 191: "...if the radiation field remains approximately isotropic, the decay rate is the same as for unidirectional radiation propagating at an angle theta such that cos theta = 1/2, i.e. 60 degrees..." Then "..the choice of cos theta = 1/2 is by nomeans a unique consequence of the assumption of isotropy....(under certain conditions) cos theta = 2/3 and this would ve an equally valid choice within the limits of the isotropic approximation..."  

    Petty staes on page 214..."The most commonly used value of (symbol for one over cos theta) is 5/3"  

    Note that my 1.7 factor is 1.666 or 2.3 rounded off and my angle is 54 degrees, not 60 degrees.

    Furthermore in The Physics of Atmospheres by John Houghton third edition he states on pp. 11 - 12 that for Schwarzchild's Equation, to a good approximation the intensity may be replaced by the diffuse upward flux if B(T), [B(T) is the symbol for the black body emission per unit solid angle per unit area of a surface at temperature T] is replaced by pi B and the incremental increase in altitude dz is replaced by 5/3 dz."

    At one time I went through the procedure of replacing vertical angle paths by paths at 54 degrees using a spread sheet. But I realized that exactly the same result is obtained by keeping the path vertical and replacing the concentration q by 1.7 q. Mathematically this must be the case since all the expressiont for transmittance in Pierrehumber involve the set of symbols F (q, theta) = q/cos theta. In the bottom of page 229, Pierrehumbert, his equation for the transmittance between pressures p1 and p2 is one minus the equivalent width where the equivalent width of a single Lorentzian line is one over delta (delta is he range of wave number considered)

    time 2 times the square root  the hitran line strength for a Lorentzian at at the base of the atmosphere times a different "strong line strength LS". The strong line strength contains together the set of symbols F = q/cos theta. This can be expanded to give the Curtis-Godsen approximation.  I have tested whether the transmittance values I get from SpectralCalc and transform to angled paths at 60 degrees using a spread sheet are identical to the transmittance of a vertical path with q' = 2q . The agreement is exact!

    Say I chose theta to be 60 degrees. Since for a vertical line cos theta is one, and 2q/one equals q/cos 60 degrees what I do is exactly equivalent to the "diffusivity approximation"

    Bottom line....what MILA does is actually integrate over all angles. I use this approximation instead. The techical consultant at SpectralCalc tells me that what I am doing is an approximation to doing this a better way, where he would have used  a "fouth order quadrature" whatever that is. He also told me that the same thing is true for the radiant atmospheric path APP as for the transmittance paths in Schwarzchild Equation, i.e. by multiplying the path length by 1.7 or by multiplying the concentration by 1.7 I can approximate the best way of going from intensity to diffuse flux which would use a "fourth order quadrature".

    It works!! See post 8 above. The Science of Doom describes the method  in their section on the "Greenhouse Effect" "The Equations"

    You should also know that if you use atmospheric paths that are too long you get an error message in SpectralCalc to the effect that you have exceeded the one million point limit. Also, their are angled paths already available in SpectralCalc but they are real paths which are strongly refracted, and you need idealized paths that go in straight lines to use this approximation.

  16. Yes, we can do 'sound' climate science even though it's projecting the future

    It truly gets a bit absurd. 

    Think about the real reason for doing science.  The evolutionary advantage reason for doing it.  It is ENTIRELY about projecting the future.

     All the explaining, the theory, the curiousity, the analysis, the data gathering, is about understanding what the real world does well enough to predict what the real world will do with enough time and certainty to change things.   

    Understanding "why" (which is what science is) allows preparation  and changes to "what" ultimately happens.  It is the root of every benefit of human civilization.    

    Moderator Response:

    [JH] The use of "all-caps" is prohibited by the SkS Comments Policy. The use of bold face font for emphasis is acceptable.  

  17. Daniel Bailey at 23:08 PM on 18 April 2017
    Arctic icemelt is a natural cycle

    As Tom notes, both the Arctic sea ice and Antarctic sea ice are more than 2 standard deviations below the long-term average.  So that point bears repeating.

    Arctic Sea Ice (per NSIDC):

    NSIDC Arctic Sea Ice

     

    Antarctic Sea Ice (per NSIDC):

    NSIDC Antarctic Sea Ice

  18. Daniel Bailey at 23:01 PM on 18 April 2017
    Arctic icemelt is a natural cycle

    NASA's position on land-based ice sheet mass losses:

    "Data from NASA's GRACE satellites show that the land ice sheets in both Antarctica and Greenland are losing mass. The continent of Antarctica has been losing about 118 billion metric tons of ice per year since 2002, while the Greenland ice sheet has been losing an estimated 281 billion metric tons per year. (Source: GRACE satellite data through 2016)

     

    Greenland Land-based Ice Sheet Mass Losses, per GRACE:

    NASA_GRACE_Greenland

     

    Antarctica Land-based Ice Sheet Mass Losses, per GRACE:

    Antarctica Mass Losses NASA GRACE

  19. Daniel Bailey at 22:50 PM on 18 April 2017
    Arctic icemelt is a natural cycle

    To sum, the earth is losing a trillion tons of ice per year:

    - 159 Gt Antarctic land ice, McMillan el al, GRL (2014)

    + 26 Gt Antarctic sea ice, Holland et al, J Climate (2014)

    - 261 Gt Arctic sea ice, PIOMAS

    - 378 Gt Greenland, Enderlin et al, GRL (2014)

    - 259 Gt other land based glaciers, Gardner et al. Science (2013)

    Total = - 1,031 Gt

    Losses outnumber gains by a ratio of 40:1

  20. Humans on the verge of causing Earth’s fastest climate change in 50m years

    Thanks Tom. (BTW - I live in Brisbane too)

  21. Humans on the verge of causing Earth’s fastest climate change in 50m years

    BC @8, the gap appears compressed because of the use of a log scale on the y-axis.  Further, the scenarios are defined for their forcing as at 2100.  RCP 8.5 continues to expand atmospheric concentration long after that so that its final focing is significantly greater than 8.5 W/m^2.  RCP 6.0, in contrast, maintains a near constant forcing after 2100.  Finally, Twink12 is defined by the number of terratonnes of carbon emitted rather than by forcing (as I understand it).

  22. Humans on the verge of causing Earth’s fastest climate change in 50m years

    Digby Scorgie @6, nobody knows, and the time will depend on the rate of fossil fuel burning.  Further, to a certain extent, increased energy resources and be used to counter much of the economic effect of AGW, particularly in highly industrialized areas.  A sufficiently irrational person could greatly extend the time before it became impossible to maintain the technological civilization needed to burn fossil fuels by burning fossil fuels faster and faster.  (This strategy requires calous disregard for those whose economic situation isn't so favoured.)

    I do not think the OP argues for so high a benchmark on disruption, ie, that it will significantly impair our ability to burn fossil fuels.  I think it is arguing that at some point the level of economic harm and natural disasters will catastrophic, potentially to the point of negative economic growth and declining population.  Our civilization can survive declines in both of low percentage; and with it our ability to burn fossil fuels will also be preserved should we be mad enough.

  23. Humans on the verge of causing Earth’s fastest climate change in 50m years

    Good article. It gives a lot of pertinent info about our situation in some fairly simple graphs. I was surprised to see the RCP4.5 and RCP6 lines so close together especially in the forcing graph (second one) so went back to the SKS post below. And fig 4 in this RCP guide shows that the numbers 4.5 and 6 actually refer to the forcing level (no doubt a lot of readers already knew this). So the gap between 4.5 and 6 seems too narrow when compared to the gap between 0 and 4.5, or am I interpreting this wrong. Perhaps of more interest is that fig 4 also gives temperature anomalies 2.4 degC for RCP4.5 and 3.0 degC for RCP6, and the CO2 eq figures are 650 and 850 ppm. These temperature figures, while bad, aren't as high as I expected, and give a glimmer of hope. James Hansen in Storms of My Grandchildren estimates a conversion of .75 when going from forcing to temp which gives 3.4 degC and 4.5 degC, which are more concerning.

    rcp-guide-part3-post.html

  24. Humans on the verge of causing Earth’s fastest climate change in 50m years

    chriskoz @5, the figure is from the IEA's Resources and Reserves 2013.  With regard to different estimates, this pyramid of US resources and reserves from the EIA is helpful:

    Converting from short tons to tonnes (ie, metric tons) we have 232 Gigatonnes of estimated recoverable reserves, 1514 Gt of identified resources, and a total resource base of 3544 Gt of total resources.  They further estimate that the US has 26% of the world's coal reserves, which would indicate a world recoverable reserves at 892 Gt of global recoverable reserves.

    For comparison, IEA 2013 estimates 8,130 Gt as the total resource for hard coals and lignite in North America (ie, the US, Canada and Mexico).  As that is more than double the US estimate, and the US has more coal than either Canada or Mexico, clearly the IEA 2013 estimate is larger - as will happen given that it is in part an estimate.  If we assume all North American coal is in the US, and scale the IEA 2013 global estimate so that the North American estimate matches that by the EIA, then the global resource would be 7,500 Gt.  Alternatively, if we scale the EIA US resource using the percentage of global share of the reserve, we get a total global resource of 13,630 Gt. Even on the low value, once you throw in tar sands, oil shale, deep sea oil and gas resources, arctic oil and gas resources, antarctic oil and coal (not incuded in any of the above) etc, a civilization determined to "burn, baby, burn" regardless of consequences could far exceed the RCP 8.0 scenario; and the Twink12 scenario is a reasonable scenario for such a strategy.

    As to the large differences in estimates, that will be in part because they are estimates when we are talking about the total resource.  More importantly, many estimates of amount of fossil fuels remaining in the ground restrict themselves to reserves, and/or reserves plus identified resources.  I once did a spread sheet of all the publicly available estimates across coal, oil and gas.  Only a few of the estimates included any oil sands, shale oil or unconventional gas (and they not all of it), and a range of criteria were used.  Of 14 estimats across 9 sources, only 3 provided estimates that may have represented the Total Resource, with estimates of 3000 GtC (World Energy Council 2010, "possible"), 11,000 GtC (S&W 2011, "TRB") and 16,000 GtC (IEA 201, "estimated").  (Note, the values quoted are my estimate of the carbon content allowing for amounts not oxidized due to spills, etc, and for carbon content of the fuel, based on the original resource and/or reserve estimates.  I do not think the WEC "possible" estimate is an estimate of the TRB, but rather an estimate of what part of that base could become economically accessible.  The other estimates nearly match (S&W) or significantly exceed the Twink12 requirements - and all with limited examination of unconventional sources, and without allowing for LUC and cement emissions. 

  25. Digby Scorgie at 17:43 PM on 18 April 2017
    Humans on the verge of causing Earth’s fastest climate change in 50m years

    Is the following a fair summary?

    On the one hand, we have a continuous increase in the burning of fossil fuels.

    On the other hand, we have the increasingly damaging effects of climate change.

    At some time in the future, climate effects will damage human civilization sufficiently to disrupt the burning of fossil fuels, resulting in a rapid decline in such burning.

    If this summary is valid, the obvious question is: When?

  26. Humans on the verge of causing Earth’s fastest climate change in 50m years

    Tom@2,

    Appology for my typo @1. I typed 5Pg and 10Pg (peta-grams == giga-ton), but I meant 5Eg and 10Eg (exa-gram) which is 1000 times more.

    Where did you find your number "17,204 Gt of coal resources" and 728 Gt reserves? I searched various IEA publications but cannot find your numbers.

    From the Wikipedia, the world's proven C reserves are 909,064 Mt. Such numbers obviously change as new discoveries are made and extraction methods (e.g. hydraulic fracking) improve.

    The resources number can be very fluid and change depending on definition of "resources". The one from IPCC AR4, e.g. as shown among other C reservoirs in OA not OK series:

    shows only 3700GtC of all fossil fuel resources.

    Archer 2005, 2007 etc, that we both know very well, considers only a 5000PgC slug in their model. Accordingly, David teaches 5000PgC to be the most likely slug if all FF are burned (a version of your "world where Trump is President of the US"). That number is still nowhere near your number of 17Eg+ of coal only.

    So I wonder where these large differences of various estimates of FF reservoir size do come from.

  27. Humans on the verge of causing Earth’s fastest climate change in 50m years

    green tortoise @3, even in a wink12 scenario, GMST will rise by about 10 C (the temperature at which wet bulb temperature emergencies become endemic in the tropics) only around 2200.  By then, the vast majority of the fossil fuels have been burned, and if on that pathway, it would still take several decades to convert to renewable sources.  Further, even then, a nation determined to "Put America First", or "Put Europe First" rather than putting the globe first, and which was determined to adapt to climate change by massively increased energy use powered by fossil fuels could still power on regardless.

    Do I think it likely that governments will be persistently that stupid?  No!  But while a large number of influential think tanks, and several major governments continue to push "Burn, baby, burn", it is a scenario that ought to be included to show the real consequences of their policies.  That is, in addition to scenarios that undershoot BAU (RCP 6) to guide those who want a sensible response to AGW, we need policies that overshoot it as a warning to those who do not (or those who do, but might be tempted by the massive PR campaign for fossil fuels being run by the likes of WUWT).

    As an aside, I suspect you were tongue in cheek, but the quote was:

    "You can fool all of the people some of the time, and some of the people all of the time, but you can't fool all of the people all of the time."  (Quoted from memory.)  The author was Abraham Lincoln, the first Republican President.  To that, pro-fossil fuel lobby and the most recent Republican President has added the addendum, you can fool enough of the people enough of the time that you can sideline those who see through your nonsense.

  28. Arctic icemelt is a natural cycle

    jfrantz @64, the linked source is describing sea ice for both Antarctica and the Arctic.  Antarctica has a fringe of sea ice that is preserved even in summer, though with minimal extent (about 4 million km squared).  In winter it becomes very extensive, exceeding in extent the Arctic sea ice maximum, mostly because it can extent into open ocean.  In terms of sea ice extent, the reduction in Arctic sea ice has been far greater, as shown in the graph in the section on Antarctic sea ice extent in the linked source.  That graph, however, only extends to Dec, 2012.  A more recent graph shows the Antarctic sea ice extent anomaly to have declined astonishingly over the last two years, reaching Arctic (and hence negative) levels by Dec 2016:

    (Source)

    The low extent anomaly has continued into 2017:

    (Source)

    With regard to sea ice volume, we are primarilly dependent on models, as there are insufficient depth measurements of the ice to provide a region wide, continuous time series.

    A region wide model for Antarctic sea ice volume reported in 2014, and showed sea ice volume to 2010:

    The trend of 28.7 km^3/yr compares to Arctic trends of -260 km^3/year in April, and -320 km^3/yr in September shown for the Arctic:

    Obviously the recent massive retreat in Antarctic sea ice extent will also have been reflected in a retreat in Antarctic sea ice volume, but as we do not know to what extent it has been matched by a reduction in sea ice thickness, we do not know by how much.

    Finally, if Al Gore did say that "arctic ice is floating and antarctic ice is on land", that is misleading (at least out of context).  The peak Arctic sea ice volume was about 33,000 km^3 in April (the time of maximum volume).  That is dwarfed by the 2,900,000 km^3 volume of the Greenland Ice Sheet.  Looked at differently, 14,000,000 km^2 area of the Antarctic continent, while five times the minimum Antarctic sea ice extent is about 80% of the maximum Antarctic sea ice extent.  Whether we look at either volume or extent, both polar regions are a story of land ice and sea ice.  However, Antarctica is landice surrounded by sea, while the Arctic is sea ice surrounded by land.  That makes a very large difference with regard to the rapidity of temperature responses, and the rapidity of albedo changes with warming or cooling, both being much faster in the Arctic.  With regard to sea level rise, however, both poles are a land ice story.

     

  29. green tortoise at 15:25 PM on 18 April 2017
    Humans on the verge of causing Earth’s fastest climate change in 50m years

    I also don't think that 12000 Ptg of fossil carbon could be liberated, mainly because, even incluiding the Trumps, Putins et al. pro-fossil fuel politicians and regimes, the human devastation would be so huge that humanity will be destroyed much before that happens.

    As an example, most of the Middle East, that has some of the biggest (and cheapest) oil&gas resources would cross the 35°C (wet bulb temperature) boundary of human survival. Most people die of high fever above that temperature. Imagine hunded of millions of Arabs, Indians, Iranians and Africans storming Russia, China and Europe escaping the uninabitable landscapes behind.

    Behind there would be disaster zones surely taken by extremist groups that would not certainly did a good maintenace to the legacy oil&gas facilities, just like has happened in the areas occupied by ISIS/ISIL/DAESH today, specially if bombed by the Putins and Trumps.

    Given that human component, I don't think we can pass 6°C without destroying the fossil fuel infrastructure (either by climate disasters, political violence or global war).

    This is a negative feedback, given some climate change, the planet become deadly for most people, and the following collapse prevents further fossil fuel burning. This is however the worst kind of negative feedback imaginable, because the biosphere is saved by killing us as if we were some kind of deadly virus or bacteria

    There is catch,  unfortuately: given the huge amount of carbon stored in shallow soil, permafrost and gas hydrates that could be destabilised by warming, maybe  once the human emissions (and the human population) approach zero, those "natural" emissions triggered could push the planet to a state not seen since the Snowball Earth meltdown (i.e. Tmean= 50°C) in the Precambrian, killing everything that is not a microorganism.

    If the Sun has warmed enough since the Snowball Earth meltdown, maybe even a moist greenhouse could be triggered, that will end only when the carbon is sequestered by the flash cap-carbonate reaction. If that is not enough to to compensate for the increasing water vapor greenhouse effect, the greenhouse state will last until Earth has lost most of its water to space, leaving a desert planet behind (I however doubt that Earth is vulnerable to runaway greenhouse like Venus).

    I however have strong hope that nothing like that will happen, because "people can be all stupid sometimes, always there is some stupid people, but people cannot be all stupid alltimes" ( I don't remember who wrote that, any idea?).

    After all, hope is what moved a lot of brave people against very adverse events in the past, and hope is badly needed to face this crisis.

    By the way, I hope you have had a nice Easter.

  30. Arctic icemelt is a natural cycle

    Seroius and I think well founded question here.

    In the source linked, http://nsidc.org/cryosphere/sotc/sea_ice.html they describe that antarctic ice has been increasing during this time period while arctic is decreasing. In terms of standard deviation from mean the arctic decrease has been twice as sharp, but:

    1. In terms of volume of ice, are we net gaining or losing ice at the poles?

    2. Since arctic ice is floating and antarctic ice is on land (pulling this from Inconvenient truth), should the antarctic ice be the main concern when it comes to rising sea levels? Or is that outweigh by the specific areas growing or shrinking  (e.g. growing areas are sea ice, shrinking are land based)?

  31. Humans on the verge of causing Earth’s fastest climate change in 50m years

    chriskoz @1, while the IEA only estimated 728 Gt (=728 Pg) of coal reserves in 2010, they estimated 17,204 Gt of coal resources.  The difference is that while reserves for reserves, they are recoverable given current mining technology, they are also economic to recover at current prices.  Resources in excess of reserves are also recoverable given current technology, but are not economic to recover at current prices.  The transition from resource to reserve can be quite rapid given changes in technology and/or increased demand.  It follows that the Twink12K (ie, 12,000 GtC) is well within the limits of fossil fuel resources, and what is within the limit of fossil fuel reserves is a matter to be proven by future technological and economic developments.

    That leaves aside emissions from LUC and cement manufacture.

    It is unlikely that we would be able to burn that much fossil fuels before the effects of climate change made further burning of fossil fuels politically infeasible (if not necessarilly destroying the infrastructure, which is no more vulnerable than any other part of our civilizations infrastructure).  In a world where Trump is President of the US, however, there are no guaranttees.

  32. Humans on the verge of causing Earth’s fastest climate change in 50m years

    The " Wink12K scenario" is based on this article, speculating the realease of 10Pg anthropogenic C, is the first such scenario I've seen. Apart from being unrealistic (only 5PgC of recoverable FF reserves have been estimated), I think homo sapiens would be technically unable to burn it before the transit climate change effects wiped off or seriously crippled the whole FF burning infrastructure. Then, it comes the increasing awareness that will put more pressure to curb the burning in the future, with possibly other/renewable energy sources replacing it. So, we can safely cross that Wink12K scenario as impossible.

  33. Human CO2 is a tiny % of CO2 emissions

    Pattio: The airborne fraction of CO2 has been fairly constant, despite the growth in the rate of anthropogenic CO2 emissions. Therefore the natural sinks are not static. That determination has been made by scientists who, therefore, do not in reality believe the sinks are static.

  34. CO2 effect is saturated

    Hope this is the right place to point out that Figs 1 & 2 have disappeared from the 'Advanced' article, apparently after Altervista suspended their hosting. They're still available in the PDF and via the Wayback Machine.

    Moderator Response:

    [DB]  Updated.  Thanks for the heads-up!

  35. 2017 SkS Weekly Climate Change & Global Warming News Roundup #15

    Please dont misunderstand what india is doing. we have a religion based fascist government in power. they are painting the color of religion everywhere, so all who get affected by that propaganda reacting with religious theme. Making just one river is sacred is a game for divide and rule and killing innocent people.

  36. Human CO2 is a tiny % of CO2 emissions

    Pattio: as Michael says, please do provide a reference to support your claim that others hold the position that sinks are static.

    The sources that I am familar with (e.g., the IPCC) pretty clearly recognize that about half of what is emitted to the atmosphere (by burning fossil fuels) is abosrbed by the oceans and biosphere (the "sinks"), which directly contradicts two of the claims you made in your opening paragraph:

    1. ...that others claim the sinks are static (unsupported because others feel that sinks have increased to absorb half of what is emitted)
    2. ...that your argument counters the claim that the sinks cannot process the increase in emissions (they can't, as evidenced by the fact that they can only process half, with the other half still residing in the atmosphere).
  37. Science of Climate Change online class starting next week on Coursera

    curiousd @32.

    I have tracked down this Schwarzchild Equation and, rather than the ones that explains the wobbles of the planet Mercury, it is a rather mundane equation that I didn't appreciate had a name and which is explained here. Frankly, I do not see how it could be used with the UoC MODTRAN model in a way that would lead to a misapprehension that emissivity=0.92. As Tom Curtis points out @37, the emissivity value is set out clearly in the output file which would be required reading for any "serious user."

    Concerning the effect of using a single MODTRAN calculation as a global average, further rough calculations suggest to me that the impact of latitude and the annual & diurnal cycles do appear to provide enough additional flux above that single average figure to add 1% to 3% to the total. While these rough weighted calculations remain very sensitive to the assumed weightings, the outcome appears quite clear. Tom Curtis @35 adds weight to this finding. Add in the truncated frequency used in the UoC model analysis and the flux values are become very close to the Chen et al (2013) values.

    Of course this does not mean there is no room for other approximations with significant effect on the result within the UoC model, or even errors. But if you wish to establish the potential of your "Correction Number Two" and your "Correction Number Three" you will need to present due reason. So far, that has not been done. And failing to explain your ideas, for instance the derevation of the "One km IR Radiance" values in the graph you presented up-thread (ie this graph), is not winning you any favours.

  38. 2017 SkS Weekly Climate Change & Global Warming Digest #15

    Joe; I posted the Abstract to the paper you had linked to in hopes that you would acturally read it. If you had, you would see that the researched forcused on a specific location — Bunaken Island (North Sulawesi), Indonesia — with unique sea level rise conditions and for a specific type of coral. For you to suggest that the results of this single study can be extrapolated worldwide is absurd.

    The first of the two articles that I linked is not an advocacy article. You actually might learn something by carefully reading it.

  39. michael sweet at 01:34 AM on 18 April 2017
    Human CO2 is a tiny % of CO2 emissions

    Pattio,

    Can you provide a reference for your claim that someone says sinks are static?  I am underthe impression that most of the sinks and sources of carbon respond to changes in the environment around them.

    While you article is interesting, it is clear from the measured increase in CO2 in the atmosphere that natural sinks have not been able to absorb all the CO2 humans release.  That may change in the future although it is not clear if the sinks will increase or decrease.

  40. 2017 SkS Weekly Climate Change & Global Warming News Roundup #15

    @Joe: I should have been more circumspect with my terminology, and avoided "personhood" which has been a more contentious term. Notwithstanding, corporations and other legal entities are considered "legal persons" by the law and are the subject of legal agency, and it is on the basis of this legal agency and the First Amendment rights that inhere in it, that the Supreme Court granted them the right to political contributions and action.

    Common sense would dictate that a legal persoon has "personhood", but this is clearly not the case, as everybody recognizes that a "legal person" is no more than a legal artifice. Although universally recognized and employed, the concept of legal "persons" has been criticized in recent decades because it often enables the persons behind such social organizations "to get away with murder."

  41. Human CO2 is a tiny % of CO2 emissions

    The following study published in Nature, April 5th 2017, shows a 31% ± 5% plant growth since the beginning of the industrial revolution. This would counter the claim that "sinks" are static and cannot process the comparatively tiny increase in carbon emissions due to human activity.

    Large historical growth in global terrestrial gross primary production http://dx.doi.org/10.1038/nature22030

    Large historical growth in global terrestrial gross primary production:  "Growth in terrestrial gross primary production (GPP)—the amount of carbon dioxide that is ‘fixed’ into organic material through the photosynthesis of land plants—may provide a negative feedback for climate change1, 2. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth3. As a consequence, modelling estimates of terrestrial carbon storage, and of feedbacks between the carbon cycle and climate, remain poorly constrained4. Here we present a global, measurement-based estimate of GPP growth during the twentieth century that is based on long-term atmospheric carbonyl sulfide (COS) records, derived from ice-core, firn and ambient air samples5. We interpret these records using a model that simulates changes in COS concentration according to changes in its sources and sinks—including a large sink that is related to GPP. We find that the observation-based COS record is most consistent with simulations of climate and the carbon cycle that assume large GPP growth during the twentieth century (31% ± 5% growth; mean ± 95% confidence interval). Although this COS analysis does not directly constrain models of future GPP growth, it does provide a global-scale benchmark for historical carbon-cycle simulations."

    Moderator Response:

    [DB] As others have noted, you will need to furnish a source citation for this claim:

    "This would counter the claim that "sinks" are static"

    Hotlinked DOI.  An openly accessible copy is here.

  42. Science of Climate Change online class starting next week on Coursera

    curiousd @36:

    "But as is, ModtranChicago gives in its output no information, either for the assumed emissivity or about the restricted underlying wave number range of 100 wn to 1500 wn.

    That user will deduce an emissivity of 0.92, which I now know, but did not always know, is entirely too low."

    Anybody using the UChicago version of Modtran will see in the user inteface a button labelled "Show raw model output".  If they press that button, a new tab will open showing that output.  Scrolling through it, they will see first the details about the atmospheric profile, absorber amounts etc, and will then come to a section labeled "RADIANCE(WATTS/CM2-STER-XXX)", which is then shown for wave numbers from 100 to 1500 cm^-1.  At the bottom, the output shows the "INTEGRATED ABSORPTION FROM 100 TO 1500 CM-1".  A little below that it informs us that "BOUNDARY EMISSIVITY = 0.980".  In short, all the information you claim is not available, for which according to you, there is "no information" in its output, is all available just by pressing a button of mysterious function with the mystical label "raw model output" /sarc.

    Beyond the false claim about what was not available in the Modtran output, there is nothing in your comment not dealt with by my comment to which you were responding.  

  43. Science of Climate Change online class starting next week on Coursera

    Hi Tom Curtis,

    ModtranChicago is designed primarily for aid in teaching basic environmental science, and is an excellent tool for doing so as is.

    It is, as far as I know, also the only source available to everyone for free of plots of outgoing OLR values as a function of altitude and GHG composition. Therefore, for someone such as I, who wishes to learn about at least the one dimensional version of atmospheric science on his own with no available guidance, the temptation will be to use the ModtranChicago output; there is no other choice.

    But as is, ModtranChicago gives in its output no information, either for the assumed emissivity or about the restricted underlying wave number range of 100 wn to 1500 wn. 

    That user will deduce an emissivity of 0.92, which I now know, but did not always know, is entirely too low. See the post 31 above.

    That user may then be applying the Schwarzchild Equation correctly and still will find very poor agreement with his "Standard" which is the output of ModtranChicago, for the region near 1 km. The user will then go back to the drawing board repeatedly to find his mistake.  

  44. 2017 SkS Weekly Climate Change & Global Warming Digest #15

    The NY Times is reporting that this year may bring another El Nino, which would be pretty devastating for the GBR.

  45. Science of Climate Change online class starting next week on Coursera

    curiousd @various, it was established in 1997 by Myhre and Stordal that using a single atmospheric profile in LBL and broadband radiative models will introduce inaccuracies to the calculation.  That result was confirmed by Freckelton et al (1998).  Myhre and Stordal state:

    "The averaging in time and space reduce the radiative forcing in the clear sky case by up to 2%. This is due to the fact that blackbody emissions are proportional to T4 and that averaging reduce or remove spatial or temporal variations."

    It follows that if you really want to test Modtran for bias, you would need to use (ideally) 2.5o x 2.5o cells based on weekly averages, and take the means.  Failing to do so will introduce a bias, which based in Myhre and Stordal, is approximately equal to 50% of the bias you claim to have detected.

    The University of Chicago version of Modtran does not permit that, restricting choices of atmospheric profiles to a Tropical zone, two Mid Latitude zones (summer and winter) and two Subarctic zones (summer and winter).  Using default values for a clear sky, and with no GHG I found the difference between the OLWR at 70 km for each case, and for a areal and temporally weighted zonal means, and for the US standard atmosphere at default temperatures, and at a temperature adjusted to match OLWR to the average incoming radiation to have a mean bias of 5.35 +/- 3.69%.  Tellingly, the least bias (2.33%) was found with the weighted means.  The US standard atmosphere with surface temperature set to 254.5 K showed a bias of 4.19%.  Given this, the case for any significant bias in the calculation of OLWR over the range of wave numbers covered by the model is unproven.  Given the wide range of biases in different scenarios, it is not clear a single correction factor would work in any case.

    Further, the idea that Modtran should be adjusted to determine a single OLWR value seems wrongheaded.  Modtran is intended to predict observed IR spectrums given a knowledge of surface conditions, and trace gas and temperature profiles.  Here is an example of such a prediction (strictly a retrodiction):

    Clearly the University of Chicago version of Modtran is capable of reasonable but imperfect predictions of such observed spectrums.  Given the limited ability to reproduce actual conditions (ie, site specific surface emissivity, specific temperature profiles, density profiles, etc) we do not expect anything else, in what is simply a teaching tool.  Nor is any explicit bias obvious from the example above, with OLWR at specific wave numbers sometimes being over estimated and sometimes under estimated by the model.  If you trully wanted to test Modtran 6 for bias (or for error margin), you would need to compare by wavenumber across a large, but representative range of such site specific profiles.

    I have not been following the technical discussion above at any depth, but it seems to me that before you get to that discussion, you need to allow for the known constraints on any radiative transfer model, regadless of its accuracy line by line.  Further, you would be better directing the technica discussion to the actual use of radiative transfer models rather than their use (and potential misuse) in testing zero dimensional first approximations of the greenhouse effect.

  46. Science of Climate Change online class starting next week on Coursera

    Regarding:

    "What you demonstrate here is that yor correction for emissivity ("Correction Number Three") is duplicating your correction resulting from the limited spectrum under analysis ("Correction Number One"). When you say "(a) You know nothing about the limited wavelengh range" you correct by adjusting the emissivity value ("Correction Number Three"). But we do know about the "the limited wavelengh range" and when we correct for it ("Correction Number One") the flux calculations match blackbody conditions for emissivity=0.98 as the UoC model says it uses."

    The 0.92 emissivity result is a problem whether or not one obtains better values for the OLR at 250 K by correction one, or an improved clear sky OLR from correction two.  It results in a result for the solution to the Schwarzchild Equation (S E) ,assuming no scattering (two stream approximation), that has entirely the wrong OLR for the 1km distance from ground if the emissivity is 0.98. To the point that instead of a 1 km OLR that is slightly less than the surface ORL, as is the case for 0.98 emissivity, at 0.92 the OLR is slightly greater at 1 km than the surface emissivity.

    So consider the "serious user" of MILA who:

    1. Assumes that 0.92 is the emissivity instead of 0.98 by doing the simple calculation above comparing the MILS emission from the ground and comparing to the Stefan - Boltzmann law.

    2. Correctly numerically applies the SE expression for the outgoing stream in the two stream SE, perhaps applying the SE to only the difference between the 800 ppm and 400 ppm cases for CO2 to get the climate sensitivity. He /She is (unknowingly) using a drastically too low surface emissivity.

    3. Since the upstream solution is I^(0) t* plus atmospheric contribution , if I^(0) is (0.92/0.98) factor too small relative to what one gets from MILA, since the atmospheric contribution term is independant of the surface, both the 400 ppm and 800 ppm contributions will be too small relative to what one obtains by just running MILA for 400 ppm and then 800 ppm and subtracting, where the dicrepancy is not much at,say, 20 km but really serious at 1 km.

    4. The user will therefore conclude that his/her application of SE is incorrect even though it is not. The user is just (unknowingly) using the wrong emissivity.

    Grant Petty states, on his page 215 that the I^(0) term is the "....only term whose value cannot be directly computed from knowledge of (symbol for wave length specific linear absorpion coefficient as function of z ) and (symbol for transmittance as function of z) alone (for given wavelength and viewing direction mu). " He then states:

    "The expression we supply for I^(0) depends on what we assume about the nature of the surface. But regardless of those assumptions, there are two contributions that must be considered: (1) Emission by the surface itself, and (2) upward reflection of atmospheric radiation incident on the surface."

    I am certain that MILA does not take reflection at the surface into account which is fine for an illustration for students, but then the value of the emissivity of the surface strongly influences both the OLR and the value of intensity 400 minus intensity 800 as function of altitude z.

    So back to the serious user of MILA...He/She will think the emissivity is 0.92 instead of 0.98 and therefore get something that disagrees strongly with MILA at 1 km even though that serious user has been applying the SE correctly but whilst (unknowingly) using the wrong emissivity

  47. Science of Climate Change online class starting next week on Coursera

    Curiousd @29.
    The 255k-239.7W/sqm is a standard blackbody calculation for emissivity=1 with the outward IR flux matching the absorbed solar radiation, thus the albedo remains from cloud & surface while GHG=0 because it is a blackbody. In terms of accurate measurement/assessment of the flux, today there is an energy imbalance due to AGW.
    The UoC model gives 226.363W/sqm for 255K (and altitude =0) because emissivity=0.98 and the model only considers part of the spectrum, the value 226.363W/sqm being not greatly different from the 266.52W/sqm value calculated for the part-spectrum using SpectralCalc.com. (The difference is insignificant and could result from many understandable approximations.) Note the implications of this @31.
    Putting #31 to one side and assuming your “corrections to Modtran for 0.98 emissivity” are not what is meant, but rather the +8W/sqm you argue for @7. (I miscalculated this as 9W/sqm @28) I set out @18 why I consider this +8W/sqm to be too high and that a +4W/sqm would be more appropriate. You present no reason for the +4W/sqm being incorrect.

    The impact of altitude on the flux magnitude has been considered up-thread but as my misinterprettion of your “Correction Number Two”. (Note that I did not indicate the sign +/- @14. In truth I had in mind a positive correction but it would actually be a negative correction.) The UoC model set with GHG inputs as zero does demonstrate a change in flux with altitude but it alters by less than the variation of area with altitude (the square of the effective radius). Thus with all GHGs reduced to zero in the UoC model, we should see a 2.2% reduction from surface to 70km. The actual reduction (varies with surface temperature) is roughly 0.5%. This is probably explained by the continued presence of minor GHGs in the UoC model which are not zeroed but remain unchanged with user input. If this is not the case & if there is no alternative explanation, if it does require correction, this would comprise a reduction of some 4W/sqm for 255K surface and a measured IR flux at 70km.

    curiousd @31.

    What you demonstrate here is that yor correction for emissivity ("Correction Number Three") is duplicating your correction resulting from the limited spectrum under analysis ("Correction Number One"). When you say "(a) You know nothing about the limited wavelengh range" you correct by adjusting the emissivity value ("Correction Number Three"). But we do know about the "the limited wavelengh range" and when we correct for it ("Correction Number One") the flux calculations match blackbody conditions for emissivity=0.98 as the UoC model says it uses.

    curiousd @32.

    I am pretty hazy about Schwarzschild but I would not expect there to be a significant gravitational effect (?! I am pretty hazy!) when calculating the radiation flux through a planet's atmosphere.

  48. New study shows worrisome signs for Greenland ice

    Haze @7

    I am surprised that you feel "discussions at Skeptical Science are becoming increasingly constrained, and thus decreasingly informative, due to the current relative paucity of comments on most topics" (unquote)

    My own impression (admittedly personal anecdotal) over the past 4 years of frequent observation of SkS, is that there is a good flow of informative discussion in the comments columns.  If anything, I feel there is an even higher amount of such discussions than 3 or 4 years ago.  On top of that, I must say (from reading many comments columns back to the previous decade) that I noticed fluctuations in activity in most of the lengthier columns — but that is only something to be expected, as new developments and new data come along from time to time.   Perhaps, Haze, you are interested in a narrow range of climate topics : and that may have caused you to have a certain bias in your observations.

    Possibly the most important factor in discussions is the absolute need (if any!) for any discussions in the comments column!  A good many of the articles posted are so straightforward, that little discussion is required — save for the rebuttal of erroneous or peevish disputations that may appear in the comments column. 
    Moderator Response:

    [DB] Thank you for your fine response, but as the comment you were responding to was moderated, it became necessary to also moderate your comment's response to the off-topic material.

  49. michael sweet at 11:41 AM on 17 April 2017
    Tuvalu sea level isn't rising

    The link Rob P posted to the Tuvalu post explains a lot about local sea level rise.

    A comment on the OP here: I visited Funafuti in December 1988.  Only a very small portion of the atoll is 3 meters above sea level as mentioned in the OP.  Most of the island is much lower.  According to this Wikipedia article, the average height of Funafuti is less than 2 meters (no date given).   This climate report shows knee deep water in the center of town.  The island is so small that water from rain runs off immediately, the water would be at sea level.   Even in 1988 there were large areas in the village that regularly flooded at high tide.  A rise of even a few centimeters would have been immediately noticable.  Since they are dependent on ground water for many  uses flooding would ruin their water supply.

    When you combine that with the fact that the El Nino cycle can raise sea level 20-30 cm (as described in the OP), and consider similar storm surge increases in sea level, it is no wonder that the people of Tuvalu are concerned.

  50. michael sweet at 10:18 AM on 17 April 2017
    2017 SkS Weekly Climate Change & Global Warming Digest #15

    Joe,

    The sea level fall in Indonesia is unrelated to Global Warming.  It is caused by the El Nino cycle.  When El Nino appears the trade winds lessen.  The water piles up on the Eastern Pacific near Peru.  Sea levels in Indonesia fall.  During La Nina the trade winds blow harder and sea levels in Indonesia rise (and fall in Peru).  This animation clearly shows that sea level in 2016 fell in INdonesia while it rose in Peru.

    According to AGW theory, the sea level for the entire globe will rise.  Local effects are not relevant.  In fact, global sea level did rise in 2015-2016 as this graph shows:

    graph of sea level rise

    Deniers often confuse local sea level effects with global effects.  Local effects are not contradictions to predicted global effects.  I note that the article you cite claims that sea levels were at a 12  year low.  It was not lower than that because sea level rise caused the base to get higher.

Prev  397  398  399  400  401  402  403  404  405  406  407  408  409  410  411  412  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us