Recent Comments
Prev 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 Next
Comments 20251 to 20300:
-
bjchip at 04:50 AM on 21 April 2017SkS Analogy 1 - Speed Kills
The BBC also reported on the rate back in 2006
"The "scary thing", he added, was the rate of change now occurring in CO2 concentrations. In the core, the fastest increase seen was of the order of 30 parts per million (ppm) by volume over a period of roughly 1,000 years."... "The last 30 ppm of increase has occurred in just 17 years. We really are in the situation where we don't have an analogue in our records,"
http://news.bbc.co.uk/2/hi/science/nature/5314592.stm
-
DrBill at 03:42 AM on 21 April 2017Increasing CO2 has little to no effect
The free energy change of a process is the sum of energy and entropy*temp, and is generally regarded as happening spontaneously when the sum is <0. A version of free energy is that of Helmholtz, whose summation is ΔF (or ΔA) = ΔE -TΔS, but another is the Gibbs ΔG=ΔH-TΔS. The difference between the two expressions is that ΔE and ΔH are not the same thing: ΔE, as I've been used to thinking of it, is now named ΔU, while ΔH = ΔU + Δ(pV). In short, Gibbs Energy includes PV work, while Helmholtz does not when evaluating the possibility of a spontaneous process. A view of this, which I hold, is that the ΔE portion emphasizes the radiative changes, while the ΔH portion includes pV work.
Since the atmosphere not only radiates energy in trying to come to equilibrium, but also does substantial pV work, I believe a model that seems to rely on radiative transfers is not sufficient to explain the climate.
In support of this, I refer the reader to the 1962 and 1976 editions of The U.S. Standard Atmosphere. Google links lead to pdf's and I don't know if anyone wants to wade through dozens of derivations and perhaps 60 pages of notes and explanations, so I won't burden this post with the links themselves, unless someone wants them. In summary, the NASA, NOAA, and a host of other sponsors and contributors determine a model of the atmosphere based on gravity and Cp ((∂H/∂T)p [see @301], and regard CO2, methane and NOx as "trace gases" with no significant impact on temperature. Similarly, the lapse rate -g/Cp suggests the derivation from Gibbs.
The dichotomy is strong enough to get out the old slide rule, imo, and attempt to recognize something fundamental like Cp in the forcing equation. I have not so far.
-
kmoyd at 03:15 AM on 21 April 2017SkS Analogy 1 - Speed Kills
This is a very useful rebuttal to "The climate is always changing." argument.
For or those who don't care about nature, it would be helpful if there were a mention that agriculture is also affected.
is there an explanation for the three very high points?
-
Evan at 23:49 PM on 20 April 2017SkS Analogy 1 - Speed Kills
The post-1950's data is a simple linear fit in Excel. Nothing fancy. Although one can eaily make the case for CO2 rising faster than a linear trend (as pointed out in the responses), in this analogy the point was to simply introduce the idea of just how much faster we are moving than a "base rate" defined by deglaciation cycles.
-
DrBill at 12:59 PM on 20 April 2017Increasing CO2 has little to no effect
Just a quick thanks to Tom CurtiS, JH and RH. FWIW, I tried several ways to make the page show up and not until this evening did I try to quit and relog in, and that seemed to work. I agree with chriskoz about courtesy and thank Tom for recognizing a typo. If JH/RH don't mind, I wouldn't mind seeing my 302,3,4 deleted; 301 had all I had in mind to ask, as it had reference to the partial derivatives that result in Cp. It's late here, for me, and I'll post a new submission tomorrow.
-
Tom Curtis at 11:04 AM on 20 April 2017Increasing CO2 has little to no effect
DrBill @301, the formula for radiative forcing was not directly derived from fundamental physics. Rather, the change in Outgoing Long Wave Radiation at the tropopause, as corrected for radiation from the stratosphere after a stratospheric adjustment (which is technically what the formula determines), was calculated across a wide range of representative conditions for the Earth using a variety of radiation models, for different CO2 concentrations. Ideally, the conditions include calculations for each cell in a 2.5o x 2.5o grid (or equivalent) on an hourly basis, with a representative distribution and type of cloud cover, although a very close approximationg can be made using a restricted number of latituded bands and seasonal conditions. The results are then have a curve fitted to it, which provides the formula. The same thing can be done with less accuracy with Global Circulation Models (ie, climate models).
The basic result was first stated in the IPCC FAR 1990. That the CO2 temperature response (and hence forcing) has followed basically a logarithmic function was determined in 1896 by Arrhenius from empirical data. The current version of the formula (which uses a different constant) was determined by Myhre et al (1998). They showed this graph:
The formula breaks down at very low and very high CO2 concentrations.
-
Digby Scorgie at 10:59 AM on 20 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
Tom Curtis @9
When the "level of economic harm and natural disasters" becomes sufficiently catastrophic, I can't see anything less than a collapsing global civilization, with huge numbers of people dying of starvation and strife (Syria many times over) and economic activity plummeting as a consequence. Local civilizations will still survive but far less fossil fuel will perforce be burnt. Anyway, that's my gut-feeling — but then I'm a cynical old man.
-
amhartley at 10:48 AM on 20 April 2017Antarctica is gaining ice
Should this discussion be updated to account for the recent record lows in Antarctica's sea ice extent?
jlsoaz: Did you look at the National Snow & Ice Data Center's website, nsidc.org?
-
amhartley at 10:38 AM on 20 April 2017SkS Analogy 1 - Speed Kills
The analogy does make some good points, & clearly. Does anyone besides me worry, though, about using the period 1851-1950 as the comparator? For that period to be meaningful, it would have needed to last as long as we expect our current warming rate to last (probably many centuries, under a business-as-usual scenario), & we would need to have observed the warming's impacts on food supplies, sea level rise, weather & so on.
Since 1851-1950 was not that long, maybe a more meaningful comparator period would be the last previous period of warming of >1,000 years (ignoring any periods with other factors present that would grossly exacerbate extinctions & so on, such as asteroid hits). -
Tom Curtis at 10:19 AM on 20 April 2017Increasing CO2 has little to no effect
chriskoz @305 while I agree that correct spellings of names is a matter of courtesy, DrBill spelled my name correctly in three posts prior to that @304, so it is reasonable to assume that that mispelling was entirely inadvertent. Further, "most accomplished" is a compliment that suggests significant acheivement in a scientific field, whereas I lack even a BSc. I take that as a compliment to my depth of understanding of the topic, for which thankyou. However, while I think is deserved, that depth is limited, especially relative to anybody with a PhD in atmospheric physics.
-
curiousd at 09:00 AM on 20 April 2017Science of Climate Change online class starting next week on Coursera
Hello All,
No the moderator is wrong. I have learned all I can learn, you folks have helped me, and I am on my way.
Cheers.
Curiousd
Moderator Response:[JH] May the Force be with you. :)
-
Cedders at 08:44 AM on 20 April 2017Corals are resilient to bleaching
A note that even after another hot summer of coral bleaching, this time without an El Niño, Jim Steele doesn't seem to have learned much. He's written an article on the web entitled "Falling Sea Level: The Critical Factor in 2016 Great Barrier Reef Bleaching!" attacking Hughes et al (2017). Global warming and recurrent mass bleaching of corals. Steele's article has been referred to by James Delingpole and others, presumably because it tries to blame anything other than sea temperatures for coral mortality despite their evident close relationship.
The article about coral mortality in Sulawesi he references would not explain bleaching and its authors support the overall picture of thermally-induced bleaching and mortality. One of his suggestions is that Hughes's demonstration of mass mortality of the northern GBR was based entirely on aerial surveys. That is false, as can be seen in reports for example in SciAm or New Scientist. Although this has been pointed out to him, Steele has as of 19 April yet to correct these errors.
-
chriskoz at 08:20 AM on 20 April 2017Increasing CO2 has little to no effect
DrBill@304,
Please show a bit more respect to Mr Curtis by taking care to spell his name correctly. The fact that Mr Curtis in one of the most accomplished commenters on this site calls for even more care.
Then people will show more respect to you and your comments, reciprocally.
-
chriskoz at 08:05 AM on 20 April 2017SkS Analogy 1 - Speed Kills
Typo in my post@2.
"dppm/dy" should read "dppm/dt". Sorry.
-
chriskoz at 08:02 AM on 20 April 2017SkS Analogy 1 - Speed Kills
If I read the plot correctly, the linear trend since 1950 does not look to fit the data. The data rises faster than the line in the figure. And the data in more accurate than pre-1950 part (I'm assuming data comes from MaunaLoa and rapid accumulation ice cores respectively).
So, since the y-axis is already a first derivative (dppm/dy), while the x-axis is linear, the actual CO2 trend looks faster than quadratic since 1950. This is the first time I've realised it. I don't need to explain to people on this site what ultra-quadratic trend means in context of "Elevator Statement" above: more than freefall, more than gt2/2. Those politicians who still have such basic understanding of maths & kinetics (sadly none associated with GOP in US) should be given this post to read.
-
JWRebel at 05:00 AM on 20 April 2017Yes, we can do 'sound' climate science even though it's projecting the future
@ubrew12: Good points. The third point is also a giant misapprehension even if you grant the premise. I happen to agree with what Spencer says about the the Earth, but this does not give us a free pass — his conclusion is a total non sequitor. You cannot cross the street without looking and say nothing happens without God's will, so there's no danger. Not looking is part of what is happening; nothing anybody believes about the Earth or Mother Nature can suspend cause and effect. Balance is always restored. What we do and don't do matters a great deal. The Israelites are told to plant trees, to allow the land rest (fallow), etc., if they want to keep the land as their inheritance. Nowhere does it state that it doesn't matter what you do. Quite the opposite. There is no religion that states that you endanger your life and soul if you kill a fellow human being, but pouring infinite amounts of CO² into the atmosphere has no consequences, the indulgences have already been provided in advance at no cost.
-
MA Rodger at 04:41 AM on 20 April 2017Science of Climate Change online class starting next week on Coursera
curiousd @39.
So, concerning the graph you introduced @25, it is incorrect to say that "This image has to do with the last correction,"Correction Three" to MILA." In truth it has no bearing in the development of your "Correction Number Three" , the graph being the result of your use of SpectralCalc. As we continue to have great difficulty getting to grips with your use of MILA, adding the output of a different model into the discussion is a step too far.
Concerning your "Correction Number Three", I repeat my comment @33; it is only duplicating your correction resulting from the limited spectrum under analysis within MILA ( your "Correction Number One"). And I still see no reason for your "Correction Number Two" because there is no reason to believe that the MILA calculator is not already adjusting using a "diffusivity approximation".
And may I be so bold as to suggest an alternative to this torturous discourse we engage in here. You could approach UoC concerning their MILA asking for their understanding of the different approximations their model encompasses and their relative impact on the output.
Moderator Response:[JH] As long as you and/or Tom Curtis respond to his posts, curiousd will likely keep pursuing this matter on this website.
-
Quidam at 03:15 AM on 20 April 2017SkS Analogy 1 - Speed Kills
" Slam on the breaks" should be " Slam on the brakes"
Moderator Response:[JH] Glitch corrected. Thank you for pointing it out.
-
DrBill at 02:42 AM on 20 April 2017Increasing CO2 has little to no effect
Is this post system working? My last three tries have not produced a post after Tom Curtic @ 300.
Moderator Response:[JH] Yes, the system is working. All of your comments have appeared.
[RH] What's probably happening is, he's posting at the end of one page and the comments is showing up on the new following page. That one trips people up from time to time.
-
DrBill at 02:39 AM on 20 April 2017Increasing CO2 has little to no effect
Tom Curtis @ 300 I've been trying to post a physics question, but the page won't update with my submission, so I'll just use English. Does the forcing equation arise from fundamental physics to your knowledge, or is it something from curve-fitting efforts? Is it possible to put partial differential equations in this text-box?
Moderator Response:[JH] All of your prior posts were visible.
-
John Hartz at 02:17 AM on 20 April 2017Science of Climate Change online class starting next week on Coursera
curiousd: In the context of your commentary, the following caught my eye...
39-year-old drawing hints at what the Event Horizon Telescope may have just captured: the true shape of a black hole
What Does a Black Hole Really Look Like? by Amanda Montañez on April 17, 2017
-
DrBill at 02:07 AM on 20 April 2017Increasing CO2 has little to no effect
Tom Curtis @300. Thank you. Can you tell me if the formula for radiative forcing is a curve-fitting equation (ie, without the sensitivity factor), or is it derived from fundamental physics? I can't seem to locate something that ought to be related, say, to the partial derivative of enthalpy wrt T at constant p, . Have you seen something of the sort? My apologies if this is a duplicate. The page did not update with my submission, until I took the symbols out.
Moderator Response:[JH] They all appeared. Two of the duplicate posts have been deleted.
-
DrBill at 02:04 AM on 20 April 2017Increasing CO2 has little to no effect
Tom Curtis @300. Thank you. Can you tell me if the basic equation is a curve-fitting equation (ie, without the sensitivity factor), or is it derived from fundamental physics? I can't seem to locate something that ought to be related, say, to the partial derivative of enthalpy wrt T at constant p, (∂H/∂T)p. Have you seen something of the sort? My apologies if this is a duplicate. The page did not update with my submission.
Moderator Response:[JH] Your prior two duplicate posts have been deleted.
-
ubrew12 at 01:46 AM on 20 April 2017Yes, we can do 'sound' climate science even though it's projecting the future
Lamar Smith: "Anyone stating what the climate will be... at the end of the century is not credible" Then Chairman Smith is not credible, since the argument for doing nothing about fossil emissions is based on a prediction that climate in 2100 will be unaffected by it. This 'pushback' argument is not made often enough: the 'do nothing' alternative is still a course of action based on a prediction (that despite doing nothing, everything will be OK). On what is that prediction based? History? Intuition? Madam Costanza's crystal ball? No rational course of action, or inaction, is made without an estimate of its future impact. Since all courses of action, or inaction, require such future predictions, why are only the predictions of the climate scientists being questioned?
Deniers, when questioned on this, will often appeal to history: 'climate has always changed naturally over the course of Earth's history'. This is a non sequitor: if Smith shot his neighbors dog, his defense can't rest on the observation that most dogs throughout history died naturally. Besides: name something that hasn't changed naturally over the course of Earth's history.
Rarely, deniers will reveal something closer to the heart of their objection, as when Dr. Roy Spencer said "Earth and its ecosystems — created by God’s intelligent design and infinite power and sustained by His faithful providence — are robust, resilient, self-regulating, and self-correcting, admirably suited for human flourishing, and displaying His glory." So, that's a prediction of future climate based on 'Everything is going to be OK, because God told me so'. Personally, I prefer Madam Costanza's crystal ball.
-
BBHY at 20:36 PM on 19 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
Even if humans don't burn all the fossil fuels, the rapidly thawing permafrost could boost CO2 to that higher level, or to the GHG equivalent since that would add a lot of methane to the mix.
-
curiousd at 19:03 PM on 19 April 2017Science of Climate Change online class starting next week on Coursera
Another thing....I should call my corrections one and two "suggested supplements" to MILA. The potential for the user to obtain his/her emissivity of 0.92 would be solved simply if, someplace on the screen that the user sees when the website is opened, the statement is made that the emissivity used is 0.98.
I first brought this up at Science of Doom, and Steve Carson was the one who suggested working up a revised MILA estimate for the clear sky OLR. The other S of D responder posted that although he had used MILA he had never thought of investigating the button for the underlying program.
I will acknowledge Steve Carson when I contact David Archer. Again, S of D explains the diffusivity approximation at length, and recommends it in their section that derives the symbolic solution to the upward stream of the two stream solution to Schwarzchild's equation.
Finally, in post 38 above, the statement is made "I have tracked down this Schwarzchild Equation and, rather than the ones that explains the wobbles of the planet Mercury, it is a rather mundane equation that I didn't appreciate had a name and which is explained here."
1. The S.E. that is used in the present context was first used in elucidating the physics of the sun it is not:
2. The first solution to the general theory of relativity that is the "Schwarzchild Radius" of a black hole. And of course general relativity is what described the precession of the perihelion of mercury, and that discovery was independent of Schwarzchild.
3. I guess this thread is the first time you have heard of the Schwartzchild Equation, M.A. Rodger? The equation may look simple but I found it not so simple to apply in practice. There is a subtlety in the application of the boundary conditions, IMO.
Moderator Response:[JH] A reminder, MILA = "Modtran Infrared Light in the Atmosphere"
-
chriskoz at 17:25 PM on 19 April 2017Yes, we can do 'sound' climate science even though it's projecting the future
Thank you Kevin for that oped. Gavin's TED talk is an excellent complementary explanation, in case you haven't watched it yet. Be wise and source reality from the relevant experts rather than from fake news makers.
-
curiousd at 16:55 PM on 19 April 2017Science of Climate Change online class starting next week on Coursera
In the above, I state mistakenly that Note that my 1.7 factor is 1.666 or 2.3 rounded off. Mistake. my 1.7 factor is 1.666 rounded off. One over 1.666 equals the cosine of 53.96 degrees. 53.96 degrees rounds off to 54 degrees. One over 1.7 equals .5882, and the ArcCosine of 0.5882 is 53.9 degrees which rounds off to 54 degrees, which is the angle I choose.
I have by going through this exercise come to realize that Pierrehumbert stating that cos theta = 2/3 would be an equally valid choice as cos theta = 1/2 implies an upper limit for the angle of 60 degrees and a lower limit of 48.1 degrees.
Grant Petty's text suggests cos theta equals 3/5 which is the arccosine of 53.1 degrees.
Houghton chooses 3/5 as the ratio,a ratio which is the Arc cosine of 53.1 degrees, which is the same angle recommended by Petty.
My angle of 54 degrees lies within the range of 60 degrees upper limit to 48.1 degrees lower limit corresponding to page 191 of Pierrehumbert, and is only ~1 degree greater than the 53.1 degrees used by both Petty and Houghton.
-
Tom Curtis at 13:55 PM on 19 April 2017Increasing CO2 has little to no effect
DrBill @299, the formula for the radiative forcing of CO2 is 5.35 x ln(CO2current/CO2initial). That is equivalent to 12.32 x log(CO2current/CO2initial). NOAA gives this, and formulas for the radiative forcing of other greenhouse gases here.
To determine the equilibrium response to a given radiative forcing, you need to multiply the forcing by the climate sensitivity factor. That is approximately equal to 0.8 +/- 0.4oC/(W/m2), which is what wikipedia says.
-
chriskoz at 12:08 PM on 19 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
Tom@7,
Thanks for the pointer to the source publication of your claim. It was an interesting read. And it confirms that the amount on FF reserves can be hgher than I thought beforehand and Wink12K scenario of releasing 12EgC anthropogenic CO2 slug is at least theoretically possible, though constrains other than resource limit can make it unrealistic. I would add that with a CO2 slug that big, the natural feedback of thawing permafrost and unknown mechanisms that can turn the ocean into CO2 source, can contribute to even more CO2 release.
BC@8,
A conversion of .75 when going from forcing to temp is the fact that Equilibrium Climate Sensitivity used by Hansen is 3K/double CO2 while double CO2 creates 4Wm-2 forcing. Hence 3/4=.75.
But I think not just ECS should be used to determine temp evolution on the timescale in the OP graph. More appropriate is Earth System Sensitivity which takes into account millenial scale feedbacks, such as melting icesheets, permafrost thaw I mentioned above. ESS is larger than ECS. So, having not read the OP study, I don't understand why the temps on fig 4 are smaller than ECS. From your eyeballing, it looks more like Transient Climate Sensitivity figures - 2K/double CO2 wich is 50% of forcing number in Wm-2 - e.g. 3.0 degC for RCP6. Maybe figure h shows just the amount of CO2 released in each scenario and does disregards the ocean CO2 uptake, optimistically assuming the strength of the ocean CO2 sink will not be affected till the end of the period shown and ESS feedback wil not happen. With such assumptions you can say that ocean CO2 uptake will largely counter-balance the warming progressing from TCS to ESS within decadal to a century timescale and postulate TCS only level of warming on that timescale.
-
DrBill at 11:45 AM on 19 April 2017Increasing CO2 has little to no effect
I read a lot of this thread, but not all of it, an found that the disussion had moved on from the basic question of its title, to a temperature rise due to a concentration change. This is similar to something from wikipedia that says the forcing resulting from CO2 is accordiing to 5.65K*log CO2(1)/CO2(0) = change in temperature. Is this the understanding here, or has wiki got it wrong? Just trying to understand and if this is the wrong thread, I apologize in advance.
-
curiousd at 09:18 AM on 19 April 2017Science of Climate Change online class starting next week on Coursera
Note to Tom Curtis comment 37. I had always assumed, say for two years or so, that I could never understand anything that button revealed, as it would have something to do with the program source code. It was only when I determined by integrating one of the outputs by using a digitized result with an underlying width which by accident extended between 5 wn to 2000 wn and got a significantly larger OLR than was given in the MILA output, and then found I was within 2% of the MILA output if the underlying interval was limited to 100 wn to 1500 wn. that I realizedsomething was wrong.
By user output I meant what the user sees in the window that appears when you click on the URL for MILA. Then I looked at that button and found that the actual integration was limited to 100 to 1500 wn and that the emissivity was0.98 The way I found this out, and that indeed the information was revealed by that button, was described earlier in the thread, Tom Curtis, which you might have looked at more carefully before accusing me of a false claim.
Regarding comment 38. Here is how I get the upward intensity as a finction of emissivity and altitude.
For the zero altitude in band intensity I go to the SpectralCalc black body calculator APP and calculate the in band radiance for 288.2 degrees and the emissivity I put in. I use a wave length range of 500 to 850 wn which completerly includes the bending mode band of CO2.
For CO2 the bending mode range is contained within the 500 to 850 wn range. The band from 100 wn to 500 wn is completely transparent if CO2 is the only green house gas, which I also put into SpectralCalc. The same is true for the window between 850 wn and 1500 wn. Therefore, since CO2 is completerly transparent for 100 wn to 500 wn, and 850 wn to 1500 wn I also obtain those intensities using the SpectralCalc Black Body Calculator as described above in this post. The upward intensity for a CO2 only atmosphere, will be the same for all altitudes for those two outer widows, as observed at ground level, and therefore I just add them at higher altitudes to the output I get for the 500 to 850 band.
For the 500 to 850 band I use the "Atmospheric Paths" APP of SpectralCalc. This gives either transmittances or radiance outputs. Here I use the radiance output. The way this works is that there is a virtual source at ground level, for which you can put in a temperature of 288.2 degrees and an emissivity of choice. Again, I choose an atmosphere with only CO2, major isotopologue. A complication is that the U.S. Standard atmosphere is used in the scale factors. Therefore a scale factor of one does not correspond to 400 ppm of CO2, since back in the 70s the CO2concentration was less than this. For 400 ppm you must therefore use a scale factor of 1.212. The U.S. standard atmosphere is default for the SpectralCalc atmospheric path sections.
Here is my only non standard step: I use instead of 400 ppm a concentration of 1.7 times 400 ppm or 680 ppm. This corresponds to a scale factor of 1.212 times 1.7 or 2.06. Why do I do this? I am in effect using the "diffusivity approximation" as described in great length both in Pierrehumbert's text and Grant Petty's text. The idea is that if one does not wish to integrate the output radiance to obtain diffusive flux, one can approximate the result of integrating that radiance by multiplying the in band radiance by pi and simultaneously replacing all paths involved by a straight line path going at angle theta relative to vertical. Here I quote from the pages within Pierrehumbert that are not available in the URL http://cips.berkeley.edu/events/rocky-planets-class09/ClimateVol1.pdf
Quoting from Pierrehumbert p. 191: "...if the radiation field remains approximately isotropic, the decay rate is the same as for unidirectional radiation propagating at an angle theta such that cos theta = 1/2, i.e. 60 degrees..." Then "..the choice of cos theta = 1/2 is by nomeans a unique consequence of the assumption of isotropy....(under certain conditions) cos theta = 2/3 and this would ve an equally valid choice within the limits of the isotropic approximation..."
Petty staes on page 214..."The most commonly used value of (symbol for one over cos theta) is 5/3"
Note that my 1.7 factor is 1.666 or 2.3 rounded off and my angle is 54 degrees, not 60 degrees.
Furthermore in The Physics of Atmospheres by John Houghton third edition he states on pp. 11 - 12 that for Schwarzchild's Equation, to a good approximation the intensity may be replaced by the diffuse upward flux if B(T), [B(T) is the symbol for the black body emission per unit solid angle per unit area of a surface at temperature T] is replaced by pi B and the incremental increase in altitude dz is replaced by 5/3 dz."
At one time I went through the procedure of replacing vertical angle paths by paths at 54 degrees using a spread sheet. But I realized that exactly the same result is obtained by keeping the path vertical and replacing the concentration q by 1.7 q. Mathematically this must be the case since all the expressiont for transmittance in Pierrehumber involve the set of symbols F (q, theta) = q/cos theta. In the bottom of page 229, Pierrehumbert, his equation for the transmittance between pressures p1 and p2 is one minus the equivalent width where the equivalent width of a single Lorentzian line is one over delta (delta is he range of wave number considered)
time 2 times the square root the hitran line strength for a Lorentzian at at the base of the atmosphere times a different "strong line strength LS". The strong line strength contains together the set of symbols F = q/cos theta. This can be expanded to give the Curtis-Godsen approximation. I have tested whether the transmittance values I get from SpectralCalc and transform to angled paths at 60 degrees using a spread sheet are identical to the transmittance of a vertical path with q' = 2q . The agreement is exact!
Say I chose theta to be 60 degrees. Since for a vertical line cos theta is one, and 2q/one equals q/cos 60 degrees what I do is exactly equivalent to the "diffusivity approximation"
Bottom line....what MILA does is actually integrate over all angles. I use this approximation instead. The techical consultant at SpectralCalc tells me that what I am doing is an approximation to doing this a better way, where he would have used a "fouth order quadrature" whatever that is. He also told me that the same thing is true for the radiant atmospheric path APP as for the transmittance paths in Schwarzchild Equation, i.e. by multiplying the path length by 1.7 or by multiplying the concentration by 1.7 I can approximate the best way of going from intensity to diffuse flux which would use a "fourth order quadrature".
It works!! See post 8 above. The Science of Doom describes the method in their section on the "Greenhouse Effect" "The Equations"
You should also know that if you use atmospheric paths that are too long you get an error message in SpectralCalc to the effect that you have exceeded the one million point limit. Also, their are angled paths already available in SpectralCalc but they are real paths which are strongly refracted, and you need idealized paths that go in straight lines to use this approximation.
-
bjchip at 04:47 AM on 19 April 2017Yes, we can do 'sound' climate science even though it's projecting the future
It truly gets a bit absurd.
Think about the real reason for doing science. The evolutionary advantage reason for doing it. It is ENTIRELY about projecting the future.
All the explaining, the theory, the curiousity, the analysis, the data gathering, is about understanding what the real world does well enough to predict what the real world will do with enough time and certainty to change things.
Understanding "why" (which is what science is) allows preparation and changes to "what" ultimately happens. It is the root of every benefit of human civilization.
Moderator Response:[JH] The use of "all-caps" is prohibited by the SkS Comments Policy. The use of bold face font for emphasis is acceptable.
-
Daniel Bailey at 23:08 PM on 18 April 2017Arctic icemelt is a natural cycle
As Tom notes, both the Arctic sea ice and Antarctic sea ice are more than 2 standard deviations below the long-term average. So that point bears repeating.
Arctic Sea Ice (per NSIDC):
Antarctic Sea Ice (per NSIDC):
-
Daniel Bailey at 23:01 PM on 18 April 2017Arctic icemelt is a natural cycle
NASA's position on land-based ice sheet mass losses:
"Data from NASA's GRACE satellites show that the land ice sheets in both Antarctica and Greenland are losing mass. The continent of Antarctica has been losing about 118 billion metric tons of ice per year since 2002, while the Greenland ice sheet has been losing an estimated 281 billion metric tons per year. (Source: GRACE satellite data through 2016)”
Greenland Land-based Ice Sheet Mass Losses, per GRACE:
Antarctica Land-based Ice Sheet Mass Losses, per GRACE:
-
Daniel Bailey at 22:50 PM on 18 April 2017Arctic icemelt is a natural cycle
To sum, the earth is losing a trillion tons of ice per year:
- 159 Gt Antarctic land ice, McMillan el al, GRL (2014)
+ 26 Gt Antarctic sea ice, Holland et al, J Climate (2014)
- 261 Gt Arctic sea ice, PIOMAS
- 378 Gt Greenland, Enderlin et al, GRL (2014)
- 259 Gt other land based glaciers, Gardner et al. Science (2013)
Total = - 1,031 Gt
Losses outnumber gains by a ratio of 40:1
-
Humans on the verge of causing Earth’s fastest climate change in 50m years
Thanks Tom. (BTW - I live in Brisbane too)
-
Tom Curtis at 19:07 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
BC @8, the gap appears compressed because of the use of a log scale on the y-axis. Further, the scenarios are defined for their forcing as at 2100. RCP 8.5 continues to expand atmospheric concentration long after that so that its final focing is significantly greater than 8.5 W/m^2. RCP 6.0, in contrast, maintains a near constant forcing after 2100. Finally, Twink12 is defined by the number of terratonnes of carbon emitted rather than by forcing (as I understand it).
-
Tom Curtis at 19:01 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
Digby Scorgie @6, nobody knows, and the time will depend on the rate of fossil fuel burning. Further, to a certain extent, increased energy resources and be used to counter much of the economic effect of AGW, particularly in highly industrialized areas. A sufficiently irrational person could greatly extend the time before it became impossible to maintain the technological civilization needed to burn fossil fuels by burning fossil fuels faster and faster. (This strategy requires calous disregard for those whose economic situation isn't so favoured.)
I do not think the OP argues for so high a benchmark on disruption, ie, that it will significantly impair our ability to burn fossil fuels. I think it is arguing that at some point the level of economic harm and natural disasters will catastrophic, potentially to the point of negative economic growth and declining population. Our civilization can survive declines in both of low percentage; and with it our ability to burn fossil fuels will also be preserved should we be mad enough.
-
Humans on the verge of causing Earth’s fastest climate change in 50m years
Good article. It gives a lot of pertinent info about our situation in some fairly simple graphs. I was surprised to see the RCP4.5 and RCP6 lines so close together especially in the forcing graph (second one) so went back to the SKS post below. And fig 4 in this RCP guide shows that the numbers 4.5 and 6 actually refer to the forcing level (no doubt a lot of readers already knew this). So the gap between 4.5 and 6 seems too narrow when compared to the gap between 0 and 4.5, or am I interpreting this wrong. Perhaps of more interest is that fig 4 also gives temperature anomalies 2.4 degC for RCP4.5 and 3.0 degC for RCP6, and the CO2 eq figures are 650 and 850 ppm. These temperature figures, while bad, aren't as high as I expected, and give a glimmer of hope. James Hansen in Storms of My Grandchildren estimates a conversion of .75 when going from forcing to temp which gives 3.4 degC and 4.5 degC, which are more concerning.
-
Tom Curtis at 18:48 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
chriskoz @5, the figure is from the IEA's Resources and Reserves 2013. With regard to different estimates, this pyramid of US resources and reserves from the EIA is helpful:
Converting from short tons to tonnes (ie, metric tons) we have 232 Gigatonnes of estimated recoverable reserves, 1514 Gt of identified resources, and a total resource base of 3544 Gt of total resources. They further estimate that the US has 26% of the world's coal reserves, which would indicate a world recoverable reserves at 892 Gt of global recoverable reserves.
For comparison, IEA 2013 estimates 8,130 Gt as the total resource for hard coals and lignite in North America (ie, the US, Canada and Mexico). As that is more than double the US estimate, and the US has more coal than either Canada or Mexico, clearly the IEA 2013 estimate is larger - as will happen given that it is in part an estimate. If we assume all North American coal is in the US, and scale the IEA 2013 global estimate so that the North American estimate matches that by the EIA, then the global resource would be 7,500 Gt. Alternatively, if we scale the EIA US resource using the percentage of global share of the reserve, we get a total global resource of 13,630 Gt. Even on the low value, once you throw in tar sands, oil shale, deep sea oil and gas resources, arctic oil and gas resources, antarctic oil and coal (not incuded in any of the above) etc, a civilization determined to "burn, baby, burn" regardless of consequences could far exceed the RCP 8.0 scenario; and the Twink12 scenario is a reasonable scenario for such a strategy.
As to the large differences in estimates, that will be in part because they are estimates when we are talking about the total resource. More importantly, many estimates of amount of fossil fuels remaining in the ground restrict themselves to reserves, and/or reserves plus identified resources. I once did a spread sheet of all the publicly available estimates across coal, oil and gas. Only a few of the estimates included any oil sands, shale oil or unconventional gas (and they not all of it), and a range of criteria were used. Of 14 estimats across 9 sources, only 3 provided estimates that may have represented the Total Resource, with estimates of 3000 GtC (World Energy Council 2010, "possible"), 11,000 GtC (S&W 2011, "TRB") and 16,000 GtC (IEA 201, "estimated"). (Note, the values quoted are my estimate of the carbon content allowing for amounts not oxidized due to spills, etc, and for carbon content of the fuel, based on the original resource and/or reserve estimates. I do not think the WEC "possible" estimate is an estimate of the TRB, but rather an estimate of what part of that base could become economically accessible. The other estimates nearly match (S&W) or significantly exceed the Twink12 requirements - and all with limited examination of unconventional sources, and without allowing for LUC and cement emissions.
-
Digby Scorgie at 17:43 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
Is the following a fair summary?
On the one hand, we have a continuous increase in the burning of fossil fuels.
On the other hand, we have the increasingly damaging effects of climate change.
At some time in the future, climate effects will damage human civilization sufficiently to disrupt the burning of fossil fuels, resulting in a rapid decline in such burning.
If this summary is valid, the obvious question is: When?
-
chriskoz at 17:11 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
Tom@2,
Appology for my typo @1. I typed 5Pg and 10Pg (peta-grams == giga-ton), but I meant 5Eg and 10Eg (exa-gram) which is 1000 times more.
Where did you find your number "17,204 Gt of coal resources" and 728 Gt reserves? I searched various IEA publications but cannot find your numbers.
From the Wikipedia, the world's proven C reserves are 909,064 Mt. Such numbers obviously change as new discoveries are made and extraction methods (e.g. hydraulic fracking) improve.
The resources number can be very fluid and change depending on definition of "resources". The one from IPCC AR4, e.g. as shown among other C reservoirs in OA not OK series:
shows only 3700GtC of all fossil fuel resources.
Archer 2005, 2007 etc, that we both know very well, considers only a 5000PgC slug in their model. Accordingly, David teaches 5000PgC to be the most likely slug if all FF are burned (a version of your "world where Trump is President of the US"). That number is still nowhere near your number of 17Eg+ of coal only.
So I wonder where these large differences of various estimates of FF reservoir size do come from.
-
Tom Curtis at 17:09 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
green tortoise @3, even in a wink12 scenario, GMST will rise by about 10 C (the temperature at which wet bulb temperature emergencies become endemic in the tropics) only around 2200. By then, the vast majority of the fossil fuels have been burned, and if on that pathway, it would still take several decades to convert to renewable sources. Further, even then, a nation determined to "Put America First", or "Put Europe First" rather than putting the globe first, and which was determined to adapt to climate change by massively increased energy use powered by fossil fuels could still power on regardless.
Do I think it likely that governments will be persistently that stupid? No! But while a large number of influential think tanks, and several major governments continue to push "Burn, baby, burn", it is a scenario that ought to be included to show the real consequences of their policies. That is, in addition to scenarios that undershoot BAU (RCP 6) to guide those who want a sensible response to AGW, we need policies that overshoot it as a warning to those who do not (or those who do, but might be tempted by the massive PR campaign for fossil fuels being run by the likes of WUWT).
As an aside, I suspect you were tongue in cheek, but the quote was:
"You can fool all of the people some of the time, and some of the people all of the time, but you can't fool all of the people all of the time." (Quoted from memory.) The author was Abraham Lincoln, the first Republican President. To that, pro-fossil fuel lobby and the most recent Republican President has added the addendum, you can fool enough of the people enough of the time that you can sideline those who see through your nonsense.
-
Tom Curtis at 16:51 PM on 18 April 2017Arctic icemelt is a natural cycle
jfrantz @64, the linked source is describing sea ice for both Antarctica and the Arctic. Antarctica has a fringe of sea ice that is preserved even in summer, though with minimal extent (about 4 million km squared). In winter it becomes very extensive, exceeding in extent the Arctic sea ice maximum, mostly because it can extent into open ocean. In terms of sea ice extent, the reduction in Arctic sea ice has been far greater, as shown in the graph in the section on Antarctic sea ice extent in the linked source. That graph, however, only extends to Dec, 2012. A more recent graph shows the Antarctic sea ice extent anomaly to have declined astonishingly over the last two years, reaching Arctic (and hence negative) levels by Dec 2016:
(Source)
The low extent anomaly has continued into 2017:
(Source)
With regard to sea ice volume, we are primarilly dependent on models, as there are insufficient depth measurements of the ice to provide a region wide, continuous time series.
A region wide model for Antarctic sea ice volume reported in 2014, and showed sea ice volume to 2010:
The trend of 28.7 km^3/yr compares to Arctic trends of -260 km^3/year in April, and -320 km^3/yr in September shown for the Arctic:
Obviously the recent massive retreat in Antarctic sea ice extent will also have been reflected in a retreat in Antarctic sea ice volume, but as we do not know to what extent it has been matched by a reduction in sea ice thickness, we do not know by how much.
Finally, if Al Gore did say that "arctic ice is floating and antarctic ice is on land", that is misleading (at least out of context). The peak Arctic sea ice volume was about 33,000 km^3 in April (the time of maximum volume). That is dwarfed by the 2,900,000 km^3 volume of the Greenland Ice Sheet. Looked at differently, 14,000,000 km^2 area of the Antarctic continent, while five times the minimum Antarctic sea ice extent is about 80% of the maximum Antarctic sea ice extent. Whether we look at either volume or extent, both polar regions are a story of land ice and sea ice. However, Antarctica is landice surrounded by sea, while the Arctic is sea ice surrounded by land. That makes a very large difference with regard to the rapidity of temperature responses, and the rapidity of albedo changes with warming or cooling, both being much faster in the Arctic. With regard to sea level rise, however, both poles are a land ice story.
-
green tortoise at 15:25 PM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
I also don't think that 12000 Ptg of fossil carbon could be liberated, mainly because, even incluiding the Trumps, Putins et al. pro-fossil fuel politicians and regimes, the human devastation would be so huge that humanity will be destroyed much before that happens.
As an example, most of the Middle East, that has some of the biggest (and cheapest) oil&gas resources would cross the 35°C (wet bulb temperature) boundary of human survival. Most people die of high fever above that temperature. Imagine hunded of millions of Arabs, Indians, Iranians and Africans storming Russia, China and Europe escaping the uninabitable landscapes behind.
Behind there would be disaster zones surely taken by extremist groups that would not certainly did a good maintenace to the legacy oil&gas facilities, just like has happened in the areas occupied by ISIS/ISIL/DAESH today, specially if bombed by the Putins and Trumps.
Given that human component, I don't think we can pass 6°C without destroying the fossil fuel infrastructure (either by climate disasters, political violence or global war).
This is a negative feedback, given some climate change, the planet become deadly for most people, and the following collapse prevents further fossil fuel burning. This is however the worst kind of negative feedback imaginable, because the biosphere is saved by killing us as if we were some kind of deadly virus or bacteria
There is catch, unfortuately: given the huge amount of carbon stored in shallow soil, permafrost and gas hydrates that could be destabilised by warming, maybe once the human emissions (and the human population) approach zero, those "natural" emissions triggered could push the planet to a state not seen since the Snowball Earth meltdown (i.e. Tmean= 50°C) in the Precambrian, killing everything that is not a microorganism.
If the Sun has warmed enough since the Snowball Earth meltdown, maybe even a moist greenhouse could be triggered, that will end only when the carbon is sequestered by the flash cap-carbonate reaction. If that is not enough to to compensate for the increasing water vapor greenhouse effect, the greenhouse state will last until Earth has lost most of its water to space, leaving a desert planet behind (I however doubt that Earth is vulnerable to runaway greenhouse like Venus).
I however have strong hope that nothing like that will happen, because "people can be all stupid sometimes, always there is some stupid people, but people cannot be all stupid alltimes" ( I don't remember who wrote that, any idea?).
After all, hope is what moved a lot of brave people against very adverse events in the past, and hope is badly needed to face this crisis.
By the way, I hope you have had a nice Easter.
-
jfrantz at 13:59 PM on 18 April 2017Arctic icemelt is a natural cycle
Seroius and I think well founded question here.
In the source linked, http://nsidc.org/cryosphere/sotc/sea_ice.html they describe that antarctic ice has been increasing during this time period while arctic is decreasing. In terms of standard deviation from mean the arctic decrease has been twice as sharp, but:
1. In terms of volume of ice, are we net gaining or losing ice at the poles?
2. Since arctic ice is floating and antarctic ice is on land (pulling this from Inconvenient truth), should the antarctic ice be the main concern when it comes to rising sea levels? Or is that outweigh by the specific areas growing or shrinking (e.g. growing areas are sea ice, shrinking are land based)?
-
Tom Curtis at 10:04 AM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
chriskoz @1, while the IEA only estimated 728 Gt (=728 Pg) of coal reserves in 2010, they estimated 17,204 Gt of coal resources. The difference is that while reserves for reserves, they are recoverable given current mining technology, they are also economic to recover at current prices. Resources in excess of reserves are also recoverable given current technology, but are not economic to recover at current prices. The transition from resource to reserve can be quite rapid given changes in technology and/or increased demand. It follows that the Twink12K (ie, 12,000 GtC) is well within the limits of fossil fuel resources, and what is within the limit of fossil fuel reserves is a matter to be proven by future technological and economic developments.
That leaves aside emissions from LUC and cement manufacture.
It is unlikely that we would be able to burn that much fossil fuels before the effects of climate change made further burning of fossil fuels politically infeasible (if not necessarilly destroying the infrastructure, which is no more vulnerable than any other part of our civilizations infrastructure). In a world where Trump is President of the US, however, there are no guaranttees.
-
chriskoz at 09:08 AM on 18 April 2017Humans on the verge of causing Earth’s fastest climate change in 50m years
The " Wink12K scenario" is based on this article, speculating the realease of 10Pg anthropogenic C, is the first such scenario I've seen. Apart from being unrealistic (only 5PgC of recoverable FF reserves have been estimated), I think homo sapiens would be technically unable to burn it before the transit climate change effects wiped off or seriously crippled the whole FF burning infrastructure. Then, it comes the increasing awareness that will put more pressure to curb the burning in the future, with possibly other/renewable energy sources replacing it. So, we can safely cross that Wink12K scenario as impossible.
-
Tom Dayton at 05:14 AM on 18 April 2017Human CO2 is a tiny % of CO2 emissions
Pattio: The airborne fraction of CO2 has been fairly constant, despite the growth in the rate of anthropogenic CO2 emissions. Therefore the natural sinks are not static. That determination has been made by scientists who, therefore, do not in reality believe the sinks are static.
Prev 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 Next