Enter a term in the search box to find its definition.
Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).
All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.
Sorry. When I said "initial", and RomanEmpire @115 said "beginning", this was not the very beginning @3 which is pretty incoherent stuff, but the Rosco comment @7. This fits with the description @115 "What Rosco was saying in the beginning is that Venus surface is far too hot for the current state of the affairs." I must admit that re-reading #7 I did manage to mis-interprete the comment, somehow reading into it the idea that the sun only heats the outer atmosphere (thus the 250 K limit), an idea which is actually absent. But in my defence, the actual argument/question presented @7 is entirely self-defeating - there is no greenhouse effect (thus temperature is as a black body) so how can there be a greenhouse effect?
MA Rodger @117, this is Rosco's original post upthread (@3):
"Venus is nothing like the earth - it is (-snip-) to claim it is. I have seen claims that the "greenhouse" effect on Venus is responsible for heating the planet by ~500 k. This is clearly impossible given the albedo of Venus reflects most incoming solar radiation.
If such an effect were possible it could easily solve Earth's energy problems - simply collect all the hot exhaust gases from a coal fired power station and force it into a chamber under 92 bar pressure, add sunlight and the runaway greenhouse would raise the temperature to over 700 K - and we could use this heat to drive turbines and eventually shut down the coal fired power station.
Yeah right - the whole idea is "beyond absurd"."
In successive paragraphs he shows repeatedly that he does not think there is any such thing as a greenhouse effect, that he does not know how it works, and (at the end of the second paragraph) that he does not understand the laws of thermodynamics.
The greenhouse effect is sufficiently well understood, and sufficiently well evidenced that the probability of it not existing is not meaningfully distinguishable from the probability that geocentrism is true. Ergo, his initial contribution is very much on a level with geocentrism.
There are things that are reasonably disputable, and even controversial in climate science. That the origin of the 20th century increase in CO2 is anthropogenic, and that an greenhouse effect and an enhanced greenhouse effect exist are not among them. Any denial of those facts merely shows an abysmal scientific ignorance. The lack of a civilized discussion that RomanEmpire points to is a direct consequence of that fact. It is not possible to have civilized discussion defending the thesis that black is white (or geocentrism; or rejecting the existence of a greenhouse effect) because one participant must lack an essential of civilized discussion in any such case - the desire or ability to be rational. Those who sheet home the failure of civilized discussion to the rational side of the discourse need to be reminded of this fact in no uncertain terms.
I think it is wrong to characterise the initial position of Rosco up-thread as being "on a level of geocentrism". RomanEmpire @115 is pursuaded that there was something in Rosco's initial position and thus it would be wrong to dismiss it entirely off-handedly.
The basic idea that seemed to confound Rosco was that he held that the sun (less albedo) should warm the insulating outer atmosphere of Venus to some 250 K and then he was perplexed that the surface temperature of Venus is some 750 K. How could this be? Addressing this point was not helped up-thread as Rosco arrived with a heavy load of misconceptions but let us ignore them. What Rosco simply failed to grasp was that when a planet gains a powerful greenhouse atmosphere, it takes very little energy to raise the temperature at its surface. So the vulcanism within Venus, if it had a similar heat output as Earth (which is likely) would require only a few million years (a mere blink of an eye in the evolution of a planet's climate) to warm its thick lower atmosphere from 250 K to today's 750 K.
I thought to consider whether the Zwally contention could require snowfall to be maintained over a shorter period than the whole Holocene. Could it be a more recent increase and thus reduce the impact on sea level? The Vostok ice core gives ice depth and age from 5,679yBP suggesting that it takes half of the length of the Holocene for high Antarctic snow to compact completely into ice. And much of the compaction would occur in a far shorter time.
But I came away from the Vostok data with more of an appreciation of Zwally's contention. Note in the Vostok data that the annual ice thickness (which Zwally contends is the full cause of surface elevation rise) is 21mm/year thick through the early Holocene but 12mm/year through much of the last ice age (over a 60ky period, 13mm/y over 100ky period so presumably achieving a steady-state condition) providing the sort of additional ice (1cm/year) Zwally is arguing for. So does it not come down to the question of how the multi-kilometre thick ice sheet will react to a multi-millennial increased rate of snowfall. Would the extra snowfall (which the Vostok data suggests has resulted in an extra 120m of ice-equivalent added to the top of the sheet since the LGM) significantly increase the rate of ice flow? Are there other effects at play? Or does Zwally's contention need addressing? Okay it comes with problems for the Holocene SLR record, and the measured elevation change may not be all ice, but with an extra 120m of ice-equivalent on the ice sheet over 18,000y can it be argued that there should be a significant ice component within the measured surface elevation rise?
"Averaged over the planet, about 17 W/m² are absorbed at the ground (some 2.5% of the total solar energy incident on the planet)."
This is definitely inconsistent with the claim that there "... is no evidence that even 1% of solar radiation reaches Venus surface", which is revealed as hyperbole at best.
2) Chris Colose writes above:
"note Venus may never actually encountered a true runaway, there is still debate over this"
It follows that when you write "Chris essentially said ... that Venus surface is far too hot for the current state of the affairs ..., so there MUST HAVE BEEN a runaway greenhouse effect..." you are clearly misrepresenting his argument. His argument is that while the TOA insolation on Venus is sufficient to drive a runaway greenhouse effect, it is not sufficient on Earth. Venus may have reached its current conditions by either never having cooled down sufficiently from its initial heat of formation (due to a strong greenhouse effect) or to having experienced a runaway greenhouse after cooling down as the Earth did.
3) Roscoe espoused absurd theories (on the level of geocentrism). He refused to either be convinced by clear argument or evidence provided. Scientists, no matter how respectful, cannot be expected to persuade those who come into the discussion with a closed mind as Roscoe clearly did. Nor, if you abuse their patience by continuing to espouse nonsense rather than learn something new, can you expect the patience of scientists to persist.
My understanding is that Chris Colose's piece (the origin of this debate) said is not totally inconsistent with what Rosco was saying (Rosco was, unfortunately, mobbed out of this thread; shame on us, scientists, for being unable to conduct a civilized discussion with a well-meaning outsider without patronizing, antagonizing, provoking, name calling, etc.).
So, Chris essentially said (again, this is my understanding) that Venus surface is far too hot for the current state of the affairs (insolation, albedo, atmosphere composition, etc.), so there MUST HAVE BEEN a runaway greenhouse effect in some (uspecified) past that heated it up, and the current state of the affairs does not let it cool too quickly.
What Rosco was saying in the beginning is that Venus surface is far too hot for the current state of the affairs. I don't understand why he had to be chased out of this thread for this, even though here Rosco and Chris seem to agree.
Where I disagree with Chris is when he says that "Less than 10% of the incident solar radiation reaches the surface." There is no evidence that even 1% of solar radiation reaches Venus surface, so dense is the Venusian atmosphere (if we live under an equivalent of 10 m of water - our atmosphere compressed, the on Venus the equivalent depth is >900 m, and this is without taking into account the dense clouds). The light observed by the Russian station was likely due to the lightening that is constantly illuminates the Venusian atmosphere.
[PS] Rosco came to debate with a history of trying the moderator's patience and a strong dislike for either reading or comprehending information that contradicted his/her preconceptions. Please note that moderation complaints are always offtopic.
Jimlj - that's a very good point. Ocean volume has been static for the last 4-5000 years - with relative sea level fall occurring in the farfield due to the effect of ocean siphoning. Zwally et al not only contradict the bulk of other Antarctic research, but Holocene sea level research too.
As noted at HotWhopper and now DeltoidThe RealClimate difficulty has a, I hope, temporary solution as RC has resorted to a backup server. It may be usefull to update the link in the heading to point to:
billthefrog @13, as you can see in the video below, the melt of the Laurentide Ice Sheet was essentially over 8 thousand years ago
It appears the fennoscandian ice sheet also essentially disappeared by 5000 kya. Further, since about 8 kya, global temperatures have been slightly falling, or constant so that there is no additional sea level rise due to thermal expansion, and possibly some slight fall. It is not obvious, therefore, that there is a basis to assume compensation for an increase in Antartic ice mass over that period.
My background is not in Paleo (although the wife keeps calling me an old git), but the long drawn out retreat of both the Laurentide and Fennoscandian ice sheets would certainly have emptied more than a few gallons into the oceans.
One would also have to factor in the delayed effects of thermal expansion. Even the surface layers of the oceans clearly show this, with oceanic temps reaching their annual maximum several months after the relevant hemispheric summer sostice. Think how long it is going to take for the deep ocean to equilibrate following the Last Glacial Maximum of about 20 - 21 kya.
My limited understanding is that thermosteric effects (i.e. thermal expansion) and eustatic effects (i.e. caused by the addition of extra waters) are currently of comparable magnitude. Taken together, these would account for the sea level rise of around 120+ metres since the LGM. The limked NASA article discusses this effect.
According to Zwally's paper, not only is Antarctica gaining ice now, but as I understand it, has been for the last 10 millenia. If everything else were equal, that would have meant at least a couple of meters of sea level drop. But we haven't seen that, so what is the source of the water that has kept sea level relatively constant?
If I've misunderstood something, or gotten some facts wrong, I'd appreciate learning where I went wrong.
Tor B @1, the indices you reference are simple temperature indices. That is, they plot the temperature in a small area of the tropical Pacific and use that as an index of ENSO states. However, because they simply take the temperature of a small region, any increases in temperature due to global warming generates a bias towards stronger and more frequent El Nino states (and weaker and less frequent La Nina states). Given that, we do not know to what extent the record values are a consequence of a strong El Nino, and to what extent they are directly due to global warming.
As you can see, it is only the third strongest El Nino by the MEI since 1950. The SOI shows the current ENSO value (-20.2) as weaker than five months of the 97-98 El Nino (peak at -28.5), and six months of the 82-83 El Nino (peak at -33.3).
There is no guarantee as to how El Ninos will develop. The current El Nino may collapse and fall away towards neutral conditions, but current predictions are that it will remain strong for several more months.
If it were to do so, it will not challenge either 97/98 or 83/83 as the strongest El Nino since 1950. It is however possible that it will strengthen again. If it does so, it may follow a similar pattern to the 82-83 El Nino and end up as the strongest since 1950. Or not. If it did, that would be grim news indeed.
As it stands, the big news about this El Nino is that it so hot globally relative to 97-98 when the El Nino event is not as strong as 97-98.
The most recent weekly Nino data (released today?) hit a new all-time record of 3.1. The base line for Nino datasets was reset early in 2015, a baseline raised from the previous baseline (I think by about 0.1); this makes me think this and last weeks record values all the more significant.
From the Arctic Sea Ice Forum's Consequences/El Nino? thread: "... NOAA data show[s] that the Nino 3.4 is now +3.1, that both the Nino 1+2 and the Nino 4 increased, and the Nino 3 remained constant at +3.0." Included is a table of recent week values of the several Nino regions.
I see that some of my questions have actually been raised and answered already above.. Sorry about that..
Considering that my own calculation essentially has been verified through the replication of Foster & Rahmstorf original numbers, and that I get quite close (about rounding error level) to SkS calculator with "well defined" time series, I think I have no urgent questions anymore.
However, it still would be quite nice to have clear definition of exact data sources used, and the timestamp when SkS calculator has retrieved refreshed datasets. But I well understand the issue with time in a voluteer setup.
It is not a "redistribution" Zwally is arguing for. Rather it is a millennia-long accumulation he is proposing. The argument is that the snowfall today is not much changed from the snowfall over recent millennia. Prior to this, in the distant past, there was both less snow and less ice. The snow compacts to ice that then, with time, flows off the summits of Antarctica towards the oceans. But the arriving snowfall, while constant over the millennia, is greater than the ice flow. So ice continues to build up as it has over all those millennia.
I'm having a hard time comprehending the argument that an increase in precipitation 18,000 to 12,000 years ago could be seen as an increase in mass in recent years. Once a net gain has occurred then we're only talking about redistribution of the mass by ice movement.
Yet no mention of the satellite data which is not aggreeing with the tone of this article. This particularly since Spencer & Christy rolled out UAH 6.0. Now both RSS and UAH seem to show there is no warming to speak of in the lower troposphere. Right? So long as that is the case deniers see that as confirmation this is a big hoax. I did find the comparison of RSS to RATPAC data in this link interesting:
The chart Andy posted at #13 conspicuously rules out nuclear a priori. We are going to need all the tools available to meet this challenge, including new generations of nuclear technology. Again, its political ...
The concentration of water vapour, a powerful greenhouse gas, at the earth's surface is usually tens of thousands of parts per million, which completely dwarfs the CO2's 400, let alone the changes in the concentration of the order of 100 parts per million. So say the skeptics.
But this isn't the case at high altitudes. At 40,000 feet, at the top of the troposphere, the temperature is around -50C, and so the water vapour is frozen out, with a vapour pressure of about 4Pa, (1Pa at -60C). Atmospheric pressure at sea-level is about 100,000 Pa, and about 20,000Pa at 40000 feet. If I assume that the concentration of CO2 remains about the same, with a bit of turbulence stopping the heavy CO2 molecules settling out to lower altitudes, then the pressure of CO2 would be about 8Pa=20,000x(400/1,000,000)Pa, which is twice the pressure of the water vapour. So maybe an increase in the CO2 concentration does matter. We shall all get warmer, regardless of the height at which the heat flow into space is restricted.
Read part 1B for more on the distances over which one can interpolate. And as scaddenp said, Hansen & Lebedeff 1987 is important to read.
Let me address why anomalies give better accuracy mathematically. This article at wikipedia on Accuracy & Precision is worth reading, particularly for the difference beteen the two terms.
For any reading from an instrument, a thermomemeter for example, we know that the reading will consist of the true value of what we are interested in and some error in the measurement. In turn this eror is made up of the accuracy of the reading and the prcision of the reading. What is the difference between the two ideas.
Accuracy is the intrinsic, built in error of the instrument. For example a thermometer might be mis-calibrated and read 1 degree warmer than the real value. In many situations the accuracy of a device is constant - built into the device
Precision is how precisely we cane take the reading. So by eye we might only be able to read a mercury thermometer to 1/4 of a degree. A digital thermometer might report temperature to 1/100th of a degree.
So if we take many readings with our instrument each reading will be:
Reading = True Value + Accuracy + Precision.
This image might illustrate that:
So if we take the sum of many readings we will get:
And the Average of the Readings is the Average of the True values + the Average of the Accuracy + the Average of the precisions.
But the more readings we take, the more the average of the precisions will tend towards zero. The average of a random set of values centered around zero is zero. So the more readings we take, the better the precision of the average. Mathematically, the precision of an average is the precision of a single reading divided by the square root of the number of readings.
So if an instrument has a precision of +/- 0.1, an average of 100 readings has a precision of 0.1 / sqrt(100) = 0.01 10 times more precise.
And the average of the accuracy (for an instrument with a fixed accuracy) is just equal to the accuracy.
So the more readings we have, the more the average tends towards being: Average of Readings = Average of True Values + Accuracy.
Now if we take an anomaly. When we take the difference between 1 reading and the average we get:
Single Anomaly = (Single True Value + Accuracy + Precision) - (Average True Value + Accuracy)
which gives us:
Single Anomaly = (Single True Value - Average True Value) + Precision.
So by taking an anomaly against its average we have cancelled out the fixed part of the error - the accuracy. We have effectively been able to removed the influence of built in errors in our instrument from our measurement.
But we still have the precision from our single reading left.
So if we now take the average of many anomalies we will get
Average of Read Anomalies = Average of True Anomalies.
Accuracy has already been removed and now the average of the precisions will tend towards zero the more anomaliess we have.
So an average of the anomalies gives us a much better measure. And the remaining error in the result is a function of the precision of the instruments, not their accuracy. Manufacturers of instruments can usually give us standardised values for their instrument's precision. So we can then calculate the precision of the final result.
But we have to abandon using absolute temperatures. They would be much less accurate. Since the topic is climate Change, not climate, anomalies let us measure the change more accurately than measuring the static value.
There is one remaining issue which is the topic of parts 2A and 2B. What if our accuracy isn't constant? What if it changes? One thermometer might read 1 degree warm. But if that themometer is later replaced by another one that reads 1 degree colder, then our calculation is thrown out, the accuracies don't cancel out.
Detecting these changes in bias (accuracy) is what the temperature adjustment process is all about. Without detecting them our temperature anomaly record contains undetected errors.
PaulG - the way you do it, it take lots of thermometers at varying distances from each other and see how exactly the correlation of anomalies vary with distance. That would be the Hansen and Lebedeff 1987 paper referenced in the main articles.
Okay, I read the explanation that averaged anomalies at nearby locations will typically show less change than will average measured temperatures. That sort of makes sense. I am not sure how less change means more correct.
And I understand the idea of teleconnection. What I don't understand is how you can generalize, or interpolate, in areas where you don't have any measurements.
It seems to me that you won't be able to determine the correlation coefficients resulting from teleconnection unless you have all of the temperature measurements in the first place.
How can you determine the error in your averaging if you don't have all of the measurements to determine that error? The average of anomalies approach may intuitively feel better, but how is it mathematically justified?
To add, bear in mind the denier mantra "Satellites are true and NOAA lies." is something I increasingly have trouble arguing against even considering the 10 positive indicators of a warming world show the idea we have not really warmed in 20 years makes no sense. But, satellites don't lie, right?
[Rob P] - It's informative to look at the satellite trend (for RSS) for the lower troposphere:
So the satellite data does in fact show a long-term warming trend. If one selects the super El Nino of 1997/1998 as a starting point it's possible to fool the uninformed, but that deceptive practice may soon come to an end with the current super El Nino likely to anomalously warm the lower troposphere in the next 4-5 months.
GregCharles is correct about UAH 6.0. It also points out the need for several popular web sites to "get with the times". Whatever the reason UAH 6.0 was adjusted, the effect was to make it nearly identical to RSS. This improves "consistency", but begs the question why UAH thought RSS right and everyone else wrong. Regardless, all the popular sites I visit still rely on older UAH 5.6 or thereabouts, and we are now left with the satellite data showing nearly no warming for the past 20 years, and the surface data sets with more significant warming. The clear discrepancy between RSS & UAH 6.0 and the surface data (GISS, HadCrut4, Berkeley, etc) is something this site should address and I'm not seeing it.
The Karl et al 2015 paper that denier are objecting to is linked to on Tom's post. Another link here.
Full citation:Karl, Thomas R., Anthony Arguez, Boyin Huang, Jay H. Lawrimore, James R. McMahon, Matthew J. Menne, Thomas C. Peterson, Russell S. Vose, and Huai-Min Zhang. "Possible artifacts of data biases in the recent global surface warming hiatus." Science 348, no. 6242 (2015): 1469-1472.
I found the code that Tamino used for their 2011 paper and the way they calculated standard error. Looks like I am now able to replicate quite well the slopes and confidence intervals produced by SkS calculator - at least for those series that are clearly identifiable like Berkeley Land, UAH (5.6), RSS (TLT 3.3), GISSTEMP and HadCRUT4.. Unfortunately I still have some challenges with references like NOAA and Karl 2015 which I was not able to identify unambiguously. What I think they may mean (NCDC V3?) is not as close as I would have hoped for. Furthermore Berkeley global is a challenge for me as there my numbers deviate too much from SkS. It would be extremely helpful if the exact data sources (and last update time) were someplace visibe on the calculator page. (To avoid issues related to possible missing data point in 2015 I have made my calculations for period January 1979 - December 2014.)
Michael: Yes, that is one of the articles that made me think it is possible to replace fossil fuels completely. Then I read other articles saying in response that it's not so easy! One thing glossed over is fuel for aircraft and ships. I've heard of emission-free synfuels (synthetic fuels) for aircraft — whether or not they work for ships I don't know.
Just a note about the Zwally paper. Whilst it has been recently published, the research is much older and I first saw it as a conference paper listed on a NASA page, in 2012. It was also referenced in a Nature article that year. The conference paper had the same title and the same authors (apart from one).
This article shows that it is possible to build a completely renewable energy supply by 2050. If a high carbon fee was instituted so that carbon paid for all the damage it causes, investors would build out the renewable energy we need. It would save everyone money and result in better health. The primary block is political due to the influence of fossil fuel companies who stand to lose trillions of dollars.
Fossil fuels are only cheap because they make everyone else pay for the damage they cause. Obama's clean power plan is completely justified by the health savings, the climate benefits are not considered.
There are actually two aspects to my "subversive thought". Firstly, is it physically possible to find ways to cope with declining supplies of fossil fuel over a thirty-year period? I don't know. From what I read, sometimes I think it is, sometimes I think it isn't. I just don't know. That's why I asked.
Secondly, how should declining supplies of fossil fuel be allocated between countries? This is the aspect you've latched onto, but this is a different problem altogether. It is a problem the world would've had to face if we'd really passed peak-oil and supplies really were declining. My subversive thought was to suppose the decline was imposed, not natural.
Another way to look at the second aspect is this: We're aiming to induce a decline in fossil-fuel use in the face of a plentiful supply. Well, good luck with that! Whatever methods are used, I'll believe them only when I see fossil-fuel production declining in reaction to declining demand. I fear that, as you imply, the threat needs to be immediate and life-threatening before this will happen.
If you examine NOAA OHC data, you'll see a strong El Nino signal in the 0-100m Pacific record with OHC rising during El Nino. The signal reverses for 0-700m data but becomes far less obvious. So when global 0-2000m OHC data is examined there is nothing really left to see.
The 0-2000m OHC data does indeed record OHC reduced from 2015Q2 to 2015Q3. But the same thing happened in both 2014 and 2013 when it showed even greater reductions but without an El Nino. Indeed, in the post the "400 million atomic bomb detonations (27 zettajoules, or 27 billion trillion Joules)" figure for ΔOHC in 2015 appears to be the annual figure for 2015Q3 - 2014Q3 which is 27.6 Zj. Over the last 8 quarters the annual ΔOHC averages 22 Zj.
One comparison not made in the post but which I consider more interesting that numbers of bomb-equivalents is the global energy imbalance required to add, say, 22 Zj/year to the oceans. That would be 1.37W/m2 or according to IPCC AR5 Table AII.1.2 equal in size to the increase in all the positive human climate forcings 1978-2011 which, with these OHC figures, would thus still be up there warming away despite the level of AGW we have witnessed since that date. That's a rather scary equivalent.
Digby: Suppose fossil-fuel producers were "forced" to cut production at the above rate. Would the free market coupled with human ingenuity find ways to cope?!
One question that springs to mind: how would the cuts be allocated, between countries and among fuel types? Also, this would drive prices up, so who would benefit, the producers or would governments have to institute some kind of windfall profits tax? It would also be regressive, so some form of compensation to the poor would be needed.
Perhaps a less regressive policy would be to ration fossil fuel use at the consumer level. But until there is a widespread consensus that we are facing a an urgent existential crisis, I can't see that being politically feasible.
Far better would be a cap and trade policy. After all, the goal should be to lower emissions, not necessarilly to reduce fossil fuel use. Even better than that, in my opinion, anyway, would be a steadily rising carbon-tax-and-dividend policy.
It appears the rise in ocean heat content rate is trending up. Since 1990 (25 years), it appears to have increased ~21x10^22 J (15,668 atom bombs/hr), based on an atom bomb = 17x10^9 watt-hrs. In contrast, since 2008 (7 years), heat content appears to have increased ~8.5x10^22 J (22,650 atom bombs/hr). Article notes the 2015's El-Nino driven spike in heat content amounting to 50,400 atom bombs/hr (14/sec). I'm a bit preplexed: I had thought that in El-Nino years that the stored-up ocean heat content was expelled (more than avg rates) to the surface, causing spikes in land & ocean surface temps, and also causing a slight decrease in the rate-of-ocean heat content (compared to avg) due to this more aggressive expulsion of heat from the ocean. ... Opposite effect in La-Nina years, that the rate-of-rise of ocean heat content would be more than average. So, I'm surprised to see OHC rate-of-rise increase in 2015's El-Nino year. I obviously have something wrong here.
[Rob P] This years super El Nino has yet to really leave its mark on ocean heat content. The data for ocean heat content end in September and we're only reaching the peak of El Nino now. The next two quarters of OHC may tell a different story.
Time-series of globally-averaged (60°South to 60°North) ocean temperature anomaly from the monthly mean, versus depth in metres (pressure in dbar).(b) Time?series of globally-averaged sea surface temperature anomaly (black, °C), ocean temperature at 160 metre depth (blue), and the Nino3.4 regression estimate for SST (red). From Roemmich & Gilson (2011)
I would have thought that higher surface temperatures would have resulted in more infra red radiation out to the atmoshpere and space and less accumulation of heat in the oceans. That is, wouldn't there be more accumulation in cooler surface temp /La Nina years and less in El Nino years? Maybe that's why there's a downward turn at the end of the graph, which only goes to June 2015. That downward turn may continue while the El Nino unfolds.
The scenario linked to is interesting, thanks. It seems to match the back-of-envelope estimate I've made in the meantime: If fossil-fuel use remains at 30 Gt per year until 2020 and thereafter declines at 1 Gt per year for almost 30 years one uses 585 Gt of the budget. That leaves a residue of 15 Gt to play with. Being so precise is a bit silly really, considering all the uncertainties involved. However, the overall effect does seem to match the "WWW and Ecofys" scenario, even to the small tail of fossil-fuel use from 2050 onwards.
As you point out, feasibility is a different story altogether. In this regard, let me offer a subversive thought: Suppose fossil-fuel producers were "forced" to cut production at the above rate. Would the free market coupled with human ingenuity find ways to cope?!
Morgan01944 @5, the full context of your final quote reads:
"One other thing is certain: West Antarctica has been losing mass at an increasing rate since the 1990s and, irrespective of what is happening further East, that trend looks set to continue. Going to the other end of the Earth, the Greenland ice sheet has also been losing mass at an accelerating rate since around 1995. Greenland is now the single biggest source of mass to the oceans. These trends at both poles are huge signals that are unequivocal and uncontested."
West Antarctica is indeed at one pole, as Greenland is at another. Further, their trends in ice loss are indeed both huge and uncontested (including by Zwally et al). That it is not clear as to the trend in the East Antarctic Ice Sheet has no bearing on that claim. Ergo, your juxtaposing this quote with the two others, which discuss the reasons for lack of clarity in Eat Antarctica, shows at best careless reading - not a problem with the OP.
"If we knew the snowfall history perfectly then there wouldn’t be any controversy!"
"So what is really happening? One thing that Zwally’s study does highlight is how difficult it is to nail what is happening in East Antarctica because the signal is small and contaminated by unwanted effects that are as large or even large"
" trends at both poles are huge signals that are unequivocal and uncontested."
[DB] Note that this article is a repost. Please refrain from arguments from your personal incredulity, like you have done here.
If you have substantive issues with the content of the article, please cite those specific issues along with a link to a published analysis or examples from the primary literature that support your contentions.
So what's really happening in Antarctica?
Looks like realclimate DNS is down/subverted, i have informed Prof. Schmidt. The majority of nameservers that i checked claims no authoritative nameserver exists, a minority redirect to IP 126.96.36.199, which is also returned by directly querying the listed authoritative nameservers. whereat resides a spoof site claiming the domain has expired, and inviting one to buy it. A whois check is more illuminating, i have sent the results on to Prof. Schmidt also.
The "issue" with RC manifests itself primarily on weekends, as also appears to be the case here.
So it's simply a regular server mainteance or they don't run a server on weekends. No surprise, climate scientists usually don't have too much volunteer money to spare to maintain their ageing server. RC has been running for over 10y with the same graphical appearance which is quite impressive in those days of constant (and not necessarily useful) technology changes.
Ed Wiebe @ 3 icorrectly points out that Superconductivity of Graphene coated with Lithium is achieved at K5.9, not room temperature.
The sentence: ‘When coated with Lithium it becomes a superconductor, having no resistance to an electric current at room temperature’ is wrong and the text has been modified to show deletion of the sentence.
Here's a scenario by WWW and Ecofys that does the job of getting rid of fossil fuels by 2050. You should note that the total energy consumed in 2050 is about one-third of the amount in the RCP2.6 scenario I referred to in my post. Also, the amount of renewables and non-BECCS bioenergy looks (eyeball) similar between the van Vuuren and WWF cases, somewhere around 200-250 EJ/year.
This would be a huge challenge over 35 years and relies very heavily on energy efficiencies as well as aid to developing countries to allow them to grow their economies without increasing emissions. Much of the capital stock we have like power stations would have to be scrapped before its economic lifetime was up.
To me, this scenario looks even less feasible than the one that relies on CCS and BECCS.