Recent Comments
Prev 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 Next
Comments 57801 to 57850:
-
curiousd at 12:30 PM on 26 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
Input wanted on Hansen's year 2000 article in Proc.Nat. Acad. Sci 97, pp. 9874 - 9880. Please!! Here he says "..rapid warming in recent decades has been driven mainly by non-CO2 greenhouse gases (GHGs), such as chlorofluorocarbons, CH4, and N2O, not by the products of fossil fuel burning, CO2 and aerosols."" Wow folks, doesn't this contradict the seminal 1981 paper? What happened here????? -
Tom Curtis at 09:08 AM on 26 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
curiousd @23, relative humidity tends to be greater over ocean than over land, so increasing the surface area of the Earth's Oceans would indeed increase the relative strength of the Water Vapour feedback. It would also increase the strength of the negative feedback from the Lapse Rate feedback, which is the reduction of the Lapse Rate with increased humidity. The two effects do not cancel out, so that if the Earth was a 100% water world it would be warmer, all else being equal. However, it would not be sufficiently warm to initiate a runaway-greenhouse effect. The reasons why are discussed by Chris Colose here. The essence of the argument is that positive feedbacks reduce the outgoing radiation for a given surface temperature. If, but only if, the positive feedback reduces the OLR such that arbitrarily large increases of surface temperature are required to match the incoming solar radiation will you get a runaway feedback. For the Earth, with the water vapour/lapse rate feedback, effective insolation (insolation*(1-albedo) would need to be just over 320 W/m^2 rather than the current 240 W/m^2. The effective insolation needed to reach runaway greenhouse is called the Kombayashi-Ingersoll limit. Put another way, the Earth will only achieve a runaway greenhouse effect with current insolation if its albedo is reduced to 0.05 (give or take a bit for uncertainties). -
Bob Loblaw at 08:56 AM on 26 June 2012Mercury rising: Greater L.A. to heat up an average 4 to 5 degrees by mid-century
daisym: Well, UHI is usually a function of the logarithm of population (Oke, 1967), so the effect tapers off as growth continues, and it's usually caused by changing land use (pavement replacing natural surfaces, heat retention in artificial structures, etc.). How much more growth can the LA basin handle, and how much of it is still left in a natural state, to pave over? -
Eric (skeptic) at 07:50 AM on 26 June 2012Mercury rising: Greater L.A. to heat up an average 4 to 5 degrees by mid-century
The heat around the LA area depends almost entirely on offshore versus onshore winds. The global model cannot predict that pattern so they used downscaling. Reading through the study here http://c-change.la/pdf/LARC-web.pdf I see in the appendix that the offshore vs onshore parameters (alpha and beta) are determined from a minimizing the error between the global and regional models. Thus, alpha and beta are a function of three parameters from the global model (lapse rate, along with warming and ocean to desert contrast). The most obvious flaw is that the onshore vs offshore regime will depend on the larger climate pattern. For example the past winter's La Nina created more offshore winds. It does not appear that there is any connection from such climate regimes in the global model to the local model as they state in the paper the connections are mostly through temperature, contrast and lapse rate. Here's an alternative scenario: [LINK]Moderator Response: [RH] Shortened link that was breaking page formatting. -
daisym at 06:46 AM on 26 June 2012Mercury rising: Greater L.A. to heat up an average 4 to 5 degrees by mid-century
Los Angeles is one of the largest urban heat islands in the U.S. Is Alex Hall's predicted temperature increase for the L.A. region NET of UHI temp increases arising from projected population growth? What did Hall use as the projected L.A. temperature increase due to population growth through mid-century? -
cynicus at 06:37 AM on 26 June 2012Adding wind power saves CO2
Fyi, Fred Udo is great friends with De Groot and Le Pair, so don't be surprised when his 'science' appears as cherry-picked and deeply flawed like the others. He also trots out the false 'OCGT stations balance wind so CO2 savings are nil' argument. -
funglestrumpet at 06:23 AM on 26 June 2012Mercury rising: Greater L.A. to heat up an average 4 to 5 degrees by mid-century
They are lucky, The U.S.A. being one of the places in the world that will be least affected by climate change. Some poor sods stand little or no chance of adapting to it other than by burying their dead, of course. It would be nice if, when planning their adaptation strategies, these American experts could offer advice to their less fortunate fellow humans who live in parts of the world that will be most affected by it. Especially seeing as these people often have neither the expertise nor the time, what with subsistence farming to cope with, and Aids and water shortages and and and. -
miffedmax at 06:14 AM on 26 June 2012Response to Vahrenholt and Luning
One can only hope there is a special place in academic hell for those who repeatedly cherrypick the work of others to cite conclusions that are the exact opposite of what the original researchers conclude. -
Jim Powell at 03:04 AM on 26 June 2012Review of new iBook: Going to Extremes
It should show up in the Dutch store soon and the price should be $0.99U.S. or the equivalent in each currency. It will only work on an iPad, not in Linux. -
Bob Lacatena at 02:21 AM on 26 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
Here's the image from that link, changed to also include the atmosphere rolled into a ball alongside the oceans. Note that all of the biomatter on earth is an almost invisible spec, the size of one pixel, in this image: -
Bob Lacatena at 02:14 AM on 26 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
curiousd, No. It's really, really hard to get to a runaway. It has nothing to do with the surface of the earth, or how much water. Surface area means nothing, (everything is in units per square meter, it doesn't matter how many square meters in total). There's more than enough water on earth, too, the problem isn't that the system runs out of water. But there are negative feedbacks in place as well, the largest of which is the Planck effect... the hotter it gets, the more it radiates (power of 4), so it gets harder and harder to push that envelope. The "doubling" rule of CO2 applies to water vapor as well. Then there's the lapse rate feedback... as the earth warms, the lapse rate changes, and it becomes easier to radiate energy to space (i.e. more of the temperature change is higher up, where it can escape more easily to space). In the end, achieving a runaway is really, really hard. On how much water there is on earth... enough, but just for fun look at this. -
curiousd at 00:48 AM on 26 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
O.K. think I kind of dig it, esp thanks scaddenp who points out about the time step in the computer models respond to each temperature so the kind of effect I am talking about is IMPLICITLY calculated. One follow up question...I get the feeling from the old 1981 Hansen paper that his water vapor feedback on the CO2 initial temperature increase is positive but less than unity which is a really good thing!! The temp of the earth if no greenhouse gases would be given by (S is solar constant) S/4 = emissivity x stef boltz const x T^4, so the surface area of the earth cancels out. But if the surface area of the earth were larger, and it contained proportionally more water, then once the greenhouse effect were taken into account I suspect you could get closer to a runaway condition, even if distance from sun is same as in real case. Yes? No? (Here I am trying to see the kinds of things that go into the modeling....probably one of them is how much water there is on earth?) -
Bob Lacatena at 00:04 AM on 26 June 2012Response to Vahrenholt and Luning
renewable guy, Until the link is properly fixed in the post, you can find this excellent talk by the American geologist Richard Alley here.Moderator Response: [DB] All links fixed. -
Bob Lacatena at 23:40 PM on 25 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
19, curiousd, The short answer to your question is "yes." The long answer is that climate sensitivity (in addition to Tristan's distinction between transient, fast feedback and Earth System) is a lot of different things, because it doesn't really exist as some fixed universal constant. As explained previously, every system configuration will respond differently. In casual conversation and for simple box models we use a single scalar to represent sensitivity relative to a doubling of CO2. But in a physics based climate model, for instance, you don't tell it the sensitivity, you tell it how atmospheric water vapor responds to a change in temperature, and how temperature responds to a change in water vapor... and a hundred other things. Then the model plays out and tells you what the climate sensitivity is in that scenario, as a (grossly oversimplified) scalar value. So to answer your question more directly, if you are talking about a climate model, the non-runaway-vapor-thing is included as a matter of physics, and if you're talking about a simple scalar value from paleoclimate or other observational study, the non-runaway-vapor-thing is included because you're looking at the "end result difference," and if it's a value computed in some other way it includes the non-=runaway-vapor-thing unless the person who put the number together was stupid. -
macoles at 23:34 PM on 25 June 2012An exponential increase in CO2 will result in a linear increase in temperature
I was curious to see if global warming trend acceleration was noticeable in the Foster and Rahmstorf Trend Calculator. I used the GISS dataset option and Excel to plot 17 year trend values over 16 years (ie from 1979-1996 to 1994-2011). Here it is: Excel's TREND function calculated the increase of these trends to be 0.0579C per decade per decade. Another way to think of that acceleration is every 21 months the per decade rate increases by 0.01C Extrapolating that rise to a 2012 midpoint indicates that the current rate of warming is around 0.26C per decade, which compared to the often quoted average of 0.17C per decade (over the period 1979-2011) is rather alarming. Now I'm just a humble engineer with only a basic grasp of statistics and climate science, so I'll concede that maybe 17 year trends over 16 consecutive years is a bit too short or not enough data to pass a significance test, but the Foster and Rahmstorf datasets are far less noisy than the raw datasets that require 30 year trends. -
renewable guy at 23:28 PM on 25 June 2012Response to Vahrenholt and Luning
Is it possible to get the link to Richard Alley working? -
ajki at 21:49 PM on 25 June 2012Response to Vahrenholt and Luning
Thanks for keeping up the tedious work of myth rebuttal. German readers might be interested in a wiki especially dedicated to the book (KalteSonneCheck) by Vahrenholt, Lühning & Consortes, that tries to show the many gross misreprensentations by chapters (often recurring to SkS-pages along the way) - though this may be next to impossible to do since there are so many of them. -
Tristan at 21:19 PM on 25 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
curiousd What do you mean by 'real climate sensitivity'? There are three different 'lengths' of climate sensitivity. A) Transient Climate Response (TCR) is the warming due to CO2 equiv at the point at which CO2 concentration has doubled, assuming a 1%/yr increase. Central estimate is around 2.0. B) Fast-Feedback/Equilibrium Climate Sensitivity (ECS), the most used definition, takes quite a while to realise (hundreds of years after doubling) and includes the diminishing positive feedback due to water vapor and sea ice albedo. Central estimate is around 3.0. C) Earth System Sensitivity includes the slow feedback resulting from large-scale glacial retreat and the long-term ocean responses. It could be as much as twice the ECS and takes thousands of years to realise. -
scaddenp at 20:05 PM on 25 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
Electrical engineers see feedback and try to construct climate in terms of something they know. With multiple non-linear feedbacks operating on different time-scales, it doesnt work well. What you ask would happen - but implicitly - as at each time-step in the model, the systems would respond to current temperature. Note also the AR4 models did not include carbon-cycle feedback (what would big for ice-age) as far as I know. This is because feedback is assumed to be too slow to have much effect in the next 100 years. I believe some AR5 models are full earth system model with a carbon cycle. (I am echoing comments heard from climate modellers sorry which isnt the most reliable source). -
scaddenp at 19:53 PM on 25 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
Not to forget also that there a robust and not-so robust predictions from climate models. Sensitivity, especially short-term sensitivity is not so robust, especially in older models. If you want to talk about model/data comparisons, then better to talk about model skill (models predictive power compared to naive prediction). " Eventually they must diverge, and we will have to wait to see which line the actual temperatures follow." So at what point would you say that data has changed your mind? -
kampmannpeine at 19:47 PM on 25 June 2012Review of new iBook: Going to Extremes
I do not have Itunes because I am working under LINUX ... any idea to download the book (wine produces error-message und Debian Lenny Linux) -
curiousd at 18:27 PM on 25 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
O.K. I waded into one of the threads here on runaway greenhouse etc and found an incredible hornets nest of advanced electrical engineering analogues. Also found - I think - versions of my argument above which involves the simplest feedback model, maybe, with a positive feedback that is less than unity. So let me sharpen my question....does the real climate sensitivity include this effect of adding up increasingly smaller positive feedback terms in the water vapor feedback? Do real climate modelers include this effect? -
Paul D at 18:20 PM on 25 June 2012It's the sun
test comment -
Glenn Tamblyn at 14:26 PM on 25 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
Fred To talk about 'falsifying' something, to the extent that the concept of falsifiability can be applied, one needs to define what it is one is seeking to falsify. Are we going to totally falsify something if it doesn't behave exactly as predicted? Are we going to say that we may have falsified the extent to which it occurs rather than whether. Where there is a range of science involved in a 'hypothesis', do we need to falsify all those aspects? Or just one of them? and to what extent. If a theory makes multiple predictions and most of them are validated but a few aren't, does that mean the entire theory is wrong? Or that we don't understand certain aspects of it? Consider what AGW 'predicts'. 1. Rising GH gases will cause the Earth to go into energy imbalance wrt space. 2. As a consequence we expect heat to accumulate in various parts of the climate systems. 3. This will have some distribution over space and time. 4. Then these components of the system will then interact in ways that may redistribute some of this additional energy 5. These different components of the system are of quite magnitudes so interactions between the system components can have significant impacts on the smaller components compared to the larger components. So if the smallest component of the system happens to not behave quite as we expect for a moderate period of time what are we to conclude? Exactly what has been falsified? So what is happening to the climate systems? - The oceans are still absorbing most (90%) of the heat; several Hiroshima Bombs per sconds worth. - Ice is still melting, 500 GTonnes /year which requires more heat - that 3% of the extra heat - Temperatures within the Earth's surface crust are still rising slightly - thats about 4% of the extra heat - Atmospheric temperatures have risen as well (around 3% of the extra heat) but over the last decade have relatively plateaued. - Over the same period, thanks to the ARGO float array network we know that warming in the surface layers has plateaued as well because heat is being drawn down to deeper levels. - Simple thermodynamics says that if the upper layer of the oceans hasn't warmed much, the atmosphere won't warm much either. So with all this, what might possibly have been falsified (even allowing for the fact that a decade or so still isn't long enough to make that judgement let alone any statistical arguments)? Have the GH properties of those gases suddenly turned off? Is the Earth no longer in an energy imbalance? No! Heat is still accumulating unchanged. Just that most of it is happening, as it has for the last 1/2 century, in the oceans. And the amount of extra heat is too great to have originated from somewhere else on Earth. To have a lack of warming that might be statistically significant after some years to come, first you have to have a lack of warming in the first place. And we don't! 97% of the climate has continued accumulating heat unabated. And the other 3% accumulated it for much of that period but has slowed recently, for understandable reasons. So is there even a prospect from the data we have available to date that the basic theory of AGW might be falsified? Nope! No evidence for that. Is there a prospect from the data we have available to date that the aspects of the theory that tell us how much heat will tend to go into which parts of the system might befalsified? A small one perhaps. Heat is largely going where we expect it to. What about the possibility that the aspects of the theory that deal with the detailed distribution of heat within different parts of the ocean might be falsified? That although we have a good understand of the total amount of heat the oceans are likely to absorb, that our understanding of its internal distribution in the oceans, spatially and temporally might not be perfect. Yep, a reasonable prospect of that. Although at least one GCM based study - Meehl et al 2011 has reported the very behaviour we are observing. So is that what you mean by the falsification of AGW theory? That our understanding of how flow patterns in the ocean might change isn't 100 reliable? If that is your definition then I agree with you. We can probably already say that the statement that we can model ocean circulation with 100% accuracy has already been falsified. However the statement that we can model ocean circulation with reasonable accuracy and can model total heat accumulation in the ocean very well has definitely not been falsified; so far there is no prospect of that. And certainly the statement that we can model the underlying causes of the Energy imbalance of the Earth has even a prospect of being falsified any time soon is simply unsupported by the evidence. If you want to investigate evidence that might confirm or falsify the core theory of AGW, focus on the total heat content of the ocean. If that levels off then there really is something to talk about. But there has been absolutely no sign of that. If we need to wait x years for that key data to become significant wrt any 'lack of warming' then we are at year zero now. 'lack of warming' hasn't even started yet. Believe me, if total OHC data showed the sort of pattern we have seen in the merely atmospheric data (the 3%) over the last decade, that really would be BIG NEWS. And we would report it here, believe me! Unfortunately, it just ain't happening! -
scaddenp at 13:50 PM on 25 June 2012Ten Things I Learned in the Climate Lab
Which means the question cant be answered till pluvial tells us more. Embedding a WRF model into GCM is a cool idea for resolving process. -
Bob Loblaw at 12:36 PM on 25 June 2012Ten Things I Learned in the Climate Lab
Actually, PluviAL's question isn't that unreasonable. Do a google search on "Regional Climate Model" - there has been a lot of work of using a fine-mesh local grid in the region of interest (say, the continental U.S.) imbedded in a courser-grid GCM. -
Bob Lacatena at 09:54 AM on 25 June 2012Ten Things I Learned in the Climate Lab
PluviAL, Like scaddenp asks... what is the point of your questions? Climate models often work with lots of grid sizes (different for ocean, land, atmosphere). The world is also three-dimensional. Grid choices are made primarily for execution time (twice as many cells in the grid along its width and height means four times as many calculations, four times as many means 16 times the calculations, etc.). To go to a scale of 10 meters instead of 10 km, you have 1,000 times as many cells across, and 1,000,000 times as many cells in total. Climate models don't even run at scales of 10 km.. more like hundreds of kilometers, so your scale change would blow things way out of the water. No, you're not likely to find a computer on earth that will even get things done at the 10 km scale (although that's really not necessary, either). Sometimes what you are looking at doesn't require better resolution, and can even be confounded by it (it requires even more complex and detailed modeling of physical processes to resolve the interaction at the higher resolution -- things that can easily be dispensed with at larger resolutions). The CMMAP project is particularly interesting. One of the great problems in climate models is cloud behavior, because the scales needed to properly model clouds are far too small to be performed efficiently, and most climate attributes do not need that small scale. Their approach is to model the climate on an achievable scale, and to model clouds for just one small grid cell within the larger cell, and then to apply that result throughout the cell (effectively assuming that their single result will apply, on average, throughout the larger cell). -
piloot at 09:23 AM on 25 June 2012Review of new iBook: Going to Extremes
I Can't find it in the Dutch ibook store? -
peter prewett at 08:37 AM on 25 June 2012Review of new iBook: Going to Extremes
It may be $3.99 as per download link - then I do live in Australia which is costly!! -
scaddenp at 08:20 AM on 25 June 2012Ten Things I Learned in the Climate Lab
pluvial - what do you hope this will achieve? I dont work in climate code but for models I work with the immediate issues would be: 1/ halve the resolution and you double the processing time. This limits what we can do with our models. Resolution has improved as no. of cpu goes up and get faster. 2/ The numeric method being used may have issues. Its little help to understanding reality if fine scales simply amplify rounding error in copying bounding conditions to nodes. 3/ Parameterization is often done to account for subscale processes. Increasing scale should theoretically allow for eliminating this but only by actually directly modelling these processes. Doing this is likely to be at least as complex as the large scale models and so massively increasing cpu time requirements. If you dont model these processes, then you potentially lose the reason for smaller scale in first place. -
shoyemore at 07:23 AM on 25 June 2012Adding wind power saves CO2
Paul D #37, I noted the data was very sparse on the right, relative to the left. The loess is probably inappropriate to show the trend, and an estimator should be used with error bars. The errors bars could be so large as to render the fit after 600MW or so meaningless. It is also not clear what is meant by "instantaneous CO2 emissions". PS I chose 1000MW arbitarily, could as easily have said 600MW. PPS Ireland has about 2000MW of installed wind power capacity. -
Paul D at 06:01 AM on 25 June 2012Adding wind power saves CO2
Shoyemore@36 Why on Earth would 1000MW be special?? Thats about two average sized power stations in the UK (or anywhere). Should be noted that the population of Ireland is about 10 times smaller than the UK. Also on that article a number of commentors have pointed out the cherry picking nature of the data and a reference is made to Dr Udo. Seems like that research has been thoroughly rebutted. The commentor from AWEA points out that a cold period seems to have been chosen which boosted fuel burnt for heating. Also found this. http://coloradoenergynews.com/2011/09/fact-check-fred-udos-bogus-numbers-on-wind-and-emissions-savings/Moderator Response: TC: Made link live. -
dana1981 at 04:20 AM on 25 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
Fred, short and simple, you're focusing on short-term noise and missing the long-term signal. If you want to deny that the planet is still warming, please go to one of the appropriate threads, like this one. Bottom line is that there is a warming trend, and that trend corresponds to ~3°C sensitivity. People really need to get over Scenario C. Scenario C didn't happen - it's a moot point, a red herring, a distraction. Just pretend it's not there. Kevin C is also going to have a very interesting post on surface temperatures in a couple of weeks which will put another nail in the 'no warming' myth. -
Bob Lacatena at 02:40 AM on 25 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
Fred,We have three independent sources all showing flat temperature trends...
False. Foster and Rahmstorf (2012) Another viewTo quote Gavin: “The basic issue is that for short time scales (in this case 1979-2000), grid point temperature trends are not a strong function of the forcings...
You appear to have completely misunderstood what he was saying. The quote is about "grid-points" -- spacial temperature trends, applied locally. Your follow-on assessment that this somehow means "CO2 emissions had little to do with 20th Century temperature increases" is utterly wrong.His “projections” test the science behind the models (CO2 et al forcing), not the statistics. They were intended to contrast what would happen if CO2 emissions continued to increase after year 2000, and what would happen if they did not.
Wrong. This is a strawman that you have constructed so you can choose to interpret things as you wish. This has been explained to you multiple times, and you keep trying to find ways to ignore the facts. The simple truth is the 24 years ago Hansen used a simple climate model and three specific scenarios out of countless possibilities to make a series of projection. His model was not as good as those today, his climate sensitivity was too high, none of his scenarios came to pass (and none is truly close), and a number of confounding factors in the past decade have all combined to cause current events to fail to match any of those predictions. This is the simple truth. So what, exactly, is the point that you are trying to make out of all of this? -
Fred Staples at 02:02 AM on 25 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
Most of these posts miss the point of the Hansen models. His “projections” test the science behind the models (CO2 et al forcing), not the statistics. They were intended to contrast what would happen if CO2 emissions continued to increase after year 2000, and what would happen if they did not. It does not matter if Hansen’s sensitivities were accurate, as long as they were non-zero. Why not? Because the CO2 concentration in the C line after year 2000 is assumed to be constant. It is a base line against which the impact of the actual CO2 can be assessed. I would plead with everyone who wishes to debate this to look at the Hansen chart at the Real Climate update or Post 48 here. Up to 2006 the B line, the C line, and the actuals moved together. Gavin Schmidt could reasonably claim, as he did, that he science behind the model reflected reality. (By that time the exponential A line had been disowned and “we are moving up the B line”). Most of the 20th century increase occured from 1985 to 2006, which is why the overall trend lines (B, C and actuals) are similar, post 37. CO2 concentrations increased from 350 to 380 ppm over the period, and to just short of 400ppm from 2006 to date. But after 2006 the B and C lines diverged sharply, and they must continue to do so. The actuals, on all measurements, followed the C line, as Dana points out. As of today, 24th June, 2012, on any measurement, the global average temperature is more than 0.5 degrees below the Hansen projection. Now to explain that discrepancy we can invoke measurement error, model error, or short-run statistics. We have three independent sources all showing flat temperature trends following the C line since year 2000 – satellite, radio-sondes, and land stations. Random fluctuations in the real global temperature is a possible explanation. Purely by chance, we might be observing a run of increasingly depressed temperatures. To quote Gavin: “The basic issue is that for short time scales (in this case 1979-2000), grid point temperature trends are not a strong function of the forcings - rather they are a function of the (unique realisation of) internal variability and are thus strongly stochastic. (Incidentally, if true, this means that CO2 emissions had little to do with the 20th Century temperature increases). Finally, of course, there are countervailing forces – aerosols, La Nina preponderance, “heat” transfer to the deep oceans, reductions in other greenhouse gasses. All these explanations of the Hansen error must eventually reverse, and it is possible that the actual trend will move sharply up to the B line, and stay there. But while we wait the gap (B to C) will grow. And it is pointless to lower the temperature sensitivity to force B into line with C. Eventually they must diverge, and we will have to wait to see which line the actual temperatures follow. In the mean-time surely we have to accept that the Hansen models offer no corroboration whatsoever for the CO2 forcing theory of AGW. If temperatures remain flat, eventually falsification will loom. -
shoyemore at 01:19 AM on 25 June 2012Adding wind power saves CO2
MarkR, Tom, & Anyone Else Could you take at look at this chart for me? Information is here: Eirgrid Ireland "instantaneous" CO2 emissions & loess fit The author (who is not a climate science denier or possessed of hangups about renewables) suggests this shows CO2 emissions increase with wind power capacity above 1000MW. I have my own idea about what is wrong, but would like to hear some other opinions. Thanks. -
les at 23:55 PM on 24 June 2012Review of new iBook: Going to Extremes
Ahhh - I tried from iPhone. -
Jim Powell at 23:34 PM on 24 June 2012Review of new iBook: Going to Extremes
Les: I show it for sale in the UK store. Not sure what the problem is. Others: I have added the missing hyphen and updated the iBook. Try that in a print book! As a reminder, because of the software used to accomplish the interactivity, these Apple iBooks only work on an iPad. I hope that will change and believe it will. -
Tom Curtis at 22:53 PM on 24 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
dana @56, the important paragraph in the google translated version at WUWT reads:"The CO 2 emissions since 2000 to about 2.5 percent per year has increased, so that we would expect according to the Hansen paper a temperature rise, which should be stronger than in model A. Figure 1 shows the three Hansen scenarios and the real measured global temperature curve are shown. The protruding beyond Scenario A arrow represents the temperature value that the Hansen team would have predicted on the basis of a CO 2 increase of 2.5%. Be increased according to the Hansen’s forecast, the temperature would have compared to the same level in the 1970s by 1.5 ° C. In truth, however, the temperature has increased by only 0.6 ° C."
(My emphasis) The 1.5 C anomaly compared to the 1960-70s mean compares well with the size of the arrow. Hence I take this to be Solheim's real "prediction". The 1.9 C increase mentioned in the caption to Solheim's figure makes little sense. It may refer to the increase in Solheim's projection out to 2050 relative to temperatures assumed not to increase further in temperature, or perhaps even to decrease. As a comparison between even Solheim's inflated projections and observations, it is not true over any period up to and including 2012. -
PluviAL at 19:10 PM on 24 June 2012Ten Things I Learned in the Climate Lab
How hard is it to set a model with two grid patterns, one with inputs on the grid scale availabe from commonly available data, and one in very fine scale, say in grid cells of 10 meters instead of 10s of kilometers? Is that doable? What kind of person count would you need? How big a system? How long to build a working model? How long to test it and make it reliable? -
chuck101 at 18:20 PM on 24 June 2012Greenhouse gases are responsible for warming, not the sun
Hmmm, difficult stay away from political comments when Heartland is, at heart (sorry), a political institution; set up by big business and big oil, to look after their interests, and with the stated aim of casting doubt in the minds of the public about AGW. In order to do this, they hire scientists willing to plead their cause. Sadly, since the science is so rock solid, the only way they can do this is by twisting the science itself, by a combination of 'errors', 'mistakes', misrepresentation, and in some cases, outright falsification. These aren't accidental mistakes or errors. They are deliberate, so no way are these guys going to retract. They would lose their funding from Heartland. Hence the same scientists keep making the same sorts of 'errors'. They get called on it, refuse to admit it, then get called on it again in some other context, refuse to admit it, etc, and so the merry *Gish Gallop' continues. For example Bob Carter, the serial cherry picker, gets called on it time after time, but still we get 'GW stopped in 1998', or 'slight decline since 2002', or 'no warming since (insert future date here)'. Still he continues, because he is not out to persuade fellow scientists, he is out to bamboozle the general public, who are, in the main, scientifically challenged and do not even see the various cunning tricks that have been played on them. So while these 'scientists' need to be held to account scientifically, (and you guys are doing a great job), the actual battle is in the political area. The scientific battle was won long ago. -
BaerbelW at 18:15 PM on 24 June 2012Review of new iBook: Going to Extremes
Thanks D_C_S for the heads-up! Re-reading the statement, I realised that the "14" is actually the number of "billion-dollar" weather disasters happening in the U.S. in 2011 and not the amount they incurred. I have updated the post accordingly. -
curiousd at 16:51 PM on 24 June 2012Hansen and Sato Estimate Climate Sensitivity from Earth's History
Hi Newbie Physics guy, non climate guy interested in teaching this stuff here: Pointing out something that might possibly relate the 1981 Hansen, et al Science paper to this recent determination of sensitivity by fitting to past data? In 1981 paper, CO2 alone gives 1.2 degrees no feedbacks. (Model 1) By holding relative humidity constant ( water vapor feedback) sensitivity increased to 1.9 C. ( Hansen model 2)So 1.9 - 1.2 is 0.7 degrees which is 0.58 times the 1.2. Then if you just take these two effects alone why don't you get the real sensitivity = 1.2 (1 + .58 + .58^2 + etc) which by this being a geometric series gives 1.2 / (1-.58) = 2.9 degrees C? Which is about the same as the fitted climate sensitivity from Hanson and Sato?????? Or is this non runaway water vapor thing here already taken into account by the method of Hanson and Sato? Also does not this way of looking at it mean that it is a darned good thing that the "water vapor constant relative humidity assumption" gives 0.7 extra degrees , instead of twice that?, which would be more than the original 1.2 degrees? 1.2 / (1-.. -
dana1981 at 14:19 PM on 24 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
Tom - Solheim's figure was entirely unclear to me. I couldn't tell if his arrow was intended to stop at the supposed actual forcing-corresponding temperature change, or if it was just pointing in that direction. Given that he said the model overestimates the temperature by 1.9°C, I just couldn't figure out what he was trying to show in that figure. The arrow didn't indicate a 1.9°C discrepancy unless it was simply pointing upwards toward a much higher temperature. Likewise the arrow in my Figure 1 is simply intended to point upwards in a non-specific manner. Regardless, Solheim royally screwed up, the only question is exactly how royally. If you're correct that he applied a 2.5% CO2 growth rate starting in 1951, well, that would indeed be a very royal screw-up. -
D_C_S at 11:01 AM on 24 June 2012Review of new iBook: Going to Extremes
Make that $55billion. -
D_C_S at 10:27 AM on 24 June 2012Review of new iBook: Going to Extremes
The article states that the US experienced $14billion in weather disasters in 2011. There were 14 $billion-plus-each weather-related disasters in the US in 2011, for a total cost of $54billion. See the January 26 post here: http://www.wunderground.com/blog/JeffMasters/archive.html?year=2012&month=01 -
Tom Curtis at 09:59 AM on 24 June 2012Simply Wrong: Jan-Erik Solheim on Hansen 1988
dana, by pixel count, temperature anomaly that Solheim claims would have been predicted with actual forcings is 18% higher than that for scenario A. In contrast, you only show him as showing a forcing 9% greater than that in scenario A. Curiously, if you apply a 2.5% increment to the growth of CO2 from 1951 in a manner similar to that described in Appendix B of Hansen et al, 88, the result is an 18% greater increase of CO2 concentration from 1981 to 2011 than is shown in Scenario A. This strongly suggests that Solheim has not only applied the 2001-2011 growth rate of CO2 over the entire period of the projection, despite the well known fact that growth in CO2 concentrations was much reduced in the 1990s, but that he has also treated CO2 as the only forcing. It is not certain that that is what he has done, but it is the simplest explanation of his error. As it happens, in assuming that he took a more reasonable approach, you appear to have underestimated his error. -
Steve L at 07:47 AM on 24 June 2012Arctic sea ice takes a first nosedive
The solstice occurred on the 20th (where I am) this year, but in some years it occurs on the 21st or 22nd. So could comparing to same day-of-year be adding noise? Should comparison be made instead to days relative to the solstice? -
les at 06:20 AM on 24 June 2012Review of new iBook: Going to Extremes
Doesn't seem tone available from UK itoons book shop. -
dana1981 at 05:56 AM on 24 June 2012Christy Exaggerates the Model-Data Discrepancy
Bernard @15 - wow, assuming population density is uniform over entire countries? It doesn't get much sloppier than that - unless you consider McKitrick's previous work, I suppose. Mosher suspects that doing the analysis properly will strengthen McKitrick's conclusions. Let's just say I'm skeptical, but until he does it properly, his conclusions are wholly unsupported, and certainly don't deserve consideration by the IPCC.
Prev 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 Next