Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
Keep me logged in
New? Register here
Forgot your password?

Latest Posts

Archives

Climate Hustle

Andy Skuce

Andy Skuce unfortunately passed away on September 14, 2017 after a years long battle against cancer which he wrote about in his final blog post published a couple of weeks earlier. Andy was an independent geoscience consultant based in British Columbia, Canada. He earned an MSc in Applied Geophysics from the University of Leeds, a BSc in Geology from the University of Sheffield and was registered as a Professional Geoscientist in British Columbia. Andy has worked for the British Geological Survey and in a variety of technical and managerial roles for oil companies in Canada, Austria and Ecuador. Andy published a handful of papers over the years in tectonics and structural geology that can be viewed here. He described how his views on climate change evolved in this blog post.

The Skeptical Science community is thankful for all the contributions Andy made to our efforts over the years and we bid him a heartfelt farewell here.

 

Publications

Google Scholar profile.

 

Recent blog posts


Exit, Pursued by a Crab

Posted on 29 August 2017 by Andy Skuce &

This is a re-post from Critical Angle

Participating in social media creates a wide and diverse network of acquaintances. Often, these people become “friends”, even though direct personal contact may never made with them. It can be hard to establish traditional friendships without face-to-face encounters. Before the Internet, reading body language, voice inflections and facial expressions was as big a part of communication as speech itself. For many of us who spend a disproportionate amount of time in front of screens, much of our communication has become disembodied. But we still have bodies and, unfortunately, bodies break down.

I never wanted to write this post, but I feel that I owe it to the people I have come to know as online friends. They deserve to know that I’m suffering from a fatal illness. However, I hate the idea of now being treated differently because of this disclosure. I am not fishing for compliments or looking for moral support.

IMG_0811

In 2002, at age 48, I was diagnosed with aggressive prostate cancer. I had a prostatectomy, but, despite the entire removal of the gland, there were small amounts of metastatic disease detected in nearby lymph nodes. The cancer had not been cured. Progression of the disease was slowed for many years by intermittent hormone treatment. I experienced no physical symptoms of the disease for twelve years, although the consequences of surgery and hormone treatment were no fun. But life continued and it was good.

As Hemingway remarked about going bankrupt, my cancer progressed gradually at first and then suddenly. About two years ago, my body’s plumbing and scaffolding started to show signs of trouble. More aggressive hormonal drugs were prescribed, which brought me back to good health for a year. Then, as the effectiveness of those drugs failed, chemotherapy beat back the worst symptoms for most of another year. Chemotherapy side-effects can often be managed quite well these days and it is not the horror that many imagine.

You become aware that the treatment options are running out when the oncologists start talking about maximizing quality, rather than quantity, of life. That’s where I am now. My life expectancy has been reduced from years to months. There still may be a few tricks left in my doctors’ books that may help extend my life beyond current expectations, but they are long shots and may not be available.

Read more...

20 comments


New publication: Does it matter if the consensus on anthropogenic global warming is 97% or 99.99%?

Posted on 3 May 2017 by Andy Skuce &

The 97% consensus on anthropogenic global warming (AGW) reported by Cook et al. (2013) (C13) is a robust estimate. Alternative methods, such as James Powell's, that identify only explicit rejections of AGW and assume that all other instances are endorsements, miss many implicit rejections and overestimate the consensus.

The C13 method can be modified and used to estimate the consensus on plate tectonics in a sample of the recent geological literature. This limited study—not surprisingly to anyone familiar with the fieldproduces a consensus estimate of 100%. However, among the abstracts that expressed an opinion. only implicit endorsements of the theory were found: this underlines the importance in consensus studies of identifying implicit statements. The majority of the abstracts expressed no position. Assuming that these cases suggest either uncertainty about, or rejection of, plate tectonics (as some critics of C13 have claimed about AGW) would lead to the absurd conclusion that the theory is not widely accepted among geologists.

Opinion surveys show that the public is misinformed about true state of consensus among climate scientists, with only a minority aware that it is greater than 90%. The difference between 97% and 99.99% is tiny compared to this gap. The 3% of published papers that reject AGW are contradictory and have invariably been debunked. They do not therefore provide a coherent alternative account of climate science. Despite this, dissenters are awarded disproportionate attention by ideologically and commercially motivated interests, as well as by false balance in the media. Countering this public confusion requires both communicating the consensus and debunking bad science.

For a shorter summary than the one below, read Dana Nuccitelli's Guardian piece: Is the climate consensus 97%, 99.9%, or is plate tectonics a hoax?

In 2016, James Powell published an article Climate Scientists Virtually Unanimous Anthropogenic Global Warming Is True in the Bulletin of Science, Technology & Society. The paper was critical of the 2013 paper by John Cook and members of the Skeptical Science team (C13): Quantifying the consensus on anthropogenic global warming in the scientific literature. Powell’s article is pay-walled, but readers can read the details of his argument on his blog and in a Skeptical Inquirer piece from 2015. There is also a free version of the BST&S article hosted here, found through the Unpaywall Browser Extension.

We have previously responded to Powell’s arguments. John Cook published an article for Skeptical Inquirer and I wrote a piece on my blog Critical Angle. Yesterday, a team of eight co-authors (Andy Skuce, John Cook, Mark Richardson, Bärbel Winkler, Ken Rice, Sarah Green, Peter Jacobs and Dana Nuccitelli) published a more formal, peer-reviewed rebuttal: Does it matter if the consensus on anthropogenic global warming is 97% or 99.99%? in the Bulletin of Science, Technology & Society. The article is also paywalled, but the publisher allows us to post a version of the revised submitted manuscript, which can be read here.

In his BST&S paper, Powell makes four main arguments against C13:

  1. Consensus studies by Doran & Zimmerman (2009), the Pew Research Center (2015) and Anderegg et al. (2010) fail to reveal a near-unanimous consensus on AGW “…because of some combination of small sample size, reliance on fallible opinion, and inclusion of nonexperts…”
  2. The C13 study of 11,944 articles in the peer-reviewed literature from 1991-2011 failed to adequately measure the consensus because it ignored the abstracts and papers that did not express an opinion on AGW.
  3. Applying the C13 methodology to a universally accepted theory like plate tectonics would yield misleading or absurd results.
  4. If it were true that 3% of scientists rejected or were uncertain about AGW, then the case for action to prevent global warming would be weakened, since, by comparison with now-settled scientific questions like continental drift and the origin of lunar craters, the existence of dissenters might suggest that AGW theory was about to be overthrown

We respond in detail to these objections in the paper. I will summarize the main points below. Complete references are provided in the paper.

Read more...

7 comments


Justin Trudeau approves two big oil sands pipeline expansions

Posted on 1 December 2016 by Andy Skuce &

In an announcement on November 29, 2016, Canadian Prime Minister Justin Trudeau approved two new major pipeline expansions for Canadian bitumen. Altogether, the two projects will add over a million barrels per day to Canada's export capacity.

At the same press conference, Trudeau rejected the application for the Northern Gateway pipeline, which would have provided 525,000 barrels per day of transportation from Alberta to the Pacific Ocean through the northern British Columbia coast, near Kitimat. 

Northern Gateway (map by Enbridge)

The proposed export route would have involved tanker transport through fjords and treacherous seas in an area of protected wilderness known as the Great Bear Rainforest. Trudeau promised a legislated ban on all oil tankers on the BC Coast north of Vancouver Island. The Northern Gateway project was fiercely resisted by First Nations.

Kinder Morgan Trans Mountain Expansion Project (TMX)

The Trans Mountain Expansion project involves the twinning and expansion of an existing pipeline that runs from Edmonton, through Jasper National Park, to the Pacific coast at Vancouver.

The project currently has a capacity of 300,000 barrels per day and will be expanded to have a total capacity of 890,000 barrels per day. Around 400 Aframax tankers per year will transport diluted bitumen from the Westridge Marine Terminal, through Vancouver's Burrard Inlet, then down narrow passages, with strong tidal currents, between the Gulf Islands, and finally through the busy shipping lane of the Strait of Juan de Fuca to the open ocean and markets around the Pacific. The project should be completed in 2019.

Chart by Doug Bostrom. Green line shows outbound tanker course, red overlay shows areas with strong tidal currents, in knots. High resolution PDF (big file)

Read more...

24 comments


Sensitivity training

Posted on 30 September 2016 by Andy Skuce &

This article was originally published online at Corporate Knights Magazine and will appear in the publication's Fall 2016 hard-copy magazine.

 

Climate scientists are certain that human-caused emissions have increased carbon dioxide in the atmosphere by 44 per cent since the Industrial Revolution. Very few of them dispute that this has already caused average global temperatures to rise roughly 1 degree. Accompanying the warming is disruption to weather patterns, rising sea levels and increased ocean acidity. There is no doubt that further emissions will only make matters worse, possibly much worse. In a nutshell, that is the settled science on human-caused climate change.

What scientists cannot yet pin down is exactly how much warming we will get in the future. They do not know with precision how much a given quantity of emissions will lead to increased concentrations of greenhouse gases in the atmosphere. For climate impact it is the concentrations that matter, not the emissions. Up until now, 29 per cent of human emissions of carbon dioxide has been taken up by the oceans, 28 per cent has been absorbed by plant growth on land, and the remaining 43 per cent has accumulated in the atmosphere. Humans have increased carbon dioxide concentrations in the atmosphere from a pre-industrial level of 280 parts per million to over 400 today, a level not seen for millions of years.

There’s a possibility that the 43 per cent atmospheric fraction may increase as ocean and terrestrial carbon sinks start to become saturated. This means that a given amount of emissions will lead to a bigger increase in concentrations than we saw before. In addition, the warming climate may well provoke increased emissions from non-fossil fuel sources. For example, as permafrost thaws, the long-frozen organic matter contained within it rots and oxidizes, giving off greenhouse gases. Nature has given us a major helping hand, so far, by the oceans and plants taking up more than half of our added fossil carbon, but there’s no guarantee that it will continue to be so supportive forever. These so-called carbon-cycle feedbacks will play a big role in determining how our climate future will unfold, but they are not the largest unknown. 

Feedbacks

Atmospheric physicists have long tried to pin down a number to express what they refer to as climate sensitivity, the amount of warming we will get from a certain increase in concentration of greenhouse gases. Usually, this is expressed as the average global warming, measured in degrees Celsius that results from a doubling of carbon dioxide concentrations. The problem is not so much being able to calculate how much warming the doubling of the carbon dioxide alone will cause – that is relatively easy to estimate and is about 1 degree C. The big challenge is in figuring out the range of size of the feedbacks. These are the phenomena that arise from warming temperatures and that amplify or dampen the direct effects of the greenhouse gases that humans have added to the atmosphere.

The biggest feedback is water vapour, which is actually the most important single greenhouse gas in the atmosphere. Warm air holds more water vapour. As carbon dioxide increases and the air warms, there is plenty of water on land and in the sea available to evaporate. The increased amount of vapour in the air, in turn, provokes more warming and increased evaporation. If temperatures go down, the water vapour condenses and precipitates out of the atmosphere as rain and snow. Water vapour goes quickly into and out of the air as temperatures rise and fall, but the level of carbon dioxide stays around for centuries, which is why water vapour is considered a feedback and not a forcing agent. Roughly speaking, the water vapour feedback increases the sensitivity of carbon dioxide alone from 1 to 2 degrees C.

Read more...

6 comments


The Madhouse Effect, a review

Posted on 9 September 2016 by Andy Skuce &

This is a re-post from Critical Angle

Climate scientist Michael Mann has teamed up with cartoonist Tom Toles to write TheMadhouse Effect: How Climate Change Is Threatening Our Planet, Destroying Our Politics and Driving Us Crazy. It’s an excellent book, well-written, authoritative on the science, revealing on the politics and laced with the wit of many superb cartoons. Buy a copy for the climate science doubter in your family. They will be drawn in by the cartoons and may well be unable to resist dipping in to the text.

MadhouseEffect_Book_On_Climate_Change

Michael Mann has previously written The Hockey Stick and the Climate Wars: Dispatches from the Front Lines, about how he was hounded for writing a paper that featured the striking hockey-stick graph. He also authored Dire Predictions: Understanding Climate Change with scientist Lee Kump. At the same time that he turns out first-class books, Mann is a prolific research scientist and has an active presence on social media. You can only wonder how he does it all.

Tom Toles is a Pullitzer-Prize winning cartoonist who works for the Washington Post. His main focus is politics, but his cartoons have often featured climate science and the absurd lengths that many American politicians go to in avoiding the facing up to the reality of global change.

Writing about scientific subjects like climate change for the non-specialist is not easy and authors have to walk a fine line. Many readers expect scientists to be detached about the implications of their work, but that would make their message less engaging, less human. The science needs to be explained in ways that the average person can understand, but oversimplification can gloss over some of the important complications. And treatments of the topic can so so easily be depressing and dull. The Mann/Toles team have succeeded in bringing their talents together to overcome these problems. The writing is excellent  and the cartoons add a much-needed satirical perspective.

tipping point

Read more...

7 comments


Consensus on consensus

Posted on 5 May 2016 by Andy Skuce &

Originally published in Corporate Knights Magazine

In 1998 and 1999, American scientists Michael Mann, Raymond Bradley and Malcolm Hughes published two papers that reconstructed the average temperatures of the northern hemisphere back to the year 1000. The articles showed a temperature profile that gently declined from 1000 to 1850, fluctuating a little along the way, with a sudden increase in the late nineteenth and the twentieth centuries.  The graph was nick-named “the Hockey Stick”, with its long relatively straight handle showing the stable pre-industrial climate and the blade representing the sudden uptick in the last 150 years.

The diagram was a striking depiction of the abrupt warming that had occurred since the Industrial Revolution compared to what happened before. For those opposed to the scientific consensus on Anthropogenic Global Warming (AGW), the Hockey Stick posed a threat and had to be broken.

As detailed in Mann’s 2013 book The Hockey Stick and the Climate Wars: Dispatches from the Front Lines,his critics employed a variety of tactics to try to break the hockey stick. They disputed the statistical methods that Mann and his colleagues used, although they never produced new results of their own. Stolen private conversations were quote-mined for damning phrases. Senior US politicians and the right-wing press denounced the work a fraud.

Mann and other scientists were subjected to numerous investigations, all of which exonerated the Hockey Stick authors. Most importantly, other researchers, using alternative methods and new data, produced additional temperature curves that closely matched the original results of Mann et al. Nevertheless, the attacks on the original Hockey Stick continued, as has the harassment of Mann by right-wing pundits. If you need to deny the consensus on AGW, you have to keep repeating that the “Hockey Stick is Broken”. Never mind that it is intact and that there are enough new sticks to equip an NHL team.

There are parallels with the reception given to the paper Quantifying the consensus on anthropogenic global warming in the scientific literature, published in 2013 by University of Queensland researcher John Cook and eight volunteers associated with the website Skeptical Science (including me). The paper, published in the journal Environmental Research Letters (ERL), has been deemed a hoax and a fraud, by contrarian bloggers as well as by Republican presidential hopefuls such as Ted Cruz and Rick Santorum.

Read more...

5 comments


James Powell is wrong about the 99.99% AGW consensus

Posted on 12 April 2016 by Andy Skuce &

This is reposted from Critical Angle with slight modifications and updates.

In a recent article in Skeptical Inquirer, geologist and writer James Lawrence Powell, claims that there is a 99.99% scientific consensus on Anthropogenic Global Warming (AGW). You might think that after all of the harsh criticism that the 2013 Cook et al. paper (C13) has received from climate contrarians that we would be pleased to embrace the results of a critique that claims we were far too conservative in assessing the consensus. While it certainly does make a nice change from the usual rants and overblown methodological nit-picks from the contrarians, Powell is wrong to claim such a very high degree of agreement.

He makes many of the same errors that contrarian critics make: ignoring the papers self-rated by the original authors; and making unwarranted assumptions about what the “no-position” abstracts and papers mean.

Powell’s methodology was to search the Web of Science to review abstracts from 2013 and 2014. He added the search term “climate change” to the terms “global climate change” and “global warming” that were used by C13. He examined 24,210 papers co-authored by 69,406 scientists and found only five papers written by four authors that explicitly reject AGW. Assuming the rest of the abstracts endorsed AGW, this gives consensus figures of 99.98% (by abstract) and 99.99% (by author).

His definition of explicit rejection would align roughly with the seventh level of endorsement used in C13: “Explicitly states that humans are causing less than half of global warming”. In the abstracts from 1991-2011, C13 found 9 out of 11,914 that fit level 7, which using Powell’s consensus calculation assumptions, would yield 99.92%. So, there is probably not much difference between the two approaches when it comes to identifying an outright rejection paper. It’s what you assume the other abstracts say—or do not say—that is the problem.

C13 also counted as “reject AGW” abstracts that: “Implies humans have had a minimal impact on global warming without saying so explicitly, e.g., proposing a natural mechanism is the main cause of global warming”. These are more numerous than the explicit rejections and include papers by scientists who consider that natural causes are more important than human causes in recent warming, but who do not outright reject some small human contribution.

\

Competing Climate Consensus Pacmen. Cook on the left, Powell on the right.

Read more...

10 comments


Temperature tantrums on the campaign trail

Posted on 24 March 2016 by Andy Skuce &

Originally published at Corporate Knights on March 17, 2016.

Sorry Ted Cruz. There's no conspiracy among scientists to exaggerate global warming by fudging the numbers.

Last year was the warmest year recorded since the measurement of global surface temperatures began in the nineteenth century. The second-warmest year ever was 2014. Moreover, because of the persisting effects of the equatorial Pacific Ocean phenomenon known as El Niño, many experts are predicting that 2016 could set a new annual record. January and February have already set new monthly records, with February half a degree Celsius warmer than any February in history.

This news is deeply unsettling for those who care about the future of the planet. But it is even more upsetting for people opposed to climate mitigation, since it refutes their favourite talking point – that global warming has stalled in recent years.

U.S. Congressman Lamar Smith claims there has been a conspiracy among scientists to fudge the surface temperature records upwards and has demanded, by subpoena, to have scientists’ emails released.

Senator and presidential candidate Ted Cruz recently organized a Senate hearing on the temperature record in which he called upon carefully selected witnesses to testify that calculations of temperature made by satellite observations of the upper atmosphere are superior to measurements made by thermometers at the Earth’s surface.

It’s easy to cherry-pick data in order to bamboozle people. The process of making consistent temperature records from surface measurements and satellite observations is complicated and is easy to misrepresent.

But the fact remains that there are no conspiracies afoot. Here’s why.

Read more...

12 comments


The Quest for CCS

Posted on 13 January 2016 by Andy Skuce &

This article was originally published online at Corporate Knights and will appear in the hard copy Winter 2016 Edition of the Corporate Knights Magazine, which is to be included  as a supplement to the Globe and Mail and Washington Post later in January 2016. The photograph used in the original was changed for copyright reasons.

Human civilization developed over a period of 10,000 years during which global average surface temperatures remained remarkably stable, hovering within one degree Celsius of where they are today.

If we are to keep future temperatures from getting far outside that range, humanity will be forced to reduce fossil fuel emissions to zero by 2050. Halving our emissions is not good enough: we need to get down to zero to stay under the 2 C target that scientists and policy makers have identified as the limit beyond which global warming becomes dangerous.

Shell boasting about its government-funded Quest CCS project, on a Toronto bus. (Photo: rustneversleeps) "Shell Quest captures over one-third of our oil sands upgrader emissions"

Many scenarios have been proposed to get us there. Some of these involve rapid deployment of solar and wind power in conjunction with significant reductions in the amount of energy we consume.

However, many of the economists and experts who have developed scenarios for the Intergovernmental Panel on Climate Change (IPCC) believe that the only way to achieve the two-degree goal in a growing world economy is to invest in large-scale carbon capture and storage (CCS) projects. These technologies capture carbon dioxide from the exhausts of power stations and industrial plants and then permanently store it, usually by injecting it into underground rock layers.

Even with massive deployment of CCS over coming decades, most scenarios modelled by the IPCC overshoot the carbon budget and require that in the latter part of the century, we actually take more carbon out of the atmosphere than we put into it. Climate expert Kevin Anderson of the Tyndall Centre for Climate Change Research at the University of Manchester recently reported in Nature Geoscience that, of the 400 IPCC emissions scenarios used in the 2014 Working Group report to keep warming below two degrees, some 344 require the deployment of negative emissions technologies after 2050. The other 56 models assumed that we would start rapidly reducing emissions in 2010 (which, of course, did not happen). In other words, negative emissions are required in all of the IPCC scenarios that are still current.

One favoured negative emissions technology is bioenergy with carbon capture and storage (BECCS). This involves burning biomass – such as wood pellets – in power stations, then capturing the carbon dioxide and burying it deep in the earth. The technology has not yet been demonstrated at an industrial scale. Using the large amounts of bioenergy envisioned in such scenarios will place huge demands on land use and will conflict with agriculture and biodiversity needs.

Read more...

78 comments


Alberta's new carbon tax

Posted on 31 December 2015 by Andy Skuce &

 

http://alberta.ca/documents/climate/climate-leadership-report-to-minister.pdfOn Sunday November 22nd, 2015, Alberta's new centre-left Premier, Rachel Notley, announced that the province would be introducing an economy-wide carbon tax priced at $30 per tonne of CO2 equivalent, to be phased in in 2016 and 2017. Observers had been expecting new efforts to mitigate emissions since Notley's election in May 2015, but the scope and ambition of this policy took many by surprise. 

Alberta, of course, is the home of the Athabasca oil sands and is one of the largest per-capita GHG emitters of any jurisdiction in the world. The new plan was nevertheless endorsed by environmental groups, First Nations and by the biggest oil companies, an extraordinary consensus that many would not have thought possible.

How was this done? I will try and explain the new policy as far as I can (the details are not all available yet), but the short answer is that a huge amount of credit is due to the panel of experts led by University of Alberta energy economist Andrew Leach and his fellow panelists. Not only did they listen to what all Albertans had to say, but they were thoughtful in framing a policy that is acceptable to almost everyone. 

The background

Alberta is the wealthiest province in Canada, with a population of 4.1 million.  In 2013, greenhouse gas emissions were 267 Mt CO2 equivalent, about 65 tonnes per capita, which compares with the average for the rest of Canada of about 15 tonnes. Among US states only North Dakota and Wyoming are worse. Alberta's fugitive emissions of methane alone amount to 29 Mt CO2e, about 7 tonnes per person, which is a little more than the average for all GHGs per-capita emissions in the world.

From the Climate Leadership Report. The 2030 emissions do not consider savings from the new policy.

Read more...

11 comments


The Road to Two Degrees, Part Three: Equity, inertia and fairly sharing the remaining carbon budget

Posted on 9 December 2015 by Andy Skuce &

In the first part of this series, I examined the implications of relying on CCS and BECCS to get us to the two degree target. In the second part, I took a detailed look at Kevin Anderson's arguments that IPCC mitigation scenarios aimed at two degrees are biased towards unproven negative-emissions technologies and that they consequently downplay the revolutionary changes to our energy systems and economy that we must make very soon. In this last part, I'm going to look at the challenges that the world faces in fairly allocating future emissions from our remaining carbon budget and raising the money needed for climate adaptation funds, taking account of the very unequal past and present.

Until now, economic growth has been driven and sustained largely by fossil fuels. Europe and North America started early with industrialization and, from 1800 up to around 1945, this growth was driven mainly by coal. After the Second World War there was a period of rapid (~4% per year) economic growth in Europe, N America and Japan, lasting about thirty years, that the French refer to as Les Trente Glorieuses, The Glorious Thirty. This expansion was accompanied by a huge rise in the consumption of oil, coal and natural gas. After this there was a thirty-year period of slower growth (~2%) in the developed economies, with consumption fluctuations caused by oil-price shocks and the collapse of the Soviet Union. During this time, oil and coal consumption continued to grow, but not as steadily as before. Then, at the end of the twentieth century, economic growth took off in China, with a huge increase in the consumption of coal.

Source of the emissions data is from the CDIAC. See the SkS post The History of Emissions and the Great Acceleration for further details.

If we are to achieve a stable climate, we will need to reverse this growth in emissions over a much shorter time period, while maintaining the economies of the developed world and, crucially, allowing the possibility of economic growth for the majority of humanity that has not yet experienced the benefits of a developed-country middle-class lifestyle.

Here are the the annual emissions sorted by country and region:

Read more...

15 comments


The Road to Two Degrees, Part Two: Are the experts being candid about our chances?

Posted on 26 November 2015 by Andy Skuce &

The first part of this three-part series looked at the staggering magnitude and the daunting deployment timescale available for the fossil fuel and bioenergy carbon capture and storage technologies that many 2°C mitigation scenarios assume. In this second part, I outline Kevin Anderson's argument that climate experts are failing to acknowledge the near-impossibility of avoiding dangerous climate change under current assumptions of the political and economic status quo, combined with unrealistic expectations of untested negative-emissions technologies.

In plain language, the complete set of 400 IPCC scenarios for a 50% or better chance of meeting the 2 °C target work on the basis of either an ability to change the past, or the successful and large-scale uptake of negative-emission technologies. A significant proportion of the scenarios are dependent on both. (Kevin Anderson)

Kevin Anderson has just written a provocative article titled: Duality in climate science, published in Nature Geoscience (open access text available here). He contrasts the up-beat pronouncements in the run-up to the Paris climate conference in December 2015 (e.g. “warming to less than 2°C” is “economically feasible” and “cost effective”; “global economic growth would not be strongly affected”) with what he see as the reality that meeting the 2°C target cannot be reconciled with continued economic growth in rich societies at the same time as the rapid development of poor societies.  He concludes that: “the carbon budgets associated with a 2 °C threshold demand profound and immediate changes to the consumption and production of energy”.

His argument runs as follows: Integrated Assessment Models, which attempt to bring together, physics, economics and policy, rely on highly optimistic assumptions specifically:

o   Unrealistic early peaks in global emissions;
o   Massive deployment of negative-emissions technologies.

He notes that of the 400 scenarios that have a 50% or better chance of meeting the 2 °C target, 344 of them assume the large-scale uptake of negative emissions technologies and, in the 56 scenarios that do not, global emissions peak around 2010, which, as he notes, is contrary to the historical data.

I covered the problems of the scalability and timing of carbon capture and storage and negative emissions technologies in a previous article.

From Robbie Andrew, adjusted for non-CO2 and land-use emissions.Note that these mitigation curves assume no net-negative emissions technologies deployed in the latter part of the century.

Read more...

52 comments


The Road to Two Degrees, Part One: Feasible Emissions Pathways, Burying our Carbon, and Bioenergy

Posted on 16 November 2015 by Andy Skuce &

This post looks at the feasibility of the massive and rapid deployment of Carbon Capture and Storage and negative-emissions Bioenergy Carbon Capture and Storage technologies in the majority of IPCC scenarios that avoid dangerous global warming. Some observers question whether the deployment of these technologies at these scales and within the required time frames is achievable. This is Part One of a three-part series on the challenge of keeping global warming under 2 °C.

The various emissions models that have been used to produce the greenhouse gas concentration pathway to 2°Celsius vary considerably, but the majority of them require huge deployment of Carbon Capture and Storage (CCS) as well as net-negative global emissions in the latter part of the twenty-first century. The only negative emissions methods generally considered in these scenarios are bioenergy capture and storage (BECCS) and land-use changes, such as afforestation. For there to be net-negative emissions, positive emissions have to be smaller than the negative emissions.

Kevin Anderson (2015) (open-access text) reports that of the 400 scenarios that have a 50% chance or greater of no more than 2 °C of warming, 344 assume large-scale negative emissions technologies. The remaining 56 scenarios have emissions peaking in 2010, which, as we know, did not happen.

Sabine Fuss et al. (2014) (pdf) demonstrate that of the 116 scenarios that lead to concentrations of 430-480 ppm of CO2 equivalent, 101 of them require net negative emissions. Most scenarios that have net-negative emissions have BECCS providing 10-30% of the world’s primary energy in 2100.

 

From Fuss et al. (2014), showing the historical emissions (black), the four RCPs (heavy coloured lines) and 1089 scenarios assigned to one of the RCPs (light coloured lines).

Read more...

22 comments


The thermometer needle and the damage done

Posted on 6 November 2015 by Andy Skuce &

Rising temperatures may inflict much more damage on already warm countries than conventional economic models predict. In the latter part of the twenty-first Century, global warming might even reduce or reverse any earlier economic progress made by poor nations. This would increase global wealth inequality over the century. (This is a repost from Critical Angle.)

A recent paper published in Nature by Marshall Burke, Solomon M. Hsiang and Edward Miguel Global non-linear effect of temperature on economic production argues that increasing temperatures will cause much greater damage to economies than previously predicted. Furthermore, this effect will be distributed very unequally, with tropical countries getting hit very hard and some northern countries actually benefitting.

Let me attempt a highly simplified summary of what they did. I’m not an economist and this analysis is not straightforward, so beware. If I confuse you, try Dana Nuccitelli’s take or Seth Borenstein’s or Bloomberg’s or The Economist's.

Firstly, Burke et al. looked at factors like labour supply, labour performance and crop yields and how they relate to daily temperature exposure. Generally these show little variation up to temperatures in the high twenties Celsius, at which point they fall off quickly. Secondly, those trends were aggregated to predict the relationship between annual average temperatures and the annual impact on economic output. Thirdly, they looked at annual economic output and average annual temperatures for individual countries for the period 1960-2010. Note that they only compared the economic effects of temperature change on individual countries, they did not correlate one country with another. Using these observations they were able to see how the observations compared with their predicted aggregate curve.

2015-10-28_11-43-09

All figures from Burke et al. (2015).

This work showed that the GDP of countries with an annual average temperature of 13°C were the least sensitive to temperature changes. Colder countries on the left side of the hump would benefit from an increase in temperature, whereas warmer countries would see their output suffer as temperature increases. Note that the figure does not show that a certain temperature predetermines the level of wealth of a country (China, despite recent rapid growth is poorer than the US and Japan even though average annual temperatures are similar). Rather, it illustrates how susceptible countries are to increases or decreases in productivity relative to their annual average temperature.

There is some evidence that rich countries are slightly less affected by changes in temperature (the curve is a little flatter for them). There are few hot and wealthy countries examined in the study, so any general conclusions about them cannot be certain, but the evidence still points to them being more prone to damage from rising temperature than rich, cooler countries. No matter how rich you are, extra heat hurts the warm lands more than it does the temperate and the cool. You can’t buy your way out of the effects of global warming, except by moving away from the Equator or up into the highlands.

Read more...

15 comments


B.C. lowballing fugitive methane emissions from natural gas industry

Posted on 8 October 2015 by Andy Skuce &

This article was first published in the Corporate Knights Magazine.

There is a supplementary article at my blog Critical Angle, that has more detail, links and references, along with an estimation of the GHG emissions (excluding end-use) associated with one liquefied natural gas project and the effect this will have on the feasibility of BC reaching its emissions targets.

There is a further piece at DeSmog Canada, where I compare the situation in BC's gas industry with the Volkswagen emissions reporting scandal, in which a corporation cheats on its emissions tests, with the tacit approval of industry-friendly regulators and governments, only to be exposed by independent researchers performing tests in real-world situations.

The push by British Columbia to develop a new liquefied natural gas (LNG) export industry raises questions about the impact such activities would have on greenhouse gas emissions, both within the province and globally.

One of the single most important factors relates to the amount of methane and carbon dioxide that gets released into the atmosphere, either deliberately through venting or by accident as so-called fugitive emissions. Fugitive emissions are the result of valves and meters that release, by design, small quantities of gas. But they can also come from faulty equipment and from operators that fail to follow regulations.

Photo by Jesús Rodríguez Fernández (creative commons)

According to the B.C. Greenhouse Gas Inventory Report 2012, there were 78,000 tonnes of fugitive methane emissions from the oil and natural gas industry that year. B.C. produced 41 billion cubic metres of gas in 2012. This means about 0.28 per cent of the gas produced was released into the atmosphere.

By North American standards, this is a very low estimate. The U.S. Environmental Protection Agency (EPA) uses a figure of 1.5 per cent leakage, more than five times higher. Recent research led by the U.S. non-profit group, Environmental Defense Fund (EDF), shows that even the EPA estimates may be too low by a factor of 1.5. B.C.’s estimate, in other words, would be about one-eighth of what has been estimated for the American gas industry.

Although the amounts of methane released are small compared to carbon dioxide emissions, methane matters because it packs a much bigger global warming punch. Determining the effect of methane emissions is complicated because molecules of methane only last in the atmosphere for a decade or so and the warming effect from its release depends on the time interval it is measured over. Compared to a given mass of carbon dioxide, the same mass of methane will produce 34 times as much warming over 100 years, or 86 times as much over 20 years.

Read more...

1 comments


Are we overestimating our global carbon budget?

Posted on 15 July 2015 by Andy Skuce &

The latest research suggests that natural sinks of carbon on land may be slowing or even turning into sources, creating climate consequences potentially worse than first thought.

Nature has provided humans with a buffer against the worst effects of our carbon pollution. Since 1750, we have emitted about 580 billion tonnes of carbon into the atmosphere by burning fossil fuels, cutting down forests and making cement. If those emissions had simply accumulated in the air, the concentration of carbon dioxide would have increased from 280 parts per million (ppm), as it was before the Industrial Revolution, to about 550 ppm today. Instead, we currently measure around 400 ppm, which is still a whopping 40 per cent above the planet’s pre-industrial atmosphere, but much less than a doubling.

Some 60 per cent of our emissions have been taken up in natural sinks by, in roughly equal parts, dissolving into the ocean and by being taken up by plants growing faster on land. Were it not for these natural carbon sinks, we would by now be much deeper into dangerous climate change.

As we continue to burn fossil fuels, our climate troubles will become worse should those sinks start to falter. And the outlook will be worse still if those sinks turn into sources of carbon. 

New research

According to the latest research, the carbon sink on land is unfortunately starting to show signs of trouble. Instead of providing a brake on human emissions, the land carbon sink could instead soon be giving our emissions a boost. (The ocean sink appears to be relatively safe for now, although there is a price to pay: the consequence of the process of carbon dioxide dissolving into seawater, ocean acidification, has been called climate change’s evil twin with its own, non-climate related consequences for marine life.)

Plants don't thrive solely on carbon dioxide

Plants don’t thrive solely on carbon dioxide.

Read more...

10 comments


Carbon cycle feedbacks and the worst-case greenhouse gas pathway

Posted on 7 July 2015 by Andy Skuce &

The worst-case emissions pathway, RCP8.5, is a scenario that burns a huge amount of fossil fuels, especially coal. The model has sometimes been criticized as implausible because of its huge resource consumption and emissions of ~1700 billion tonnes of carbon (GtC) over the century. Those emissions are based in part on carbon cycle model assumptions, which recent work suggests may be too optimistic. New research shows that future plant growth may be restricted by nutrient availability, turning the land carbon sink into a source. Also, permafrost feedbacks (not considered in IPCC CMIP5 models) may also add significant emissions to the atmosphere under the RCP8.5 pathway. In addition, the latest research on the Amazon Basin reveals that the tropical forest carbon sinks may already be diminishing there. Together, these feedbacks suggest that the greenhouse gas concentrations in the RCP8.5 case could be achieved with ~400 GtC smaller human emissions, making the RCP8.5 worst-case scenario more plausible.

The climate models referred to  in the recent IPCC Fifth Assessment Report (AR5) are founded on one of four Representative Concentration Pathways or RCPs. The key word in RCP is concentration. In the RCPs, the concentration of greenhouse gases is fixed at different times in the future and the climate model (or general circulation model or GCM) uses those atmospheric concentrations to calculate future climate states. Underpinning the concentration pathways are socio-economic and emissions scenarios. There can be more than one underlying emissions scenario capable of producing the concentration pathway.

If you are unfamiliar with RCPs, check out the great guide that Graham Wayne wrote in August 2013 for Skeptical Science.

This way of modelling differs from previous approaches in which the starting point was a story or scenario about economic and social development that led to emissions. These emissions are run through a carbon-cycle model (which may be simple or complex) to produce atmospheric concentrations over time. 

The schematic illustrates the differences in approach. The elements in red boxes are the prescribed inputs into the models, whereas the elements in blue ellipses are outputs. The advantage of the RCP prescribed-concentration approach is that the climate model outputs do not depend to the same degree on carbon-cycle models as they did in the emissions scenario method. The disadvantage is that there is no unique link between concentrations and emissions. The schematic is simplified in that there are feedbacks and loops in the processes that are not illustrated. 

The worst-case scenario among the four Representative Concentration Pathways (RCPs) is known as RCP8.5. The number “8.5” refers to the radiative forcing level measured in W/m2 in the year 2100. RCP8.5, despite it often being called “business-as usual”, has been criticized as an unlikely outcome. While true, that’s more feature than bug, since, as one of the two extreme pathways, it is designed to provide climate modellers with an unlikely, but still just plausible “how bad could it be” scenario.

Let’s look briefly at some of the underlying socio-economic assumptions behind RCP8.5, then we’ll examine how the latest research on the terrestrial carbon cycle makes the GHG concentrations in the RCP8.5 model easier to reach.

Read more...

19 comments


Why the 97 per cent consensus on climate change still gets challenged

Posted on 18 May 2015 by Andy Skuce &

Here are some excerpts from an article I wrote for the magazine Corporate Knights, published on May 14, 2015. Some references and links have been added at the end.

In 2004, science historian Naomi Oreskes published a short paper in the journal Science concluding there was an overwhelming consensus in the scientific literature that global warming was caused by humans.

After the paper’s release, there was some unexpectedly hostile reaction. This prompted Oreskes and her colleague Erik Conway to go even deeper with their research, leading to the publication of the book Merchants of Doubt. It documents how a small group of scientists with links to industry were able to sow doubt about the scientific consensus and delay effective policy on DDT, tobacco, acid rain and, now, global warming.

Fast forward to two years ago: a team of volunteer researchers (myself included) associated with the website Skeptical Science decide to update and extend Oreskes’ research. Led by University of Queensland researcher John Cook, we analyzed the abstracts of about 12,000 scientific papers extracted from a large database of articles, using the search terms “global warming” and “global climate change.” The articles had been published over a 21-year period, from 1991 to 2011.

As an independent check on our results, we also sent emails to the more than 8,500 scientist authors of these articles. (These were the scientists whose e-mail addresses we were able to track down). We asked them to rate their own papers for endorsement or rejection of man-made global warming.

Both approaches yielded a very similar result: 97 per cent of the scientific literature that expresses an opinion on climate change endorses the expert consensus view that it is man-made. The results were published in May 2013 in the journal Environmental Research Letters.

We were astonished by the positive reception. Mention of the paper was tweeted by U.S. President Barack Obama, Al Gore and Elon Musk, among others. Obama later referenced it in a speech at the University of Queensland, while U.S. Secretary of State John Kerry has referred to the 97 per cent consensus in recent speeches. John Oliver based an episode of his HBO comedy show Last Week Tonight around it, a clip viewed online more than five million times.

Read more...

32 comments


Permafrost feedback update 2015: is it good or bad news?

Posted on 20 April 2015 by Andy Skuce &

We have good reason to be concerned about the potential for nasty climate feedbacks from thawing permafrost in the Arctic. Consider:

  • The Arctic contains huge stores of plant matter in its frozen soils. Over one-third of all the carbon stored in all of the soils of the Earth are found in this region, which hosts just 15% of the planet's soil-covered area.
  • The Arctic is warming at twice the rate of the rest of the planet. The vegetable matter in the soils is being taken out of the northern freezer and placed on the global kitchen counter to decompose. Microbes will take full advantage of this exceptional dining opportunity and will convert part of these plant remains into carbon dioxide and methane.
  • These gases will add to the already enhanced greenhouse effect that caused the Arctic warming, providing a further boost to warming. There's plenty of scope for these emissions to cause significant climatic mischief: the amount of carbon in the permafrost is double the amount currently in the air. 

But exactly how bad will it be, and how quickly will it cause problems for us? Does the latest research bring good news or bad?

Ted Schuur and sixteen other permafrost experts have just published a review paper in Nature: Climate change and the permafrost feedback (paywalled). This long and authoritative article (7 pages of text, plus 97 references) provides a state-of-the-art update on the expected response of permafrost thawing to man-made climate change. Much of the work reported on in this paper has been published since the 2013 IPCC AR5 report. It covers new observations of permafrost thickness and carbon content, along with laboratory experiments on permafrost decomposition and the results of several modelling exercises.

The overall conclusion is that, although the permafrost feedback is unlikely to cause abrupt climate change in the near future, the feedback is going to make climate change worse over the second half of this century and beyond. The emissions quantities are still uncertain, but the central estimate would be like adding an additional country with the unmitigated emissions the current size of the United States' for at least the rest of the century. This will not cause a climate catastrophe by itself, but it will make preventing dangerous climate change that much more difficult. As if it wasn't hard enough already.

Observations

There's a lot of information in this paper and, rather than attempt to describe it all in long form, I'll try to capture the main findings in bullet points. 

  • The top three metres of permafrost contain about 1035 PgC (billion tonnes of carbon). This is similar to previous estimates, but is now supported by ten times as many observations below the top 1 m depth. Very roughly, the deepest deposits richest in carbon are near the Russian, Alaskan and Canadian Arctic coasts, with the poorest in mountainous regions and in areas close to glaciers and the Greenland ice sheet.

The carbon content in the top three metres of permafrost soils. From Hugelius et al (2013).

Read more...

23 comments


The history of emissions and the Great Acceleration

Posted on 7 April 2015 by Andy Skuce &

 This is a repost from the Critical Angle blog.

One of my pastimes is downloading data and playing around with it on Excel. I’m not kidding myself that doing this means anything in terms of original research, but I do find that I learn quite a lot about the particularities of the data and about the science in general by doing some simple calculations and graphing the numbers. There’s even occasionally a small feeling of discovery, a bit like the kind that you experience when you follow a well-trodden path in the mountains for the first time:

We were not pioneers ourselves, but we journeyed over old trails that were new to us, and with hearts open. Who shall distinguish? J. Monroe Thorington

Anyway, I downloaded some historical emissions data from the CDIAC site and played around with it. To repeat, there’s nothing new to science here, but there were a few things that I found that were new to me. First, let’s look at historical emissions of CO2 from man-made sources from 1850 to 2010. Note that for all of these graphs there are no data shown for 2011-2015.

What immediately struck me—something I hadn’t fully appreciated before—was how small oil consumption was before 1950. Both world wars were carried out without huge increases in oil use, despite the massive mobilizations of armies, navies and air forces. You can make out some downward blips in coal consumption for the Great Depression (~1930) and around the end of WW2 (~1945).

It wasn’t until after 1950 that fossil-fuel consumption went nuts. Some people have taken to calling this inflection point The Great Acceleration, there’s more on this later.

What do these emissions from different sources look like as a proportion of all human emissions over this time period?

Read more...

29 comments


Shell: internal carbon pricing and the limits of big oil company action on climate

Posted on 24 March 2015 by Andy Skuce &

Shell evaluates all of its projects using a shadow carbon tax of $40 per tonne of carbon dioxide. That's great. But why is the company still exploring in the Arctic and busy exploiting the Alberta oil sands?

Of all of the big fossil-fuel companies, Shell has adopted perhaps the most constructive position on climate change mitigation. Recently, the company's CEO, Ben van Buerden told an industry conference:

You cannot talk credibly about lowering emissions globally if, for example, you are slow to acknowledge climate change; if you undermine calls for an effective carbon price; and if you always descend into the ‘jobs versus environment’ argument in the public debate.

Shell employs engineer David Hone as their full-time Climate Change Advisor. Hone has written a small ebook Putting the Genie Back: 2°C Will Be Harder Than We Think, priced at just 99¢ and he writes a climate change blog that should be part of every climate-policy geek's balanced diet.

Shell also has a position they call Vice President CO2, currently occupied by Angus Gillespie. Here's Gillespie talking recently at Stanford on the company's internal shadow carbon pricing strategy (hat-tip to John Mashey). It's worth watching if only for Gillespie's vivid example of the limitations of looking at averages. The slides can be downloaded here.

Read more...

1 comments


Does providing information on geoengineering reduce climate polarization?

Posted on 4 March 2015 by Andy Skuce &

Dan Kahan of Yale University and four colleagues have just published an article in Annals of the AAPS titled: Geoengineering and Climate Change Polarization Testing a Two-Channel Model of Science Communication that investigates the effect on study participants' attitudes to climate change after reading an article about geoengineering. In their abstract, they write:

We found that cultural polarization over the validity of climate change science is offset by making citizens aware of the potential contribution of geoengineering as a supplement to restriction of CO2 emissions.

I will argue here that this experiment achieved no such result because the premise was wrong. Specifically, the information on geoengineering that was presented to the study participants (in the form of a fictional newspaper article) bears no relation to mainstream scientific opinion on geoengineering nor, even, to the opinions of advocates of geoengineering. Geoengineering is portrayed in the fictional newspaper article as a strategy with no uncertainty about how well it might work and, it is claimed, will "spare consumers and businesses from the heavy economic costs associated with the regulations necessary to reduce atmospheric CO2 concentrations to 450 ppm or lower". This is hardly depicting geoengineering as a "potential solution" or "a supplement" to the restriction of emissions, as is claimed in the abstract of the paper.

In fact, what Kahan et al. have demonstrated is that presenting misinformation dressed up as fact can affect people's opinions about climate change. That may be interesting as a social science experiment conducted on consenting adults, but it is not much use as a guide to effective public science communication, constrained as it is to tell the truth.

The Kahan et al 2015 paper is paywalled, but there is a 2012 version of it (updated in 2015), with the same title, similar figures, but different text, that is available online here. The study looked at two representative samples of individuals from the USA and England of 1500 each. The two samples were further split into three groups that were each asked to read one of three fictional newspaper articles. One article, used as a control, had nothing to do with climate change. The second was an article advocated tighter limits on atmospheric concentrations of CO2 (although this article contained what surely must be a typo, calling for CO2 concentrations of 175 ppm, which would send us back to depths of the last ice age). The third piece called for geoengineering on the grounds that "limiting emissions is a wasteful and futile strategy". Articles two and three both quoted a (fictional) Dr Williams of Harvard University, the spokesman of the (fictional) "American Association of Geophysical Scientists". Both of these articles contained a couple of pictures designed to appeal to or to repel people at either end of the political spectrum.

Read more...

20 comments


Andy Skuce's AGU Fall Meeting 2014 poster presentation

Posted on 29 December 2014 by Andy Skuce &

This is a re-post from Critical Angle

I gave a poster presentation on December 16th  at the 2014 Fall Meeting of the American Geophysical Union in San Francisco. The title is: Emissions of Water and Carbon Dioxide from Fossil-Fuel Combustion Contribute Directly to Ocean Mass and Volume Increases.

You can read the abstract here and I have uploaded a pdf of the poster here. There is a picture of the poster below, click on it to make it readable, although you will need to download the pdf to make out some of the fine print.

Poster2014a

Some of the numbers changed a little bit between the time I submitted the abstract in August and now. I found one or two small errors and recalculated the uncertainty range using Monte Carlo analysis. “Min” and “Max” values bracket the 90% confidence interval. In the title I used “directly” to distinguish the physical effects of emissions on ocean volumes from the more “indirect” (and bigger and better-known) contributions of emissions to sea-level rise via the effect of emissions on global warming.

Read more...

2 comments


Keystone XL: Oil Markets and Emissions

Posted on 1 September 2014 by Andy Skuce &

  • Estimates of the incremental emission effects of individual oil sands projects like the Keystone XL (KXL) pipeline are sensitive to assumptions about the response of world markets and alternative transportation options.

  • A recent Nature Climate Change paper by Erickson and Lazarus concludes that KXL may produce incremental emissions of 0-110 million tonnes of CO2 per year, but the article has provoked some controversy.

  • Comments by industry leaders and the recent shelving of a new bitumen mining project suggest that the expansion of the oil sands may be more transportation constrained and more exposed to cost increases than is sometimes assumed.

  • Looking at the longer-term commitment effects of new infrastructure on cumulative emissions supports the higher-end incremental estimates.

President Obama (BBC) has made it clear that the impact of the Keystone XL (KXL) pipeline on the climate will be critical in his administration’s decision on whether the pipeline will go ahead or not.  However, different estimates of the extra carbon emissions that the pipeline will cause vary wildly. For example, the consultants commissioned by the US State Department estimated that the incremental emissions would be 1.3 to 27.4 million tonnes of CO2 (MtCO2) annually. In contrast, John Abraham, writing in the Guardian (and again more recently), estimated that the emissions would be as much as 190 MtCO2 annually, about seven times the State Department’s high estimate (calculation details here).

The variation in the estimates arises from the assumptions made. The State Department consultants assumed that the extra oil transported by the pipeline would displace oil produced elsewhere, so that we should only count the difference between the life-cycle emissions from the shut-in light oil and those of the more carbon-intensive bitumen. In addition, they estimated that not building KXL would mean that bitumen would instead be transported by rail, at slightly higher transportation costs. Abraham simply totted up all of the production, refining and consumption emissions of the 830,000 barrels per day (bpd) pipeline capacity and did not consider any effect of the extra product on world oil markets.

Neither set of assumptions is likely to be correct. Increasing the supply of any product will have an effect on a market, lowering prices and stimulating demand (consumption) growth. Lower prices will reduce supply somewhere.  The question is: by how much?

An interesting new paper in Nature Climate Change (paywalled, but there is an open copy of an earlier version available here) by Peter Erickson and Michael Lazaruares ,attempts to answer this question. The authors are based in the Seattle office of the Stockholm Environment Institute (SEI).

Read more...

3 comments


Athabasca Glacier: a tragic vanishing act

Posted on 26 August 2014 by Andy Skuce &

The Athabasca Glacier in the Canadian Rocky Mountains is probably the easiest glacier in the world to access by car. It's just a few hundred metres' stroll from the nearest parking lot on the magnificent Icefields Parkway in Alberta. The problem is, the stroll keeps getting longer by about 10 metres every year. Since 1992, the snout of the glacier has retreated about 200 metres, requiring tourists anxious to set foot on the glacier to walk a little further. The glacier has lost about 2 km of its length since 1844 (Geovista PDF). 

\ 

The Athabasca Glacier seen from the access trail. This point is about halfway from the parking lot and the current snout of the glacier, which is about 200 metres away. In the centre background is the ice-fall from the Columbia Icefield.  The marker shows where the glacier snout was in 1992, coincidentally the year of the Rio Earth Summit. It is just possible to make out some people walking on the glacier on the left-hand side.Click for big.

Read more...

29 comments


The Carbon Bubble - Unburnable Fossil Fuels - Seminar and Discussion

Posted on 26 March 2014 by Andy Skuce &

The British Columbia Sustainable Energy Association (BCSEA) organizes a series of free seminars on climate change and sustainability issues. BCSEA was founded by Guy Dauncey. On February 11th, 2014 BCSEA held a webinar on the recent work done by the Carbon Tracker Initative. Guy has written a detailed summary of their recent work on the BCSEA webpage.

The seminar starts at 8:30 minutes and a very good Q&A session begins at 39 minutes. The slides that accompany the seminar can be downloaded here.

The presenter is Mark Campanale, the founder and executive director of the Carbon Tracker Initiative.

Read more...

6 comments


The Editor-in-Chief of Science Magazine is wrong to endorse Keystone XL

Posted on 3 March 2014 by Andy Skuce &

An editorial by the Editor-in-Chief of Science Magazine, Marcia McNutt, conditionally endorses the Keystone XL (KXL) pipeline. Her argument is that:

  • the absence of the pipeline has not stopped oil sands development and the building of the pipeline will not accelerate oil sands development;
  • President Obama can extract concessions from the Canadians to reduce emissions and upgrade the bitumen in Canada.

Both of these arguments are wrong; let me explain why.

Pipelines promote production

The Mildred Lake oil-sands plant in Alberta. Note the tailings pond behind the huge yellow piles of sulphur, a by-product of bitumen upgrading. The sulphur may come in handy later for use in solar radiation management. Photo Wikipedia

It should be obvious from the intense lobbying and advertising efforts of Canada's Federal Government, the Alberta Provincial Government and the Canadian Association of Petroleum Producers that the KXL pipeline is a very big deal indeed for those with a stake in expanding oil sands production. Federal Natural Resources Minister Joe Oliver accuses his domestic political opponents of putting tens of thousands of Canadian jobs at risk by urging Washington not to approve KXL. At least on this matter, he is right; without new transportation infrastructure, the massive investments that result in growth in oil sands production will be postponed or cancelled. But that's the message provided to a Canadian audience.

Read more...

22 comments


Talking Trash on Emissions

Posted on 7 January 2014 by jg & Andy Skuce

While attending the recent AGU conference, some of us were struck by a statistic presented by Professor Richard Alley: On average, a person's contribution of carbon dioxide waste to the atmosphere is forty times greater than their production of solid trash to landfills when measured as mass.

It can be difficult to grasp the huge quantities of CO2 that we emit. It’s an invisible gas with no odour and we are not used to thinking about amounts of gas in terms of mass. But we do have a good sense of how much solid waste we throw out, since we all have to lug our garbage to the curb. If we had to do the same with our greenhouse gases, instead of one can a week, we would have to haul forty.

Every time we see a garbage truck, let’s imagine forty others following it, all taking our carbon dioxide to a dump site. When we hear of municipal politicians struggling to find new landfill sites, imagine the problems we would have finding forty subterranean landfill sites if we ever tried to dispose of our CO2 in the subsurface instead of dumping it freely into the air.

Read more...

20 comments


Hans Rosling: 200 300 years of global change

Posted on 31 October 2013 by Andy Skuce &

We think we have done more than we have done and we haven't understood how much we have to do. Hans Rosling

Hans Rosling is a Swedish medical doctor and statistician who is determined (in his own words) "to fight devastating ignorance with a fact-based worldview that everyone can understand".

Here is a video of him giving a talk on September 28th, 2013 at a public forum that introduced the latest IPCC report. The meeting was hosted by the International Geosphere-Biosphere Programme in Stockholm.

During the talk he asks a couple of questions, one on how many more children there will be in the year 2100 compared to today and another on what percentage of world energy is produced by solar and wind. I was in the minority that got the first one correct, but only because I had already seen one of Rosling's earlier talks. On the second question, I was among the majority that got the answer wrong. How will you do?

Read more...

19 comments


Update on BC’s Effective and Popular Carbon Tax

Posted on 25 July 2013 by Andy Skuce &

  
Stewart Elgie and Jessica McClay of the University of Ottawa have a peer-reviewed article in press in a special issue of the journal Canadian Public Policy. The article is summarized in the report BC’s Carbon Tax shift after five years: Results. An environmental (and economic) success story. The report can be downloaded here and is summarized here.
  
The results are similar to a previous report that I wrote about in the article BC’s revenue-neutral carbon tax experiment, four years on: It’s working, but updated, with one more year of data.  The new data show that the carbon tax is working even better than reported previously.
 
Fuel consumption per capita has fallen in BC by nearly 19% relative to the rest of Canada; these are just the fuels that are subject to the carbon tax. (Note that the years in these tables begin on July 1, in the previous report, they were calendar years, so the numbers do not match exactly.)
 
 
Note that all fuel use for the various types of fuel fell faster per-capita in BC than for the rest of Canada. The one exception is aviation fuel, which is mostly exempt from the carbon tax and showed no differential fall in use in BC.
 
 

Read more...

41 comments


BC’s revenue-neutral carbon tax experiment, four years on: It’s working

Posted on 27 June 2013 by Andy Skuce &

Carbon taxes get the market to tell the environmental truth. Stewart Elgie

British Columbia is the only jurisdiction in North America with a revenue-neutral carbon tax that taxes greenhouse gas emissions (GHGs) from individuals and businesses alike. The tax was announced in February 2008 and was implemented in July 2008 at a rate of $10 per tonne of CO2, rising in $5 annual increments to the current price of $30/tonne. It is designed as a revenue-neutral tax, meaning that all carbon-tax proceeds collected by the government are returned in the form of income tax cuts and rebates. The tax is now raising over C$1.2 Billion per year, about C$270 per person, and the proceeds are distributed roughly equally between personal and business tax reductions.

People on low incomes get a per-person payment of C$115 annually, and homeowners who live outside the SW of the province can get additional rebates of up to $200 annually. The personal income tax reductions are focussed on earnings below C$75,000. The allocation of carbon tax revenue has to be reported in the annual budget.

Read more...

14 comments


A Miss by Myles: Why Professor Allen is wrong to think carbon capture and storage will solve the climate crisis

Posted on 11 June 2013 by Andy Skuce &

(This post was co-written by rustneversleeps and Andy Skuce).

A recent opinion piece in the British newspaper Mail on Sunday by University of Oxford climate scientist Myles Allen argues that the best way to combat climate change is to pass laws requiring fossil fuel producers to capture and sequester a rising proportion of the carbon dioxide emissions that the fuels produce. We argue here that such a policy, with its emphasis on carbon sequestration, would not be successful in achieving the carbon emission reductions that Allen himself advocates—for a variety of political, economic, technological and logistical reasons. A more recent article by Allen in The Guardian covers the same ground.

Nevertheless, Allen’s prescription does succeed in focussing the mind on the scale of the problem that we face in mitigating climate change.

Summary/Index

This is a very long post, so here is a clickable summary.

A good starting framework, then... Allen's diagnosis is clear and his framing of targets in terms of cumulative emissions is unabiguous. But his prescription is flawed.

Politics There is no reason to assume a fixed emissions cap schedule would be easier to sell to the public than a carbon tax. Caps would produce greater certainty of longer-term emission reductions at the cost of uncertain economic consequences.

Economics (i): Efficiency Imposing emissions caps without allowing trading through brokers would be very inefficient. It is not clear whether Allen supports or opposes trading.

Economics (ii) Innovation by fiat? Prescribing one form of technology as the principle solution is risky. Nobody can predict how technology will evolve and what problems may emerge in future.

Economics (iii): The information conveyed by prices The cost of one technology should not be used as a basis for carbon pricing. There is a wide range of mitigation options, with highly variable prices, all with variable and uncertain potential to contribute to solutions. Experience in British Columbia shows that even a modest carbon tax can reduce emissions significantly without harming the economy.

Scaling it up to climate relevance Even promoters of aggressive deployment of carbon capture and storage (CCS) do not envision it as more than a partial contribution to mitigating climate change by 2050.

Timing and feasibility The mass of the CO2 to be sequestered is about double the mass of the fossil fuels themselves. To develop a new industry, from scratch, to capture, transport and dispose of these quantities will involve vast amounts of capital and many decades, even if it were technically possible.

Hazards The magnitude of the CO2 to be sequestered in the subsurface is such that environmental risks from leakage, aquifer contamination and induced earthquakes are likely to be much larger than those from the already contentious shale gas industry. Getting  public licence for CCS projects in inhabited areas is likely to be very difficult and time consuming.

Summing up The climate crisis is so vast that we need to throw everything we have at it. Claiming that any single technology will solve the problem can lead to complacency that the fix is simple. It isn't.

Read more...

19 comments


Global Warming: Not Reversible, But Stoppable

Posted on 19 April 2013 by Andy Skuce &

Let's start with two skill-testing questions:

1. If we stop greenhouse gas emissions, won't the climate naturally go back to the way it was before?
2. Isn't there "warming in the pipeline" that will continue to heat up the planet no matter what we do?

The correct answer to both questions is "no".

Global warming is not reversible but it is stoppable.

Many people incorrectly assume that once we stop making greenhouse gas emissions, the CO2 will be drawn out of the air, the old equilibrium will be re-established and the climate of the planet will go back to the way it used to be; just like the way the acid rain problem was solved once scrubbers were put on smoke stacks, or the way lead pollution disappeared once we changed to unleaded gasoline. This misinterpretation can lead to complacency about the need to act now. In fact, global warming is, on human timescales, here forever. The truth is that the damage we have done—and continue to do—to the climate system cannot be undone.

The second question reveals a different kind of misunderstanding: many mistakenly believe that the climate system is going to send more warming our way no matter what we choose to do. Taken to an extreme, that viewpoint can lead to a fatalistic approach, in which efforts to mitigate climate change by cutting emissions are seen as futile: we should instead begin planning for adaptation or, worse, start deliberately intervening through geoengineering. But this is wrong. The inertia is not in the physics of the climate system, but rather in the human economy.

This is explained in a recent paper in Science Magazine (2013, paywalled but freely accessible here, scroll down to "Publications, 2013") by Damon Matthews and Susan Solomon: Irreversible Does Not Mean Unavoidable

Since the Industrial Revolution, CO2 from our burning of fossil fuels has been building up in the atmosphere. The concentration of CO2 is now approaching 400 parts per million (ppm), up from 280 ppm prior to 1800. If we were to stop all emissions immediately, the CO2 concentration would also start to decline immediately, with some of the gas continuing to be absorbed into the oceans and smaller amounts being taken up by carbon sinks on land. According to the models of the carbon cycle, the level of CO2 (the red line in Figure 1A) would have dropped to about 340 ppm by 2300, approximately the same level as it was in 1980. In the next 300 years, therefore, nature will have recouped the last 30 years of our emissions.

 

Figure 1 CO2 concentrations (A); CO2 emissions (B) ; and temperature change (C). There are two scenarios: zero emissions after 2010 (red) and reduced emissions producing constant concentrations (blue). From a presentation by Damon Matthews, via Serendipity

Read more...

116 comments


Living in Denial in Canada

Posted on 1 March 2013 by Andy Skuce &

In an earlier article, I reviewed sociologist Kari Norgaard’s book Living in Denial: Climate Change, Emotions and Everyday Life in which she records the response of rural Norwegians to climate change. She analyzes the contradictory feelings Norwegians experience in reconciling their life in a wealthy country that is at once a major producer and consumer of fossil fuels and, at the same time, has a reputation of being a world leader in its concern for the environment, human development, and international peace.

Canada shares many characteristics with Norway; they are both northern lands that distinguish themselves from larger southern neighbours by their cold climates and progressive social policies. Both countries are wealthy, thanks in large part to exploitation of their abundant natural resources. In this article, I will try to look at Canada through the same lens that Norgaard used in her study of Norway. Because I am not aware of any kind of field study in Canada similar to the kind that Norgaard did in Norway, I will rely on how socially organised denial expresses itself through Canadian political discourse on climate change.

According to polling by Environics a majority of Canadians in 2012 (57%) are convinced that the science is conclusive that global warming is happening and is caused mostly by humans. Only 12% believe that the science of global warming is not yet conclusive. Majorities are also in favour of carbon taxes in most areas of the country. It is also worth noting that none of Canada's political leaders take a denialist stance on climate change. In a speech in Berlin in 2007, Prime Minister Stephen Harper affirmed his government's commitment to "...the fight against climate change, [is] perhaps the biggest threat to confront the future of humanity today." Later in the same speech he acknowledged; "But frankly, up to now, our country has been engaged in a lot of "talking the talk" but not "walking the walk" when it has come to greenhouse gases".  It is that disengagement between thought and action that is at the heart of implicatory denial.

Read more...

13 comments


Living in Denial in Norway

Posted on 28 February 2013 by Andy Skuce &

Norway is one of the most wealthy countries on Earth, with the very highest levels of human development, it is among the most generous donors of foreign aid and, for a country of its size, makes enormous efforts to promote peace. A former Prime Minister of Norway, Gro Harlem Brundtland, has done as much as anyone to promote global sustainable development and public health. The world would surely be a better place if everyone on Earth behaved like Norwegians.

Norway, on the other hand, is also the largest per capita oil producer outside of the Middle East, producing more oil per capita even than Saudi Arabia, about 150 barrels per person per year from its fields in the North Sea. Five million Norwegians also emit 11 tonnes of greenhouse gasses each per year, a little higher than the European mean and twice as high as the global average. The world would surely become uninhabitable if everyone on Earth behaved like Norwegians.

Every other country and community has its own contradictions, of course.  Despite the fact that the majority of Americans and Canadians believe that climate change is a concern, no progress has been made at national levels to introduce the carbon pricing policies that institutions like the International Energy Agency and the World Bank believe to be an essential step in reducing emissions. And neither, generally speaking, have many people voluntarily made the lifestyle changes—like giving up non-essential air travel—that are necessary if we are to achieve a low-carbon future. Concern about climate change is broad, but often shallow. We mostly carry on in our daily lives as if climate change was not happening.

Among the majority of us who recognize the threat of climate change, there’s clearly a disconnection between thought and action. We know that things have to change, but we have a lot of reasons why change is not up to us, or why now is not the time, or why our inaction is somebody else’s fault. The exact reasons will vary from person to person and from country to country, but they all serve the same purpose, to help ease the anxiety and helplessness we all feel, while doing nothing substantial to alleviate the problem. Kari Norgaard, an American and a sociology professor at the University of Oregon, calls these responses “socially organized denial” and has written several articles on the subject (available here) and a book, Living in Denial: Climate Change, Emotions and Everyday Life (MIT Press, 2011) based on sociological field work that she conducted in rural Norway. 

Read more...

23 comments


Subcap Methane Feedbacks. Part 4: Speculations

Posted on 23 January 2013 by Andy Skuce &

Previous articles in this series have reviewed recent research on methane sources from beneath permafrost and ice sheets. Part 1 looked at subcap fossil methane seeps in Alaska; Part 2 provided a perspective for the size of these seeps in relation to other natural and human sources; and Part 3 looked at potential methane sources resulting from the withdrawal of glaciers and ice sheet. In this final section, I will try to make estimates of what subcap methane emissions may mean for future climate change; more as a speculative basis for discussion rather than an authoritative prediction. Firstly, though, I will argue for a role for subcap methane emissions on the East Siberian Arctic Shelf (ESAS). 

Subcap methane on the East Siberian Arctic Shelf?

 In their 2005 paper Indications for an active petroleum system in the Laptev Sea, NE SiberiaCramer and Franke (CF5) presented seismic data and analyses of gas samples that showed that there are deep sedimentary basins in this part of the Arctic Ocean that are actively seeping fossil methane. They conclude: 

• Seepages of thermal gas from the sedimentary column into the sea water were detected at two locations at the northern margin of the shelf, indicating the rather uneven distribution of petroleum seeps in the Laptev Sea.

• This uneven distribution probably reflects the occurrence of sub-sea permafrost. The south and the centre of the Laptev Sea is underlain by a permafrost layer several hundreds of metres thick, generally preventing petroleum from seeping into the water. In the north, the permafrost is absent, allowing the migration of hydrocarbons into the water column.

 

Figure 1. Seismic reflection section from the Laptev Sea (location shown by green star on the map). The interpretation shows faulted and tilted sedimentary rocks, a weak reflection at ~0.5 seconds that may mark the base of the permafrost, and indications of what may be gas migration chimneys. The section comes from Cramer and Franke, 2005 (their Figure 3); and the map is adapted slightly from Franke and Hinz, 2009, (their Figure 3). The map's contours and colours represent two-way time to basement rocks; this is the time taken by an acoustic wave to get from the surface to a reflector and back, and is proportional to depth for a given velocity.

Read more...

6 comments


Subcap Methane Feedbacks. Part 3: Methane from beneath the ice

Posted on 21 December 2012 by Andy Skuce &

The previous two parts (one here and two here) of this series examined the evidence for seeps of geological methane through the cap provided by permafrost and looked at estimates of the magnitude of such seeps in relation to other sources of atmospheric methane. In this section, we will look at the emerging evidence for actual and potential methane releases related to glaciers and ice sheets. The sources of this methane are varied: fossil methane released through reactivated fractures; organic matter once buried and now exhumed; and possible methane hydrates lurking under the big ice sheets.

 The Southern Alaska subcap seeps

There is a cluster of subcap seeps in southern Alaska in areas where there is only sporadic permafrost. This is an area that was covered with a major ice sheet during the last ice age. Today, it is a region where nearby modern glaciers are rapidly receding. These seeps are underlain by a sedimentary basin that contains coal beds and an active petroleum system.  Katey Walter Anthony and her colleagues (WA12) showed that major seeps in this part of Alaska are within 10km of the nearest glacier and close to large, previously mapped faults. Seeps of gas and oil are common above active petroleum systems and are often used in hydrocarbon exploration for highlighting attractive areas to explore. 

Figure 1. Map and satellite image showing the location of the Eyak Lake and Katalla Bay seeps in South Central Alaska. Source: Google Maps.

Read more...

2 comments


Subcap Methane Feedbacks. Part 2: Quantifying fossil methane seepage in Alaska and the Arctic

Posted on 8 December 2012 by Andy Skuce &

The previous article in this series looked at the recent discovery of significant releases of fossil methane through the thawing permafrost in Alaska. In this second instalment we will look at the potential of the rest of the Arctic to produce subcap methane, and will compare the size of these seeps to other global methane-producing mechanisms. 

Pan-Arctic extrapolation

Much of the Arctic is underlain by sedimentary basins (Figure 1) that have potential to generate fossil methane gas. Onshore areas that are both gas- and permafrost-prone include Northern Alaska, Canada (Mackenzie Delta and Arctic Islands), Svalbard and Western Siberia.

In the Mackenzie Delta of Arctic Canada, for example, large gas seeps have been observed in small lakes in the delta close to large discovered gas fields. These seeps have isotope and chemistry signatures similar to local deep gas fields and to the Mallik gas hydrate accumulation, which contain fossil, thermogenic methane. Permafrost is up to 500 metres thick in this area but oil industry seismic reflection data reveal the presence of what may be gas conduits through the permafrost, from the deeper fossil gas accumulations (personal communication from the Geological Survey of Canada). It seems highly probable that similar situations will exist elsewhere in the Arctic, especially, in Siberia. Cramer and Franke (2006) show examples of gas conduits, some of them through permafrost, in the Laptev Sea (see also part four of this series, for more discussion).

 

 

Read more...

15 comments


Subcap Methane Feedbacks, Part 1: Fossil methane seepage in Alaska

Posted on 28 November 2012 by Andy Skuce &

As permafrost thaws, methane is released as the vegetable matter in the soils decomposes. This methane bubbles to the surface in lakes and ponds and accumulates under the ice in the wintertime. New research has shown that the most vigorous methane seeps in Alaska are fed also by methane emitted by thermal decomposition of organic matter in deeper and much older sediments. Continuous permafrost acts as a top seal to this fossil methane, preventing it from reaching the surface and, as global warming melts and perforates this cap, we can expect the pent-up gas to be released more quickly. This source of methane, released from traps under the permafrost, is a potential third source of methane feedback in the Arctic, in addition to permafrost soils and methane hydrates. 

One of the big unknowns in forecasting the course of climate change is anticipating how the Earth’s carbon cycle will respond to the coming man-made increase in temperature. The carbon cycle describes the many processes through which carbon flows between stocks of the element in the atmosphere, the oceans, the rocks, the soil, and in plants and animals. The Earth's most sensitive place to changes in the carbon cycle is the Arctic. Not only is this the region with the fastest changing climate and some of the largest stores of carbon, but even small temperature changes there will produce large effects as ice and soils that have been frozen for thousands of years begin to thaw.

Figure 1. Schematic diagram of the many ways that greenhouse gasses can be released as a warming climate in the Arctic leads to collapse of the cryosphere. Most of these will be covered in this series of articles. Graphic by John Garrett.

Read more...

25 comments


Book review: Rising Sea Levels: An Introduction to Cause and Impact by Hunt Janin and Scott Mandia

Posted on 7 November 2012 by Andy Skuce &

"My other piece of advice, Copperfield," said Mr. Micawber, "you know. Sea-wall height twenty feet, maximum storm surge nineteen feet six inches, result happiness. Sea-wall height twenty feet, maximum storm surge twenty feet six inches, result misery. The blossom is blighted, the leaf is withered, the god of day goes down upon the dreary scene, and — and in short you are for ever flooded."

With apologies to Charles Dickens for the paraphrasing.

Wilkins Micawber knew from his own experience that a small but persistent excess of spending over income eventually leads to disaster; in his case, debtors' prison. Similarly, a small and sustained rise in sea level—once it is combined with unusual weather and high tides—can push ocean waters, quite literally, over a tipping point; as the people in New York and New Jersey, in the wake of Hurricane Sandy, have just witnessed.

Michael Mann has remarked that sea levels around New York are about a foot higher than they were a century ago.  One foot may seem small compared to everyday waves and tides, but as we have seen, this sustained change to the baseline can make all the difference between a bad storm surge and a disastrous one. And, of course, the effects of an extra foot of seawater, compared to no water at all, is a very big deal indeed when that water is lying on a farmer’s field, your living room floor or an airport runway. The best estimates of future sea levels predict several additional feet of sea-level rise in New York over the next few decades: the recent flooding in the US north-east is a just a taste of things to come.

New York's La Guardia Airport, October 29, 2012. Source 

Read more...

10 comments


Big Oil and the Demise of Crude Climate Change Denial

Posted on 26 October 2012 by Andy Skuce &

From 1989 to 2002, several large US companies, including the oil companies Exxon and the US subsidiaries of Shell and BP, sponsored a lobbying organisation called the Global Climate Coalition (GCC), to counter the strengthening consensus that human carbon dioxide emissions posed a serious threat to the Earth’s climate. As has been documented by Hoggan and Littlemore and Oreskes and Conway, the GCC and its fellow travellers took a leaf out of the tobacco industry’s playbook and attempted to counter the message of peer-reviewed science by deliberately sowing doubt through emphasizing uncertainties and unknowns. The climate scientist Benjamin Santer accused the GCC of deliberately suppressing scientific information that supported the IPCC consensus.

In the late 1990s, the oil industry’s response to the climate question started to change, when BP and Shell decided to abandon the GCC and instead embrace the scientific consensus. According to the account by ex-BP geologist Bryan Lovell in his book Challenged by Carbon, BP’s change of mind was triggered by a memo sent in 1997, by then Chief Geologist David Jenkins, to the managing directors of the company that maintained that it was time for BP to be prepared to respond to the climate crisis in a constructive manner. Jenkins’ argument—which is detailed in Lovell’s book—was framed to appeal to BP’s corporate self-interest; pointing to future opportunities to employ the company’s subsurface expertise for carbon sequestration, and also to shift the climate mitigation focus away from the oil industry onto the coal-fired power sector. (Experienced corporate insiders know that appeals to ethics gain less traction than appeals to self-interest.) Jenkins’ recommendation was well received and over the following years the company changed its name to “BP plc”, changed to a new sunflower logo (at a cost of $211 million) and adopted the slogan “Beyond Petroleum”.  In 2002, the chief executive of BP, Lord John Browne gave a speech in which he said:

Read more...

9 comments


Modelling the permafrost carbon feedback

Posted on 4 October 2012 by Andy Skuce &

A recent modelling experiment shows that climate change feedbacks from thawing permafrost are likely to increase global temperatures by one-quarter to a full degree Celsius by the end of this century. This extra warming will be in addition to the increase in temperature caused directly by emissions from fossil fuels.  Even in the unlikely event that we were to stop all emissions in the near future, this permafrost climate feedback would likely continue as a self-sustaining process, cancelling out any future natural draw-down in atmospheric carbon dioxide levels by the oceans or vegetation. Avoiding dangerous climate change by reducing fossil-fuel emissions becomes more difficult once permafrost emissions are properly considered. 

Many papers have looked at the expected contribution of thawing permafrost to climate change. For example, Schaeffer et al. (2011) and Schuur and Abbott (2011) have both published estimates of the effect that the thawing and decomposition of organic matter in Arctic soils will have on future climates. Aspects that these studies neglected were the feedback that the permafrost carbon release would have on causing further permafrost degradation and the varying response that the carbon release would have on the climate in different emission scenarios and for a range of climate sensitivities.

To explore this matter further, a recent paper in Nature Geoscience (paywalled) by Andrew MacDougall, Christopher Avis and Andrew Weaver couples together climate and carbon-cycle models. Using the University of Victoria Earth System Climate Model adapted to include a permafrost response module, the researchers calculated the contribution to climate warming of thawing permafrost over a range of varying parameters.

Figure 1. Taken from MacDougall et al. (2012) showing the additional warming induced by permafrost thawing for four diagnosed emissions pathways (DEP, see text below for explanation). The coloured areas are the ranges of likely additional temperature ranges and the black lines show the median responses. The uncertainty within each DEP run results from uncertainties in the density of carbon in the permafrost and the climate sensitivity (the temperature effect of a given rise in carbon dioxide concentration in the air). Figure with original caption here.

Read more...

39 comments


Rally for Canadian Science in Victoria, BC

Posted on 28 September 2012 by Andy Skuce &

Skeptical Science readers may already be familiar with the dismal performance of the Canadian Federal Government on climate change. The Canadian contributors to Skeptical Science expressed our concerns about the erosion of our country's science for political ends in a blog post here in March of this year: PMO Pest Control: Scientists. We have also run a number of posts on the rapid development of the oil sands, for example: Tar Sands Oil - An Environmental Disaster  and; Alberta’s bitumen sands: “negligible” climate effects, or the “biggest carbon bomb on the planet”?. This summer, Canadian scientists have been taking their protest to the street and last week there was a rally in Victoria, British Columbia.

In an event organized by Ken Wu, Canadian scientists and concerned citizens rallied outside a Federal Government building in Victoria on Friday September 14th in protest against the Federal government's policies that have been cutting science budgets, shutting down vital projects (e.g., PEARL, ELA) and muzzling government scientists. People jammed the sidewalks in downtown Victoria to hear speeches by climate scientist Andrew Weaver, Canadian Green Party leader Elizabeth May and "Dr X", a marine biologist working for the Department of Fisheries and Oceans who appeared in disguise for fear of losing his job.

Green Party Leader Elizabeth May and marine biologist "Dr X" (hiding behind a false moustache) speaking at the Victoria rally. (All photographs by the author).

Read more...

30 comments


The Continuing Denial of the Scientific Consensus on Climate Change

Posted on 16 August 2012 by Andy Skuce &

One of the perennial Skeptical Science top ten climate myths is “There is no consensus” (currently at number 4 in popularity). Consensus means the elements of knowledge that research scientists tend not to discuss or actively investigate any more. Consensus is the stuff that fills textbooks and is the established knowledge that teachers try to cram into high school and undergraduate students’ heads. It doesn’t mean an impregnable bastion of knowledge—there are many well-known examples of consensus-changing revolutions in the history of science—and even school textbooks have to get updated every now and then.

Consensus doesn’t mean unanimity, either. There is always a minority of gadfly scientists who decide to take on the consensus: scientists who challenge the biotic origin of oil or medical researchers who doubt HIV as a cause of AIDS. In such cases, the contrarian scientists don’t typically deny the existence of the consensus; they just think that the content of it is wrong.

Nor does consensus mean that everybody is happy with every single element that others believe to be settled. Consensus in any field has a hard core but fuzzy edges.

There have been a few studies that have attempted to measure the degree of scientific consensus on climate change. Naomi Oreskes in 2004; Doran and Zimmerman in 2009; Anderegg et al in 2010; and the Vision Prize in 2012. All found evidence for a very strong consensus among climate scientists for the idea that recent climate change can mostly be attributed to human activities (see the recently updated rebuttal written by Dana Nuccitelli for details).  Most of the world’s scientific academies have made explicit affirmations of the consensus on climate, along with numerous scientific associations.  The IPCC reports are a major effort to define the extent of general agreement and to identify the areas of remaining uncertainty.

Read more...

24 comments


Scientific literacy and polarization on climate change

Posted on 15 June 2012 by Andy Skuce &

It is not news that people are polarized over their assessment of the risks posed by climate change. But is it true that the most polarized people are those who are more scientifically literate? Counter-intuitive though it may seem, the answer is: Yes, it is. This is the result of a recent article by Dan Kahan and six colleagues in Nature Climate Change (henceforth, the Kahan Study).  This study has received a lot of attention, with blog articles, for example in The Economist, Mother Jones and by David Roberts at Grist.

At Skeptical Science, our goal is to debunk false arguments and explain the science behind climate change. In the light of this peer-reviewed research, we have to ask ourselves: if we are striving to increase scientific literacy, won’t we just be making the polarization that exists around climate change worse?  We will come back to that question at the end of this piece, but first, we’ll look in some detail at the Kahan Study itself.

Testing two hypotheses

Kahan et al identified two contrasting hypotheses that seek to explain the polarization in the public’s appreciation of the risks posed by climate change. (Note that the Kahan Study did not look at the public’s perception of the truth or reliability of climate science but, rather, the public’s assessment of the risks that climate change poses.) These hypotheses are:

Read more...

33 comments


Alberta’s bitumen sands: “negligible” climate effects, or the “biggest carbon bomb on the planet”?

Posted on 28 April 2012 by Andy Skuce &

*The climate effects of bitumen development are significant once viewed in the perspective of probable emissions over the rest of this century.

*The climate impact of coal consumption is greater than that of bitumen, particularly when non-mineable coal is considered.

*Accelerated expansion of bitumen extraction will make climate mitigation efforts much more difficult.

*Because of its high carbon emissions and high extraction costs, further bitumen development would not be viable if stringent global emissions policies were adopted.

The accelerating development of the huge bitumen* resources in Alberta has produced a great deal of recent public interest, due mainly to controversial proposals to build two big new pipelines: one connecting Alberta to the US Gulf Coast (Keystone XL) and the other to the Pacific coast of British Columbia (Northern Gateway). Much of the discussion has revolved around the dangers of leaks from the pipelines themselves and, in the case of the Northern Gateway proposal, the risks of tanker accidents in the narrow fjords of BC’s pristine northern coast and the turbulent Hecate Strait. Recent publications have also drawn attention to the massive damage to peatlands caused by bitumen mines and to the pollution of the Athabasca River. However, for the purposes of this article, I will focus only on the effect of bitumen sand exploitation on climate change.

The general topic was discussed previously at Skeptical Science in Tar Sands Impact on Climate Change.

Read more...

31 comments


DeConto et al: Thawing permafrost drove the PETM extreme heat event

Posted on 11 April 2012 by Andy Skuce &

Sudden spikes in global temperatures that occurred 50-55 million years ago were caused by thawing of permafrost in Antarctica and northern high latitudes, according to recent research. The trigger for this sudden destabilization was a variation in orbital configurations that resulted in warmer polar summers. This model also provides an analogue for the releases of carbon from modern permafrost caused by current man-made global warming. Modern permafrost volumes are smaller than the estimates for those of 55 million years ago, but will nevertheless amplify the climatic effect of fossil fuel consumption and will provide continuing warming feedbacks for centuries after human emissions cease.

The Paleocene-Eocene hyperthermal events

The Paleocene-Eocene Thermal Maximum (PETM) is an extreme global warming event or hyperthermal that occurred 55.5 million years ago when a sudden “burp” of a huge quantity of carbon was emitted into the atmosphere, causing global temperatures to rise by five degrees Celsius and the oceans to become more acid. Because of the rapidity of the carbon burp, the event has often been considered an analogue of what might happen as humans release a comparable slug of CO2 into the atmosphere. The PETM was discovered in 1991 by Kennett and Stott and since then over 400 scientific papers have been written on the subject. There is a good review of the PETM by McInerney and Wing (2011). See also Rob Painting's post CO2 Currently Rising Faster Than The PETM Extinction Event and the comments in the discussion that follows it.

There are uncertainties about where the carbon in the PETM came from and what triggered its sudden release. One of the leading hypotheses (e.g. Dickens, 2003) has been that the carbon was released from methane hydrates, ice-like accumulations of methane and water that were present in sediments below the deep ocean floor. According to this hypothesis, the hydrates became destabilized as sea water temperatures gradually rose. The release of the hydrates into the atmosphere increased the greenhouse effect, which led to more hydrate destabilization. See also Lunt et al (2011).

Read more...

10 comments


Eocene Park: our experiment to recreate the atmosphere of an ancient hothouse climate

Posted on 6 April 2012 by Andy Skuce &

Fifty million years ago, during the Eocene Epoch , the world had a very different climate, with temperatures much higher than today's, especially at the poles. This hothouse climate was caused mainly by CO2 levels that were twice as high, or more, than now. On our current emissions trajectory, we could recreate the chemistry of the hothouse atmosphere before the end of this century, with potentially drastic consequences for our climate.

The Hothouse

Imagine a world where crocodiles swim in an Arctic ocean among blooms of freshwater ferns, and where palm trees grow in Alaska and the high valleys of the Rocky Mountains. In this world, a great forest covers the continent of Antarctica and there are no large ice sheets. Sea levels are 50 metres or more higher than today’s. This world was the hothouse Earth in the time of the Eocene Epoch, 56 to 34 million years ago.  Estimates of the concentration of carbon dioxide in the Eocene air show that they were two times or more higher than today. Such a CO2-rich atmosphere has not been seen since the Eocene, but it is quite possible that we will create an atmosphere like it again soon, perhaps later in this century.

The frost-intolerant palm trees that grew in Montana show that that Eocene winter temperatures, even at high elevations in mid-latitude continental interiors, must have stayed above zero most of the time.  The variation of average temperature with latitude—the poles in the Eocene were 30°C warmer than today but the tropics were only a few degrees warmer—is another characteristic of the Eocene sauna. Trying to model this climate presents a challenge known as The Equable Climate Problem.  Higher levels correctly predict a much warmer world but there are some mismatches in the regional predictions. In essence, the difficulty is that if the boundary conditions in the climate models are set to values that produce mild winters in continental interiors and high latitudes, the modelled tropics turn out to be hotter than our best proxies show. However, the discovery of a one-ton fossil snake in Colombia suggests that Eocene and Paleocene tropical temperatures may have been underestimated.  Many innovative solutions (see page 243 of Huber and Caballero (2011) for a referenced list) have been proposed to resolve the Equable Climate problem but there is, as yet, no consensus on a resolution.

The Eocene hothouse and the elevated CO2 levels came to an end in the Oligocene Epoch and the Earth’s climate changed into its current icehouse state.

Read more...

34 comments


Changing Climates, Changing Minds: The Great Stink of London

Posted on 11 March 2012 by Andy Skuce &

Effective action for solving Victorian London's sewage crisis was put off for decades, due to chaotic governance, concerns about financing, the interference of vested interests and the complacency and inertia of central government. Once the ill effects appeared underneath the politicians’ noses, a lasting solution was quickly deployed. The modern challenge of finding the political will to deal with climate change is analogous, although there are additional factors that make fixing the climate problem much more difficult.

Richard Alley, in his recent book Earth, the Operator’s Manual, devotes part of Chapter 16, Toilets and the Smart Grid, to a comparison between the infrastructure challenges caused by the sewage problems of Victorian Edinburgh and London and the obstacles we face today in acting to solve the climate crisis. In this essay, I cover some of the same ground as Alley, but use as a main reference the book, The Great Stink of London, Sir Joseph Bazalgette and the Cleansing of the Victorian Capital (GSOL), written by Stephen Halliday in 1999. Here's a link to a pdf of  pages 58-76 of the book.

London’s sanitation before the Victorians

In medieval London, human waste was deposited into household cesspools, some of which spilled into streams and ditches. By the mid-fourteenth century, pollution from sewage was already becoming a serious problem, and at the end of that century—during the time that Richard Whittington was Lord Mayor of London—laws were proclaimed that said:

Read more...

22 comments


Changing Climates, Changing Minds: The Personal

Posted on 11 March 2012 by Andy Skuce &

Nobody comes into this world with a fully-formed opinion on anthropogenic climate change. As we learn about it, we change our minds. Sometimes, changing your mind can be easy and quick; sometimes it’s hard and slow. This is an anecdotal and subjective account of the author’s changes of mind.

A goal of Skeptical Science is to change people’s minds, especially the minds of people who doubt the reality of man-made climate change. John Cook and Stephan Lewandowsky’s The Debunking Handbook provides a how-to and how-not-to resource for debunking myths and misinformation: a guide to changing people’s minds, based on published research in psychology.

 In this article I’ll examine my own history and sketch out how I twice changed my mind on climate change; I’ll speculate on why one step was easy yet memorable while the other was hard work but forgettable.

Read more...

45 comments


Michael Mann, hounded researcher

Posted on 30 December 2011 by Andy Skuce &

Here is a translation of  recent article (December 25th, 2011) in the French newspaper Le Monde by science journalist  Stéphane Foucart. He reports on a talk that Michael Mann gave at the 2011 AGU Fall Meeting in San Francisco, in which Mann introduces his forthcoming book  The Hockey Stick and the Climate Wars: Dispatches from the Front LinesFoucart interviews Mann and discusses the background of the Hockey Stick and Climategate controversies. What is refreshing is the absence of the false balance, both-sides-of-the-story, style of reporting that is found so often in English language newspapers. 

Original article (in French) from Le Monde

In early December, at the Fall Meeting of the American Geophysical Union (the annual grand gathering of the bigwigs of the geoscience world), Michael Mann introduced his forthcoming book to his peers. The lecture was entertaining and the audience laughed heartily.  The American climatologist, Director of the Earth System Center at Pennsylvania State University, cracked numerous jokes and made many witty asides. He scoffed at the anti-science of the Republican politicians and mocked their ridiculous statements on climate change; everybody laughed out loud.

But this, surely, is no laughing matter. Michael Mann’s forthcoming book, The Hockey Stick and the Climate Wars: Dispatches from the Front Lines (Columbia University Press), is not really a science book; rather, as its title suggests, it deals instead with the war on climate science, which has at times turned into a manhunt, frequently with Mann as the quarry.

Read more...

62 comments


Changing the Direction of the Climate

Posted on 1 December 2011 by dana1981 & Andy Skuce

As Andy recently discussed, the International Energy Agency (IEA) has published the World Energy Outlook 2011 (WEO11), which incorporates the most recent data on global energy trends and policies, and investigates the economic and environmental consequences of three scenarios over the 2010 to 2035 time period:

  • The New Policies Scenario. This is used as a reference case. It assumes that governments will follow through on the (non-binding) pledges that they have made to reduce emissions and deploy renewable energy sources.

  • The 450 Scenario. This is an outcome-driven scenario, which lays out an energy pathway designed to limit the long-term concentration of greenhouse gasses to 450 ppm CO2 equivalent. Achieving this would provide a 50% chance of limiting global temperature increases to 2°C. Some climatologists, such as James Hansen, argue that a more aggressive 350 ppm target is required to avoid the possibility of “seeding irreversible catastrophic effects”.

  • The Current Policies Scenario.  This projection models a future in which only those climate and energy policies actually adopted in mid-2011 are incorporated.

The IEA focuses on CO2 emissions from energy production, as we'll see below.  Their 450 Scenario proposes to keep CO2 equivalent (including all atmospheric greenhouse gases) concentrations in the vicinity of 450 ppm by very quickly reducing both non-fossil fuel CO2 emissions (i.e. reducing deforestation) and non-CO2 greenhouse gas emissions (i.e. methane), such that their emissions in 2020 are lower than today.  However, the main focus of the report is on fossil fuel CO2 emissions.

Read more...

37 comments


World Energy Outlook 2011: “The door to 2°C is closing”

Posted on 16 November 2011 by Andy Skuce &

If we don’t change direction soon, we’ll end up where we’re heading

These words come from the Executive Summary of the World Energy Outlook 2011 (WEO11), just published by the International Energy Agency (IEA). The study incorporates the most recent data on global energy trends and policies, and investigates the economic and environmental consequences of three scenarios over the 2010 to 2035 time period. This is an important document that should be widely read but, unfortunately, the full report costs €120 for a single-user 650-page pdf.  Some key graphs and fact sheets are provided for free.

The WEO11 report is a commentary on the assumptions and output of the World Energy Model (WEM) in the IEA's words: a large-scale mathematical construct designed to replicate how energy markets function and is the principal tool used to generate detailed sector-by-sector and region-by-region projections for various scenarios.  A detailed description of the WEM is available here. The IEA updates their model and analysis every year, since important and unpredictable developments — a tsunami in Japan, the Arab Spring, new technologies in natural gas production — change the model’s boundary conditions significantly. In 2012, we can look forward to: an election in which the world’s largest economy may elect a government that denies the urgency and even the reality of climate change; further unrest in the Middle East; developments in renewable and fossil-fuel technology; and an unfolding economic crisis in Europe. And those are just some of the more foreseeable events that will make a new WEO study needed next year. In comparison, climate modellers have it easy.

In the 2011 Outlook, the IEA explored three scenarios.

  • The New Policies Scenario. This is used as a reference case. It assumes that governments will follow through on the (non-binding) pledges that they have made to reduce emissions and deploy renewable energy sources.

  • The 450 Scenario. This is an outcome-driven scenario, which lays out an energy pathway designed to limit the long-term concentration of greenhouse gasses to 450 ppm CO2 equivalent. Achieving this would provide a 50% chance of limiting global temperature increases to 2°C. Some climatologists, such as James Hansen, argue that a more aggressive 350 ppm target is required to avoid the possibility of “seeding irreversible catastrophic effects”.

  • The Current Policies Scenario.  This projection models a future in which only those climate and energy policies actually adopted in mid-2011 are incorporated.

Read more...

75 comments


Berkeley Earth Surface Temperature Study: “The effect of urban heating on the global trends is nearly negligible”

Posted on 21 October 2011 by Andy Skuce &

A paper submitted for peer review by the Berkeley Earth Surface Temperature study (BEST) finds that urban heating has an influence on global temperature trends that is “nearly negligible” and that what effect has been observed is even slightly negative, which is to say that temperature trends in urban areas are actually cooler than the trends measured at rural sites, and that the Earth's land surface has warmed approximately 1°C on average since 1950.

The Urban Heat Island Effect

It has long been observed that temperatures in cities are higher than in the surrounding countryside, caused, in part, by human structures that reduce albedo and evapo-transpiration, as well as by the effects of waste heat emissions, McCarthy et al 2010. Even though most (99%) of the Earth’s surface is not urbanized, some 27% of the Monthly Global Historical Climatology Network (GHCN-M) temperature stations are located in cities having populations of more than 50,000. Since urbanization has grown dramatically over the past few centuries, it seems reasonable to ask how much of the observed rise in global temperatures is due to urbanization. For example, McKitrick and Michaels claimed in 2007 that about half of the recent warming over land is due to urban heat island effect, although this result  was disputed by Schmidt in 2009.

Several studies have looked in depth at possible heat island contamination of land temperature records. For example, Hansen et al in 2010 used satellite measurements of night lights to remove possibly affected urban sites from the temperature records. Allowing for urban effects reduced the global temperature trend over the period 1900-2009 by 0.01°C. Menne et al performed an analysis on US surface temperature records and found only very small effects due to poor siting of temperature stations. They reported that poorly-sited stations did show some small positive bias in recorded minimum temperatures but there was a small, but slightly larger, negative bias in recorded maximum temperatures, resulting in an overall small residual cool bias.

Read more...

44 comments


Heat from the Earth’s interior does not control climate

Posted on 17 September 2011 by Andy Skuce &

This blog post is the intermediate-level rebuttal to the climate myth “Underground temperatures control climate”. 

The myth:

"There are other possible causes for climate change which could be associated with solar activity or related to variations in the temperature of the liquid core of the Earth, which is about 5,400 degrees Celsius.  We don't need a high heat flow - just a high temperature for the core to affect the surface climate.  There is massive heat inside the Earth." Link. See here, also.

Consider:

  • The center of the Earth is at a temperature of over 6000°C, hotter than the surface of the Sun.
  • We have all seen pictures of rivers of red-hot magma pouring out of volcanoes.
  • Many of us have bathed in natural hot springs.
  • There are plans to exploit geothermal energy as a renewable resource.

Common sense might suggest that all that heat must have a big effect on climate. But the science says no: the amount of heat energy coming out of the Earth is actually very small and the rate of flow of that heat is very steady over long time periods. The effect on the climate is in fact too small to be worth considering.

The Earth’s heat flow

Read more...

71 comments


The Ridley Riddle Part Three: Like a Northern Rock

Posted on 12 August 2011 by Andy Skuce &

This is the third and final article in the series on climate contrarian Matt Ridley. Here are Parts One and Two. 

This article will look at Matt Ridley’s involvement in the collapse of the British bank, Northern Rock, in 2007. I am not the first to attempt to link this business disaster with with his views on climate change; George Monbiot wrote an articleThe Man Who Wants to Northern Rock the Planet, remarking, among other things, on the contradiction between Ridley’s small-government libertarianism and his begging the Treasury for a bail out of his company.

The rise and fall of Northern Rock

Matt Ridley was the non-executive Chairman of Northern Rock, a British bank that, in 2007, was the first in over a century and a half to experience a run on its deposits. British banks had all survived two World Wars, the Great Depression, and the end of the British Empire, until Northern Rock failed. Ridley had served on the Northern Rock board of directors since 1994 and was appointed Chairman in 2004. A previous Chairman of the bank was his father, Viscount Matthew Ridley.

Northern Rock’s depositors responding rationally but not optimistically to market signals. Source

Northern Rock’s business model was a very aggressive one, centered on rapid growth of its mortgage business. Before 1997, Northern Rock was a building society, a co-operative savings and mortgage institution. Like many other British building societies, it transformed itself into a bank and was listed on the stock exchange. This led to rapid growth for Northern Rock, which grew its assets at an annual rate of more than 23% from 1998 to 2007. Before its crisis, Northern Rock had assets of about $200 billion and was the fifth-largest bank in Britain.  The bank’s retail deposits did not grow at the same rate as its mortgage assets; the difference was made up with funding from capital markets. When the credit crisis hit in 2007, Northern Rock saw its funding vanish. Northern Rock’s debts were more than fifty times its shareholder common equity, making the bank an outlier even among the many other highly-levered financial institutions at that time. This made the bank particularly vulnerable to changes in the credit markets. The bank was unable to pay its creditors and had to turn to the Bank of England for help in September 2007. These events led to panic among its depositors, who formed huge queues outside its branches to withdraw their savings.

Read more...

12 comments


The Ridley Riddle Part Two: The White Queen

Posted on 7 August 2011 by Andy Skuce &

 This is the second of three articles on the climate contrarian Matt Ridley. Part One is here

In 2010, David MacKay wrote a letter to Matt Ridley in response to an op-ed article by Ridley published in The Times. MacKay’s letter and Ridley’s reply to it are both posted on the Ridley’s Rational Optimist blog. In this article, I am going to focus on Ridley’s reply, not because it is a particularly interesting or original addition to the skeptical canon, but because I believe it is revealing about the mindset of a climate contrarian.

David MacKay is a physicist but not a climate scientist. He is the author of the book Sustainable energy - without the hot air, which examines the daunting challenge that the United Kingdom faces in decarbonizing its energy supply. It’s a must-read, entertainingly written, easy-to-understand work, and can be downloaded for free.  MacKay was appointed in 2009 to be Chief Scientist at Britain’s Department of Energy and Climate Change. His letter to Ridley raises points that will be familiar to regular readers of Skeptical Science; he cites recent evidence of climate change and discusses the analogue of the Paleocene-Eocene Thermal Maximum and the central question of the likely range of climate sensitivity. MacKay also mentions that his personal and professional contact with climate scientists bears no relation to the way they are often negatively depicted on contrarian blogs.

There are three main elements of Ridley’s reply that I want to focus on in this article but, first, I’ll simply list in point form some of the other arguments that comprised the rest of Ridley’s Gish Gallop, along with rebuttal links and brief comments.

Read more...

7 comments


The Ridley Riddle Part One: The Red Queen

Posted on 30 July 2011 by Andy Skuce &

This is a three-part series on science writer, businessman and climate contrarian Matt Ridley. The first section looks at his science books and is critical of his latest book, The Rational Optimist; the second scrutinizes one of his blog posts on climate change and shows that his avowed lukewarmer stance is built on shaky scientific foundations; the final part examines Ridley’s history as a businessman, drawing parallels between his role in the credit crunch and his approach to climate change. 

Sometimes, it’s easy to dismiss climate change contrarians as being a little slow when it comes to truly understanding science, but it’s not possible to do that with Matt Ridley, who has a well-earned reputation as a first-class science writer.  He is on the Academic Advisory Council of the contrarian Global Warming Policy Foundation, along with Robert Carter, William Happer, Richard Lindzen, and Ian Plimer. As well as being the author of several excellent science books, he’s a journalist, a businessman, and has a D.Phil. in Zoology from Oxford. And he runs a blog that, among other things, takes a skeptical stance on the mainstream science of climate change. 

Why such a talented science writer should have come to reject the scientific consensus on climate change is the Ridley Riddle that this series of posts will attempt to answer.

Read more...

24 comments


Thinning on top and bulging at the waist: symptoms of an ailing planet

Posted on 14 July 2011 by Andy Skuce &

Like many an aging baby boomer, the Earth is starting to bulge at the waist and is getting thinner on top. In the Earth’s case this isn’t due to a weakness for drinking beer or due to an inherited tendency towards male pattern baldness. Rather, it is because of climate change. As the big ice sheets in Greenland and Antarctica thaw, the melt water is distributed throughout the world’s oceans, causing mass to move away from the poles.

The earth is not exactly round, it’s slightly flattened at the poles and bulging at the Equator. The radius of the Earth measured from its center to sea level at the poles (or where sea level would be at the South Pole if there were no continent there) compared to the radius at the Equator, differs by about 21 km. This means that the point on the Earth’s surface furthest away from its center is not the summit of Everest but rather the top of the volcano Chimborazo in Ecuador. Although the summit of Chimborazo is 2,500 meters less high above sea level than Everest, it is also much closer to the Equator and hence further from the center of the Earth. The reason that the Earth is not quite round is mainly because it spins on its axis through the poles; the centrifugal forces acting on the rocks of the Earth, which are somewhat ductile, tend to push out the parts furthest away from the axis.

The amount of flattening (or oblateness) of the Earth can be represented by a gravitational parameter known as J2. See here for a definition. This parameter has been measured since the mid-1970’s by satellite laser ranging (SLR) and is found to vary from year to year due, among other things, to the ENSO, the El Niño/La Niña oscillation in the Pacific. See Figure 1 below and here, also.

Figure 1. Measurements of changes in J2 (the Earth’s flatness or oblateness) from satellite laser ranging (SLR, not to be confused with sea-level rise), from Nerem and Wahr (2011). Note the declining trend in J2 from the mid-1970's until the mid 1990's. 

Read more...

10 comments


Carbon Cycle Feedbacks

Posted on 24 February 2011 by Andy Skuce &

The document entitled 'Carbon Dioxide and Earth's Future: Pursuing the Prudent Path', referenced in the "skeptic" scientist letter to US Congress, makes the claim that rising CO2 concentrations have "actually been good for the planet" because of the fertilization effect of CO2. Although it is true that there has been a measurable CO2 fertilization effect, particularly in the tropics (see this video seminar), this is only one factor that will influence the response of the global carbon cycle to climate change. It's instructive to look at some important factors that are not mentioned in The Prudent Path.

Read more...

9 comments


The Global Carbon Cycle by David Archer—a review

Posted on 15 February 2011 by Andy Skuce &

"The climate system is an angry beast and we are poking it with sticks" - Wallace Broecker

The belly of Broecker’s climate beast is surely the global carbon cycle. Currently, the beast is digesting about half of our fossil-fuel CO2 emissions, acting as a major stabilizing influence on the climate, a negative feedback. But as Professor David Archer warns us in his latest book, The Global Carbon Cycle, the carbon cycle behaves unpredictably in different circumstances and over different timescales. For example, during the Paleocene-Eocene Thermal Maximum (PETM), it acted to bring things back to the then hothouse normal after a massive belch of carbon into the atmosphere.  At other times, it amplified small temperature perturbations into climate events that changed the face of the planet, as it did in the ice age cycles, the period we are still currently in.  Humans are administering a perhaps PETM-scale dose of carbon into the atmosphere during a time when the global carbon cycle has acted as an amplifier rather than a damper. Ominously, Archer tells us, the planet’s reservoirs of carbon in the form of peat, gas hydrates and rain-forest carbon are now probably as charged up as they can be. That's some beast; some stick.

Read more...

10 comments


The 2010 Amazon Drought

Posted on 6 February 2011 by Andy Skuce &

A short paper by Simon Lewis, Paulo Brando and three co-authors, just published in Science Magazine, reports on the 2010 drought in the Amazon Basin. This drought occurred only a few years after the exceptional drought of 2005, which was supposed to have been a one-in-a-hundred-year event. The paper presents evidence that last year's drought was both more severe and more extensive than the earlier one.

Read more...

32 comments


The Hitchhiker’s Guide to the AGU Fall Meeting

Posted on 26 December 2010 by Andy Skuce &

"Space," it says, "is big. Really big."
The Hitchhiker's Guide to the Galaxy, by Douglas Adams

If the AGU Fall Meeting had to be summed up in one word, that word would surely have to be big. There were over 19,000 people attending and many thousands of talks and poster presentations. The Scientific Program guide is 560 pages long with each page listing about thirty presentation titles. I doubt that anyone (except maybe some hapless AGU proof-reader) has been able to read this mostly verb-free text from beginning to end. Fortunately, the AGU provides an online scheduling tool that allows you to search the program for keywords and set up a personalized itinerary; but the process of presentation selection remains rather random. Everyone experiences a different meeting.

Read more...

12 comments


Measuring CO2 levels from the volcano at Mauna Loa

Posted on 25 October 2010 by Andy Skuce &

The observatory near the summit of the Mauna Loa volcano in Hawaii has been recording the amount of carbon dioxide in the air since 1958. This is the longest continuous record of direct measurements of CO2 and it shows a steadily increasing trend from year to year; combined with a saw-tooth effect that is caused by changes in the rate of plant growth through the seasons. This curve is commonly known as the Keeling Curve, named after Charles Keeling, the American scientist who started the project.

Read more...

28 comments


Comparing volcanic CO2 to human CO2

Posted on 27 August 2010 by Andy Skuce &

The solid Earth contains a huge quantity of carbon, far more than scientists estimate is present in the atmosphere or oceans. As an important part of the global carbon cycle, some of this carbon is slowly released from the rocks in the form of carbon dioxide, through vents at volcanoes and hot springs. Published reviews of the scientific literature by Moerner and Etiope (2002) and Kerrick (2001) report a minimum-maximum range of emission of 65 to 319 million tonnes of CO2 per year. Counter claims that volcanoes, especially submarine volcanoes, produce vastly greater amounts of CO2 than these estimates are not supported by any papers published by the scientists who study the subject.

Read more...

24 comments



The Consensus Project Website

THE ESCALATOR

(free to republish)

Smartphone Apps

iPhone
Android
Nokia

© Copyright 2017 John Cook
Home | Links | Translations | About Us | Contact Us