Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.


Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe

Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...

Keep me logged in
New? Register here
Forgot your password?

Latest Posts


Explaining climate change science & rebutting global warming misinformation

Scientific skepticism is healthy. Scientists should always challenge themselves to improve their understanding. Yet this isn't what happens with climate change denial. Skeptics vigorously criticise any evidence that supports man-made global warming and yet embrace any argument, op-ed, blog or study that purports to refute global warming. This website gets skeptical about global warming skepticism. Do their arguments have any scientific basis? What does the peer reviewed scientific literature say?


2015 SkS Weekly News Roundup #35

Posted on 29 August 2015 by John Hartz

A chronological listing of the news articles posted on the Skeptical Science Facebook page during the past week. 

Sun, Aug 23

Mon, Aug 24



How to make sense of 'alarming' sea level forecasts

Posted on 28 August 2015 by Guest Author

This is a re-post from The Conversation by Andrew Glikson, Australian National University

You may have read recent reports about huge changes in sea level, inspired by new research from James Hansen, NASA’s former Chief Climate Scientist, at Columbia University. Sea level rise represents one of the most worrying aspects of global warming, potentially displacing millions of people along coasts, low river valleys, deltas and islands.

The Intergovernmental Panel on Climate Change, the UN’s scientific climate body, forecasts rises of approximately 40 to 60 cm by 2100. But other studies have found much greater rises are likely.

Hansen and 16 co-authors found that with warming of 2C sea levels could rise by several metres. Hansen’s study was published in the open-access journal Atmospheric Chemistry and Physics Discussion, and has not as yet been peer-reviewed. It received much media coverage for its “alarmist” findings.

So how should we make sense of these dire forecasts?

What we’re pretty sure about

According the to the IPCC sea level rise has accelerated from 0.05 cm each year during 1700-1900 to 0.32 cm each year during 1993-2010. Over the next century the IPCC expects an average rise of 0.2 to 0.8 cm each year.

Observed and projected sea level rise. IPCC AR5



You can’t rush the oceans (why CO2 emission rates matter)

Posted on 27 August 2015 by howardlee

Four of the “big five” mass extinctions, and several more minor environmental crises in Earth’s past, were associated with abrupt global warming and ocean acidification triggered by Large Igneous Province (LIP) eruptions. The most recent of the “big five” - the end-Cretaceous - was complicated by a large asteroid impact, but at least the marine extinction was triggered by the Deccan Traps LIP, as explained in this post. Yet not all LIPs were associated with environmental catastrophes. Some, like the early Cretaceous Paraná-Etendeka LIP, seem to have had very little effect on global climate.

So what made some LIPs destructive and others not? And, you may be asking, what possible relevance do these ancient events have for us today?

The answer is that they confirm what scientists have modeled from ocean chemistry: CO2 emission rates mattered back then just as they do now. In fact they are key.

Why are the oceans so important in climate change?

I am focusing on the oceans because the oceans are a far, far larger reservoir of carbon than either the terrestrial biosphere or the atmosphere. The atmosphere today contains about 830 billion tonnes of carbon (GtC) (up from about 600 GtC before the industrial revolution), the terrestrial biosphere contains about 560 Gt, the surface layer of the ocean about 900 Gt and the deep ocean about 37,100 Gt. Humans have emitted about 600 Gt since the industrial revolution, and if we continue business-as-usual we might end up emitting as much as 1700 Gt by the end of the century, more than the terrestrial biosphere and surface ocean inventories combined. The oceans currently absorb about 30% of human CO2 emissions and 90% of the excess heat generated by modern climate change, but as terrestrial “sinks” for our CO2 emissions dwindle and even switch to becoming sources of CO2 this century (see this post), the oceans are fast becoming the main sink for absorbing our CO2 emissions.

Oceans are dynamic, allowing them to change in response to alterations in climate, but there are limits to that dynamism.

You can’t rush the oceans

To understand why CO2 emission rates matter, you have to wet your feet in some basic ocean chemistry.

Rates of main ocean carbonate processes

Figure 1: Rates of key aspects of ocean carbon chemistry in a time of rising atmospheric CO2 levels.

In a time of rising CO2 levels in the atmosphere, the surface layer of the ocean absorbs most (but not all) of that excess CO2 to regain balance with the atmospheric levels (Henry’s Law). Once dissolved in seawater, that CO2 reacts with water to make carbonic acid, a process that takes about a minute to achieve equilibrium, and then the fun starts.



Tracking the 2C Limit - July 2015

Posted on 26 August 2015 by Rob Honeycutt

Following my last post on The 1C Milestone I've decided to build my own chart to track our progress relative to the 2C Limit. This will be a monthly post that I make here on Skeptical Science once the GISS monthly data come out. 

In conjunction with these updates I'll try to throw in additional information that might relate to the 2C limit as it becomes available (or as I happen upon it). With this first post there are a few things that I'd like to point out.

Data Source

I'm choosing to use NASA - GISTemp LOTI v3 data mostly because the data are published in a manner that makes it easy for me to manage. I've played around with the GISS data before and am more familiar with it than the other data sets. As new versions of GISTemp come out I will switch over to those, hopefully without too much pain. 

If anyone is interested in checking out the data, you can find it here:


This is an issue that can trip up some folks who might be new to the climate change issue. It can be confusing as to why the surface temperature data sets use "anomalies" off of a baseline instead of actual relative temperature. NOAA has a very good short explanation about why.

What I'm doing with this data is, essentially, re-baselining it. The original data are baselined to a 1951-1980 average, making the average of those data points the zero axis. But, we want to know what the temperature anomalies are relative to a "preindustrial" baseline, in essence, so we can see how much warming has taken place since humans started pumping CO2 into the atmosphere.



Here’s what happens when you try to replicate climate contrarian papers

Posted on 25 August 2015 by dana1981

Those who reject the 97% expert consensus on human-caused global warming often invoke Galileo as an example of when the scientific minority overturned the majority view. In reality, climate contrarians have almost nothing in common with Galileo, whose conclusions were based on empirical scientific evidence, supported by many scientific contemporaries, and persecuted by the religious-political establishment. Nevertheless, there’s a slim chance that the 2–3% minority is correct and the 97% climate consensus is wrong.

To evaluate that possibility, a new paper published in the journal of Theoretical and Applied Climatology examines a selection of contrarian climate science research and attempts to replicate their results. The idea is that accurate scientific research should be replicable, and through replication we can also identify any methodological flaws in that research. The study also seeks to answer the question, why do these contrarian papers come to a different conclusion than 97% of the climate science literature?

This new study was authored by Rasmus Benestad, myself (Dana Nuccitelli), Stephan Lewandowsky, Katharine Hayhoe, Hans Olav Hygen, Rob van Dorland, and John Cook. Benestad (who did the lion’s share of the work for this paper) created a tool using the R programming language to replicate the results and methods used in a number of frequently-referenced research papers that reject the expert consensus on human-caused global warming. In using this tool, we discovered some common themes among the contrarian research papers.



Adapting to air pollution with clean air stands in China

Posted on 24 August 2015 by John Abraham

To adapt or to mitigate? That is the question that faces governments and industries across the globe as the impacts of climate change and pollution become ever clearer. It turns out, we will need to do both. The longer we allow pollution to be freely emitted, the fewer and more expensive will be the choices remaining to us.

Pollution adaptation can take many forms, but it generally means dealing with a pollutant after it has been emitted, or it can mean changing infrastructure to make it more resilient to heavy rains, floods, or more intense storms.

One great example of adaptation is being developed in Hong Kong and elsewhere in Asia by a major engineering company (Arup Engineering) and the CSR arm of a Hong Kong property developer (Sino Green). Arup and Sino Green are dealing with the environmental problem of localized airborne pollution. 

In many parts of the world, airborne pollution levels are very high and can be elevated for long periods of time. These high levels of pollution can pose health problems to people and animals, particularly people with other health problems or those who are young or the elderly. In some cases, the airborne pollution levels can be at high levels for 8,000 hours (90%+) in a single year.

There are many sources of pollution; for Arup Engineering, whose East Asia headquarters is in Hong Kong, much of the pollution is from nearby heavy industries across the border in mainland China and from vehicle emissions. At other locations, high levels of airborne pollution may be caused by burning of wood or dung for fire, slash-and-burn agricultural practices (particularly for countries near Indonesia), or from other causes. But, regardless of the cause, companies such as Arup are trying to find ways to reduce human exposure even when the airborne pollution levels are high.

Arup is embarking on an effort to provide filtered air zones for people who are street side, perhaps waiting for public transportation. Much like a bus stop, the proposed structure (called City Air Purification System) provides clean air flow to create a cocoon around bystanders, shown in the following photograph.


Dr. Jimmy Tong (right) and colleagues from Sino Group. From right to left, Mr. David Ng, Executive Assistant to Chairman, Mr. Daryl Ng, Executive Director, and Mr. Vincent Lo, General Manager, showcasing a patent-pending clean-air stand in Hong Kong.



2015 SkS Weekly Digest #34

Posted on 23 August 2015 by John Hartz

SkS Highlights... El Niño Watch... Toon of the Week... Quote of the Week... He Said What?... SkS in the News... SkS Spotlights... Poster of the Week... Coming Soon on SkS... SkS Week in Review... and 97 Hours of Consensus

SkS Highlights

Michael Sweet's post, New paper shows that renewables can supply 100% of all energy (not just electricity) garnered, by a wide margin, the most comments of the articles posted on SkS during the past week — demonstrating once again that our readers feel more comfortable expressing their opinions about energy matters than they do about the science of climate change. 

El Niño Watch

IT IS a long way from the western Pacific Ocean to the flooded streets of Buenos Aires where, this month, the city’s Good Samaritans have been distributing food and candles by kayak after some unseasonably heavy rain. But there is a link. Its name is El Niño.

Bringing up baby, The Economist, Aug 22, 2015 

Toon of the Week

 2015 SkS Toon 34

Hat tip to I Heart Climate Scientists 



2015 SkS Weekly News Roundup #34

Posted on 22 August 2015 by John Hartz

A chronological listing of the news articles posted on the Skeptical Science Facebook page during the past week. 

Sun, Aug 16

Mon, Aug 17



World Bank rejects energy industry notion that coal can cure poverty

Posted on 21 August 2015 by Guest Author

The World Bank said coal was no cure for global poverty on Wednesday, rejecting a main industry argument for building new fossil fuel projects in developing countries.

In a rebuff to coal, oil and gas companies, Rachel Kyte, the World Bank climate change envoy, said continued use of coal was exacting a heavy cost on some of the world’s poorest countries, in local health impacts as well as climate change, which is imposing even graver consequences on the developing world.

“In general globally we need to wean ourselves off coal,” Kyte told an event in Washington hosted by the New Republic and the Center for American Progress. “There is a huge social cost to coal and a huge social cost to fossil fuels … if you want to be able to breathe clean air.”

Coal, oil and gas companies have pushed back against efforts to fight climate change by arguing fossil fuels are a cure to “energy poverty”, which is holding back developing countries.

Peabody Energy, the world’s biggest privately held coal company, went so far as to claim that coal would have prevented the spread of the Ebola virus.

However, Kyte said that when it came to lifting countries out of poverty, coal was part of the problem – and not part of a broader solution.

“Do I think coal is the solution to poverty? There are more than 1 billion people today who have no access to energy,” Kyte said. Hooking them up to a coal-fired grid would not on its own wreck the planet, she went on.



New paper shows that renewables can supply 100% of all energy (not just electricity)

Posted on 20 August 2015 by michael sweet

A new paper: 100% clean and renewable wind, water, and sunlight (WWS) all-sector energy roadmaps for the 50 United States by Jacobson et al 2015 describes the wind, solar and other renewable technologies needed to supply all the energy used in the USA That is all the energy, not just the electricity. They find that using wind to generate 50% of energy, solar photovoltaic (PV) for 38%, concentrated solar power (CSP) for 13% and a combination of hydro, geothermal, tide and wave power for the remainder (5%) allows all energy in the USA to be supplied at a lower cost than using fossil fuels. (The total is over 100% as extra power is required to stabilize the power grid because the wind does not always blow and the sun does not always shine).

figure 5 from Jacobson et al

Fig. 5 (from Jacobson et al 2015) Time-dependent change in U.S. end-use power demand for all purposes (electricity, transportation, heating/cooling, and industry) and its supply by conventional fuels and WWS generators based on the state roadmaps proposed here. Total power demand decreases upon conversion to WWS due to the efficiency of electricity over combustion and end-use energy efficiency measures. The percentages on the horizontal date axis are the percent conversion to WWS that has occurred by that year. The percentages next to each WWS source are the final estimated penetration of the source. The 100% demarcation in 2050 indicates that 100% of all-purpose power is provided by WWS technologies by 2050, and the power demand by that time has decreased. 



Corrected sunspot history suggests climate change not due to natural solar trends

Posted on 19 August 2015 by Guest Author

This is a re-post from Astronomy Now

The Sunspot Number, the longest scientific experiment still ongoing, is a crucial tool used to study the solar dynamo, space weather and climate change. It has now been recalibrated and shows a consistent history of solar activity over the past few centuries.

The new record has no significant long-term upward trend in solar activity since 1700, as was previously indicated. This suggests that rising global temperatures since the industrial revolution cannot be attributed to increased solar activity.

The analysis, its results and its implications for climate research were made public on 7 August at a press briefing at the International Astronomical Union (IAU) XXIX General Assembly, currently taking place in Honolulu, Hawai`i, USA.

The Maunder Minimum, between 1645 and 1715, when sunspots were scarce and the winters harsh, strongly suggests a link between solar activity and climate change. Until now there was a general consensus that solar activity has been trending upwards over the past 300 years (since the end of the Maunder Minimum), peaking in the late 20th century — called the Modern Grand Maximum by some.

In this 1677 painting by Abraham Hondius, "The Frozen Thames, looking Eastwards towards Old London Bridge," people are shown enjoying themselves on the ice. In 17th century there was a prolonged reduction in solar activity called the Maunder Minimum, which lasted roughly from 1645 to 1700. During this period, there were only about 50 sunspots instead of the usual 40-50 thousand recorded. Image credit: Museum of London.

In this 1677 painting by Abraham Hondius, “The Frozen Thames, looking Eastwards towards Old London Bridge,” people are shown enjoying themselves on the ice. In 17th century there was a prolonged reduction in solar activity called the Maunder Minimum, which lasted roughly from 1645 to 1700. During this period, there were only about 50 sunspots instead of the usual 40-50 thousand recorded. Image credit: Museum of London.



2015 SkS Weekly Digest #33

Posted on 18 August 2015 by John Hartz

SkS Highlights, Toon of the Week, Quote of the Week, He Said What?, SkS Spotlights, Poster of the Week, Coming Soon on SkS, and 97 Hours of Consensus

SkS Highlights

There’s long been a troubling disconnect in climate science communication where we discuss the temperature charts we see from all the surface station data sets and the importance of keeping global mean surface temperature below 2C. This creates a challenge for anyone who wants to understand where we currently are relative to a 2C rise in temperature over preindustrial times.

For the average person, who might just now becoming interested in the climate change issue, how are they to comprehend this? How can we make the communication of this critical data point more clear and concise? How do we make it more relevant to the issue of climate change?

Rob Honeycutt, The 1C Milestone 

El Niño Watch

El Niño Could Rank Among Strongest on Record by Andrea Thompson, Climate Central, Aug 13, 2015 

Toon of the Week

 2015 Toon 33

Hat tip to I Heart Climate Scientists 



Fox News' inner struggle with climate misinformation

Posted on 17 August 2015 by dana1981

Research has shown that Fox News is a major driving force behind climate denial, decreasing viewer trust in scientists and the existence of global warming. In 2013, only 28% of Fox News’ climate science segments were accurate, although that was an improvement over its 7% accuracy in 2012.

Fox News anchor Shepard Smith has been one of the few voices on the network willing to accept the scientific reality of human-caused climate change. On the August 10 edition of Fox News’ Shepard Smith Reporting, Smith reported on biased industry-funded science by Coca Cola, and made the connection to fossil fuel-funded climate denial studies.

Lisa Kennedy Montgomery: It’s actually very brilliant marketing on the part of Coca Cola, because they realize that if someone hears that there’s a scientific study behind a reported fact, then they take that, they internalize it and take it to be true … So, what Coca Cola has decided to do is use that “science” in their favor. And if only they could find a few scientists willing to report that it’s not the calories but the lack of exercise that’s making people obese, then they can use this as a sort of an underground marketing strategy.

Shepard Smith: Well this reminds me of two things. The article in the New York Times this weekend pointed out, it reminds you of exactly what the tobacco industry did back in the day, and more recently it also reminds you of what the climate deniers, the climate change deniers are doing as well.

 August 10, 2015 segment on Shepard Smith Reporting.

In fact, just 2 days later, the Fox Business News show Varney & Co. used that strategy in an interview with Roy Spencer. Spencer is one of the fewer than 3% of climate scientists whose research rejects or minimizes the human contribution to global warming, and who infamously made comments about “global warming Nazis.”



2015 SkS Weekly News Roundup #33

Posted on 15 August 2015 by John Hartz

A chronological listing of the news articles posted on the Skeptical Science Facebook page during the past week. 

Sun, Aug 9



The Rap Guide to Climate Chaos

Posted on 14 August 2015 by Guest Author

What's Beef?, from the Rap Guide to Climate Chaos, a new hip-hop theatre production now playing at the Edinburgh Fringe.



A Powerful El Niño in 2015 Threatens a Massive Coral Reef Die-off

Posted on 13 August 2015 by Rob Painting

Key Points:
  • A powerful El Niño event continues to strengthen in the Pacific Ocean. During El Niño the poleward transport of warm surface water out of the tropics slows down dramatically and generally results in anomalous short-term heating of the tropical ocean - home to the world's coral reefs).
  • Because of the long-term warming of the oceans by industrial emission of greenhouse gases, the temporary surge in tropical sea surface temperatures associated with El Niño now threatens large-scale coral bleaching episodes - times when the maximum summer water temperatures become so warm that coral die in large numbers.
  • The powerful El Niño now forming, combined with the ongoing ocean warming, suggests that we are likely to see a mass coral bleaching episode that approaches, or exceeds, the worldwide bleaching that came with the Super El Niño years of 1982/1983 and 1997/1998. The 1997/1998 Super El Niño saw 16% of the world's coral bleach, the largest die-off ever observed, and some of this coral has never recovered.


Figure 1 - Coral bleaching outlook for August-November 2015 based on climate model forecasts. The figure is a from an new experimental product at NOAA's Coral Reef Watch.

Some Don't Like it Hot

The great irony of ocean warming and coral is that, until the late 20th century, it was actually beneficial to coral growth rates. Now, however, warming of the tropical ocean has progressed to a point where natural fluctuations of water temperature over summer months can now exceed an upper thermal tolerance threshold and which can often result in the death of coral reef communities. This is commonly known as mass coral bleaching, and takes place when coral have been subjected to water temperatures 1-3 degrees celsius above the normal summer maximum for several weeks or more.

Figure 2 - an example of coral bleaching. The loss of coloured pigments produced by the symbiotic algae makes the white coral skeleton visible beneath the coral polyp's translucent skin tissue.  Image from NOAA's Coral Reef watch.

Coral reefs consist of colonies of individual coral polyps which build their skeletons together so that they form massive structures capable of providing habitat for hundreds of thousands of marine species. Symbiosis is the term which describes the mutually beneficial relationship between the coral polyp and photosynthetic algae that live within its skin tissue. Through photosynthesis the algae provide food, in the form of sugars, to the polyp and also boost its immune system. In return the polyps provide safe lodgings for the algae. When this relationship breaks down, as it can do for a number of reasons but especially so when water temperature becomes too warm, the coral polyps expel the algae. Because the algae produce the pigments which give coral its colour, the loss of the algae results in the white coral skeleton becoming visible through the polyps transparent skin tissue - it appears to have been bleached.

Given sufficient time, most reefs can re-colonize areas that have been killed through bleaching, although some areas in the Galapagos Islands have never done so. But the issue with a warming ocean is that, eventually, the frequency and severity of bleaching will become so intense that coral reefs never recover (Hooidonk et al [2013]).



Earth Overshoot Day

Posted on 13 August 2015 by Guest Author

In less than 8 Months, Humanity exhausts Earth's budget for the year

Earth Overshoot Day 2015 lands on August 13. Please see the new website for Earth Overshoot Day at

Below is information from Earth Overshoot Day 2014:

August 19 is Earth Overshoot Day 2014, marking the date when humanity has exhausted nature’s budget for the year. For the rest of the year, we will maintain our ecological deficit by drawing down local resource stocks and accumulating carbon dioxide in the atmosphere. We will be operating in overshoot.

Just as a bank statement tracks income against expenditures, Global Footprint Network measures humanity’s demand for and supply of natural resources and ecological services. And the data is sobering. Global Footprint Network estimates that approximately every eight months, we demand more renewable resources and C02 sequestration than what the planet can provide for an entire year.

Read our 2014 press release in your language:

Click here to learn more about Earth Overshoot Day, and how it has changed over time.

Click here for an economics-focused press release on 2014 Earth Overshoot Day.

Click here for media coverage of Earth Overshoot Day 2014.

Earth Overshoot Day is the annual marker of when we begin living beyond our means in a given year. While only a rough estimate of time and resource trends, Earth Overshoot Day is as close as science can be to measuring the gap between our demand for ecological resources and services, and how much the planet can provide.


The Cost of Ecological Overspending

Throughout most of history, humanity has used nature’s resources to build cities and roads, to provide food and create products, and to absorb our carbon dioxide at a rate that was well within Earth’s budget. But in the mid-1970s, we crossed a critical threshold: Human consumption began outstripping what the planet could reproduce.

According to Global Footprint Network’s calculations, our demand for renewable ecological resources and the services they provide is now equivalent to that of more than 1.5 Earths. The data shows us on track to require the resources of two planets well before mid-century.

The fact that we are using, or “spending,” our natural capital faster than it can replenish is similar to having expenditures that continuously exceed income. In planetary terms, the costs of our ecological overspending are becoming more evident by the day. Climate change—a result of greenhouse gases being emitted faster than they can be absorbed by forests and oceans—is the most obvious and arguably pressing result. But there are others—shrinking forests, species loss, fisheries collapse, higher commodity prices and civil unrest, to name a few. The environmental and economic crises we are experiencing are symptoms of looming catastrophe. Humanity is simply using more than what the planet can provide.



2015 global temperatures are right in line with climate model predictions

Posted on 12 August 2015 by John Abraham

In an earlier post, I wrote about some research that compared ocean temperature measurements to climate model predictions. It turns out, the models have done a great job estimating the increase in ocean heat although they have slightly under-predicted the change.

What about other components of the Earth’s climate? For instance, how have the models done at predicting the changes in air temperatures? With recent data now available, we can make an assessment. I communicated with NASA GISS director Dr. Gavin Schmidt, who provided the following data. 

The graph shows the latest computer model simulations (from the CMIP project), which were used as input to the IPCC, along with five different temperature datasets. The comparison to be made is of the heavy dashed line (annotated in the graph just below the solid black line) and the colored lines. The heavy dashed line is the average predicted temperature including updated influences from a decrease in solar energy, human emitted heat-reflecting particles, and volcanic effects.

The dashed line is slightly above the colored markers in recent years, but the results are quite close. Furthermore, this year’s temperature to date is running hotter than 2014. To date, 2015 is almost exactly at the predicted mean value from the models. Importantly, the measured temperatures are well within the spread of the model predictions.

models v data

Comparison of the most recent climate model simulations with actual global surface temperature measurements. Created by Gavin Schmidt.



Geoengineering is ‘no substitute’ for cutting emissions, new studies show

Posted on 11 August 2015 by Guest Author

This is a re-post from Robert McSweeney at Carbon Brief

Attempts to limit climate change by removing carbon dioxide directly from the atmosphere would not prevent the irreversible damage to the oceans, according to a new study.

While a second study finds that brightening clouds to reflect more of the Sun's radiation could help boost crop yields in parts of China and Africa.

Speaking to Carbon Brief, authors from both studies highlight the importance of reducing carbon emissions now, rather than trying to engineer the climate later.


Geoengineering is the deliberate large-scale intervention into the Earth's climate system to try and limit human-caused climate change, and it can be divided into two main methods.

Removing carbon dioxide from the atmosphere, often described as Carbon Dioxide Removal (CDR), is one approach. The other is reflecting some sunlight away from the Earth before it can be trapped by greenhouse gases, commonly known as Solar Radiation Management (SRM).

The two new studies explore the implications of each of these methods, and the results are decidedly mixed.

Ocean acidification

We start with the oceans. About a quarter of the carbon dioxide emitted by human activity is taken up by the world's oceans. There it reacts with water to form carbonic acid, reducing the pH level and making the oceans less alkaline.

This process is known as ocean acidification, and it can have serious implications for marine life, says Sabine Mathesius from the Helmholtz Centre for Ocean Research in Kiel in Germany. She explains to Carbon Brief:

"The resulting increase in the ocean's acidity disturbs important biological processes, like the build-up of calcium carbonate shells. If ocean acidification continues at the current rate, many species at the bottom of the food chain, as well as corals, could face extinction. In consequence, species that depend on them, including fish and ultimately humans, would be affected too."

Mathesius is the lead author of a study, published in Nature Climate Change, which investigates whether CDR could help stave off ocean acidification. Her results suggest that continuing to emit carbon dioxide in the hope of being able to remove it from the atmosphere later could consign our oceans to changes that are irreversible on human timescales.

Carbon dioxide removal

The study focuses on two scenarios, or representative concentration pathways, for how we deal with rising carbon emissions.

RCP8.5 is the highest of the emissions scenarios used by the Intergovernmental Panel on Climate Change (IPCC), and is described in this paper as "business-as-usual". While RCP2.6is the lowest IPCC scenario, where emissions are curbed to keep global average temperature rise to within 2C above pre-industrial levels.

The researchers simulated applying three different levels of CDR to the RCP8.5 high emissions scenario: none ("CDR0"), five billion tonnes per year ("CDR5") and 25bn tonnes per year ("CDR25").

The paper notes that CDR25 is "probably unfeasible" as it is way beyond the potential of technologies that currently exist. Even CDR5 would required "gigantic efforts", but is technically possible, says Mathesius. Five billion tonnes of carbon is equivalent to 18.3bn tonnes of carbon dioxide - around half of global annual emissions in the present day.

The chart below shows what happens to total atmospheric carbon dioxide concentrations if these scenarios were played out over the centuries ahead. Only using CDR25 (orange and dashed blue lines) does carbon dioxide levels return to anything close to the RCP2.6 low emissions scenario (green dashed line), but not for at least two centuries.

Mathesius Et Al 2015 Fig1bProjections out to year 2700 of global atmospheric carbon dioxide concentrations under RCP8.5 with no CDR (black line), CDR5 from 2250 (red), CDR25 from 2250 (orange), CDR5 from 2050 (purple dashed), and CDR25 from 2150 (blue dashed). RCP2.6 shown as a green dashed line.



The 1C Milestone

Posted on 10 August 2015 by Rob Honeycutt

From Here to There and (Hopefully) Back Again

There’s long been a troubling disconnect in climate science communication where we discuss the temperature charts we see from all the surface station data sets and the importance of keeping global mean surface temperature below 2C. This creates a challenge for anyone who wants to understand where we currently are relative to a 2C rise in temperature over preindustrial times.

For the average person, who might just now becoming interested in the climate change issue, how are they to comprehend this? How can we make the communication of this critical data point more clear and concise? How do we make it more relevant to the issue of climate change?

Baselining and Anomalies

It’s important to understand that all the data sets are estimations of global mean temperature. We have multiple international groups all independently looking at the question of global temperature. Each have their own methods, and each have their strengths and limitations. And we also have hybrid estimates that attempt to combine different methods to improve on coverage and give a more accurate answer.

The surface temperature data sets (GISS, NOAA, HadCRUT4, Berkeley Earth and Cowtan & Way) are presented on different baselines and thus give us different relative temperature anomaly figures which are not specifically related to the 2C limit. For instance, HadCRUT4 baselines to 1961-1990, whereas GISS data baselines to 1951-1980. These baseline periods merely establish a zero axis for the data. Changing the baseline does not change the data, it only changes where the zero axis falls. (Tamino has a great explanation of this here.)

There is no perfect answer to the true global mean temperature of earth, but that’s rather inconsequential since we are primarily interested in understanding the change in global temperature. 

To make things even more confusing for the climate newby we have many tools out there that enable us to adjust the baseline we’re looking at. All the data sets publish their data relative to a set baseline, but for researchers it can be important to test differing baselines to reveal aspects of warming. For a newby it just makes things all the more confusing, and can easily play into the hands of people for whom confusion is a desired outcome.



The Consensus Project Website



(free to republish)



The Scientific Guide to
Global Warming Skepticism

Smartphone Apps


© Copyright 2015 John Cook
Home | Links | Translations | About Us | Contact Us