Uncertainty is Exxon's friend, but it's not ours

Stephan Lewandowsky is a cognitive scientist at the University of Bristol who studies uncertainty in climate-human interactions.

Richard Pancost is a Biogeochemist who studies ancient climates and is the Director of the Cabot Institute at the University of Bristol.

Timothy Ballard is a cognitive scientist at the University of Queensland.

Beginning in the late 1970s, Exxon scientists started telling their top executives about the risk from climate change. By the 1980s, Exxon scientists shared the consensus view that the global climate was sensitive to greenhouse gases such as carbon dioxide.

However, when the issue of cuts to carbon emissions became prominent in the early 1990s, Exxon assumed a different position in public and embarked on a campaign against the increasingly clear fact that the Earth is warming from greenhouse gas emissions. This campaign was centered on scientific uncertainty: time and again, Exxon executives expressed their doubts about the science, turning their backs on their own scientists’ research.

This was no accident: Appeals to uncertainty to preclude or delay political action are so pervasive in political and lobbying circles that they have attracted scholarly attention under the name “Scientific Certainty Argumentation Methods”, or “SCAMs” for short. SCAMs are politically effective because they equate uncertainty with the possibility that a problem may be less serious than anticipated, while ignoring the often greater likelihood that the problem may be more deleterious.

In the case of climate change, analyses of the role of scientific uncertainty in the climate system have repeatedly revealed that greater uncertainty about the climate’s sensitivity to carbon emissions means that there is greater, not lesser, risk.

Potential climate surprises are more likely to be calamitous than benign, because the probability of adverse climate events, such as flooding resulting from sea level rise, increases with increasing uncertainty, all other factors being equal. A sea wall that might cost $1,000,000 if we knew the extent of sea level rise with great precision may cost $2,000,000 if the same expected sea level rise is known with lesser precision.

Given that the extent of uncertainty translates into the magnitude of the risk and hence into the price tag of adaptation measures, uncertainty—ironically—is a source of actionable knowledge rather than an indicator of ignorance.

Uncertainty can be a source of knowledge in a number of ways.

First, in many cases uncertainty is a mathematical expression of the knowledge we do have. Concerning climate change, our knowledge is extensive and firm: We know that doubling the concentration of carbon dioxide in the atmosphere compared to pre-industrial levels will cause warming of somewhere between 1.5°C and 4.5°C. That range is determined by our recognized knowledge of the climate system.

A troubling aspect of this range is that it has a “fat upper tail”, implying that deleterious surprises are more likely than benign surprises. Related to this fact is that if the range of estimates increases, even if it is by a lowering of the lower bound, that seemingly good news may in fact be bad news: When the IPCC lowered its lower estimate of the range of warming from 2°C to 1.5°C, intuitively this appeared to signal that things may not be so bad after all. In actual fact, mathematics developed by Mark Freeman and colleagues show otherwise, and the decreased lower boundary (from 2°C to 1.5°C) combined with an unchanged upper boundary (4.5°C) has actually increased the expected risk.

Even deep systemic uncertainties, often described as “unknown unknowns”, can—ironically—be a source of knowledge. Wendy Parker and James Risbey have recently shown that although unknown unknowns, by definition, cannot be anticipated in the particular, the likelihood that one or the other surprise may arise can be anticipated. We cannot specify what the unknown unknowns are but it doesn’t follow that it is impossible to recognize the risk that such features exist in climatic-ecological-social systems.

The potential for future surprises is particularly large in systems in which surprises have arisen in the past: The climate system has revealed its potential instability through Dansgaard–Oeschger events during the Last Glacial period or the rapid carbon release and warming of the Paleocene-Eocene Thermal Maximum. Moreover, the risk of surprises is known to increase if complex systems are driven outside the conditions in which they have been operating in the past. Within the past year, CO2 concentrations have exceeded 400ppm probably for the first time in 3 Million years.

We therefore know that we are driving a complex system that is prone to sudden surprises outside the conditions in which it has been operating for millennia. We therefore know that we are increasing the chance of encountering unknown unknowns, and this knowledge arising out of deep uncertainty entails an impetus for climate mitigation.

Another way in which uncertainty about the future can be a source of knowledge is indirectly, through its adverse psychological consequences. Those consequences have been largely overlooked to date. For example, perceived uncertainty increases people’s response to aversive stimuli if they occur: we respond more emotionally to adverse outcomes when they are uncertain. Uncertainty also comes with a price tag concerning our ability to think: under conditions of uncertainty, people have been shown to be biased against creativity and prefer mundane functionality instead. The uncertainty that inevitably arises in times of crises may therefore stimulate a bias against the very creativity that would be needed to solving the crisis.

Click here to read the rest

 

 

Posted by Guest Author on Tuesday, 1 December, 2015


Creative Commons License The Skeptical Science website by Skeptical Science is licensed under a Creative Commons Attribution 3.0 Unported License.