We are heading for the warmest climate in half a billion years
Posted on 27 June 2017 by Guest Author
Gavin Foster, Professor of Isotope Geochemistry, University of Southampton; Dana Royer, Professor of Earth and Environmental Sciences, Wesleyan University, and Dan Lunt, Professor of Climate Science, University of Bristol
This article was originally published on The Conversation. Read the original article.
Carbon dioxide concentrations are heading towards values not seen in the past 200m years. The sun has also been gradually getting stronger over time. Put together, these facts mean the climate may be heading towards warmth not seen in the past half a billion years.
A lot has happened on Earth since 500,000,000BC – continents, oceans and mountain ranges have come and gone, and complex life has evolved and moved from the oceans onto the land and into the air. Most of these changes occur on very long timescales of millions of years or more. However, over the past 150 years global temperatures have increased by about 1?, ice caps and glaciers have retreated, polar sea-ice has melted, and sea levels have risen.
Some will point out that Earth’s climate has undergone similar changes before. So what’s the big deal?
Scientists can seek to understand past climates by looking at the evidence locked away in rocks, sediments and fossils. What this tells us is that yes, the climate has changed in the past, but the current speed of change is highly unusual. For instance, carbon dioxide hasn’t been added to the atmosphere as rapidly as today for at least the past 66m years.
In fact, if we continue on our current path and exploit all convention fossil fuels, then as well as the rate of CO? emissions, the absolute climate warming is also likely to be unprecedented in at least the past 420m years. That’s according to a new study we have published in Nature Communications.
In terms of geological time, 1? of global warming isn’t particularly unusual. For much of its history the planet was significantly warmer than today, and in fact more often than not Earth was in what is termed a “greenhouse” climate state. During the last greenhouse state 50m years ago, global average temperatures were 10-15? warmer than today, the polar regions were ice-free, palm trees grew on the coast of Antarctica, and alligators and turtles wallowed in swamp-forests in what is now the frozen Canadian Arctic.
In contrast, despite our current warming, we are still technically in an “icehouse” climate state, which simply means there is ice on both poles. The Earth has naturally cycled between these two climate states every 300m years or so.
Just prior to the industrial revolution, for every million molecules in the atmosphere, about 280 of them were CO? molecules (280 parts-per-million, or ppm). Today, due primarily to the burning of fossil fuels, concentrations are about 400 ppm. In the absence of any efforts to curtail our emissions, burning of conventional fossil fuels will cause CO? concentrations to be around 2,000ppm by the year 2250.
This is of course a lot of CO?, but the geological record tells us that the Earth has experienced similar concentrations several times in the past. For instance, our new compilation of data shows that during the Triassic, around 200m years ago, when dinosaurs first evolved, Earth had a greenhouse climate state with atmospheric CO? around 2,000-3,000ppm.
So high concentrations of carbon dioxide don’t necessarily make the world totally uninhabitable. The dinosaurs thrived, after all.
That doesn’t mean this is no big deal, however. For a start, there is no doubt that humanity will face major socio-economic challenges dealing with the dramatic and rapid climate change that will result from the rapid rise to 2,000 or more ppm.
But our new study also shows that the same carbon concentrations will cause more warming in future than in previous periods of high carbon dioxide. This is because the Earth’s temperature does not just depend on the level of CO? (or other greenhouse gases) in the atmosphere. All our energy ultimately comes from the sun, and due to the way the sun generates energy through nuclear fusion of hydrogen into helium, its brightness has increased over time. Four and a half billion years ago when the Earth was young the sun was around 30% less bright.
So what really matters is the combined effect of the sun’s changing strength and the varying greenhouse effect. Looking through geological history we generally found that as the sun became stronger through time, atmospheric CO? gradually decreased, so both changes cancelled each other out on average.
But what about in the future? We found no past time period when the drivers of climate, or climate forcing, was as high as it will be in the future if we burn all the readily available fossil fuel. Nothing like it has been recorded in the rock record for at least 420m years.
A central pillar of geological science is the uniformitarian principle: that “the present is the key to the past”. If we carry on burning fossil fuels as we are at present, by 2250 this old adage is sadly no longer likely to be true. It is doubtful that this high-CO? future will have a counterpart, even in the vastness of the geological record.
A swamp like hot, wet climate will be an incubator for all sorts of tropical diseases, just at a point where we are having problems with antibiotics. And dont kid yourself technology will solve this, as new antibiotics will develop the same problems.
Some people argue a warmer climate is a good thing, but it has plenty of downsides. Heatwaves combined with higher humidity doesn't sound like a good combination, and this is costly to adapt to.
My country will get more rainfall, and has already seen an increase in rainfall. The trouble is this is mainly falling in the one geographical location where it's of no use to either electricity generation or agriculture.
This giant, unintended experiment with climate is high risk, and solutions like geoengineering would be high risk, ambulance at the bottom of the cliff approaches. But thats typical of humanity in so many ways, as our record in prevention isn't so great.
Humans can adapt in part, such as building climate controlled cities, such as the one proposed in Dubai, but I'm sure that geo-engineering will be implemented if the summer temperatures start to exceed 40C on a regular basis in large mid latitude cities. The risks of geo-engineering may be high but still much lower than doing nothing.
Art Vandelay,
While humans could build structures and survive, what would help all the agricultural animals left in the open? Not to mention that crops fail at high temperatures. While people can live in buildings they have to grow food.
Many scientists think the dangers of geoengineering exceed the benefits. Sulfate aerosols, the most common "solution" proposed, causes severe drought. Does that really solve the problem or just trade one problem for another? It is much more cost effective and safer to pollute less and reduce AGW than to pollute more and try to use untested technology to get out of a problem.
You complain elsewhere that people concerned about AGW are "advocates that overstate the effectiveness and avoid mentioning potential down-sides ." Perhaps you need to look in the mirror and see how much that applies to you.
Please provide a citation to support your claim that "The risks of geo-engineering may be high but still much lower than doing nothing." Most of the analysis I have seen is the opposite.
A question from a non-geophysicist: To what extent has the additional heat from the strengthening sun, over the past 4.5 B years, been counterbalanced by the cooling of the earth's core?
amhartley @4, 0.6 billion years ago, at the start of the phanerozoic, the Sun was approximately 5% less luminous than it currently is. That is, where it currently provides 240 W/m^2 of energy to the Earth, it then provided 228 W/m^2 assuming the same albedo. 4.5 billion years ago, it was about 75% of current values, or 180 W/m^2. The difference is 12 W/m^2. The supply of energy to the Earth's surface currently amounts to approximately 44 TW for the whole of the Earth's surface (or approx 0.1 W/m^2). The decay of radioactive elements contributes about half of that (20 TW), and contributed about 100 TW in the very distant past:
If we assume that was the primary factor governing the Earth's heat flux, then we would assume that in the distant past, the geothermal flux was about 0.5 W/m^2, or a decrease of about 0.7% of the increase in insolation.
The story is not so simple, however, as the total surface flux depends on the rate at which heat travels to the surface. There is substantial evidence that this has changed over time, such that heat flux increased over time until about 2.5 billion years ago, and has been decreasing thereafter. The upshot is that over the full 4.5 billion years of the Earth's history, surface heat flux has been relatively constant; and that the decrease over the last 2.5 billion years has been several orders of magnitude less than the increase in insolation.
Tom@5,
How did you arrive at the 12 W/m^2 difference in isolation between 4.5Gy ago and now?
The actual difference of 240 W/m^2 and 180 W/m^2 is rather 60 W/m^2 and that number is for average sphere insolation (1/4 of TCI) and attenuated -30% for Earth albedo (current TCI is 1370W/m^2). So if I did not miss anything major your number appears to be underestimate by 5 times, and would strengthen your point that the geothermal heat flux had virtualy no impact on Earth's energy budget over its history.
Sorry for my typo above - I meant TSI (total solar irradiance), "TCI" is meaningless here.
chriskoz @6, sorry, poor editing on my part. I initially calculated the difference for 600 million years ago, which is 12 W/m^2. I then later thought it appropriate to calculate back to 4.5 billion years ago, but put it before the sentence about 12 W/m^2 rather than after it where it belonged. Sorry for the confusion.