Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

2010 - 2011: Earth's most extreme weather since 1816?

Posted on 27 June 2011 by Jeff Masters

Every year extraordinary weather events rock the Earth. Records that have stood centuries are broken. Great floods, droughts, and storms affect millions of people, and truly exceptional weather events unprecedented in human history may occur. But the wild roller-coaster ride of incredible weather events during 2010, in my mind, makes that year the planet's most extraordinary year for extreme weather since reliable global upper-air data began in the late 1940s. Never in my 30 years as a meteorologist have I witnessed a year like 2010--the astonishing number of weather disasters and unprecedented wild swings in Earth's atmospheric circulation were like nothing I've seen. The pace of incredible extreme weather events in the U.S. over the past few months have kept me so busy that I've been unable to write-up a retrospective look at the weather events of 2010. But I've finally managed to finish, so fasten your seat belts for a tour through the top twenty most remarkable weather events of 2010. At the end, I'll reflect on what the wild weather events of 2010 and 2011 imply for our future.

Earth's hottest year on record
Unprecedented heat scorched the Earth's surface in 2010, tying 2005 for the warmest year since accurate records began in the late 1800s. Temperatures in Earth's lower atmosphere also tied for warmest year on record, according to independent satellite measurements. Earth's 2010 record warmth was unusual because it occurred during the deepest solar energy minimum since satellite measurements of the sun began in the 1970s. Unofficially, nineteen nations (plus the the U.K.'s Ascension Island) set all-time extreme heat records in 2010. This includes Asia's hottest reliably measured temperature of all-time, the remarkable 128.3°F (53.5°C) in Pakistan in May 2010. This measurement is also the hottest reliably recorded temperature anywhere on the planet except for in Death Valley, California. The countries that experienced all-time extreme highs in 2010 constituted over 20% of Earth's land surface area.


Figure 1. Climate Central and Weather Underground put together this graphic showing the nineteen nations (plus one UK territory, Ascension Island) that set new extreme heat records in 2010.

Most extreme winter Arctic atmospheric circulation on record; "Snowmageddon" results
The atmospheric circulation in the Arctic took on its most extreme configuration in 145 years of record keeping during the winter of 2009 - 2010. The Arctic is normally dominated by low pressure in winter, and a "Polar Vortex" of counter-clockwise circulating winds develops surrounding the North Pole. However, during the winter of 2009 - 2010, high pressure replaced low pressure over the Arctic, and the Polar Vortex weakened and even reversed at times, with a clockwise flow of air replacing the usual counter-clockwise flow of air. This unusual flow pattern allowed cold air to spill southwards and be replaced by warm air moving poleward. Like leaving the refrigerator door ajar, the Arctic "refrigerator" warmed, and cold Arctic air spilled out into "living room" where people live. A natural climate pattern called the North Atlantic Oscillation (NAO), and its close cousin, the Arctic Oscillation (AO) were responsible. Both of these patterns experienced their strongest-on-record negative phase, when measured as the pressure difference between the Icelandic Low and Azores High.

The extreme Arctic circulation caused a bizarre upside-down winter over North America--Canada had its warmest and driest winter on record, forcing snow to be trucked in for the Winter Olympics in Vancouver, but the U.S. had its coldest winter in 25 years. A series of remarkable snow storms pounded the Eastern U.S., with the "Snowmageddon" blizzard dumping more than two feet of snow on Baltimore and Philadelphia. Western Europe also experienced unusually cold and snowy conditions, with the UK recording its 8th coldest January. A highly extreme negative phase of the NAO and AO returned again during November 2010, and lasted into January 2011. Exceptionally cold and snowy conditions hit much of Western Europe and the Eastern U.S. again in the winter of 2010 - 2011. During these two extreme winters, New York City recorded three of its top-ten snowstorms since 1869, and Philadelphia recorded four of its top-ten snowstorms since 1884. During December 2010, the extreme Arctic circulation over Greenland created the strongest ridge of high pressure ever recorded at middle levels of the atmosphere, anywhere on the globe (since accurate records began in 1948.) New research suggests that major losses of Arctic sea ice could cause the Arctic circulation to behave so strangely, but this work is still speculative.


Figure 2. Digging out in Maryland after "Snowmageddon". Image credit: wunderphotographer chills.

Arctic sea ice: lowest volume on record, 3rd lowest extent
Sea ice in the Arctic reached its third lowest areal extent on record in September 2010. Compared to sea ice levels 30 years ago, 1/3 of the polar ice cap was missing--an area the size of the Mediterranean Sea. The Arctic has seen a steady loss of meters-thick, multi-year-old ice in recent years that has left thin, 1 - 2 year-old ice as the predominant ice type. As a result, sea ice volume in 2010 was the lowest on record. More than half of the polar icecap by volume--60%--was missing in September 2010, compared to the average from 1979 - 2010. All this melting allowed the Northwest Passage through the normally ice-choked waters of Canada to open up in 2010. The Northeast Passage along the coast of northern Russia also opened up, and this was the third consecutive year--and third time in recorded history--that both passages melted open. Two sailing expeditions--one Russian and one Norwegian--successfully navigated both the Northeast Passage and the Northwest Passage in 2010, the first time this feat has been accomplished. Mariners have been attempting to sail the Northwest Passage since 1497, and have failed to accomplish this feat without an icebreaker until the 2000s. In December 2010, Arctic sea ice fell to its lowest winter extent on record, the beginning of a 3-month streak of record lows. Canada's Hudson Bay did not freeze over until mid-January of 2011, the latest freeze-over date in recorded history.


Figure 3. The Arctic's minimum sea ice extent for 2010 was reached on September 21, and was the third lowest on record. Image credit: National Snow and Ice Data Center.

Record melting in Greenland, and a massive calving event
Greenland's climate in 2010 was marked by record-setting high air temperatures, the greatest ice loss by melting since accurate records began in 1958, the greatest mass loss of ocean-terminating glaciers on record, and the calving of a 100 square-mile ice island--the largest calving event in the Arctic since 1962. Many of these events were due to record warm water temperatures along the west coast of Greenland, which averaged 2.9°C (5.2°F) above average during October 2010, a remarkable 1.4°C above the previous record high water temperatures in 2003.


Figure 4. The 100 square-mile ice island that broke off the Petermann Glacier heads out of the Petermann Fjord in this 7-frame satellite animation. The animation begins on August 5, 2010, and ends on September 21, with images spaced about 8 days apart. The images were taken by NASA's Aqua and Terra satellites.

Second most extreme shift from El Niño to La Niña
The year 2010 opened with a strong El Niño event and exceptionally warm ocean waters in the Eastern Pacific. However, El Niño rapidly waned in the spring, and a moderate to strong La Niña developed by the end of the year, strongly cooling these ocean waters. Since accurate records began in 1950, only 1973 has seen a more extreme swing from El Niño to La Niña. The strong El Niño and La Niña events contributed to many of the record flood events seen globally in 2010, and during the first half of 2011.


Figure 5. The departure of sea surface temperatures from average at the beginning of 2010 (top) and the end of 2010 (bottom) shows the remarkable transition from strong El Niño to strong La Niña conditions that occurred during the year. Image credit: NOAA/NESDIS.

Second worst coral bleaching year
Coral reefs took their 2nd-worst beating on record in 2010, thanks to record or near-record warm summer water temperatures over much of Earth's tropical oceans. The warm waters caused the most coral bleaching since 1998, when 16 percent of the world's reefs were killed off. "Clearly, we are on track for this to be the second worst (bleaching) on record," NOAA coral expert Mark Eakin in a 2010 interview. "All we're waiting on now is the body count." The summer 2010 coral bleaching episodes were worst in the Philippines and Southeast Asia, where El Niño warming of the tropical ocean waters during the first half of the year was significant. In Indonesia's Aceh province, 80% of the bleached corals died, and Malaysia closed several popular dive sites after nearly all the coral were damaged by bleaching. In some portions of the Caribbean, such as Venezuela and Panama, coral bleaching was the worst on record.


Figure 6. An example of coral bleaching that occurred during the record-strength 1997-1998 El Niño event. Image credit: Craig Quirolo, Reef Relief/Marine Photobank, in Climate, Carbon and Coral Reefs

Wettest year over land
The year 2010 also set a new record for wettest year in Earth's recorded history over land areas. The difference in precipitation from average in 2010 was about 13% higher than that of the previous record wettest year, 1956. However, this record is not that significant, since it was due in large part to random variability of the jet stream weather patterns during 2010. The record wetness over land was counterbalanced by relatively dry conditions over the oceans.


Figure 7. Global departure of precipitation over land areas from average for 1900 - 2010. The year 2010 set a new record for wettest year over land areas in Earth's recorded history. The difference in precipitation from average in 2010 was about 13% higher than that of the previous record wettest year, 1956. Image credit: NOAA's National Climatic Data Center.

Amazon rainforest experiences its 2nd 100-year drought in 5 years
South America's Amazon rainforest experienced its second 100-year drought in five years during 2010, with the largest northern tributary of the Amazon River--the Rio Negro--dropping to thirteen feet (four meters) below its usual dry season level. This was its lowest level since record keeping began in 1902. The low water mark is all the more remarkable since the Rio Negro caused devastating flooding in 2009, when it hit an all-time record high, 53 ft (16 m) higher than the 2010 record low. The 2010 drought was similar in intensity and scope to the region's previous 100-year drought in 2005. Drought makes a regular appearance in the Amazon, with significant droughts occurring an average of once every twelve years. In the 20th century, these droughts typically occurred during El Niño years, when the unusually warm waters present along the Pacific coast of South America altered rainfall patterns. But the 2005 and 2010 droughts did not occur during El Niño conditions, and it is theorized that they were instead caused by record warm sea surface temperatures in the Atlantic.

We often hear about how important Arctic sea ice is for keeping Earth's climate cool, but a healthy Amazon is just as vital. Photosynthesis in the world's largest rainforest takes about 2 billion tons of carbon dioxide out of the air each year. However, in 2005, the drought reversed this process. The Amazon emitted 3 billion tons of CO2 to the atmosphere, causing a net 5 billion ton increase in CO2 to the atmosphere--roughly equivalent to 16 - 22% of the total CO2 emissions to the atmosphere from burning fossil fuels that year. The Amazon stores CO2 in its soils and biomass equivalent to about fifteen years of human-caused emissions, so a massive die-back of the forest could greatly accelerate global warming.


Figure 8. Hundreds of fires (red squares) generate thick smoke over a 1000 mile-wide region of the southern Amazon rain forest in this image taken by NASA's Aqua satellite on August 16, 2010. The Bolivian government declared a state of emergency in mid-August due to the out-of-control fires burning over much of the country. Image credit: NASA.

Global tropical cyclone activity lowest on record
The year 2010 was one of the strangest on record for tropical cyclones. Each year, the globe has about 92 tropical cyclones--called hurricanes in the Atlantic and Eastern Pacific, typhoons in the Western Pacific, and tropical cyclones in the Southern Hemisphere. But in 2010, we had just 68 of these storms--the fewest since the dawn of the satellite era in 1970. The previous record slowest year was 1977, when 69 tropical cyclones occurred world-wide. Both the Western Pacific and Eastern Pacific had their quietest seasons on record in 2010, but the Atlantic was hyperactive, recording its 3rd busiest season since record keeping began in 1851. The Southern Hemisphere had a slightly below average season. The Atlantic ordinarily accounts for just 13% of global cyclone activity, but accounted for 28% in 2010--the greatest proportion since accurate tropical cyclone records began in the 1970s.

A common theme of many recent publications on the future of tropical cyclones globally in a warming climate is that the total number of these storms will decrease, but the strongest storms will get stronger. For example, a 2010 review paper published in Nature Geosciences concluded that the strongest storms would increase in intensity by 2 - 11% by 2100, but the total number of storms would fall by 6 - 34%. It is interesting that 2010 saw the lowest number of global tropical cyclones on record, but an average number of very strong Category 4 and 5 storms (the 25-year average is 13 Category 4 and 5 storms, and 2010 had 14.) Fully 21% of 2010's tropical cyclones reached Category 4 or 5 strength, versus just 14% during the period 1983 - 2007. Most notably, in 2010 we had Super Typhoon Megi. Megi's sustained winds cranked up to a ferocious 190 mph and its central pressure bottomed out at 885 mb on October 16, making it the 8th most intense tropical cyclone in world history. Other notable storms in 2010 included the second strongest tropical cyclone on record in the Arabian Sea (Category 4 Cyclone Phet in June), and the strongest tropical cyclone ever to hit Myanmar/Burma (October's Tropical Cyclone Giri, an upper end Category 4 storm with 155 mph winds.)


Figure 9. Visible satellite image of Tropical Cyclone Phet on Thursday, June 3, 2010. Record heat over southern Asia in May helped heat up the Arabian Sea to 2°C above normal, and the exceptionally warm SSTs helped fuel Tropical Cyclone Phet into the second strongest tropical cyclone ever recorded in the Arabian Sea. Phet peaked at Category 4 strength with 145 mph winds, and killed 44 people and did $700 million in damage to Oman. Only Category 5 Cyclone Gonu of 2007 was a stronger Arabian Sea cyclone.

A hyperactive Atlantic hurricane season: 3rd busiest on record
Sea surface temperatures that were the hottest on record over the main development region for Atlantic hurricanes helped fuel an exceptionally active 2010 Atlantic hurricane season. The nineteen named storms were the third most since 1851; the twelve hurricanes of 2010 ranked second most. Three major hurricanes occurred in rare or unprecedented locations. Julia was the easternmost major hurricane on record, Karl was the southernmost major hurricane on record in the Gulf of Mexico, and Earl was the 4th strongest hurricane so far north. The formation of Tomas so far south and east so late in the season (October 29) was unprecedented in the historical record; no named storm had ever been present east of the Lesser Antilles (61.5°W) and south of 12°N latitude so late in the year. Tomas made the 2010 the 4th consecutive year with a November hurricane in the Atlantic--an occurrence unprecedented since records began in 1851.


Figure 10. Hurricane Earl as seen from the International Space Station on Thursday, September 2, 2010. Image credit: NASA astronaut Douglas Wheelock.

A rare tropical storm in the South Atlantic
A rare tropical storm formed in the South Atlantic off the coast of Brazil on March 10 - 11, and was named Tropical Storm Anita. Brazil has had only one landfalling tropical cyclone in its history, Cyclone Catarina of March 2004, one of only seven known tropical or subtropical cyclones to form in the South Atlantic, and the only one to reach hurricane strength. Anita of 2010 is probably the fourth strongest tropical/subtropical storm in the South Atlantic, behind Hurricane Catarina, an unnamed February 2006 storm that may have attained wind speeds of 65 mph, and a subtropical storm that brought heavy flooding to the coast of Uruguay in January 2009. Tropical cyclones rarely form in the South Atlantic Ocean, due to strong upper-level wind shear, cool water temperatures, and the lack of an initial disturbance to get things spinning (no African waves or Intertropical Convergence Zone.)


Figure 11. Visible satellite image of the Brazilian Tropical Storm Anita.

Strongest storm in Southwestern U.S. history
The most powerful low pressure system in 140 years of record keeping swept through the Southwest U.S. on January 20 - 21, 2010, bringing deadly flooding, tornadoes, hail, hurricane force winds, and blizzard conditions. The storm set all-time low pressure records over roughly 10 - 15% of the U.S.--southern Oregon, California, Nevada, Arizona, and Utah. Old records were broken by a wide margin in many locations, most notably in Los Angeles, where the old record of 29.25" set January 17, 1988, was shattered by .18" (6 mb). The record-setting low spawned an extremely intense cold front that swept through the Southwest. Winds ahead of the cold front hit sustained speeds of hurricane force--74 mph--at Apache Junction, 40 miles east of Phoenix, and wind gusts as high as 94 mph were recorded in Ajo, Arizona. High winds plunged visibility to zero in blowing dust on I-10 connecting Phoenix and Tucson, closing the Interstate.


Figure 12. Ominous clouds hover over Arizona's Superstition Mountains during Arizona's most powerful storm on record, on January 21, 2010. Image credit: wunderphotographer ChandlerMike.

Strongest non-coastal storm in U.S. history
A massive low pressure system intensified to record strength over northern Minnesota on October 26, 2010, resulting in the lowest barometric pressure readings ever recorded in the continental United States, except for from hurricanes and nor'easters affecting the Atlantic seaboard. The 955 mb sea level pressure reported from Bigfork, Minnesota beat the previous low pressure record of 958 mb set during the Great Ohio Blizzard of January 26, 1978. Both Minnesota and Wisconsin set all time low pressure readings during the October 26 storm, and International Falls beat their previous low pressure record by nearly one-half inch of mercury--a truly amazing anomaly. The massive storm spawned 67 tornadoes over a four-day period, and brought sustained winds of 68 mph to Lake Superior.


Figure 13. Visible satellite image of the October 26, 2010 superstorm taken at 5:32pm EDT. At the time, Bigfork, Minnesota was reporting the lowest pressure ever recorded in a U.S. non-coastal storm, 955 mb. Image credit: NASA/GSFC.

Weakest and latest-ending East Asian monsoon on record
The summer monsoon over China's South China Sea was the weakest and latest ending monsoon on record since detailed records began in 1951, according to the Beijing Climate Center. The monsoon did not end until late October, nearly a month later than usual. The abnormal monsoon helped lead to precipitation 30% - 80% below normal in Northern China and Mongolia, and 30 - 100% above average across a wide swath of Central China. Western China saw summer precipitation more than 200% above average, and torrential monsoon rains triggered catastrophic landslides that killed 2137 people and did $759 million in damage. Monsoon floods in China killed an additional 1911 people, affected 134 million, and did $18 billion in damage in 2010, according to the WHO Collaborating Centre for Research on the Epidemiology of Disasters (CRED). This was the 2nd most expensive flooding disaster in Chinese history, behind the $30 billion price tag of the 1998 floods that killed 3656 people. China had floods in 1915, 1931, and 1959 that killed 3 million, 3.7 million, and 2 million people, respectively, but no damage estimates are available for these floods.


Figure 14. Paramilitary policemen help evacuate residents from Wanjia village of Fuzhou City, East China's Jiangxi province, June 22, 2010. Days of heavy rain burst the Changkai Dike of Fu River on June 21, threatening the lives of 145,000 local people. Image credit: Xinhua.

No monsoon depressions in India's Southwest Monsoon for 2nd time in 134 years
The Southwest Monsoon that affects India was fairly normal in 2010, bringing India rains within 2% of average. Much of the rain that falls in India from the monsoon typically comes from large regions of low pressure that form in the Bay of Bengal and move westwards over India. Typically, seven of these lows grow strong and well-organized enough to be labelled monsoon depressions, which are similar to but larger than tropical depressions. In 2010, no monsoon depressions formed--the only year besides 2002 (since 1877) that no monsoon depressions have been observed.

The Pakistani flood: most expensive natural disaster in Pakistan's history
A large monsoon low developed over the Bay of Bengal in late July and moved west towards Pakistan, creating a strong flow of moisture that helped trigger the deadly Pakistan floods of 2010. The floods were worsened by a persistent and unusually-far southwards dip in the jet stream, which brought cold air and rain-bearing low pressure systems over Pakistan. This unusual bend in the jet stream also helped bring Russia its record heat wave and drought. The Pakistani floods were the most expensive natural disaster in Pakistani history, killing 1985 people, affecting 20 million, and doing $9.5 billion in damage.


Figure 15. Local residents attempt to cross a washed-out road during the Pakistani flood catastrophe of 2010. Image credit: Pakistan Meteorology Department.

The Russian heat wave and drought: deadliest heat wave in human history
A scorching heat wave struck Moscow in late June 2010, and steadily increased in intensity through July as the jet stream remained "stuck" in an unusual loop that kept cool air and rain-bearing low pressure systems far north of the country. By July 14, the mercury hit 31°C (87°F) in Moscow, the first day of an incredible 33-day stretch with a maximum temperatures of 30°C (86°F) or higher. Moscow's old extreme heat record, 37°C (99°F) in 1920, was equaled or exceeded five times in a two-week period from July 26 - August 6 2010, including an incredible 38.2°C (101°F) on July 29. Over a thousand Russians seeking to escape the heat drowned in swimming accidents, and thousands more died from the heat and from inhaling smoke and toxic fumes from massive wild fires. The associated drought cut Russia's wheat crop by 40%, cost the nation $15 billion, and led to a ban on grain exports. The grain export ban, in combination with bad weather elsewhere in the globe during 2010 - 2011, caused a sharp spike in world food prices that helped trigger civil unrest across much of northern Africa and the Middle East in 2011. At least 55,000 people died due to the heat wave, making it the deadliest heat wave in human history. A 2011 NOAA study concluded that "while a contribution to the heat wave from climate change could not be entirely ruled out, if it was present, it played a much smaller role than naturally occurring meteorological processes in explaining this heat wave's intensity." However, they noted that the climate models used for the study showed a rapidly increasing risk of such heat waves in western Russia, from less than 1% per year in 2010, to 10% or more per year by 2100.


Figure 16. Smoke from wildfires burning to the southeast of Moscow on August 12, 2010. Northerly winds were keeping the smoke from blowing over the city. Image credit: NASA.

Record rains trigger Australia's most expensive natural disaster in history
Australia's most expensive natural disaster in history is now the Queensland flood of 2010 - 2011, with a price tag as high as $30 billion. At least 35 were killed. The Australian Bureau of Meteorology's annual summary reported, "Sea surface temperatures in the Australian region during 2010 were the warmest value on record for the Australian region. Individual high monthly sea surface temperature records were also set during 2010 in March, April, June, September, October, November and December. Along with favourable hemispheric circulation associated with the 2010 La Niña, very warm sea surface temperatures contributed to the record rainfall and very high humidity across eastern Australia during winter and spring." In 2010, Australia had its wettest spring (September - November) since records began 111 years ago, with some sections of coastal Queensland receiving over 4 feet (1200 mm) of rain. Rainfall in Queensland and all of eastern Australia in December was the greatest on record, and the year 2010 was the rainiest year on record for Queensland. Queensland has an area the size of Germany and France combined, and 3/4 of the region was declared a disaster zone.


Figure 17. The airport, the Bruce Highway, and large swaths of Rockhampton, Australia, went under water due to flooding from the Fitzroy River on January 9, 2011. The town of 75,000 was completely cut off by road and rail, and food, water and medicine had to be brought in by boat and helicopter. Image credit: NASA.

Heaviest rains on record trigger Colombia's worst flooding disaster in history
The 2010 rainy-season rains in Colombia were the heaviest in the 42 years since Colombia's weather service was created and began taking data. Floods and landslides killed 528, did $1 billion in damage, and left 2.2 million homeless, making it Colombia's most expensive, most widespread, and 2nd deadliest flooding disaster in history. Colombia's president Juan Manuel Santos said, "the tragedy the country is going through has no precedents in our history."


Figure 18. A daring rescue of two girls stranded in a taxi by flash flood waters Barranquilla, northern Colombia on August 14, 2010.

Tennessee's 1-in-1000 year flood kills 30, does $2.4 billion in damage
Tennessee's greatest disaster since the Civil War hit on May 1 - 2, 2010, when an epic deluge of rain brought by an "atmospheric river" of moisture dumped up to 17.73" of rain on the state. Nashville had its heaviest 1-day and 2-day rainfall amounts in its history, with a remarkable 7.25" on May 2, breaking the record for most rain in a single day. Only two days into the month, the May 1 - 2 rains made it the rainiest May in Nashville's history. The record rains sent the Cumberland River in downtown Nashville surging to 51.86', 12' over flood height, and the highest level the river has reached since a flood control project was completed in the early 1960s. At least four rivers in Tennessee reached their greatest flood heights on record. Most remarkable was the Duck River at Centreville, which crested at 47', a full 25 feet above flood stage, and ten feet higher than the previous record crest, achieved in 1948.


Figure 19. A portable classroom building from a nearby high school floats past submerged cars on I-24 near Nashville, TN on May 1, 2010. One person died in the flooding in this region of I-24. Roughly 200 - 250 vehicles got submerged on this section of I-24, according to wunderphotographer laughingjester, who was a tow truck operator called in to clear out the stranded vehicles.

When was the last time global weather was so extreme?
It is difficult to say whether the weather events of a particular year are more or less extreme globally than other years, since we have no objective global index that measures extremes. However, we do for the U.S.--NOAA's Climate Extremes Index (CEI), which looks at the percentage area of the contiguous U.S. experiencing top 10% or bottom 10% monthly maximum and minimum temperatures, monthly drought, and daily precipitation. The Climate Extremes Index rated 1998 as the most extreme year of the past century in the U.S. That year was also the warmest year since accurate records began in 1895, so it makes sense that the warmest year in Earth's recorded history--2010--was also probably one of the most extreme for both temperature and precipitation. Hot years tend to generate more wet and dry extremes than cold years. This occurs since there is more energy available to fuel the evaporation that drives heavy rains and snows, and to make droughts hotter and drier in places where storms are avoiding. Looking back through the 1800s, which was a very cool period, I can't find any years that had more exceptional global extremes in weather than 2010, until I reach 1816. That was the year of the devastating "Year Without a Summer"--caused by the massive climate-altering 1815 eruption of Indonesia's Mt. Tambora, the largest volcanic eruption since at least 536 A.D. It is quite possible that 2010 was the most extreme weather year globally since 1816.

Where will Earth's climate go from here?
The pace of extreme weather events has remained remarkably high during 2011, giving rise to the question--is the "Global Weirding" of 2010 and 2011 the new normal? Has human-caused climate change destabilized the climate, bringing these extreme, unprecedented weather events? Any one of the extreme weather events of 2010 or 2011 could have occurred naturally sometime during the past 1,000 years. But it is highly improbable that the remarkable extreme weather events of 2010 and 2011 could have all happened in such a short period of time without some powerful climate-altering force at work. The best science we have right now maintains that human-caused emissions of heat-trapping gases like CO2 are the most likely cause of such a climate-altering force.

Human-caused climate change has fundamentally altered the atmosphere by adding more heat and moisture. Observations confirm that global atmospheric water vapor has increased by about 4% since 1970, which is what theory says should have happened given the observed 0.5°C (0.9°F) warming of the planet's oceans during the same period. Shifts of this magnitude are capable of significantly affecting the path and strength of the jet stream, behavior of the planet's monsoons, and paths of rain and snow-bearing weather systems. For example, the average position of the jet stream retreated poleward 270 miles (435 km) during a 22-year period ending in 2001, in line with predictions from climate models. A naturally extreme year, when embedded in such a changed atmosphere, is capable of causing dramatic, unprecedented extremes like we observed during 2010 and 2011. That's the best theory I have to explain the extreme weather events of 2010 and 2011--natural extremes of El Niño, La Niña and other natural weather patterns combined with significant shifts in atmospheric circulation and the extra heat and atmospheric moisture due to human-caused climate change to create an extraordinary period of extreme weather. However, I don't believe that years like 2010 and 2011 will become the "new normal" in the coming decade. Many of the flood disasters in 2010 - 2011 were undoubtedly heavily influenced by the strong El Niño and La Niña events that occurred, and we're due for a few quiet years without a strong El Niño or La Niña. There's also the possibility that a major volcanic eruption in the tropics or a significant quiet period on the sun could help cool the climate for a few years, cutting down on heat and flooding extremes (though major eruptions tend to increase drought.) But the ever-increasing amounts of heat-trapping gases humans are emitting into the air puts tremendous pressure on the climate system to shift to a new, radically different, warmer state, and the extreme weather of 2010 - 2011 suggests that the transition is already well underway. A warmer planet has more energy to power stronger storms, hotter heat waves, more intense droughts, heavier flooding rains, and record glacier melt that will drive accelerating sea level rise. I expect that by 20 - 30 years from now, extreme weather years like we witnessed in 2010 will become the new normal.

Finally, I'll leave you with a quote from Dr. Ricky Rood's climate change blog, in his recent post,Changing the Conversation: Extreme Weather and Climate: "Given that greenhouse gases are well known to hold energy close to the Earth, those who deny a human-caused impact on weather need to pose a viable mechanism of how the Earth can hold in more energy and the weather not be changed. Think about it."

Reposted from Weather Underground by Dr Jeff Masters, Director of Meteorology.

0 0

Printable Version  |  Link to this page

Comments

Prev  1  2  3  4  5  

Comments 201 to 211 out of 211:

  1. Albatross @ 197 You are busy this weekend but I can demonstrate the validity of my claim that I am independent of the NIPCC report you linked to on the Tamino website. I don't clear my google searches so I checked back what words I typed to find this report on the 1000 year drought. First I was seeking information on Amazon droughts. I tried droughts in the last 100 years without luck and then I tried 1000 years but could not find any articles to show data. My next search was this "1000 year study of droughts in North America" If you google this you will see the order of files I posted. First one that came up was the British Columbia fires. On the same page was the drought article that was written by one of your friends. On my post they are in that same order as they are on the google page I brought up. I read through the articles, looked at the graphs read though some sections again to make sure I was understanding the content and then decided I could use these as a demonstration that current events are not getting more extreme when tested against a longer time frame. And most show some form of cycle that may be longer than the climate 30 year frame.
    0 0
  2. Tom Curtis @ 169 While I was doing a google search I found this presentation. It is a precipitation study that does go back several years and it does show cycles. Maybe there are these cycles that are longer than 30 years that climatologists have not entered into their models yet. Don't know if this is one but it a elaborate presentation and does indicate longer term cycles are real (variations of PDO and ADO). Drought cycles in Western US long term showing cycles.
    0 0
  3. Norman @195, the increase in severe weather events expected from global warming at the moment is slight, and the occurrence of severe weather events sporadic. Consequently the noise is very large compared to the signal. So, if you choose just one, or a few locations it is unlikely that there will be a statistically significant trend, and whatever trend there is may be in either direction. That is the nature of noisy data when you do not have a lot of data to work with. In contrast, if you look at a lot of data, as for example, as done by Munich Re, a signal can be detected. Or you could look at the trend in the length of warm spells Australia wide: You will then see that the trends you find in the Perth heatwave data are in fact artifacts of the noisy data. You can also check parallel information, such as the general trend in Perth temperatures, which would have shown you the same thing: Frankly, I should not have to chase down this auxiliary data for you. If you were what you purport to be, you would be doing it yourself instead of seizing on any little piece of data you think could undermine the position you oppose and rushing in a post which shows no significant thought on the topic.
    0 0
  4. P. M. Williams, "Modelling Seasonality and Trends in Daily Rainfall Data" (available here) will be of interest to those interested in extreme precipitation (it is more concerned with the statistical methodology than the data, but an empirical study based on this method would be really interesting). Essentially it alows you to model the probability of rainfall and the parameters of a Gamma distribution describing the plausible amount of rainfall as a function of time. This allows you to see how the distribution of rainfall has changed, see e.g. fig 2, which show that the probability of rainfall, the mean rainfall and also ranfall variability at Pomarico have all declined since the 1950s, especially the variability (as measured by the standard deviation). Integrating the upper or lower tails of this distribution gives an indication of the kinds of extreme events we might expect to see in the future. In the case of Pomarico, it seems likely that droughts are becoming a little more likely, but that heavy rainfall is becoming less likely (the Gamma distribution is skewed, so that if the standard deviation is reduced, the upper tail will come in more than the lower tail). IMHO this paper ought to be much better cited than it is.
    0 0
  5. Tom Curtis @ 203 My post at 195 was more for DB. He was asking that I would start doing some of my own statistical analysis. I was not doing an indepth analysis to make a point with this. I was just looking for raw data. After about 7 or 8 pages of searches the Perth Australia page came up with raw data. So I plugged the data into an excel spread sheet and made trend lines. It was more an exercise for DB to determine if I was on the right track with a small data set before I would work on a larger one. Entering lots of data on an excel spreadsheet is time consuming and one has to be very careful not to add a wrong data point (typing error). I enter it and check it over. If at least I was on the right track I could then tackle larger data sets. I have already been steered away from selecting one data point with my Omaha snowfall connection to temperature post. Now I am seeking regional areas and the longest data trends I can find. I think I would pale matched against your searching skills (a master at selecting the correct key words). I type in what I think will get me data and scroll 10 pages of the same material that has nothing to do with my search. You do have internet search skills I do not.
    0 0
    Response:

    [DB] While I can appreciate wanting me to vette your methodolgy, I'm hardly a statistician (Dikran would be the person to ask on that).  But I do know enough about stats, having followed Tamino's Open Mind for the past 4 years and climate science (and science in general) for about 30, to know that a focus on a few data points out of a larger set is improper, as I noted previously.

    Depending on what level and scope of analysis you're trying to do will also determine the types of methodologies you need to follow to come to a proper conclusion.  As you're dealing with climate science data, the link I gave you earlier to D Kelly O'Day's site should be of a great profit to you, as you can easily see how someone well-versed in analysis in Excel and R does it (replete with actual workbooks).

    If you're going to do it, and I applaud the effort, do it right.  Perhaps Dikran can suggest a primer on time series analysis to help.  Dikran?

  6. DB @ 205 I did visit the D Kelly O'Day site and found his raw data and viewed his various graphs. I will have to do some internet study of statistics to learn the many easy made errors when looking at trends. I am a Chemistry Major with only a surface level of statistics (I have an older brother that is an Accuary who is quite skilled in this math). On Tamino's web site "Open Mind" that Albatross linked to, Tamino did demonstrate how easy it is to manipulate statistical trends to create false and misleading trends. The demonstration he used was Arctic ice and how a short 8 year trend made it look like the arctic ice was recovering. But pulling back and extending the length of time showed that the 8 year increase was only a very small uptick on a much large overall decline. My view is that weather is, at this time, not getting worse and 2010 is not so exceptional. I am not making the claim that continued warming will not lead to an increase in severe weather patterns at some point. I am in strong favor of seeking new energy forms to raise the standard of living for people. As one can see in the IPCC disaster report I linked to earlier, poverty conditions leads to much more death rate when a hazardous weather or geophysical event takes place. Less property damage in a poor country but a much larger loss of life. My favorite choice is the Bussard Fusion reactor which may be able to fuse abundant Boron with a simple proton (hydrogen) without the release of a hot neutron per reaction of deuterium and tritium. This web site is dedicated to science. When I took science courses in College we still performed experiments in lab that have been performed thousands of times if not millions before I did them. I see posts on this page say "Leave it for the experts". That is not the way I was taught science. I was taught never to accept any authority's word on it but to investigate, research and validate the information on my own the best I can. I don't happen to believe weather patterns are shifting to a new extreme and 2010 is the first signs of it. Jeff Masters does think it is a strong possibility. My goal is to either find out if my current view is correct or wrong. I am looking for information that indicates that 2010 is not much different than many years before it (since that is what I believe is correct). As I am looking for information I am reading material from both sides. On the Climate issue I regularly read material from this web site, WUWT, Roy Spencer, Real Climate, Science of Doom and my own searches for various materials after reading a particualar article. I do try and keep an open mind on the issue. I would not be so stupid to stay the course and destroy the planet's ecosystem I live on. I may be a "denier" but I am not a stupid or blind one.
    0 0
    Response:

    [DB] Again, I applaud the spirit and the effort in wanting to "get it right".  In all walks of life we have the choice to trust the judgement of experts or to learn it from the ground up ourselves.  I do not personally feel the need to recreate the car from first principles in order to drive one. :-)

    But in the selfsame spirit of "getting it right" do not rely on blogs at all (even this one).  Get some textbooks on climate science and statistics.  Read the seminal papers in the field.  Do your own analysis.  In time you will come to discover that those dissembler sites (WUWT, CA, Spencer et al) have "done you wrong".

    And I have never thought you "stupid".

  7. Norman @201, Thanks for the clarification. I have some time here to make a few short posts. Believe it or not your myriad of posts on the extreme threads has given me pause for thought. Not so much on the strength or validity of your arguments, but more so to try and figure out a way of succinctly and clearly explaining the errs of your logic to you. We seem to be talking past each other and my (and other) posts trying to reason with you repeatedly fail. Some poster shave tried analogies. I'll try one more. Let us consider the global SAT record, and let us go back to 2008. Some could argue that 2008 was not consistent with a warming world because it was one of the coolest years on record recently. But that would not be true-- the long term trend is up. Now let us consider 2010, tied with 2005 for the warmest year on record, yet a perusal of the global SAT anomaly map would allow one to identify areas that were below average-- that too could be used to (erroneously) claim that the warming is not significant or unusual. OK, well how about the fact that 1934 in the USA was for all practical purposes as warm at 1998? Surely the timing of those two extremes mean that the planet is not warming or that current temperatures are not unusual? I mean 1934 was a long time ago and anthro GHGs at that time were practically insignificant. But again, one would be wrong to deduce that-- the USA covers<2% of the planet and the planet has warmed quite a bit since 1934. And one and on one can go-- for example, but surely the previous interglacial was warmer than the current one-- indeed it was. Does that mean that the current warming is not unusual? The short answer is no. My point is that one can always seek out locations or times when the data appear to go against the long-term upward trend in temperatures. But to use those data to conclude that the warming is either not happening or not significant is both wrong and misses the point altogether. And seeking out such data is not viewing the body of evidence, but is rather an elaborate form of cherry-picking and argumentum ad absurdum, something that you and both the NIPCC have both identified, perhaps even independently.
    0 0
  8. Albatros @ 207. Maybe we are not connecting in points because from this post it seems you and I are not discussing the same issue. "My point is that one can always seek out locations or times when the data appear to go against the long-term upward trend in temperatures. But to use those data to conclude that the warming is either not happening or not significant is both wrong and misses the point altogether. And seeking out such data is not viewing the body of evidence, but is rather an elaborate form of cherry-picking and argumentum ad absurdum, something that you and both the NIPCC have both identified, perhaps even independently." I am not debating the issue if the planet is warming and that cause is mankind's abundant release of this gas via burning carbon based fuel. That would be "denier" as defined (closed mind) since the facts are obvious that man is burning, collectively, large amounts of carbon based fuel and it is proven empirically that atmospheric CO2 will absrob and redirect IR radiation omnidirectonal so that some will return to earth at the CO2 molecular resonance modes. I am only questioning the collection of severe weather events of 2010 and using this as proof that the warming globe is causing an increase in extreme weather that have caused more destruction and will continue to get worse as the planet continues to warm. We are on two different topics. I do not know if global warming is or is not increasing the intensity, duration or frequency of severe weather events capable of causing disasters that kill people and destroy property. There are vast amounts of articles claiming it is a fact. When I do searches for evidence to determine if this is true, many state it is true but very few provide any proof. A general concept that a warmer world with more moisture in the atmophere would automatically lead to more storms is not good science from my understanding of it. You need to prove that a warmer world will lead to more storms or floods or droughts by at least providing a mechanism on what a warmer world will effect. How will it alter the jet stream? It may shift it poleward but will that create more intense dangerous storms? It certainly may but I would like to see an explanation of how it would produce more storms, what will it do to cause the increase? That is what I am spending time on this site and others questioning. Just saying the events of 2010 are the result of a warmer/wetter world without any challenge nor a need to provide more detailed explanations of how each event listed was caused by the warmer/wetter world is not an acceptable state for science. For instance, the Pakistan flood. It was caused by a blocking pattern. That is the known science that can be determined empirically. Saying "well the earth is warmer and wetter and has more energy so it caused this flood" should not satisfy any scientist. What forces of the warmer/wetter earth made this blocking pattern last longer? Did the warm air push a front into this area that would not have been there had the Earth been 0.8C cooler? Long post but I hope that explains my position on this thread. It is not to deny AGW, if I felt strongly there was not sufficient evidence for this theory I would attempt it on another thread (as has been pointed out to do..keep the focus on topic). Topic here is Global Warming is causing more extreme weather. Only by looking at long term trends in weather and climate patterns will this question be answered to satisfaction. Who knows, maybe I will find enough evidence to see that it is indeed a case. Most valuable to me are the sites that include mechanisms like Rob Painting with his Amazon drought article on this web site. He linked warming North Atlantic waters to droughts in Amazon. A warmer earth would have more chance of doing this over a cooler earth so he is making a link to how a warming world can create a drought condition in a region.
    0 0
  9. Norman @208, "I am not debating the issue if the planet is warming" Good, but you miraculously manage still miss the point entirely...it was an analogy. "You need to prove that a warmer world will lead to more storms or floods or droughts by at least providing a mechanism on what a warmer world will effect." Strictly speaking one cannot prove anything in science. That is just how it is. Now, I am going to spend some time with my family.
    0 0
  10. My 2 +/- 2 cents on trends in extremeness is that true extremes have small numbers of events which make it difficult to perform trending. The extremes need to be more tightly defined than is the case above (some of the head post examples are extreme, some are not and some are ill-defined or not commonly measured). Tom's proposed definition up in post 110 which is two standard deviations from the mean for rainfall needs to be tightened up IMO. The extremeness will depend on the coverage area, time interval for the mean, and the distribution of that particular random variable. A normally distributed variable will have a relatively straightforward definition of extreme, 3 SDs in many cases and 4 SDs in most cases as long as the time period and arial extents are large enough. Here's an example of extreme rainfall events http://journals.ametsoc.org/doi/pdf/10.1175/1520-0493(1999)127%3C1954%3AEDREAT%3E2.0.CO%3B2 e.g. 10 SDs above the daily average in an Indian monsoon due to the skewness of the distribution making outliers more common than a normal distribution. The Tennessee floods mentioned above had a large area of 1 in 1000 probability so qualifies as extreme. In contrast the 2010 Atlantic season described above as "hyperactive" is not a strong statistical outlier (12 versus 7 +/- 2 hurricanes with some right skew)
    0 0
  11. Norman @192, several comments: 1) The graph is not directly comparable as the CEI graph uses a 1 in 10 year definition of extreme, while Kunkel shows 1 year, 5 year and 20 year return intervals. 2) The graph you indicate only shows the period to 2000. From the data from CEI, the period 2001-2010 shows a higher frequency of one day, 1 in 10 year return period precipitation events than does 1991 to 2000, which probably means it is equivalent to 1895-1900 for 1 year and 5 year return events, but less than 1895 - 1900 for 20 year return events. It is however hard to be sure as no direct comparison is possible (see above). 3) The first interval on the graph, and the only interval to exceed 1991-2000 and presumably 2001-2010 in any category (except for 1981-1990 five day duration/ 20 year return interval) consists of only six years. It is probable that has the full decade been included the values would have been reduced, possibly significantly so. It is, of course, also possible that they would have increased them. If the four excluded years had followed the mean of the entire record, then the 1 day/ 1 year return value would have been approx 12.6%, the 5 year return value would have been 11.4%, and the 20 year return value would have been around 28.8%, values lower than or equivalent to those for 1990-1991. That would mean they are also lower than the 2001-2010 values. These three considerations suggest we should treat the 1895-1900 data with considerable caution. That the period is also the most likely to be adversely effected by poor station quality, and/or record keeping, only reinforces that caution. For example, one explanation for the different patterns between 1 day, 5 day and 30 day records (see below) may simply be that records were not taken at consistent times in the early period, thus inflating single day values without altering multi-day values. A six hour delay in checking rain gauges could easily inflate daily values by 50% or more. Having urged caution, however, I will take the data at face value (because it appears to counter my case rather than support it). That is, I will treat the data conservatively with respect to my hypothesis. Doing so, I proceeded to check the whole paper. What struck me is that though Figure 3 (extreme one day precipitation events) shows very high values for 1895-1900, that was not the case for 5 day and thirty day events (figures 4 and 5). For 5 day events, the 1895-1900 figures where less than the 1991-2000 (and presumably also 2001-2010) in all categories. The same is true for 30 day events, but in that case the number of 1 year and 5 year return events where much less than those for the 1990's. Assuming the data is genuinely representative of the decade 1891-1900, that indicates that the high number of extreme rainfall events in that decade has a distinct cause form those of the 1990's and presumably 2000's. If they had the same cause, they would show a similar rainfall pattern to the most recent decades. The difference in causation may be something common to the two periods, along with some factor present in the recent decades but not in the 1890's. More likely it is entirely distinct in that the only thing truly common between the 1990's high rate of extreme precipitation and the 2000's even higher rate is high global temperatures. In any event, the data presented ambivalently confirms the notion that the most recent period is unusual. Confirms because in two out of three periods, the 1990's and presumably 2000's represent the highest frequency of heavy rainfall events, while in the third (1 day events) the 1890's are similar but neither conclusively greater nor less than the 2000's. The data certainly disconfirms the notion that some periodic oscillation is responsible for both the rainfall extremes in the 1890's and over the last two decades because of the distinct causal antecedents. Now, having given you what I consider to be a fair analysis of the data, I am going to berate you. Firstly, drawing attention to the 1 day return interval without also mentioning the different pattern for the other durations is without question cherry picking (yet again). It is cherry picking even though you where responding to a comment about 1 day duration events because it was relevant in general, and definitely relevant to your hypothesis because it undermined it. (The original comment was not cherry picked in that the CEI data only includes the 1 day interval.) Further, your analysis was shoddy in that it made no mention of the confounding factors. That's par for the course in blog comments, but it is shoddy analysis. It is particularly so from somebody who is so scrupulous in finding confounding factors in data that does not suit you. In fact, it shows you are employing an evidential double standard (yet again). All of this is exactly what I would expect from a denier in sheep's clothing. But if by some extraordinary chance you are what you claim to be, you can't afford that sort of shoddiness. You need to look out for the confounding factors yourself, and always treat the data conservatively, ie, assume it has the worst characteristics for your preferred hypothesis. Otherwise you will just keep on proving the maxim that the easiest person to fool is yourself.
    0 0
  12. Eric Extreme events already have a good statistical definition, namely the return period, for instance an event might have a return period of 100 years, in laymans terms, a "once in a hundred year event". This definition has the advantage of automatically taking into account the skewness of the distribution. The small number of events does not pose a serious problem in estimating changes (trends) in return times. There is a branch of statistics called "extreme value theory" that has been developed to address exactly these kinds of problems. If you have a statistical background, there is a very good book on this by Stuart Coles (google that name to find tutorial articles).
    0 0
  13. Norman wrote: "I am only questioning the collection of severe weather events of 2010 and using this as proof that the warming globe is causing an increase in extreme weather that have caused more destruction and will continue to get worse as the planet continues to warm." It isn't proof, it is corroborative evidence. In science you can't prove by observation, only disprove, and constant calls for proof of AGW (which is known to be impossible, even if the theory is correct) is a form of denial. A warmer world means that the atmosphere can hold more water vapour, and thus there will be a strengthening of the hydrological cycle. A consequence of that is likely to be an increase in extreme weather in some places. Thus an increase in weather extremes is what you would expect to see if AGW is ocurring, but it isn't poof. If you think that is not good science, then it is your understanding that is at fault. That warm air holds more water vapour is observable to anyone with a thermometer and a hygrometer, compare humidity in summer months and in winter months. That water vapour has to go somewhere, as water is constantly being evaporated from the oceans by the suns heat. More is evaporated in warm condisions than in cold. Thus more evaporation means more rain. It really isn't rocket science.
    0 0
  14. Eric (skeptic) @210: 1) The point about the size of the event is valid but irrelevant. Specifically, the events under discussion like those catalogued by Munich Re, where a tornado outbreak over three days spawning 300 tornadoes counts as a single "event". 2) Your definition of extreme sets to high a bar. Take for example the Moscow heat wave of 2010, with an expected return interval, which lay just above 4 standard deviations above the mean for July temperatures, giving it an Annual Return Interval on the assumption of a normal distribution of 1 in 10,000 years. According to your proposed definition, that only just qualifies as extreme. As another example, the rains causing the Brisbane flood had an Annual Exceedance Probability of 1 in 400 for the whole event, and for the peak rainfall of 1 in 2000. That works out as lying between 3 and 3.5 standard deviations from the mean. This is a flood that sent twice the amount of water down the river as when Brisbane got hit by a cyclone in 1974, and as much as for Brisbane's breaking flood in 1893 (also cyclone related), but this was just from a line of thunderstorms. But according to you it is borderline as to whether it should count as extreme. Or consider the Toowoomba flood. The rainfall for that flood had an AEP of just over 100, equivalent to a standard deviation of just 2.6. Consequently Toowoomba's "instant inland tsunami" does not count as an extreme event for you. The clear purpose of your definition is to make "extreme events" as defined by you so rare that statistical analysis of trends becomes impossible. You as much as so so. In your opinion, " true extremes have small numbers of events which make it difficult to perform trending". But that is as good as saying your definition of true extremes is useless for analysis. In contrast, an event with an AEP 2 SD above the mean (a 1 in twenty year event) is certainly extreme enough for those caught in the middle. Further, those events are predicted to increase in number with global warming. So why insist on a definition of extreme which prevents analysing trends when the theory we are examining makes predictions regarding events for which we are able to examine trends? Why, indeed, is global warming "skepticism" only ever plausible once you shift the spot lights posts so that the evidence all lies undisturbed in the darkness.
    0 0
  15. #214 Tom Curtis at 17:20 PM on 3 July, 2011 As another example, the rains causing the Brisbane flood had an Annual Exceedance Probability of 1 in 400 for the whole event, and for the peak rainfall of 1 in 2000. That works out as lying between 3 and 3.5 standard deviations from the mean. Right, a 1 in 2000 event is 3.29053 standard deviation (which is indeed between 3 and 3.5). But only if the phenomenon examined follows a normal distribution. However, it is well known that probability distribution of weather/climate parameters are very far from being normal, in this domain so called fat tail distributions prevail. This fact leads to risk estimate distortions. Therefore your analysis does not make sense in this context.
    0 0
  16. Berényi Péter While you are correct that some weather parameters are non-normal (for frontal precipitation a gamma distribution is approapriate, for convective rainfall an exponential distribution is the use model), you have not established that the distributions are not sufficiently heavy tailed for Tom's argument to be qualitatively incorrect. However, the point is moot as the statistical definition of an extreme event is based on the return period, not the standard deviation.
    0 0
  17. Berényi Péter @215, my figures for the Brisbane flood are based on the SEQWater report on the flood, and in particular on the Rainfall Intensity Graph for Lowood: and Helidon: The graph plots the peak rainfall recorded for each duration at the two sites. The dashed lines show the Annual Exceedance Probability, with the highest shown being the white dashed line with an AEP of 1 in 2000. AEP's of greater than 1 in 2000 are not plotted because there is insufficient data after a century of recording to determine the values with any accuracy. You will notice the recorded rainfall intensity for periods between about 6 and 24 hours (Lowood) and between 1.5 and 24 hours (Helidon). The claim in relation to the Toowoomba storm is based on the Insurance Council of Australia's report on the event, which states:
    "Short-term rainfall data are available at nine raingauges in and around Gowrie Creek catchment. At six of these nine gauges, rainfall severities were greater than 100-Years ARI for rainfall durations of 30 minutes to 3 hours. At another gauge, rainfall severities were greater than 50- Years ARI. Readings at the remaining two gauges were far less severe (less than 20-Years ARI)."
    So, while I did just use the probabilities from a normal distribution in determining the Standard Deviation, that just means, if anything, that I have overstated the SD which would strengthen my point, not weaken it. Your criticism may be valid with regard to the Moscow heatwave where I have relied on Tamino's analysis. The 1 in 10,000 figure assumes a normal distribution. Tamino calculates the return period using extreme value theory as 1 in 260 years. However, the Moscow Heat Wave was a significant outlier on the tail of the QQ plot. Assuming that extreme value theory or normal distributions can be used to calculate the probability of extreme outliers is a fallacy equivalent to assuming that you have a chance of rolling 13 on two loaded six sided dice. Consequently the most that can be said about the Moscow heat wave is that its AEP was not less than 1 in 260 assuming global warming; and not significantly less than 1 in 1,000 assuming no global warming, but possibly not physically realisable in the absence of global warming. Using the best value, AEP 1 in 260 would have, again, strengthened by case against Eric's definition. If you have a problem with my using conservative values relative to my argument, let me know. Then I too can argue like a denier. While on the topic, I disagree with Dikran about defining extreme events by return intervals. It is in fact the best practise. However, the prediction of more extreme events from global warming is not a prediction restricted to events of a certain return interval. Further, not all data sets, and in particular Munich Re's data set is not easily classified by that basis. The definition of extreme is therefore relative to a particular data set, with the only (but very important) provision that the definition of extreme used is one for which the prediction of AGW holds.
    0 0
  18. Hi Dikran, the wikipedia article on return period does not offer any criteria on what is extreme. In fact it assumes that "extreme" or other threshold event has already been defined, then simply offers an estimation of likelihood in any year or years. One cannot assume the event is extreme just because the likelihood is low using that formula. There are a lot of factors that need to be considered, some of which I mentioned above, such as the event duration and areal extent. For example flash floods are much more common than river basin floods because the areal extent is smaller along with the event duration. My own location in Virginia is hilly enough to have flash floods mainly from training thunderstorms. The probability for any single location is low but much higher considering the large number of potential flash flood locations. Truly a dime a dozen. Meanwhile the major river that I live on shows no trend in extreme floods (1942 is the flood of record, see http://www.erh.noaa.gov/er/lwx/Historic_Events/va-floods.html) Tom, on your point 1, the fact that Munich Re defines tornado events in some arbitrary way does not invalidate my point about areal extent and event duration. On your point 2, the Moscow heat wave qualifies as extreme by my criteria and not just barely. However the blocking pattern that enabled the heat wave is a natural phenomenon and makes the probability of the extreme event higher. The mean and standard deviations of heat waves are higher in blocking patterns which makes extreme heat waves more likely. Independent of that, AGW makes extreme heat waves more likely. The Towoomba flash flood was indeed not extreme. The Brisbane flood had 1 in 2000 rainfall over a very small areal extent, not uncommon and not extreme. The more widespread rain was borderline extreme as you point out and fell on saturated ground. There were complications of measurement of river levels with damming in place. That damming may mask an extreme event by lowering levels downstream. Urban hydrology may exacerbate an event causing it to have river level extremes which would not have otherwise existed. Damages and economic losses were exacerbated by poor zoning. IMO the Brisbane flood was a somewhat borderline extreme weather event creating an extreme flood due to a variety of factors.
    0 0
  19. Norman @200:
    "On both links you posted, the 1 and 2 levels on the charts are not considered disasters. Loew & Wirtz call the first two "small and moderate loss events" the next four levels are various degrees of disaters. Munich Re lists the higher levels as catastrophes and not disasters but the meaning would be the same."
    I wondered about that from the slides to which you linked, so I looked a bit further for clarification and found Löw and Wertz, who say:
    "The Munich Re global loss database NatCatSERVICE divides natural disasters into six damage categories based on their financial and human impact, from a natural event with very little economic impact to a great natural disaster (see Figures 3). The combination of the number of fatalities and/or the overall loss impact ensures that each event can be categorised with one or two of these criteria. The limits of overall losses are adjusted to take account of inflation, so it is possible to go back into the past if new sources are available."
    (My emphasis) That wording strongly suggests that category 1 and 2 events are included in the database as natural disasters. That the events are described in a continuum of increasing severity reinforces that interpretation, with 1 (small scale loss event) and 2 (moderate loss event) being followed by 3 (severe disaster). If level 1 and 2 events are not included, we have to wonder why no minor or moderate disasters ever occur in the world. You also have to wonder why an event with 9 fatalities (level 2) is not worth calling a disaster. I know insurance agents are widely considered to be heartless, but that is going to far. If you disagree with that assessment, may I suggest you contact either Munich Re Geo or the NatCatService for clarification.
    The map you looked at only shows the United States as having more than 650 natural disasters in the period 1980-2004. Chile is shown as having from 51 to 150, the second lowest category. The reason different countries have different numbers of natural disasters varies. The US has a large number in part because of Tornado Alley. Australia, in contrast, has relatively few (351 to 650) partly because it is so sparsely populated. In contrast, it is almost certain that Africa shows very few catastrophes (outside of South Africa) because of poor reporting. UN agencies, aid agencies and other NGO's only become involved in and report large natural disasters. Many African governments are dysfunctional, and western media ignore all but the largest catastrophes in Africa, and often ignore those as well. Wether they like it or not, this inevitably introduces a bias into Munich Re's figures for Africa and (probably) South America and former Soviet Block nations. Because of that bias, although I quote the headline trends for simplicity, I only employ trends that are reflected in North American and/or European figures where these selection biases do not apply. Indeed, because reporting from Africa and South America is so low, any distortion they introduce to the overall figures is slight. Consequently this bias is unlikely to distort trend data very much, but it does make international comparisons of limited value, and mean the total figures understate the true number of natural disasters significantly. Of more concern are China, SE Asia and former Soviet Block nations where there may be a significant trend in increased reporting over time. Again the figures are not large relative to the total trend, and none of these regions show an unusual trend when compared to Europe or North America, but it is something to be aware of. The important point,however, is that global trends are reflected in European and North American trends where these reporting issues are not a significant factor.
    "I looked back at your post 116 on the thread "Linking Extreme Weather and Global Warming" where you posted the Munich Re Graph. I guess the graph shows about a 55%increase in 25 years in number of disasters (about 400 in 1980 to 900 in 2005, and annual rate of 2.2%) Here is a link on the yearly rise in home prices. National average was 2.3% but some areas were at 3% and more. A yearly rise of 2.3 in a home's value (not to mention the items in the house) will get you a 57.5% increase in value in 25 years. The value of property rose at a faster rate than the disaster rate. But the biggest thing is there incremental choices they picked for determining a disaster. From 1980 it was $25 million. In 1990 it was $40 million. In 2005 it was $50 million. The overall rate of determining a disater rose 50% in 25 year or at a rate of 2% a year. From 1990 to 2005 the rate of increase would only be 1.3% a year. The criteria to become a disaster was rising at a much slower rate than property values."
    First, it is important to note that the categories are inflation adjusted. That is, the threshold levels are stated in inflation adjusted figures. The changing threshold is therefore only introduced to adjust for increased wealth in society. Second, adjustment is carried out be two methods, by fixed stepwise adjustment as reported by Löw and Wertz, or by linear interpolation as reported by Neumayer and Barthel. In the stepwise approach, the levels for a Major Catastrophe are $85 million from 1980-1989, $160 million form 1990-1999, and $200 million from 2000 to 2009. For linear interpolation the figures are linearly interpolated between $85 million in 1980 and $200 million in 2009. In the graphs contrasting all disasters with major disasters, it is the Neumayer and Barthel method that is used. The Munich Re graphs either show all disasters (category 1 to 6) of Devastating and Great Natural Catastrophes (category 5 and 6), or Great Natural Catastrophes (category 6). Both of the later have to few events for effective statistical analysis of any trend due to global warming. Using property prices as an index of overall wealth is probably as good as you'll get for a simple comparison. So, to see how the Neumayer and Barthel adjustment fared, I compared their interpolated threshold to an index based on an exponential 2.3% (average value) and 3% (high value). Of immediate interest is that 85*1.03^29 is 200, give or take, so the endpoints preserve there values to within 0.15%. This means the middle values would overstate the relevant threshold by up to 9.5% using the 3% inflator, thus understating the number of natural catastrophes. But, the average price rise was 2.3%, not 3%. Using the average price rise as the inflator, the 2009 threshold should be 165 million dollars, and the 2010 threshold should be 168 million. That is, using the average increase in US property prices, the Munich Re threshold overstate the threshold in later years by as much as 21.5%. The effect of this is that Munich Re would be understating the number of disasters in 2009 and 2010 relative to 1985. The rarety of these events, and the large variability of damage done means the understatement of the number of major disasters will not be anywhere near 20%, but it is an understatement, not an overstatement as you conjectured.
    0 0
  20. Eric, would you say that an event that is sufficiently dramatic that it only happens once in a thousand year is an extreme event? The return period is a measure of the "extremeness" of an event; the threshold return period that demarcates extreme from non-extremes depends on the nature of the application. The same is true of statistical significance tests. The 95% threshold is often used to indicate an event sufficiently extreme that it is difficult to explain by random chance. However what people often forget is that Fisher himself said that the threshold depends on the nature and intent of the analysis. Expecting a single threshold that is right for every aplication is unreasonable in both cases. As to the additional effects such as area, event duration etc. These are conditioning variables on which the return period depends. They are already taken into account in the analysis. For instance, what is the return period for 12" of rain falling in some particular catchment in Cumbria in August. If the events you are discussing are "dime a dozen" then by definition they have a low return period, and hence would not be considered extreme according to their return period. That is the point, the return period is self-calibrating to the distribution of events at the location in question.
    0 0
  21. Eric (skeptic) @218, you seem determined to exclude yourself from the discussion as completely irrelevant. To see what I mean, consider this definition of a severe thunderstorm:
    "In Australia, for a thunderstorm to be classifi ed as severe by the Bureau of Meteorology, it needs to produce any of the following: • Hailstones with a diameter of 2 cm or more at the ground • Wind gusts of 90 km/h or greater at 10 m above the ground • Flash flooding • A tornado.
    That come from the joint CSIRO BoM 2007 report on Climate Change,which then goes on to make predictions about cool season tornadoes and the number of hail days expected per annum. I have seen other prediction in Australia for AEP rainfall events of 1 in 20 for various periods. From memory it was an expected 30% increase for the top 5% of one hour events, and a 5% increase for the top 5% of one day rainfall events by 2030. The twenty year return peak rainfall for one hour at one Toowoomba station (Middle Ridge) is about 60 mm, with about 80 mm, ie 25% higher. For another (USQ) they are about 55 mm and 75 mm respectively, ie, 36% higher. So in terms of rainfall intensity, and the fact that it involved a flash flood, Toowoomba was exactly the sort of event about which predictions where made. Now, the question is, was Toowoomba's flash flood a 1 in 100 event, or has it now become a 1 in 20 year event (as is predicted for around 20 years from today). The only way you can find out is by counting the number of relevant events. You, however, have carefully contrived your definition of "extreme event" so that it does not count exactly the sort of events we need to track to see how the AGW prediction is bearing out. In doing so,of course, you have contrived your definition to include events about which there is no explicit AGW prediction in the near term (or in many cases the long term), and which occur far to rarely for any sort of statistical analysis. What that amounts to is an attempt to remove your beliefs from empirical challenge. It also means your discussion on this thread is irrelevant because you have carefully chosen to avoid discussing the type of events about which AGW has made its predictions.
    0 0
  22. Tom Curtis Counting events is not the only way in which you could determine whether the return period has changed. For instance, you could model the distribution of observed rainfall at a particular location as a function of time, using perhaps the methods set out in the paper by Williams I discussed in an earlier post. You can then see the trend in the return period by plotting say the 99th centile of the daily distributions. This would be a reasonable form of analysis provided the extreme events are generated by the same basic physical mechanisms as the day-to-day rainfall (just with "outlier" magnitudes), rather than being cause by a fundamentally different process. Alternatively extreme value regression methods might be used.
    0 0
  23. Dikran, the events are not a dime a dozen, but the locations for potential extreme events are. So we have a situation of numerous independent tests of an event with a very low probability (by the recurrence measurement). By multiplying the probabilities of nonoccurence at each location we can determine the probability that the event will not occur at any location, then invert that to obtain the probability of occurrence. For example an event that has 0.1% probability in one location will have 1 - (0.001 ^ 100) or 9.5% probability in any one of 100 locations. The caveat is that the weather event occurrences are not necessarily independent tests. The other problem with the binomial approach is that it that the extreme event has to be defined a priori. As an extreme example I could define "extreme" as 1.00 inches, no more and no less. Or 0.99 inches. The more of these events that I invent, the greater the probability that an event will occur. Those are analogous to the head post stating: "The formation of Tomas so far south and east so late in the season (October 29) was unprecedented in the historical record; no named storm had ever been present east of the Lesser Antilles (61.5°W) and south of 12°N latitude so late in the year" (from the head post). I could easily pick some year in the 1950's or other early year and pick out a hurricane that formed so early so far west or north or late and farther north or any of dozens of potential combinations. Then I can point to the lack of a trend since the 1950's since I invented all those extremes for that decade. Tom, I think we are differing on the definition of extreme event. I find it critical to define that up front, then look for trends, then attribute to AGW and natural causes. The fact that Toowoomba's flash flood event has gone from 1 in 100 to 1 in 20 may be a suitable topic to study attribution, but not attribution of extreme events. The Toowoomba event was one location but the probability of an occurrence of such an event must be calculated across all potential flash flood locations. In that context the Toowoomba event is not extreme because if it didn't happen there, it could happen somewhere else. Dikran (222) I don't think you can equate the physical processes of the extreme and day-to-day events nor use the statistics of the day-to-day events to say anything about the probability of extreme events.
    0 0
  24. Eric (223) I have pointed out to you more than once that return periods are conditioned on factors such as location. If you want the return period for a particular event at a particular station, that is one calculation. The return period for an extreme event occurring at one or more stations within a region is a different calculation. However, rather than rolling your own statistics, it would be better to first acquaint yourself with the standard solution developed by leading statisticians, namely extreme value theory. Extreme events generally ocurr to a coincidence of factors combining constructively. These same factors when they partially cancel out (which is what happens most of the time) is what generates the central parts of the distribution. However the factors are there for both the extremes and the day to day. The difference is in how they combine. Now if you can give an example where the physical mechanism of an extreme is different from the day to day, then give an example. Just asserting that they are different does not establish that they are. As it happens, modelling rainfall extremes in this way does actually work quite well (see e.g. this paper).
    0 0
  25. Dikran Marsupial @ 213 "A warmer world means that the atmosphere can hold more water vapour, and thus there will be a strengthening of the hydrological cycle. A consequence of that is likely to be an increase in extreme weather in some places. Thus an increase in weather extremes is what you would expect to see if AGW is ocurring, but it isn't poof. If you think that is not good science, then it is your understanding that is at fault." What you describe it the first stage of science. It is a hypothesis (educated guess). Inceasing the water vapour in the atmosphere should increase the normal rainfall amounts and you should see an upward rise for these measurements. I do not know how this would lead to a necessary increase in extreme weather in some places. I guess it comes down to how "extreme weather" is defined. There is a nice ongoing debate between you, Tom Curtis and Eric (skeptic) about what is an extreme weather or climate event. As of my post reading so far, it does not seem the issue is yet resolved. Here is an analogy to consider. On the Wikipedia explanation of Standard Deviation Standared Deviation. They give an example of Standard Deviation: "A slightly more complicated real life example, the average height for adult men in the United States is about 70", with a standard deviation of around 3". This means that most men (about 68%, assuming a normal distribution) have a height within 3" of the mean (67"–73") — one standard deviation — and almost all men (about 95%) have a height within 6" of the mean (64"–76") — two standard deviations. If the standard deviation were zero, then all men would be exactly 70" tall. If the standard deviation were 20", then men would have much more variable heights, with a typical range of about 50"–90". Three standard deviations account for 99.7% of the sample population being studied, assuming the distribution is normal (bell-shaped)." The first point is what is a good definition of extreme height for an American male? Two standard deviations above the normal is 6 feet 3 inches (sorry for English terms or 1.95 meters). This is not an extreme height in my opinion. Tall yes, but not extreme. So extreme is in the "eye of the beholder". I will think 6' 8" (2.03 meters) would be an extreme height (like the Moscow heatwave "Take for example the Moscow heat wave of 2010, with an expected return interval, which lay just above 4 standard deviations above the mean for July temperatures" From Tom Curtis at 214). That is the groundwork of this analogy. Now we have a population that wants to get taller so as a group they all begin to take a growth hormone which will raise the average height by 4% (the amount of water vapor that has increased in our atmosphere due to warming). 4% of 70 inches is 2.8 inches. Our entire population is 4% taller, some are more or less affected but the average increase is 4%. Will this 2.8" in average height now mean we have a noticeable increase in 7 foot tall men from this growth hormone? It may well do this. I would need some proof of it. No doubt if you keep increasing the moisture content you will reach a point where exterme events will be more likely (extreme based upon previous levels). I am not convinced, at this time, 4% is enough to push us to the new normal. The statistical experts that post on this thread (Tom Curtis or Berenyi Peter) may be able to demonstrate a 4% increase would make it noticeably more likely to effect the far ends of the normal curve. I am not sure. There would be calculations for this.
    0 0
  26. Norman, that a warmer atmosphere can contain more water vapour is not a hypothesis, or an educated guess, it is an observable fact. That there is more evaporation in warmer conditions is similarly observable fact. Having more water in the atmosphere gives greater scope for heavy rainfall. You accept that it will increase normal rainfall, can you suggest a physical reason why it would increase normal rainfall but not extreme rainfall? Later on you write "No doubt if you keep increasing the moisture content you will reach a point where exterme events will be more likely (extreme based upon previous levels)." which suggests that you also accept that warmer conditions will lead to an increase in extreme events! I don't understand why you think there is some sort of threshold, it seems to me much more plausible that the probability of an extreme event (or equivalently its return period) is a continuous function of humidity (and a bunch of other variables).
    0 0
  27. Dirkan Marsupial @ 213 "It isn't proof, it is corroborative evidence. In science you can't prove by observation, only disprove, and constant calls for proof of AGW (which is known to be impossible, even if the theory is correct) is a form of denial." Maybe you can't prove by observation but you can certainly develop linking mechanisms to explain what causes observed behavior and gives you the ultimate scienctific goal, predictability. When Chemists came up with the atomic theory it became a very valuable tool to predict future interactions among elements and molecules. Just stating the earth is warmer and there is more water in the air does not justify the conclusion that more extreme weather related events become noticeably more probable. Extreme weather events are well outside the normal and are caused by certain events that take place in the atmosphere. Some are known and some are not. This is from an abstract I quoted in post 192. "This suggests that natural variability of the climate system could be the cause of the recent increase, although anthropogenic forcing due to increasing greenhouse gas concentrations cannot be discounted as another cause. It is likely that anthropogenic forcing will eventually cause global increases in extreme precipitation, primarily because of probable increases in atmospheric water vapor content and destabilization of the atmosphere. However, the location, timing, and magnitude of local and regional changes remain unknown because of uncertainties about future changes in the frequency/intensity of meteorological systems that cause extreme precipitation." That is what is needed to become a science. You need mechanisms that explain past weather extremes (floods, hurricanes, droughts, tornadoes, heat waves, cold waves, snow). For instance, what were mechanisms that caused floods in Australia's past? Certain atmopheric and ocean conditions are set up that will lead to flooding. Once you get a good mechanism developed it will lead to future predicability and you will easily be able to see if global warming is increasing the return time of a given weather extreme because of how it effects the overall mechanism you found that produces flooding in Australia. Without offering a mechanism of how floods occur then you definately don't have a science. You have an opinion. Science is always going to the next level.
    0 0
  28. Dirkan Marsupial @ 226 Sorry for the misunderstanding. You state "that a warmer atmosphere can contain more water vapour is not a hypothesis, or an educated guess, it is an observable fact." Yes it is, and that is not what I am calling a hypothesis. The hypothesis is that the current amount of moisture increase and temp increase will lead to more extreme weather events (at least to a noticeable degree). That is why I introduced the tall people analogy. I am not very good in statistics but I can understand the basics. I may be very wrong but the way to demonstrate your idea would be to make a precipitation graph with 4% more precipitation than another graph. Normal curve or flat tail whatever you want. You go out to the level of standard deviations you consider extreme and calculate how many more events occur with the 4% increase to see if it is noticeable enough at this time to attribute the recent floods to Global warming.
    0 0
  29. Tom Curtis @ 219 Wow! You are definately a very intelligent person. I read through your post but will have to reread it a few times and think on it before composing a comment to what you have developed. I will try to make it satisfactory for your intellect. Thanks for taking the time to give such detailed and thoughtful responses. I know it took me awhile to find data to compose my post on Munich Re.
    0 0
  30. Norman@227 Yes, that is the point, it was you that was mistakenly talking about proof. In the case of linking mechansims, these are well known to hydrologists. Knowing the mechanisms however does not mean that you can accurately predict something. You can write down the physics of a double pendulum on a side of A4, but you can't predict it exact course. Does that mean we can only have opinions about double pendulums, but not science? Of course not. You appear to have a rather unusual definition of science.
    0 0
  31. Norman, take a GCM, which embodies our knowledge of climate physics. Apply CO2 radiative forcing, and observe an increase in rainfall frequency and intensity. Does it then become science? The obvious google search gave this example. I suspect there are more.
    0 0
  32. Norman @225, please follow the logical steps: 1) Increased temperature implies increased specific humidity; 2) Increased specific humidity implies more water condenses as a result of cooling due to updrafts or frontal systems; 3) More water condensing implies more latent heat released; 4) More latent heat released implies a stronger updraft generated which: 4a) Results in greater cooling, with more water released; and 4b) Results in more air being drawn into the updraft, carrying more water with it. Please note that there is no contradiction between 4 and 4a. Although more heat is released, the updraft cools the air rapidly as it rises. The consequence is that where updrafts form, the moister the air, the stronger the thunderstorms that result, as in stronger winds as more air is drawn into the updraft, more intense rainfall as more water is precipitated out in a single column, more hail as precipitated water is carried to a greater height by the strong updraft before falling, and larger drops (and hence hail) as condensation is more rapid so fewer seeds are used, and more drops coalesce by collision. That is an observable pattern in weather systems quite independently of any considerations of global warming. It also makes the prediction of more, and more intense thunderstorms a straightforward consequence of global warming. There are complications relating to changes in weather patterns making some areas drier. There are also complications relating to the spawning of the most damaging convective storms, ie, tornadoes and hurricanes/typhoons/cyclones. The increased strength of convection is why the increased rainfall for the most intense 5% or 1% of rainfall events is expected to be much larger than 4% for a 4% increase in specific humidity. Turning to your height analogy, which I like, but which should be used cautiously; if the average height increases but variability remains unchanged, then the occurrence of more extremely tall people by the old definition of extremely tall will increase. Indeed, this effect will occur for any definition of "extremely tall" which is greater than the increase in average height. Consequently it does not matter whether you use a definition of 2 SD deviations above the original mean height, or 4 SD. Both are expected to increase. The only significant difference is that the 2 SD definition gives you a larger sample of "extreme tallness" to test for any increase. That is a crucial point. The effect of insisting on the more extreme definition is only to reduce your sample size, and hence make the data noisier. As it is, weather bureaus have definitions of "extreme weather" based on a simple metric - is the weather such that people should be warned about a risk of loss of life or property. If it is, its extreme. If it isn't, it isn't. That is a low bar in terms of return periods, but it is very practical for people receiving forecasts. Somebody receiving an "extreme weather warning" does not care how many standard deviations from the mean, or what the return interval is. They care about whether they should get their car under cover, or whether they are at risk from flash flooding. Well, global warming predicts that "at risk" episodes due to weather will increase. That is the hypothesis. For statistical convenience for some types of events (precipitation, thunderstorms) it is convenient to look at changes in return intervals to test that theory. For others (tornadoes and hurricanes), you just count the tornadoes and hurricanes and their relative intensity. Given that global warming predicts an increase of "at risk" episodes, defining "extreme" to mean "at risk of major catastrophe" episodes only reduces your statistical sample. It is a ploy to hide from the evidence. Nothing more. And on a side note, I (and I suspect Bérenyi Péter) are not experts in statistics (although Bérenyi is certainly much more competent than I am in that area). In contrast, Dikran is a genuine expert on statistics. To the extent that I am expert on anything (which I do not claim for anything) it would be logic and the use of words to convey meaning (philosophy of language).
    0 0
  33. Tom Curtis @ 232 I was reading up on the formation of preciptitaion events and what determines their severity. The thunderstrom you describe is known as an air-mass thunderstorm. Generally they are the least severe and the rain will cool the updraft and destroy the cycle. Even these need the one common factor in thunderstorms. You need a layer of unstable air (one where if you move a parcel of ground air to a higher level it will end up being warmer than the surrounding air and so continue to rise). I looked at some weather links to determine what causes the most severe thunderstorms. Moisture in the air is the definate fuel for storms but there are many factors that go into determining the intensity of a thunderstorm. I do not know if the 4% increase in moisture content of the air will make much difference (will continue to research this). Here are some links. Atmopheric stability and storm formation. Factors that determine the intensity of a thunderstorm. Stability of the atmosphere is one of the major factors in determining the severity of a thunderstorm. The less stable the more likely an intense storm will develop. Also upper wind variation is really important or the storm will choke itself off. The upper wind moves the region of downdraft away from the updraft allowing the storm to continue and intensify. Other factors are how fast the overall storm is moving. A strong strom that moves rapidly will not be a likely to produce flooding as a similar strong storm that moves much slower over an area. Will global warming create more regions of unstable air? Will these regions of unstable air be more unstable because of global warming? Will wind shear become stronger as the earth warms? If these questions can be determined to be a yes then I would agree with the hypothesis that global warming will increase the number of severe storms, more rain, hail, wind and tornadoes all with the potential to increase damage to target areas. In the US the term used is severe weather. An extreme weather event would be one significantly worse than normal severe storm events.
    0 0
  34. Tom Curtis @ 219 I did as you suggested. I sent a query to Munich Re about what criteria they use to determine a dissater in their charts. I will have to wait and see if they answer it.
    0 0
  35. Dikran Marsupial @ 231 Here ia quote from a very long report on climate models. The report is very long and detailed so I have not had to time to read through all of it. It seems climate models are fairly good at reproducing some climate effects based upon the actual events and they do continue to improve and evolve as more knowledge is gained and added. "Modeling of extreme events poses special challenges since they are, by definition, rare. Although the intensity and frequency of extreme events are modulated by ocean and land surface state and by trends in the mean climate state, internal atmospheric variability plays a very large role, and the most extreme events arise from chance confluence of unlikely conditions. The very rarity of extreme events makes statistical evaluation of model performance less robust than for mean climate. For example, in evaluating a model’s ability to simulate heat waves as intense as that in 1995, only a few episodes in the entire 20th Century approach or exceed that intensity (Kunkel et al. 1996). For such rare events, estimates of the real risk are highly uncertain, varying from once every 30 years to once every 100 years or more. Thus, a model that simulates these occurrences at a frequency of once every 30 years may be performing adequately, but its performance cannot be distinguished from that of the model that simulates a frequency of once every 100 years." Quote from this report: Climate models report.
    0 0
  36. Norman@235 In post 231, I asked you a specific question, that you ought to be capable of answering directly. Sadly all to often discussions of climate science with skeptics end up with the skeptic unwilling to state his position clearly and unambiguously. I have learned that that is usually the indicator that no further progress will be made and it is a waste of time to continue. That modelling extreme events involves uncertainties does not mean it isn't science.
    0 0
  37. Dikran Marsupial @ 236 I will attempt to answer: "Norman, take a GCM, which embodies our knowledge of climate physics. Apply CO2 radiative forcing, and observe an increase in rainfall frequency and intensity. Does it then become science?" Here is the Wikipedia definition of science which is similar to other definitions. Science Definition. "Science (from Latin: scientia meaning "knowledge") is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the world." The testable explanations would be the mechanisms I have been asking for. The mechanisms that drive extreme thunderstorms are well established and predicatable. Meteorologists can look at initial conditions and predict with high degree of confidence the severity of thunderstorms that will develop. Now for your question. It would be a science if the models have known mechanisms for explaining why the preciptiation will increase (such as, increased evaporation caused by a warming earth due to CO2 forcing) and then if they had predictablitly. If the model forcasts a given increase in frequency and intensity of rainfall over an area during a specified period of time, and the actual rainfall in the area comes close to the model's prediction over a given number of areas (one correct prediction would not be enough to qualify the model as science) then I would say the model is now a science. It has achieved the goal of predictability that is necessary for it to be considered a science. In the example you provided it goes back to what one considers extreme. In your example 40 mm a day is considered extreme. "Extreme rainfall is chosen as precipitation greater than 40 mm/day chosen on the basis of prior research indicating that rainfall at this intensity has the potential of causing soil erosion and flooding." I don't think any farmer in my area would consider a 1.57" rainfall in a day an extreme amount. I would doubt such a rain would lead to flooding unless the soil was highly saturated and it all fell in a really short time.
    0 0
  38. Norman "It would be a science if the models have known mechanisms for explaining why the preciptiation will increase ... and then if they had predictability." Come on. This is like saying that you can't rely on health experts telling you that when cigarette consumption increases in an area you can expect to see more cases of lung cancer as well as other cancers - because the mechanism for working out how smoke (of all kinds) initiates cancer is poorly understood. Nor can health experts predict with any confidence which particular smokers will contract cancer as against heart disease or stroke or emphysema. Epidemiologists can tell you the expected rate for these diseases given a certain rate of smoking. Doctors can identify which people have developed these diseases and they can treat them - but they can't tell in advance which person will, or will not, develop any smoking related illness. Does that mean it's not science?
    0 0
  39. adelady @ 238 You have pointed out one of the needs for a study to considered scientific, predictablitity. They do know some chemicals in cigarette smoke are capable of damaging DNA (which could lead to cancer). So there is a tenuous link even if it can't be specifically detemined. If a climate model would predict events sometimes and not so well at other times then it would fail in the predictablity deparment and I would not consider it science based upon the accepted definition of this concept.
    0 0
  40. Norman, all predictions are couched in terms of their inherent uncertainty. You could predict the location of say Saturn's moon Hyperion in 10 years time but you dont have a chance in hell of predicting it attitude. Does that make gravity unscientific?
    0 0
  41. And by same token we can predict earth or mars location and attitude 10 years out to high degree of precision. That make gravity an inconsistent theory?
    0 0
  42. scaddenp @ 240 and 241 Not sure I understand your line of reasoning. The definition of science is given above. Gravity has a linking mechanism, all matter attracts and it does so by the working equation F=G(M1*M2)/r^2. Certain systems of gravity cannot be predicted. They are outside the realm of science. Even accumulating more information on the system will not make it more predictable. Example of chaotic gravity. After 10 years with Hyperion you may not be able to build a model to predict its motion but gravity is still scientific. If you have the measurements of its mass, Saturn's mass, its distance from Saturn, you can come up with a precise measure of the gravitational force acting on this moon. If a climate model is to be considered scientific then it must pass the test of predictability. If the model is incapable of making valid tested predictions why would you consider it scientific?
    0 0
  43. Like gravity, the physics of climate is well understood but like hyperion, all that understanding will not make some things predictable. "If the model is incapable of making valid tested predictions why would you consider it scientific" Well first and foremost, climate models make numerous testable predictions. Some predictions are more robust than others (the level to which the physics is captured by the computer model as well as influence of chaos) and some with greater certainty than others. The strawman is to demand a prediction from the model that it cannot deliver and ignore the predictions that it can make. Note how well Broecker's prediction for temperature in 2010 made in 1975 was. I will also grant that the actual accuracy was considerably better than the model was actually capable of delivering. I would reject current climate theory if the robust predictions of the models do not match predictions within the uncertainties of the model but I see no evidence for that.
    0 0
  44. scaddenp My series of posts are an attempt to determine the validity of this question by Jeff Master "Has human-caused climate change destabilized the climate, bringing these extreme, unprecedented weather events?" I am investigating if these are extreme, unprecendented weather events by looking at long time series of weather related phenomena in various regions of the globe. Would a destabalized climate look much different than past climates? When I look at long term climate patterns I see these cycles (longer than 30 years). Maybe my vision is poor. But I still have not seen variations that seem to be "outside the envelope". They may beome a reality, that is a different question with its own set of complications. The question here are 2010-2011 weather extremes an example of climate disruption caused by global warming.
    0 0
  45. I would say that you are making the hypothesis that there is unforced cycles in the weather pattern and these alone are enough to explain weather. An alternative hypothesis which doesnt require unexplained cycles is to use existing physics and postulate that these are result of global warming. Now I would also agree that while weather patterns are consistent with global warming hypothesis, the predictions about extreme weather are not sufficiently robust (model cell size is too large) nor is the observation period long enough to make strong statements on variations compared to natural variability. However, as a guide to how insurance companies and governments with a lot at stake, I would act in precautionary way rather than depend on the hope of that this is a cycle which has no physical basis yet.
    0 0
  46. Norman @233: Trapp et al, 2009:
    "We investigate the transient response of severe thunderstorm forcing to the time-varying greenhouse gas concentrations associated with the A1B emissions scenario. Using a five-member ensemble of global climate model experiments, we find a positive trend in such forcing within the United States, over the period 1950 – 2099. The rate of increase varies by geographic region, depending on (i) lowlevel water vapor availability and transport, and (ii) the frequency of synoptic-scale cyclones during the warm season. Our results indicate that deceleration of the greenhouse gas emissions trajectory would likely result in slower increases in severe thunderstorm forcing. Citation: Trapp, R. J., N. S. Diffenbaugh, and A. Gluhovsky (2009), Transient response of severe thunderstorm forcing to elevated greenhouse gas concentrations,"
    From Trapp et al, 2007:
    "Fig. 1. Difference (A2 − RF) in mean CAPE, vertical wind shear over the surface to 6 km layer (S06), mean surface specific humidity (qs ), and severe thunderstorm environment days (NDSEV) for March–April–May (MAM) (a–d) and June–July–August (JJA) (e–h), respectively. The RF integration period is 1962–1989, and the A2 integration period is 2072–2099.">/blockquote>
    0 0
  47. Norman wrote: "If a climate model is to be considered scientific then it must pass the test of predictability. If the model is incapable of making valid tested predictions why would you consider it scientific?" As I pointed out earlier, you can write down the physics of a double pendulum on a side of A4 paper. You can use that description to make a computational model of a double pendulum (often set as a student project - it isn't difficult). Can that sofware model make useful predictions of the trajectory of a double pendulum? No. Does that mean a mathematical model of a double pendulum is non-scientific. I look forward to a direct answer to this question.
    0 0
  48. Dikran Marsupial, You ask some really good questions that are thought provoking. It would seem the mathematical model of the double pendulum is scientific since the equations can offer testable explanations about the nature of the pendulum and offer predictibility about the overall behavior of pendulum. That is if you would use the equations and run a long term simulation, the simulation would trace out the same area as an actual pendulum if this were videotaped. A software model of the pendulum would be worthless as a science as it would not give any useful prediction of the pendulum's nature. The science of meterology does not try to make predictions about the weather beyond a few days because they know such activity is nonscientific and useless. The prediction means nothing. It is a very good question and I don't know if I answered it to your liking. However, in the world of climate models, climate is an aggregate of weather patterns for a region. One could not predict anyone actual thunderstorm in the region months in advance, but a good climate model should be able to predict is a region will have more our less moisture over a given period of time. If it can't do this and make a valid prediction, the model would not be good for much and a new one should be developed.
    0 0
  49. Norman @248, as I understand it, you are currently arguing that predictions based on climate science that extreme weather will not increase are not science because they are not predictable despite the fact that we are now almost 250 posts in on a thread showing very clear that predictions of increased extreme weather have been made, and that those predictions are being confirmed by measurable results. Given that context, the claim you are making is simply not plausible. What is more, the epistemology of science on which it is based is naive and demonstrably false. If you find a more appropriate thread, I can discuss that with you. In the meantime, perhaps we could reserve this thread for talking about the weather (stay on topic).
    0 0
  50. Norman at 18:33 PM on 5 July, 2011
    "One could not predict anyone actual thunderstorm in the region months in advance, but a good climate model should be able to predict is a region will have more our less moisture over a given period of time."
    Fair enough. And by your criterion climate models have done exactly that, predicting pretty well that global warming will result in enhancement of precipitation extremes and in defining the location of those regions of the Earth that have lesser and greater rainfall, respectively, as the world has warmed. See for example: [*] X. Zhang et al. (2007) Detection of human influence on twentieth-century precipitation trends Nature 448, 461-465 abstract [**] RP Allen et al. (2008) Atmospheric warming and the amplification of precipitation extremes Science 321, 1481-1484 abstract
    0 0

Prev  1  2  3  4  5  

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us