Canada's ClimateData Web Portal: Normal Science, Not Fake

In August, the Canadian government launched a new website called ClimateData.ca which "provides engineers, public health professionals, urban planners, mayors, and anyone else doing long-term planning with user-friendly climate change information, data, resources and tools" (press release). The website uses past weather data from 1950-2011 along with computer modelling to project the possible future impacts of climate change in Canada. Françoys Labonté, the Executive Director of Computer Research Insititue of Montréal (CRIM), one of the developers of the new web portal, said the website's purpose is "to provide Canadian leaders and managers with easy access to useful, actual and future climate data and information to assist them in anticipating and managing climate change-related issues over the medium and long term."

ClimateData Website

Figure 1. The home page of Canada's new website: ClimateData.ca. Click for larger image.

It didn't take long for climate science denialists to notice the lack of weather data before 1950 and assume that Canadian government scientists must be up to something nefarious. Blacklock's Reporter was the first to claim, "Canada omitted 100 years’ worth of weather data from a federal website intended to illustrate climate change." Their story is behind a rather expensive paywall (even for us AGW alarmists who are "rolling in grant money"). But the story was picked up by Toronto Sun journalist, Lorrie Goldstein, claiming that Environment and Climate Change Canada (ECCC) "omitted a century’s worth of observed weather data in developing its computer models on the impacts of climate change." James Delingpole blared from Breitbart that the ECCC "has erased a century’s worth of observed temperature data, claiming its modelled computer projections are more accurate." Delingpole goes further in his conclusion:

Others less committed to green activism might find it somewhat sinister that the international agencies charged with maintaining the world’s temperature records are destroying them because the factual evidence doesn’t support the global warming scare narrative.

These wild claims about omitted, scrapped, erased, or destroyed historical weather data are completely false, and they are a total misrepresentation of the types of computer modelling used for ClimateData.

Explaining ANUSPLIN

One reason weather data before 1950 was not used is because the data is too sparse—there were fewer weather stations pre-1950, especially in northern Canada. The purpose of ClimateData is to provide detailed climate information for all of Canada, including small towns and remote forested or mountainous areas far from any weather stations. The only way to get such information is to extrapolate from known weather records and use computer models to "fill in the gaps."

To do this they used a software package called ANUSPLIN (Australian National University Spline). If Delingpole and Goldstein et al had read the "About" page  of the ClimateData website they would have learned about ANUSPLIN and that "Quality-controlled, but unadjusted, station data from the National Climate Data Archive of Environment and Climate Change Canada data (Hutchinson et al., 2009) were interpolated onto the high-resolution grid using thin plate splines."

I contacted a media relations spokesperson at ECCC, Gabrielle Lamontagne, who said, "Environment and Climate Change Canada has not removed or deleted any weather data in developing models on the impacts of climate change.  The complete record of weather station observations is publicly available on the Meteorological Service of Canada’s Historical Data page." You can still access pre-1950 weather records.

So, no, data was not erased and replaced with fake data which shows more warming.

The use of ANUSPLIN is another reason pre-1950 data was not used. Lamontagne, explains: "the data was calibrated using observational dataset (ANUSPLIN), which was only available from 1950. At that time, the weather station density across Canada, particularly in the northern regions, became dense enough for reliable gridded datasets to be created."

A paper about ANUSPLIN (McKenney et al. (2011) Customized Spatial Climate Models for North America) gives more information about this type of modelling:

In forestry and many other sectors, there is often a need for estimates well away from meteorological stations, which tend to be clustered near agricultural and urban areas. This need is met by “spatial” climate models,  which can provide estimates of climate at both specific locations of interest and in the form of regular grids. Projected climate change is another motivating factor in the development of these products. Spatial models of projected future climate allow these changes to be mapped, regional impacts to be assessed, and adaptation measures to be developed.

Those last few sentences perfectly describe the role of ClimateData and why modelling using ANUSPLIN was fit for that role.

McKenney et al. further state: "All spatial climate modeling begins with, and ultimately depends on, data from meteorological stations."

So, again, real-world weather records were used to generate more accurate models of Canada's climate for the second half of the 20th century.

Details from Downscaling

ANUSPLIN is a form of "statistical downscaling" commonly used in regional and local computer modelling. Most computer models used in climate studies (General Circulation Models, GCMs) are global in scale and their resolution (grid sizes of 100-150 km) is too coarse to give the finer details needed to show impacts at country, province, or city levels (grids of ~10km). To show these finer details statistical downscaling is used. Demystifying Climate Models defines statistical downscaling as a:

method by which a statistical relationship is established from observations between large-scale variables, like atmospheric surface pressure, and a local variable, like the wind speed at a particular site. The relationship is then subsequently applied to GCM output to obtain the local variables from the GCM output.

Downscaling climate models using Canada's weather observations provided ClimateData with models of Canada's climate over the second half of the 20th century. Models were then run into the future (to 2100) to illustrate what Canada's climate may be like under three different emissions scenarios. There is nothing sinister or deceptive about this. The only way to get information about what climate change may do to Canada during this century is to run computer models. And the best way to do that is to base those models on actual weather observations.

Cherry Picking Old Records and Confusing Weather for Climate

Blacklock's Reporter, as well as the others who parroted their article, also complained that some Canadian weather records prior to 1950 were warmer than current temperatures. Here's an example from Goldstein's article: "Vancouver had a higher record temperature in 1910 (30.6C) than in 2017 (29.5C)." What is so special about those two years? Why not mention 1908 (33C), which had an even higher temperature than 1910? Or 2009 which had the highest recorded temperature for Vancouver at 34C? (Source.)

Does global warming mean that Vancouver must post new record high temperatures every single year? Of course not. These critics are confused about the difference between weather and climate.  Comparing a single temperature record from some random year with that from another random year doesn't tell you much about climate or climate change, it is merely comparing one weather statistic with another weather statistic.

Vancouver Temps

Figure 2. Vancouver's hottest days, second half of the 20th century, now, and in three possible futures with global warming. From ClimateData.ca. Click for larger image.

Gabrielle Lamontagne, the spokesperson for ECCC, summed up these distinctions in the context of ECCC's role in communicating to the public about weather and climate:

Regarding the use of observation data mentioned [in] the articles, according to Canada’s Changing Climate Report, between 1948 and 2016, Canada's climate has warmed by 1.7 °C for Canada as a whole, and 2.3 °C for northern Canada, with the strongest warming in northwestern Canada. This warming trend does not mean that the average temperature is higher in each successive year. As climate varies naturally, there have been and will be years that are warmer or cooler than previous years over the country, as well as different magnitudes in warming among regions.  As a result, extremely high or low temperatures have occurred throughout the period of observational records, even though there is an overall warming trend across all regions.  When Environment Climate Change Canada reports on past changes in hottest days or storms, ECCC’s experts do not use model data, only observation data. When Environment and Climate Change Canada wants to know if changes in hottest day temperatures have anything to do with global warming, ECCC’s experts compare the observed data with model simulations of the expected climate response to global warming.

Two Options

Are Blacklock's Reporter et al. honestly confused about the difference between weather and climate, and the complexities of climate computer modelling, or are they just trying to use that complexity to confuse their readers? Whatever the case, whether it's misinformation or disinformation, their readers come away from their articles with a skewed understanding of climate science. Instead of an awareness of how scientists use weather data to inform their climate models, these readers come away with a false view that scientists are throwing out old data and replacing it with "fake" data.

Climate science can be a confusing topic. It draws from just about every major field of science, so no one can ever grasp every detail about everything involved. Of course this is true for the general public, but it's also true for scientists. Full disclosure: before writing this post I had never heard about ANUSPLINE, nor did I know about computer model "downscaling" and how modellers can interpolate weather data into climate models. Fascinating!

When faced with something new we have two options: 1) we can trust that what we are reading is an honest description of the novelty, or 2) we can dig deeper and try to learn more about the new topic. The readers of Blacklock's Reporter et al are given a specific view about Canada's new web portal which falsely implicates climate scientists in a fraudulent scheme. Readers may trust this view as reality, or if they are truly curious about climate science and honestly want to learn about the global warming effects in Canada (both now and in the future) they would do well to look closer at the ClimateData website and see where that may lead them.

Posted by David Kirtley on Thursday, 17 October, 2019


Creative Commons License The Skeptical Science website by Skeptical Science is licensed under a Creative Commons Attribution 3.0 Unported License.