Can we be inoculated against climate misinformation? Yes – if we prebunk rather than debunk
Posted on 16 February 2024 by Guest Author
This article is republished from The Conversation under a Creative Commons license. Read the original article written by Christian Turney, University of Technology Sydney and Sander van der Linden, University of Cambridge and first published on February 14, 2024.
Last year, the world experienced the hottest day ever recorded, as we endured the first year where temperatures were 1.5°C warmer than the pre-industrial era. The link between extreme events and climate change is clearer than ever. But that doesn’t mean climate misinformation has stopped. Far from it.
Misleading or incorrect information on climate still spreads like wildfire, even during the angry northern summer of 2023. Politicians falsely claimed the heatwaves were “normal” for summer. Conspiracy theorists claimed the devastating fires in Hawaii were ignited by government lasers.
People producing misinformation have shifted tactics, too, often moving from the old denial (claiming climate change isn’t happening) to the new denial (questioning climate solutions). Spreading doubt and scepticism has hamstrung our response to the enormous threat of climate change. And with sophisticated generative AI making it easy to generate plausible lies, it could become an even bigger issue.
The problem is, debunking misinformation is often not sufficient and you run the risk of giving false information credibility when you have to debunk it. Indeed, a catchy lie can often stay in people’s heads while sober facts are forgotten.
But there’s a new option: the prebunking method. Rather than waiting for misinformation to spread, you lay out clear, accurate information in advance – along with describing common manipulation techniques. Prebunking often has a better chance of success, according to recent research from co-author Sander van Linden.
How does prebunking work?
Misinformation spreads much like a virus. The way to protect ourselves and everyone else is similar: through vaccination. Psychological inoculation via prebunking acts like a vaccine and reduces the probability of infection. (We focus on misinformation here, which is shared accidentally, not disinformation, which is where people deliberately spread information they know to be false).
If you’re forewarned about dodgy claims and questionable techniques, you’re more likely to be sceptical when you come across a YouTube video claiming electric cars are dirtier than those with internal combustion engines, or a Facebook page suggesting offshore wind turbines will kill whales.
Inoculation is not just a metaphor. By exposing us to a weakened form of the types of misinformation we might see in the future and giving us ways to identify it, we reduce the chance false information takes root in our psyches.
Scientists have tested these methods with some success. In one study exploring ways of countering anti-vaccination misinformation, researchers created simple videos to warn people manipulators might try to influence their thinking about vaccination with anecdotes or scary images rather than evidence.
They also gave people relevant facts about how low the actual injury rate from vaccines is (around two injuries per million). The result: compared to a control group, people with the psychological inoculation were more likely to recognise misleading rhetoric, less likely to share this type of content with others, and more likely to want to get vaccinated.
Similar studies have been conducted on climate misinformation. Here, one group was forewarned that politically motivated actors will try to make it seem as if there was a lot of disagreement on the causes of climate change by appealing to fake experts and bogus petitions, while in fact 97% or more of climate scientists have concluded humans are causing climate change. This inoculation proved effective.
The success of these early studies has spurred social media companies such as Meta to adopt the technique. You can now find prebunking efforts on Meta sites such as Facebook and Instagram intended to protect people against common misinformation techniques, such as cherry-picking isolated data.
Prebunking in practice
A hotter world will experience increasing climate extremes and more fire. Even though many of the fires we have seen in recent years in Australia, Hawaii, Canada and now Chile are the worst on record, climate misinformation actors routinely try to minimise their severity.
As an example, let’s prebunk claims likely to circulate after the next big fire.
1. The claim: “Climate change is a hoax – wildfires have always been a part of nature.”
How to prebunk it: ahead of fire seasons, scientists can demonstrate claims like this rely on the “false equivalence” logical fallacy. Misinformation falsely equates the recent rise in extreme weather events with natural events of the past. A devastating fire 100 years ago does not disprove the trend towards more fires and larger fires.
2. Claim: “Bushfires are caused by arsonists.”
How to prebunk it: media professionals have an important responsibility here in fact-checking information before publishing or broadcasting. Media can give information on the most common causes of bushfires, from lightning (about 50%) to accidental fires to arson. Media claims arsonists were the main cause of the unprecedented 2019-2020 Black Summer fires in Australia were used by climate deniers worldwide, even though arson was far from the main cause.
3. Claim: “The government is using bushfires as an excuse to bring in climate regulations.”
How to prebunk it: explain this recycled conspiracy theory is likely to circulate. Point out how it was used to claim COVID-19 lockdowns were a government ploy to soften people up for climate lockdowns (which never happened). Show how government agencies can and do communicate openly about why climate regulations are necessary and how they are intended to stave off the worst damage.
False information on bushfires can spread like a bushfire. Toa55/Shutterstock
Misinformation isn’t going away
Social media and the open internet have made it possible to broadcast information to millions of people, regardless of whether it’s true. It’s no wonder it’s a golden age for misinformation. Misinformation actors have found effective ways to cast scepticism on established science and then sell a false alternative.
We have to respond. Doing nothing means the lies win. And getting on the front foot with prebunking is one of the best tools we have.
As the world gets hotter, prebunking offers a way to anticipate new variants of lies and misinformation and counter them – before they take root.
It is hard to argue against 'pre-bunking being more effective than de-bunking'. It is like having the relevant significant facts of a debatable matter being established before a debate and having every viewer of the debate be fully informed in advance of those established facts.
But de-bunking is still required because pre-bunking is not a cure-all. Having the facts established will not mean that everyone accepts the established facts. Post-truth misleading and believing is nothing new. It is just being more shamefully abused by unjustifiably significantly influential people - people who do not deserve their developed perceptions of status.
Now when I encounter a person publicly sharing misleading information I try to get them to establish the relevant facts for their claim. I then try to focus on improving awareness and understanding of the relevant facts (with SkS being an excellent resource). That puts the person sharing the misunderstanding on the spot to justify their claim. And it can potentially eliminate the need to make counter-claims to the misleading claim.
Of course, like pre-bunking, 'focusing on the weak/lack of justification for a misunderstanding' does not significantly influence or change the minds of people who are powerfully 'invested in misunderstanding climate science matters'. And the studies of the effectiveness of pre-bunking confirm that by only finding 'measurable reductions of susceptibility to misinformation' rather than finding 'substantial elimination of susceptibility to misunderstanding'.
OPOF @1
I think that even more important than providing specific facts ahead of potential misinformation is to clue people in on the FLICC-techniques they are most likely going to see employed. The examples in the article do just that by pointing out "false equivalence", "conspiracy theories" and "cherry picking". The more people are aware of these techniques, the less will - hopefully - fall for and share the misinformation.
BaerbelW @2,
I appreciate that the article’s prebunking is inoculation efforts that are focused on anticipated misleading marketing efforts, focusing on FLICC. And I understand the importance of that action. I can see inoculation prebunking as part of ‘government public education, especially leadership statements that increase awareness and improve understanding of what is harmful and helpful’ (clearly being compromised when pursuers of benefit from being more harmful and less helpful to others can significantly influence or compromise leadership actions by being 'popular enough to matter').
I do not anticipate encountering a personal situation where I will be able to directly justify engaging a person or group with prebunking (I am not a public/government leadership competitor, an educator, or a magazine story writer). I anticipate encountering people expressing what I consider to be a misunderstanding.
I have experienced many situations where a misunderstanding I try to correct is stubbornly maintained based on what appears to be a wilful lack of awareness, a lack of interest in learning (encountered this as an engineer). Asking the person making the claim to provide the facts justifying their claim can help. It can make it clearer earlier that a person is resisting improving their understanding of the matter.
What motivates people to be susceptible to misunderstanding would be an important part of inoculation prebunking efforts (the business motivation to profit more from quicker and cheaper actions, more likely to be more harmful, was a very common 'engineering challenge' because it was essential that safety was not compromised by those 'other interests').
My way to potentially get to the point of prebunking is to start by asking someone presenting what I believe is a misunderstanding about the justification for the claim they make, especially the independently verifiable facts and related understandings (I may be the one who lacks awareness or has misunderstood the matter). If the person’s claim is not well justified, then, in addition to introducing them to a more comprehensive set of facts and more justified understanding, I may have the opportunity to explore why they were tempted to misunderstand the matter, getting into FLICC but going beyond FLICC and getting into the motivation for their willingness to misunderstand the matter.