Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

The Debunking Handbook Part 5: Filling the gap with an alternative explanation

Posted on 25 November 2011 by John Cook, Stephan Lewandowsky

The Debunking Handbook is an upcoming a now available guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This post has been cross-posted at Shaping Tomorrow's World

Assuming you successfully negotiate the various backfire effects, what is the most effective way to debunk a myth? The challenge is that once misinformation gets into a person’s mind, it’s very difficult to remove. This is the case even when people remember and accept a correction.

This was demonstrated in an experiment in which people read a fictitious account of a warehouse fire.1,2,3 Mention was made of paint and gas cans along with explosions. Later in the story, it was clarified that paint and cans were not present at the fire. Even when people remembered and accepted this correction, they still cited the paint or cans when asked questions about the fire. When asked, “Why do you think there was so much smoke?”, people routinely invoked the oil paint despite having just acknowledged it as not being present. 

When people hear misinformation, they build a mental model, with the myth providing an explanation. When the myth is debunked, a gap is left in their mental model. To deal with this dilemma, people prefer an incorrect model over an incomplete model. In the absence of a better explanation, they opt for the wrong explanation.4

In the warehouse fire experiment, when an alternative explanation involving lighter fluid and accelerant was provided, people were less likely to cite the paint and gas cans when queried about the fire. The most effective way to reduce the effect of misinformation is to provide an alternative explanation for the events covered by the misinformation. 

This strategy is illustrated particularly clearly in fictional murder trials. Accusing an alternative suspect greatly reduced the number of guilty verdicts from participants who acted as jurors, compared to defences that merely explained why the defendant wasn’t guilty.5

For the alternative to be accepted, it must be plausible and explain all observed features of the event.6,1 When you debunk a myth, you create a gap in the person’s mind. To be effective, your debunking must fill that gap.

One gap that may require filling is explaining why the myth is wrong. This can be achieved by exposing the rhetorical techniques used to misinform. A handy reference of techniques common to many movements that deny a scientific consensus is found in Denialism: what is it and how should scientists respond?7 The techniques include cherry picking, conspiracy theories and fake experts.

Another alternative narrative might be to explain why the misinformer promoted the myth. Arousing suspicion of the source of misinformation has been shown to further reduce the influence of misinformation.8,9

Another key element to effective rebuttal is using an explicit warning (“watch out, you might be misled”) before mentioning the myth. Experimentation with different rebuttal structures found the most effective combination included an alternative explanation and an explicit warning.4 

Graphics are also an important part of the debunker’s toolbox and are significantly more effective than text in reducing misconceptions. When people read a refutation that conflicts with their beliefs, they seize on ambiguities to construct an alternative interpretation. Graphics provide more clarity and less opportunity for misinterpretation. When self-identified Republicans were surveyed about their global warming beliefs, a significantly greater number accepted global warming when shown a graph of temperature trends compared to those who were given a written description.10 

Another survey found that when shown data points representing surface temperature, people correctly judged a warming trend irrespective of their views towards global warming.11 If your content can be expressed visually, always opt for a graphic in your debunking.

References

  1. Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective? The Psychology of Learning and Motivation, 41, 265-292.
  2. Wilkes, A. L.; Leatherbarrow, M. (1988). Editing episodic memory following the identification of error, The Quarterly Journal of Experimental Psychology A: Human Experimental Psychology, 40A, 361-387. 
  3. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When discredited information in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20 (6), 1420-1436.
  4. Ecker, U. K., Lewandowsky, S., & Tang, D. T. (2011). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38, 1087-1100.
  5. Tenney, E. R., Cleary, H. M., & Spellman, B. A. (2009). Unpacking the doubt in “Beyond a reasonable doubt:” Plausible alternative stories increase not guilty verdicts. Basic and Applied Social Psychology, 31, 1-8.
  6. Rapp, D. N., & Kendeou, P. (2007). Revising what readers know: Updating text representations during narrative comprehension. Memory & Cognition, 35, 2019-2032.
  7. Diethelm, P., & McKee, M. (2009). Denialism: what is it and how should scientists respond? European Journal of Public Health, 19, 2-4.
  8. Lewandowsky, S., Stritzke, W. G., Oberauer, K., & Morales, M. (2005). Memory for fact, fiction and misinformation: The Iraq War 2003. Psychological Science, 16, 190-195.
  9. Lewandowsky, S., & Stritzke, W. G. K., Oberauer, K., & Morales, M. (2009). Misinformation and the ‘War on Terror’: When memory turns fiction into fact. In W. G. K. Stritzke, S. Lewandowsky, D. Denemark, J. Clare, & F. Morgan (Eds.), Terrorism and torture: An interdisciplinary perspective (pp. 179-203). Cambridge, UK: Cambridge University Press. 
  10. Nyhan, B., & Reifler, J. (2011). Opening the Political Mind? The effects of self-affirmation and graphical information on factual misperceptions. In press.
  11. Lewandowsky, S. (2011). Popular consensus: Climate change set to continue. Psychological Science, 22, 460-463.

0 0

Printable Version  |  Link to this page

Comments

Comments 1 to 8:

  1. I'm finding this handbook very useful indeed. Nice work!
    0 0
  2. I, too, find this article very interesting. The old adage "a knife cuts two ways" comes to mind.
    0 0
  3. pirate, This series of articles requires facts to debunk myths. It doesn't work the other way round; myths don't debunk facts. Example: when shown data points representing surface temperature, people correctly judged a warming trend irrespective of their views towards global warming. You'd have a hard time establishing 'it's cooling' with those facts.
    0 0
  4. Apirate - on first glance your comment seems to indicate you think this article could be used to debunk the truth with myths. That raises two questions -the first, already asked by muoncounter - how can you use myths to debunk facts? The second, why would anyone, let alone a science teacher, WANT to debunk the truth about science? Please explain your post more clearly.
    0 0
  5. This series is very good. It lays out explanations about discussing that make explicit things I have sensed in discussions on various global warming threads. This site is very useful for the caulk that fills in those gaps. Often I can read a paragraph with a "mini" gish gallop, refute and fill in the gaps with a quick confirmation of my response back here. Many thanks to all involved...
    0 0
  6. Here is a view I hold of the brain. Think of the brain as a vast web (yes, neurons). To properly replace something, the thinker has to (a) access the precise part of the vast web and (b) integrate the replacement. The speaker is not the thinker. This is why it's best to let the thinker talk out his (her) problem. It must come from within. Hopefully the speaker can provide useful stimulation to help the thinker find the problem area(s), but ultimately it is the thinker that must discover and fix. There may be many interrelated problem areas (with problems/myths of various sorts). So a fix in the wrong spot will tend to have little value. Also, a fix in a few spots may not solve the full problem and further thinking may end up upending the new fixes if they were few in number and not deeply intertwined with what the thinker holds strongly and fundamentally (eg, related to things easy to see for yourself). Lesson to be learned: Let the thinker guide the exploration. Try to find conflicts in their analysis (ie, weaken the foothold the myth may have). Provide many versions of the truth (eg, at different levels and under different contexts), as this adds interdependence and increases the odds it will hit the right spot for that thinker.
    0 0
  7. ..specifically in response to this article: "Debunking" may not debunk all the things that need debunking. "Debunking" might be weak. Using simple sentences and the like makes it easy to add deep interdependence to the new facts. If the thinker doesn't understand, the "fact" will be wisked away and not grow to dominate areas where myths are entrenched. If you don't understand, you can't use that fact. You can't recall that fact easily. That fact will disappear and never have real power in the first place. A person with an elaborate theory will need many of those parts undone. The myths have a strong hold and facts will not be strong enough to uproot every related and reinforcing myth. You never just fight one myth but a set of related myths. Part of the battle (besides creating depth and solid foundation for facts) is to weaken the rival mythology.
    0 0
  8. I added a bit more here http://www.skepticalscience.com/Debunking-Handbook-Part-2-Familiarity-Backfire-Effect.html#68823 . Eg, "When they think of the myth, you want them to then think of the failure of the myth or the right answer... Working off contradictions is great because many theories can be internally consistent... Contradictions are a great way to eliminate the wrong paths efficiently.. what remains will be the path taken."
    0 0

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us