The Debunking Handbook Part 4: The Worldview Backfire Effect
Posted on 23 November 2011 by John Cook, Stephan Lewandowsky
Update Oct. 14, 2020:
The Debunking Handbook is now available in an extensively updated version written by Stephan Lewandowsky, John Cook, Ulrich Ecker and 19 co-authors. Read about this new edition in this blog post: The Debunking Handbook 2020: Downloads and Translations
Excerpt 3 from this new edition of The Debunking Handbook explains the latest research about The elusive backfire effects.
The Debunking Handbook is an upcoming a freely available guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.
This post has been cross-posted at Shaping Tomorrow's World.
The third and arguably most potent backfire effect occurs with topics that tie in with people’s worldviews and sense of cultural identity. Several cognitive processes can cause people to unconsciously process information in a biased way. For those who are strongly fixed in their views, being confronted with counter-arguments can cause their views to be strengthened.
One cognitive process that contributes to this effect is Confirmation Bias, where people selectively seek out information that bolsters their view. In one experiment, people were offered information on hot-button issues like gun control or affirmative action. Each parcel of information was labelled by its source, clearly indicating whether the information would be pro or con (e.g., the National Rifle Association vs. Citizens Against Handguns). Although instructed to be even-handed, people opted for sources that matched their pre-existing views. The study found that even when people are presented with a balanced set of facts, they reinforce their pre-existing views by gravitating towards information they already agree with. The polarisation was greatest among those with strongly held views.1
What happens when you remove that element of choice and present someone with arguments that run counter to their worldview? In this case, the cognitive process that comes to the fore is Disconfirmation Bias, the flipside of Confirmation Bias. This is where people spend significantly more time and thought actively arguing against opposing arguments.2
This was demonstrated when Republicans who believed Saddam Hussein was linked to the 9/11 terrorist attacks were provided with evidence that there was no link between the two, including a direct quote from President George Bush.3 Only 2% of participants changed their mind (although interestingly, 14% denied that they believed the link in the first place). The vast majority clung to the link between Iraq and 9/11, employing a range of arguments to brush aside the evidence. The most common response was attitude bolstering - bringing supporting facts to mind while ignoring any contrary facts. The process of bringing to the fore supporting facts resulted in strengthening people’s erroneous belief.
If facts cannot dissuade a person from their pre-existing beliefs - and can sometimes make things worse - how can we possibly reduce the effect of misinformation? There are two sources of hope.
First, the Worldview Backfire Effect is strongest among those already fixed in their views. You therefore stand a greater chance of correcting misinformation among those not as firmly decided about hot-button issues. This suggests that outreaches should be directed towards the undecided majority rather than the unswayable minority.
Second, messages can be presented in ways that reduce the usual psychological resistance. For example, when worldview-threatening messages are coupled with so-called self-affirmation, people become more balanced in considering pro and con information.4,5
Self-affirmation can be achieved by asking people to write a few sentences about a time when they felt good about themselves because they acted on a value that was important to them. People then become more receptive to messages that otherwise might threaten their worldviews, compared to people who received no self-affirmation. Interestingly, the “self-affirmation effect” is strongest among those whose ideology was central to their sense of self-worth.
Another way in which information can be made more acceptable is by “framing” it in a way that is less threatening to a person’s worldview. For example, Republicans are far more likely to accept an otherwise identical charge as a “carbon offset” than as a “tax”, whereas the wording has little effect on Democrats or Independents—because their values are not challenged by the word “tax”.6
Self-affirmation and framing aren’t about manipulating people. They give the facts a fighting chance.
References
- Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755–69.
- Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32, 303-330.
- Prasad, M., Perrin, A. J., Bezila, K., Hoffman, S. G., Kindleberger, K., Manturuk, K., et al. (2009). “There Must Be a Reason’’: Osama, Saddam, and Inferred Justification. Sociological Inquiry, 79, 142-162.
- Cohen, G. L., Sherman, D. K., Bastardi, A., Hsu, L., & McGoey, M. (2007). Bridging the Partisan Divide: Self-Affirmation Reduces Ideological Closed-Mindedness and Inflexibility in Negotiation. Personality & Soc. Psych., 93, 415-430.
- Nyhan, B., & Reifler, J. (2011). Opening the Political Mind? The effects of self-affirmation and graphical information on factual misperceptions. In press.
- Hardisty, D. J., Johnson, E. J. & Weber, E. U. (1999). A Dirty Word or a Dirty World?: Attribute Framing, Political Affiliation, and Query Theory, Psychological Science, 21, 86-92
We are dealing with worldview bias rather than debating in a scientific framework, for the most part we are talking to global warming deniers, who deny they are deniers for the most part. Really I am surprised if any single global warming denier ever changes their mind.
They have been denying they are deniers, what do you think of the idea of using that against them. When you ask "is global warming real", they have sometimes shifted from answering "no", to answer "yes but".
Am I right in thinking that when someone agrees with you in an argument, you should rub it in? For example I know that when they say "yes but", that what is to follow is an explanation, "yes but it isn't caused by man, might be beneficial, is the same thing as the medieval period, etc". But for all that talk, as frustrating as it is, when they say "yes but", haven't they essentially said "you are right, but ..."? So that, I could rub that in.
For example, a politician from a particular political party which admits global warming is real like the rest of the world, could ask a politician from another political party which bitterly opposes any recognition of the reality of global warming, over and over: "I understand that you have admitted global warming is real, I really admire that". In response, "yes but (with long explanation)".
Because, I think the tactic of saying I admit global warming is real but here is my long explanation in which I try to have everything both ways and merely sow doubt and stall. It seems like a clever tactic and it is infuriating, but maybe it can be another liability for them.