Shankar Vedantam of the Washington Post wrote a fascinating article last week about psychological research showing that attempts to correct myths can actually end up strengthening them:
The conventional response to myths and urban legends is to counter bad information with accurate information. But the new psychological studies show that denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.
…The research also highlights the disturbing reality that once an idea has been implanted in people’s minds, it can be difficult to dislodge. Denials inherently require repeating the bad information, which may be one reason they can paradoxically reinforce it.
Indeed, repetition seems to be a key culprit. Things that are repeated often become more accessible in memory, and one of the brain’s subconscious rules of thumb is that easily recalled things are true.
So what are the implications for fact-checking? My take is that efforts should be targeted at elites to try to stop the spread of myths and misinformation before they become pervasive (as we did at Spinsanity). Once a myth spreads widely, this research suggests it will be extremely difficult to correct.
However, the studies Vendatam discusses typically did not consider myths where people have strong ideological reasons to hold their beliefs. My co-author Jason Reifler sent him our draft paper (PDF) on correcting political misperceptions and he was kind enough to bring it up during an interview on NPR’s On the Media (MP3):
BOB GARFIELD (host): The studies you’re talking about suggest that these effects take place irrespective of the bias of the listener, but there’s another study that suggests that if you are in fact predisposed to have a certain worldview, that misinformation sticks still more. Can you describe it?
VENDATAM: There’s a new study that’s just been completed by Jason Reifler at Georgia State University where he actually looks at questions such as why it is that large numbers of people continue to believe that weapons of mass destruction were present in Iraq before the invasion or even found in Iraq after the invasion. And what Jason and his colleagues did was try and give people the correct information. And what he found, ironically, is that partisans who wanted to believe that weapons of mass destruction had been found in Iraq when told about the correct information ended up believing ever more fervently that they were right and that the correct information was wrong.
GARFIELD: And this would explain, for example, why throughout the Arab and Muslim worlds more than half the population seems to believe the attacks on the World Trade Center and the Pentagon were the work of [the] US government or Israel.
VENDATAM: I think that’s right. What’s especially disturbing is that the number of people who believe that is actually growing over time. In the study I mentioned, 59% of Turks and Egyptians, 65% of Indonesians, 53% of Jordanians, even 56% of British Muslims do not believe that Arabs were behind the 9/11 attacks. Presenting them with the correct information, which by the way is our government’s strategy of combating myths and disinformation, does not seem to be a very effective approach.
For a classic example of how people can reject information they don’t want to believe, consider that Robert Draper’s Dead Certain reports that President Bush still thought Iraq had WMD before the war as late as 2006 (via TPM):
Though it was not the sort of thing one could say publicly anymore, the president still believed that Saddam had possessed weapons of mass destruction. He repeated this conviction to Andy Card all the way up until Card’s departure in April 2006, almost exactly three years after the Coalition had begun its fruitless search for WMDs.