A new review of the research explains why "mud" sticks — especially when it's flung in the midst of a heated political contest Daniel Hurst Photography / Getty Images
At the height of campaign season in any presidential election year, voters will be inundated with all kinds of information of dubious accuracy, from misleading claims about candidates’ personal lives to exaggerations about their policy differences. Unfortunately, it’s precisely this type of misinformation — the kind that hews to people’s preexisting political, religious or social ideology — that sticks.
As a new review of past research concludes, “mud” sticks — and, worse, attempts to correct erroneous beliefs can backfire, reinforcing the very misrepresentations they aim to erase. The main problem, the research reveals, is that rejecting information takes more mental effort than accepting it, and given our social environment, most people’s default position is to believe what others tell them. “For better or worse,” the authors write, “the acceptance of information as true is favored by tacit norms of everyday conversational conduct.” Indeed, some research even suggests that in order for the brain to process incoming information at all, it must initially assume that the information is correct.
And yet, we all know that lying and deceit are commonplace. Research shows, for example, that 10% of communication between spouses includes at least some falsehood and that more than one-third of conversations between college students include lies. That means that people aren’t entirely unaware that they’re being spun — but they hold on to their false beliefs anyway.
(MORE: The Upside of Gossip: Social and Psychological Benefits)
In multiple studies included in the new review, researchers presented people with a fictitious news report about a warehouse fire that was initially thought to have been caused by negligent storage of gas cylinders and oil paints. The participants were then offered an explicit retraction of the information about the cause of the fire, but even after reading the correction, only about half of those in the study reported that the initial news account was wrong. The finding suggests that the original false belief may stick 50% of the time, despite a correction.
Why? One reason may be that the correction itself repeats the inaccurate information — and repetition is known to strengthen memory. Indeed, a separate study involving a patient handout containing myths and facts about flu vaccination found that immediately after reading the brochure, patients were able to correctly distinguish the myths from the facts. Thirty minutes later, however, people who read the brochure were more likely to identify the myths as facts than were participants who never saw the pamphlet. The repetition of the myths seemed to have reinforced them, rather than replacing them with more accurate data.
The problem becomes even more extreme when political beliefs are involved. People have a tendency to believe information that supports their existing worldview and to reject data that threatens it. For example, a study found that Republicans who believed that Iraq had weapons of mass destruction before the Iraq war became even more committed to that notion after
being presented with evidence that no weapons were found than they were before hearing such information. The source of the information mattered, however: if the conflicting evidence came from a fellow Republican, the effect disappeared.
Interestingly, however, when it comes to misinformation from scientific research, people don’t seem to care about the source: one paper found that people were equally likely to believe a report denying that human activity causes climate change whether they were told it was funded by Exxon or “donations from people like you.”
(MORE: How Overconfidence and Paranoia Become Self-Fulfilling Prophecies)
So, what can be done to fight false beliefs, if corrections don’t work and if providing data on funding sources doesn’t even provoke skepticism?
Presenting the new information as part of a coherent story seems to help, by filling the gap in explanation that arises from simply negating a statement. In the warehouse fire example, for instance, telling people that a truckers’ strike prevented the delivery of the paints and gas cylinders — thereby rendering it impossible for those substances to have started the fire — helps readers revise their original, incorrect view. The added information about the strike not only provides an explanation for why the initial story was wrong, but also offers a memorable event to prompt recall of what really happened.
In cases involving political misinformation, providing new data that is congruent with someone’s preexisting beliefs also helps: one study found that Republicans who were disinclined to believe the scientific consensus on climate change were more likely to accept the idea if it was presented in the context of a growth opportunity in the nuclear energy industry.
Similarly, giving correct information while making people feel good about themselves through self-affirmation also helps them cope with the new info that would otherwise threaten their identity: in one study, such ego-stroking — like having people recall a time that they felt good about themselves for acting in accordance with their values — reduced the influence of people’s political ideology on their acceptance of a report that was critical of U.S. foreign policy.
It’s also helpful to simply spending more time debunking myths in detail: this doesn’t backfire like brief debunking does. A study found that a psychology course aimed at correcting misconceptions about the field was more successful when it directly refuted myths in depth than when it simply presented accurate information, without addressing common untruths.
Finally, presenting correct information coherently in the simplest way that is accurate, strengthening the message through repetition and, if possible, warning people that misinformation is about to be presented can help prevent it from sticking, though it may be hard to maintain constant vigilance over one’s data diet.
(MORE: ‘Legitimate Rape?’ Todd Akin and Other Politicians Who Confused Science)
The authors conclude: “Widespread awareness of the fact that people may ‘throw mud’ because they know it will ‘stick’ is an important aspect of developing a healthy sense of public skepticism that will contribute to a well-informed populace.”
The review was published in Psychological Science in the Public Interest
and led by Stephan Lewandowsky of the University of Western Australia.