The Backfire Effect:

Wired, The New York Times, Backyard Poultry Magazine—they all do it. Sometimes they screw up and get the facts wrong. Via ink or in photons, a reputable news source takes the time to say “my bad.”

If you are in the news business and want to maintain your reputation for accuracy, you publish corrections. For most topics this works just fine, but what most news organizations don’t realize is a correction can further push readers away from the facts if the issue at hand is close to the heart. In fact, those pithy blurbs hidden on a deep page in every newspaper point to one of the most powerful forces shaping the way you think, feel, and decide—a behavior keeping you from accepting the truth.

In 2006, researchers Brendan Nyhan and Jason Reifler created fake newspaper articles about polarizing political issues. The articles were written in a way that would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, experimenters then handed over a true article that corrected the first. For instance, one article suggested that the United States had found weapons of mass destruction in Iraq. The next article corrected the first and said that the United States had never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause, though, is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before that there actually were WMDs and that their original beliefs were correct.

The researchers repeated the experiment with other wedge issues, such as stem cell research and tax reform, and once again they found that corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired. …

Once something is added to your collection of beliefs, you protect it from harm. … Just as confirmation bias shields you when you actively seek information, the backfire effect shields you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. … Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs as true and proper.

—David McRaney, You Are Now Less Dumb (Avery / Penguin Random House, 2013)

Some thoughts about this book:

  • Kind of depressing, isn’t it? This means I will never again be able to have a conversation with that relative of mine who is in thrall to the fake-news websites. Or, as the author says, “What should be evident from the studies on the backfire effect is you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel even surer of his position than before you started the debate.” (Emphasis mine.)
  • People who have a personal philosophy of learning, inquiring, and skepticism are less susceptible to this phenomenon, I think. I’m not saying I don’t have confirmation bias, but I do, actually seek out and read a wide range of information when I am researching. I have a hairtrigger skepticism meter.
  • Sadly, the author goes on to say the backfire effect has always been a thing, “but the Internet unchained its potential, elevated its expression … As social media and advertising progress, confirmation bias and the backfire effect will be more and more difficult to overcome.” Oh boy.
  • So what can we do about it? That’s a much longer answer than I can get into, and, in fact, the folks who study this aren’t completely sure themselves. This article (“How to Convince Someone When Facts Fail”) from Scientific American offers ideas you already know instinctively: be nice, don’t get emotional, listen carefully … and “try to show how changing facts does not necessarily mean changing worldviews.” (Good luck with that.) David McRaney has some ideas, too, which you can hear in this podcast. Go ahead and check out all his stuff while you’re there.

#MyReadingYear #WhatImReadingNow

Tweet: The Backfire Effect: Convincing folks with facts doesn’t always work.
Tweet: #WhatImReadingNow? You are Now Less Dumb, by David McRaney.

Disclosure of Material Connection: I have not received any compensation for writing this post. I have no material connection to the brands, products, or services that I have mentioned. I am disclosing this in accordance with the Federal Trade Commission’s 16 CFR, Part 255: “Guides Concerning the Use of Endorsements and Testimonials in Advertising.”