Effective Debunking
How we do that, though, is trickier. Just pointing out that something is false or misleading is not only ineffective, but can backfire and make people believe in it more strongly.[1] What is more effective is to provide information that creates a new, more convincing "story." The misinformation you're correcting should only be mentioned as briefly as possible, and should be mentioned after your counter-narrative[2]: if you have to specifically refute the misinformation, focus on providing reasons why it can't be true[3] – but it's better to give a smaller number of points that tell a coherent story than a larger "laundry list" of facts.[4] As well, it's useful to acknowledge and encourage skepticism with regards to the subject, including pointing out reasons why people might want to misinform you about it. Even when all those steps are taken, though, researchers warn that "the ultimate persistence of the misinformation depends on how it is initially perceived, and detailed debunking may not always function as expected."[5]
This can be particularly true when discussing heavily polarized or politicized topics, which are often seen more as an indication of loyalty to a group than a reasoned opinion. If possible, try to approach these issues in a way that either avoids an ideological frame or draws on the other person's values and viewpoint: for example, conservatives have been found to be more accepting of the idea when it's presented as a "carbon offset," avoiding the politically-charged word "tax,"[6] while Americans had very different attitudes towards the Affordable Care Act depending on whether the questioner used that name or the term "Obamacare."[7] If you're citing experts, try to find ones whom the other person will see as authorities: this shows them that you respect their worldview, and gives them an "excuse" to consider your argument.[8] Finally, it's good to frame things in a positive way – rather than pointing out that raw milk is dangerous, for instance, one might more effectively point out that pasteurization made it possible for city-dwelling children to drink milk safely – and to make a point of recognizing that the emotions that underlie their views are valid: acknowledging, for example, that most anti-vaccination proponents are motivated by their love of their children. [9] Opening with positive statements (e.g. "I can tell you're a very caring and committed person") about the person you're talking to can make them more open to considering the issue objectively.[10]
How do you know if someone is worth arguing with?
Everything above, of course, assumes that the person you're engaging with is arguing in good faith. Unfortunately, social media (especially public-by-default networks such as Twitter) make it easy for people to participate in arguments that either they have no genuine stake in (for the fun of it, or what internet trolls call "for the lulz"[11]) or where they are only aiming to annoy and use up the energy of people on the other side of the issue. Here are a few tips for deciding whether someone is worth arguing with:
Are they a bot? Many social network accounts are actually “bots,” automated programs designed to do repetitive tasks. Some of these do valuable work, like sharing traffic information, but some are created to amplify misinformation or encourage conflict. In general, a high number of posts per day (especially identical posts), posting almost exclusively other people’s content, and a low overlap between accounts it follows and those that follow it, are signs that an account might be a bot. [12] (You can use the Botometer tool to see if a Twitter account is likely to be a bot.)
Do they have a bad track record? This is the modern-day version of "the boy who cried wolf": whether it's someone you've tangled with before or simply a well-known troll, some people have lost the right to be taken seriously.
Do they seem mostly focused on wasting your time? This tactic, sometimes called "sealioning," is often used by trolls to derail conversations or simply exhaust people they're arguing with.[13] If someone is repeatedly asking you to define or explain basic ideas, it may be a sign that they're just trying to keep you arguing.
How likely are they to listen to you? You have a higher responsibility for correcting misinformation when it's coming from friends and relatives, because within the networks you share it will seem like an endorsement of their attitude if you don't. Even weaker ties, though, can have an impact on whether or not a message will be listened to: an experiment in using bots (automated accounts) to respond to racist comments on Twitter found that appeals to empathy (e.g. "Hey man, just remember that there are real people who are hurt when you harass them with that kind of language") were more effective if the bot account had a White profile picture.[14]
Are they only being negative? A good sign that someone's unwilling to consider the other side of an issue is if they only argue against it, without putting forth any positive arguments. As noted above, this tactic was pioneered by the tobacco industry, which aimed to raise a smokescreen of doubt around the health effects of their product. Unfortunately, media coverage of the issue embraced the narrative of a "controversy": "the industry correctly perceived that no journalist would ever get fired for giving the two sides equal weight, even when that balance wasn't warranted by the facts."[15] A more modern example would be anti-vaccinators, who have increasingly moved from claiming that individual vaccines are unsafe to questioning the effectiveness of vaccines overall – without providing an alternate explanation for the disappearance of diseases such as polio.[16]
Is there an undecided audience? Sometimes it's worth arguing not to convince the other person but to convince others who are watching – or simply to show that what they're saying won't go unchallenged. In media this is sometimes called the "zone of legitimate controversy," the often narrow space between ideas that are universally accepted and ones that are seen as unworthy of discussion. (A related concept is the "Overton Window," which represents the range of positions which a politician can endorse and still be electable.) This zone, though, isn't stable: it can move significantly and sometimes quickly, and both activists and interest groups devote considerable amounts of money and effort to changing it on topics ranging from health care to gun control.
This idea is even more important when we consider the reduced influence of mass media and the ability of social networks to let us associate mostly with like-minded people, which allows different groups and spaces to perceive very different zones of legitimate controversy.[17] It's important to remember that users of social networks are largely responsible for setting the tone and values of those spaces, and it's the voice of the loudest 10 percent that does that.[18]
Remember, too, that people who are strongly committed on issues rarely change their minds, so if your debunking is going to be effective it has to focus on those who haven't yet taken a position.[19] Though it may be hard to change the minds of individuals, remember that all of society's views have been changed, through patient (and sometimes impatient) effort, on issues as fundamental as whether LGBTQ people can marry, whether women should have full rights, and whether or not slavery is wrong – often by framing the issue so that the new view seemed more in tune with people's basic values.[20]
Finally, remember that it's rare for people to openly change their minds – especially on highly charged issues – when they're under pressure or in front of an audience: moral reflection most often happens some time after a person has been forced to see an issue in a new light. [21]
[1] Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32(2), 303-330. doi:10.1007/s11109-010-9112-2
[2] Chan, M. S., Jones, C. R., Jamieson, K. H., & Albarracín, D. (2017). Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychological cience, 28(11), 1531-1546. doi:10.1177/0956797617714579
[3] Chokshi, N. (2017, September 18). How to Fight ‘Fake News’ (Warning: It Isn’t Easy). The New York Times. Retrieved April 9, 2018, from https://www.nytimes.com/2017/09/18/business/media/fight-fake-news.html
[4] Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978-0-646-56812-6. [http://sks.to/debunk]
[5] Chan, M. S., Jones, C. R., Jamieson, K. H., & Albarracín, D. (2017). Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychological Science, 28(11), 1531-1546. doi:10.1177/0956797617714579
[6] Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978-0-646-56812-6. [http://sks.to/debunk]
[7] Dropp, K., & Nyhan, B. (2017, February 07). One-Third Don't Know Obamacare and Affordable Care Act Are the Same. Retrieved April 09, 2018, from https://www.nytimes.com/2017/02/07/upshot/one-third-dont-know-obamacare-and-affordable-care-act-are-the-same.html
[8] Mooney, C. (2011, May/June). The Science of Why We Don't Believe Science. Mother Jones.
[9] Don't give up on the fact-resistant: Tips to break the grip of misinformation. (2017, September 07). Retrieved April 05, 2018, from https://www.americanpressinstitute.org/fact-checking-project/should-we-give-up-on-facts/
[10] Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978-0-646-56812-6. [http://sks.to/debunk]
[11] Norton, Q. (2017, June 03). Anonymous 101: Introduction to the Lulz. Retrieved April 09, 2018, from https://www.wired.com/2011/11/anonymous-101/
[12] Varol, O., Ferrara, E., Davis, C. A., Menczer, F., & Flamini, A. (2017). Online Human-Bot Interactions: Detection, Estimation, and Characterization. Proceedings of the Eleventh International AAAI Conference on Web and Social Media.
[13] Johnson, A. (2017). The Multiple Harms of Sea Lions. Harmful Speech Online, 13.
[14] Vince, G. (2018, April 2). Why Good People Turn Bad Online. Retrieved April 05, 2018, from https://mosaicscience.com/story/why-good-people-turn-bad-online-science-trolls-abuse/
[15] Hiltzik, M. (2014, March 9). Cultural production of ignorance provides rich field for study. Los Angeles TImes. Retrieved April 5, 2018, from http://articles.latimes.com/2014/mar/09/business/la-fi-hiltzik-20140307
[16] Moran, M. B., Lucas, M., Everhart, K., Morgan, A., & Prickett, E. (2016). What makes anti-vaccine websites persuasive? A content analysis of techniques used by anti-vaccine websites to engender anti-vaccine sentiment. Journal of Communication in Healthcare, 9(3), 151-163. doi:10.1080/17538068.2016.1235531
[17] Zuckerman, E. (2017). Mistrust, Efficacy and the New Civics: Understanding the Deep Roots of the Crisis of Faith in Journalism. (Rep.) The Knight Foundation.
[18] J. Xie, S. Sreenivasan, G. Korniss, W. Zhang, C. Lim, B. Szymanski. Social consensus through the influence of committed minorities. Physical Review E, 2011; 84 (1) DOI: 10.1103/PhysRevE.84.011130
[19] Don't give up on the fact-resistant: Tips to break the grip of misinformation. (2017, September 07). Retrieved April 05, 2018, from https://www.americanpressinstitute.org/fact-checking-project/should-we-give-up-on-facts/
[20] Haidt, J. (n.d.). The Emotional Dog and Its Rational Tail: A Social Intuitionist Approach to Moral Judgment. Reasoning, 1024-1052. doi:10.1017/cbo9780511814273.055
[21] Haidt, J. (n.d.). The Emotional Dog and Its Rational Tail: A Social Intuitionist Approach to Moral Judgment. Reasoning, 1024-1052. doi:10.1017/cbo9780511814273.055