Political Disinformation
This can lead to a vicious cycle: “Political polarization can increase vulnerability to misinformation, but misinformation also has the potential to polarize citizens.”[2] Not only that, but while skepticism that’s motivated by a desire for accuracy makes people less susceptible to misinformation, skepticism that’s based on one’s identity makes us more vulnerable.[3]
While there has been widespread misinformation during recent Canadian elections, it appears to have had relatively little impact. However, a number of formerly disconnected communities focusing on topics ranging such as vaccines, climate change and immigration have coalesced into a more cohesive, politically oriented group.[4]
The ability of social networks and search engines to deliver targeted advertising means that it’s much more likely to reach people like these, who are likely to be swayed by it: “when disinformation operators leverage this system for precision propaganda, the harm to the public interest, the political culture, and the integrity of democracy is substantial and distinct from any other type of advertiser.”[5] This may be even more true of video-based networks such as YouTube and TikTok, where authenticity and relatability are more important markers than accuracy and objectivity.[6]
There are five main ways in which disinformation can affect politics and elections:
- Reaching an audience that’s vulnerable to the message: Because of the effect our beliefs have on our thinking, targeted advertising works even better for political messages than traditional commercial ones. Political parties and disinformation agents can both draw on the data profiles provided by social networks, search engines and ad brokers to deliver their messages both to those who are known to hold those beliefs and to “lookalike audiences” that resemble them closely enough that they are highly likely to believe the same things: “the more disinformation operators know about their target audiences, the easier it is to find, manipulate, and deceive them.”[7] Profiles based on factors ranging from attitudes towards vaccines, guns and even UFOs are used to determine which ads are shown to which users.[8]
Though these paid ads may reach relatively few people, their strongly ideological content can lead to them being widely shared by those who receive them, which in turn leads social networks’ algorithms to boost the stories in their friends’ feeds. Even the most outlandish stories benefit from this process: because those who believe in one conspiracy theory are likely to believe in others, finding voters who already subscribe to at least one such theory is crucial to spreading a new one.[9]
- Energizing or suppressing voters: While political disinformation may not convert anyone to an opposing point of view, it may have a powerful impact on whether or not someone actually votes. While ads with extreme messages were once seen as risky because they might alienate more moderate voters, targeted ads make it possible to send extreme messages just to hard-core supporters – or, perhaps more importantly, to send messages to their opponent’s likely supporters to discourage them from voting. For example, while committed Trump supporters were shown ads that called Hillary Clinton a traitor, likely Clinton voters received ads highlighting comments she’d made referring to some African-American men as “super predators” in order to make them conflicted about voting for her.[10] These “dark ads” are visible only to the ad buyer, the recipients and the social network, which makes it almost impossible for an opponent – or government bodies tasked with overseeing an election – to track or respond to them. [11]
Disinformation about the election process can have this effect, too. Researchers have identified a number of recurring tropes connected to voting, which may serve to prevent a particular group from voting, to suggest that the voting process is corrupted or even to justify electoral cheating on the grounds that the “other side” already does it.[12] These often focus on the mechanics of voting, such as suggestions that voting machines can change votes, that mail-in voting is not secure[13] or even that the pencil marks on ballots can be erased.[14]
- Setting the agenda during breaking news: News has been described as “the first rough draft of history,”[15] but increasingly it is the final draft, as well. Once a false or inaccurate story has been widely spread, people are likely to believe it over a new one – even if the new story is a retraction from the same source as the original.[16] Disinformation agents have a variety of techniques for getting their version of a story spread first, many of which involve “gaming” the algorithms of search engines and social networks. Securing a top spot in search engine results, or even better a “featured snippet,” is a particularly important goal since many users wrongly assume that search engines are neutral arbiters of reliability,[17] and (perhaps because of this) more than a quarter of clicks go to the top result of a Google search.[18] For example, search results following the 2017 mass shooting in Las Vegas were dominated by conspiracy theories.[19] Although stories like these are usually quickly debunked (and sometimes retracted), by the time a story is debunked or retracted, it may have been already widely spread through social networks.[20] As Dartmouth Professor Dr. Brendan Nyhan puts it, “misinformation spreads widely before it can be downgraded in the algorithms.”[21] Following the top search results, especially when starting with search terms that are associated with extreme political views, can lead a searcher down a “rabbit hole” into a universe of conspiracy theories, as apparently happened to Dylann Roof, who murdered nine people at a church in South Carolina after searching for the terms “black on white crime.”[22]
- “Infecting” legitimate news: As noted previously, legitimate news sources can be led by bias – their own or their perception of their audience’s – to give too much credence to a plausible but not confirmed story. For example, during the 2016 U.S. election, a YouTube video suggesting that Hillary Clinton had unrevealed health problems led to the appearance of #HillarysHealth as a trending topic on Twitter (now called X), which led to it being seen as being more newsworthy by journalists (though no new information had appeared).[23] This phenomenon provides a vehicle for a small number of highly committed, reasonably savvy operators – freelance or state-sponsored – to have a hugely disproportionate effect on the public conversation, especially during times such as elections when news outlets are constantly looking for fresh and engaging content. Partisan news sites have become common on both sides of the American political spectrum (though researchers say there are substantially more of them on the right).[24]
There can be harmful effects even when the disinformation being spread is not entirely false: as Gilad Lotan writes of his study of social networks during the Israel-Gaza conflict, “neither side… was false per se. Rather, each carefully crafted story on either side omitted important detail and context. When this happens constantly, on a daily basis, it systematically and deeply affects people’s perception of what is real.”[25]
- Polarization: While many countries (including Canada) have become somewhat more socially and politically polarized,[26] we’re nowhere near as polarized as we think we are.[27] However, how polarized we think we are – what’s called false polarization – is actually more important, because that’s what we act on when we interact with other people.[28] For example, people on both ends of the political spectrum typically believe that twice as many people on the other side agree with its more extreme views than actually do.[29]
False polarization can be a result of misinformation or partisan news,[30] but it may also be a side effect of the ways that networked media encourage cherry-picking (choosing only those pieces of evidence that support your point of view) and nut-picking (engaging with only the most extreme contrary views).[31] That means that whenever we try to engage with someone online, we start by assuming they hold more extreme views than they probably do – and they assume the same about us. As a result, we also tend to think people on the other side are more hostile towards us than they actually are.[32]
False polarization can contribute to conspiracy thinking, as well. People who feel themselves to be marginalized or who feel their status is under threat are more inclined to embrace misinformation in general, and conspiracy theories in particular.[33] This is a particular concern when both misinformation and curated information feeds contribute to a sense among both sides (of any issue or conflict) that they are losing.[34]
[1] Novicoff, M. (2024) How raw milk went from a Whole Foods staple to a conservative signal. Politico.
[2] Lavigne, M. (2023). Resilient? Perceptions, Spread, and Impacts of Misinformation in the New Political Information Environment. McGill University (Canada).
[3] Li, J. (2023). Not all skepticism is “healthy” skepticism: Theorizing accuracy-and identity-motivated skepticism toward social media misinformation. new media & society, 14614448231179941.
[4] Bridgman, A., Lavigne, M., Baker, M., Bergeron, T., Bohonos, D., Burton, A., McCoy, K., Hart, M., Lavault, M., Liddar, R., Peng, P., Ross, C., Victor, J., Owen, T. and Loewen, P. (2022). Mis- and Disinformation During the 2021 Canadian Federal Election. Media Ecosystem Observatory.
[5] Ghosh, D., & Scott, B. (2018). Digital deceit: the technologies behind precision propaganda on the internet.
[6] John, J.N. (2021) When Generation Z falls for online misinformation. MIT Technology Review.
[7] Ghosh, D., & Scott, B. (2018). Digital deceit: the technologies behind precision propaganda on the internet.
[8] Singer, N. (2022) Why Am I Seeing That Political Ad? Check Your ‘Trump Resistance’ Score. The New York Times.
[9]Williams, M. N., Marques, M. D., Kerr, J. R., Hill, S. R., Ling, M., & Clarke, E. J. (2024) Increased Belief in One Conspiracy Theory Leads to Increased Belief in Others Over Time.
[10] Green, J, & Issenberg S. (2017) “Inside the Trump Bunker, With Days to Go.” Bloomberg.
[11] Madrigal, A. (2017) What Facebook Did to American Democracy. The Atlantic.
[12] DiResta, R. (2022) Arizona’s ‘Tricky Voting Machines’ Sounds Suspiciously Familiar. The Atlantic.
[13] Karalunas, L.M., &Weiser W. (2023) The Election Denier’s Playbook for 2024. Brennan Center.
[14] Carpenter, P. (2022) Researchers explore misinformation during Quebec election campaign. Global News.
[15] Fitch, G. (1905) “The Educational Value of ‘News.’” The State.
[16] Lewandowsky, S., Stritzke, W. G., Oberauer, K., & Morales, M. (2005). Memory for fact, fiction, and misinformation: The Iraq War 2003. Psychological Science, 16(3), 190-195.
[17] Bink, M., Zimmerman, S., & Elsweiler, D. (2022, March). Featured snippets and their influence on users’ credibility judgements. In Proceedings of the 2022 Conference on Human Information Interaction and Retrieval (pp. 113-122).
[18] Dean, B. (2023) Here's What We Learned About Organic Click Through Rate. Backlink.
[19] (2017) Tech giants sorry for false news about Las Vegas gunman. BBC News.
[20] (2017) Tech giants sorry for false news about Las Vegas gunman. BBC News.
[21] Carey, B. (2017) How Fiction Becomes Fact on Social Media. The New York Times.
[22] Pasquale, F. (2017) From Holocaust Denial to Hitler Admiration: Google’s Algorithm Is Dangerous. The Huffington Post.
[23] Lotan, G. (2016) Fake News Is Not the Only Problem. Data & Society.
[24] Folkenflik, D. (2022) Right-wing ‘zombie’ papers attack Illinois Democrats ahead of elections. NPR.
[25] Lotan, G. (2016) Fake News Is Not the Only Problem. Data & Society.
[26] Merkley, E. (2022). Polarization eh? Ideological divergence and partisan sorting in the Canadian mass public. Public Opinion Quarterly, 86(4), 932-943.; Garzia, D., Ferreira da Silva, F., & Maye, S. (2023). Affective polarization in comparative and longitudinal perspective. Public Opinion Quarterly, 87(1), 219-231.
[27] Parker, V. (2018). The Great and Widening Divide: Political False Polarization and its Consequences.
[28] Ling, J, (2023) Far and Widening: The Rise of Polarization in Canada. Public Policy Forum.
[29] Mernyk, J. S., Pink, S. L., Druckman, J. N., & Willer, R. (2022). Correcting inaccurate metaperceptions reduces Americans’ support for partisan violence. Proceedings of the National Academy of Sciences, 119(16), e2116851119.
[30] Tokita, C. K., Guess, A. M., & Tarnita, C. E. (2021). Polarized information ecosystems can reorganize social networks via information cascades. Proceedings of the National Academy of Sciences, 118(50), e2102147118.
[31] Stone, D. (2023) Undue Hate: A Behavioral Economic Analysis of Hostile Polarization in US Politics and Beyond.
[32] Parker, V. (2018). The Great and Widening Divide: Political False Polarization and its Consequences.
[33] Koller, W. N., Thompson, H., & Cannon, T. D. (2023). Conspiracy mentality, subclinical paranoia, and political conservatism are associated with perceived status threat. Plos one, 18(11), e0293930.
[34] Van Green, T. (2024) Most Americans continue to say their side in politics is losing more often than it is winning. Pew Research Center.