Become Aware of Your Own Biases

One of the hardest things about being a responsible sharer is to be aware of your own biases, the reasons why you might be more likely to believe something without evidence. These are aspects of the way we think that can lead us to accept false statements, reject true ones, or simply not ask enough questions.  

Here are some common cognitive biases that we need to consider before we decide to share something[1]:

  • Privileging facts that reinforce what we already believe: This is what most of us think of when we hear the word "bias": we’re even more likely to accept or ignore things depending on whether or not they support what we already believe. A warm day in February – or a cold one in July – don’t really provide evidence for or against climate change, but our opinion on that issue will make us more likely to think that one or the other does. This kind of bias doesn't have to be political or ideological, though: for instance, we're more likely to remember good things about someone that we already like and ignore bad ones.
    While this bias has always been with us, the internet gives us the ability to preferentially search for facts that confirm what we already believe: "if our initial search fails to turn up the results we want, we don’t give it a second thought, rather we just try out a different query and search again. Armed with this power to search, it usually doesn’t take long to find someone or something that confirms our bias."[2] It's also why much misinformation is aimed not at persuading you about something, but simply casting enough doubt about the opposing argument that those who are inclined to disbelieve it feel justified in doing so.[3]
    This bias is also why the classic “triangulation method” of finding three sources to confirm information is often not enough when using online sources: it’s easy to find three sources that all confirm your opinion if you look for them specifically. (As well, what looks like three distinct sources may actually all be drawing on the same original source: a recent study of blogs that claimed global warming was either a hoax or not caused by human activity found that 80% all cited the same website.[4])
  • Filling in the puzzle: People are natural storytellers, and we like it when things fit into a pattern. Once we think that we recognize a pattern, we’re more likely to accept things that fit into the pattern and more likely to ignore things that don’t. Some patterns are real, of course, but you should still judge each fact for itself and not for how well it fits the pattern.
  • Not wanting to make waves: We can also be influenced by what people around us believe. If we know that all of our family or friends believe something – especially if it’s a strongly emotional belief – we’re more likely to pay attention to things that confirm it and ignore things that argue against it. Because social networks allow us to associate only with people who share our opinion, this bias can be particularly powerful online. As Judith Donath, founder of the Sociable Media Group at the MIT Media Lab, puts it, "News is shared not just to inform or even to persuade. It is used as a marker of identity, a way to proclaim your affinity with a particular community." [5]
  • Paying more attention to unusual things: There’s a natural tendency to be more worried about rare, dramatic events, like plane crashes, than more common ones like car crashes. Media can also make us believe that things that are considered newsworthy, like plane crashes or violent crime, are more common than they actually are.
  • Not wanting to admit we don’t know: A lot of the time we jump to conclusions even if we don’t have enough information to make an informed decision. This can be due to time pressure (when we have to make decisions quickly) or social pressure (when we feel like we have to share something with our peers). This, along with "filling in the puzzle" above, is why misinformation is most effective when it's partly true: when we're pressed for time we value the facts that we know over the questions that we don't have answers for.[6] An extreme form of this is cynically denying that there is such a thing as truth which, ironically, makes us more prone to privileging facts that support what we already believe: if all sources are liars, then showing loyalty to our group matters more than being right[7] – an attitude that is also associated with belief in conspiracy theories.[8]

What are the best ways to overcome our biases?

  • First, as noted above, giving ourselves time to cool down before we share content.
  • Learning how news and other media are made, which you can do with our Verifying Online News section.
  • Making a conscious effort to value accuracy. As broadcasters of information, we have the same responsibility as journalists to check our facts before sharing: as one reporter interviewed about their fact-checking habits put it, “You don’t take any piece of information without verifying it.”[9]
  • Only sharing content when we’re confident of the identity and authority of both the original source and the source that shared it with us. An increasing amount of evidence shows that “troll factories,” including some working on behalf of countries such as Russia and Iran, work to spread content like memes across the political spectrum, with the aim of making communities more polarized.[10]  Even if the content itself is true, spreading content from these accounts increases their credibility, which can make a difference when they do spread misinformation. Many of the Twitter bots run by the Russian propaganda organization IRA pretended to be nonexistent local newspapers, such as the “Milwaukee Voice,” and spread mostly accurate news stories reported by other outlets, so that “instead of just showing up online and flooding it with news sites, they have these accounts with two years of credible history.”[11]
  • Being willing to admit when you’re wrong. As broadcasters, we have the same responsibility that news outlets have to clearly and openly correct errors or misinformation that we’ve shared. In fact, admitting and correcting your errors can actually make you more credible.[12]
  • Finally, by allowing ourselves to be curious: research has found that curiosity makes people more likely to overcome their political biases on scientific issues.[13]

[1] Benson, B. (2016, September 1). Cognitive bias cheat sheet – Better Humans. Retrieved April 05, 2018, from
[2] Owings, J. (2009, February 9). Confirmation Bias and the Internet. Retrieved April 05, 2018, from
[3] Kahan, D. M., Landrum, A., Carpenter, K., Helft, L., & Jamieson, K. H. (2017). Science Curiosity and Political Information Processing. Political Psychology, 38, 179-199. doi:10.1111/pops.12396
[4] Harvey, J. A., Berg, D. V., Ellers, J., Kampen, R., Crowther, T. W., Roessingh, P., Mann, M. E. (2018). Internet Blogs, Polar Bears, and Climate-Change Denial by Proxy. BioScience, 68(4), 237-237. doi:10.1093/biosci/biy033
[5] Zuckerman, E. (2017). Mistrust, Efficacy and the New Civics: Understanding the Deep Roots of the Crisis of Faith in Journalism. (Rep.) The Knight Foundation.
[6]Standish, R. (2017, March 1). Why Is Finland Able to Fend Off Putin’s Information War? Foreign Policy.
[7] Balmas, M. (2012). When Fake News Becomes Real. Communication Research, 41(3), 430-454. doi:10.1177/0093650212453600
[8] Koerth-Baker, M. (2013, May 21). Why Rational People Buy Into Conspiracy Theories. The New York Times. Retrieved April 5, 2018, from
[9] Shapiro, I., Brin, C., Bédard-Brûlé, I., & Mychajlowycz, K. (2013). Verification as a Strategic Ritual. Journalism Practice, 7(6), 657-673. doi:10.1080/17512786.2013.765638
[10] Facebook uncovers new global misinformation operations. The Associated Press. Retrieved August 21, 2018, from
[11] Mak, T., & Berry, L. (2018, July 12). Russian Influence Campaign Sought To Exploit Americans' Trust In Local News. Retrieved from
[12] Riordan, K. (2018). Accuracy, Independence and Impartiality: How Legacy Media and Digital Natives Approach Standards in the Digital Age. (Rep.) Reuters Institute for the Study of Journalism.
[13] Kahan, D. M., Landrum, A., Carpenter, K., Helft, L., & Jamieson, K. H. (2017). Science Curiosity and Political Information Processing. Political Psychology, 38, 179-199. doi:10.1111/pops.12396