Become Aware of Your Own Biases

One of the hardest things about being a responsible sharer is to be aware of your own biases, the reasons why you might be more likely to believe something without evidence. These are aspects of the way we think that can lead us to accept false statements, reject true ones, or simply not ask enough questions.  

A new belief often begins with an episode of disfluency or cognitive dissonance, which may be provoked by a question, recognition of a gap in knowledge, a worrying new event, et cetera. This leads to a period of doubt that eventually results in an “aha” moment that restores fluency.[1] This doesn’t occur right away, except in unusual cases. Instead, we assimilate new information into our existing views, until eventually we reach “the moment after which people can no longer justify ignoring an onslaught of disconfirmatory evidence.”[2]

One frequent cause of disfluency is when expert advice or consensus changes – such as when public health advice during the early days of the COVID-19 epidemic changed from recommending that most people not wear masks, to recommending that they wear cloth or surgical masks, to recommending the wearing of N-95 masks. In these cases, people are likely to experience disfluency – and so become susceptible to mis and disinformation as well as their own biases - unless the reasons for changing the consensus are well-communicated.[3]

Conspiracy theorists and disinformation spreaders also often appropriate the language of media literacy, calling themselves critical thinkers and exhorting other people to “do your own research.” In fact, though, people who have been taught media literacy or critical thinking in schools are 26 percent less likely to believe in a conspiracy theory – while those without that kind of formal training, but who identify themselves as “critical thinkers,” are 63 percent more likely to do so.[4]

Biases

What happens between those points depends very much on the person’s ability and willingness to engage in critical thinking: while it may lead to a more accurate view, information that provokes discomfort or anxiety – in other words, disfluency – also frequently leads to misinformation.[5] This is especially the case when that new information conflicts with one of our fundamental beliefs about the world. These fundamental beliefs or biases can be summarized as:[6]

  1. My experience is a reasonable reference. We can’t help but see the world from our own point of view. Unless we take that into account, we’re likely to think that other people know more about us and pay more attention to us than they actually do. We may also assume that other people are more like us than they are. We also tend to believe that things that are frequent or pleasurable for us are so for others, and that most people think and believe the same things we do.
  2. I make correct assessments of the world. We’re also likely to see ourselves as being more objective and logical than other people. As a result, we tend to see information as less significant or reliable if it challenges what we already believe than if it supports it. In the same way, we may seek out information that will support a decision after we’ve already made it.[7]
  3. I (and people like me) am good. We prefer to see ourselves as being good, competent and intelligent, and this extends to people similar to us and people we identify as being in our group. A corollary of this is that we are likely to see people unlike us, and people outside of our group, more negatively. Evidence that this may not be true can be painful, so it’s especially likely to cause disfluency.[8] Because it applies to the group as well as the self, it’s also a major cause of polarization, which is our tendency to judge things as true or false, good or bad, or dangerous or safe based on our group identity. This also explains things like the hostile media effect, where both sides of an issue tend to see news coverage of the issue as being biased against them.[9]

When any of these beliefs are challenged, we are prone to confirmation bias, which is to say  we only seek or accept information that avoids disfluency by reinforcing one of those beliefs.

While we can never entirely rid ourselves of these biases, we can limit their impact by fostering:

  • A focus on accuracy over other goals such as identity protection;
  • Open-mindedness to a diversity of ideas;
  • Accountability to provide evidence and reasons for your arguments;[10] and
  • A tolerance for ambiguity and failure.

We need open-mindedness and accountability to be able to meaningfully compare arguments and avoid the “just asking questions” trap. As well, we have to explicitly define open-mindedness as being open to the possibility that we might be wrong, or that we might have unconsciously ignored or minimized facts or perspectives that challenge what we believe. Being able to live with ambiguity lets us take longer to “lock in” to a belief or a decision – after which we are much less open to new information.[11] In the same way, being able to treat failure as iterative – a possibly necessary step towards a final answer – is central to the scientific method and to critical thinking in general.[12]

This attitude – intellectual humility – is associated not just with better thinking but also with reduced polarization[13] and with investigating political misinformation.[14]

Key aspects of intellectual humility are being:

  • open-minded, willing to revise your beliefs in the face of new evidence;
  • curious, actively seeking out new information (even if it might challenge what you think or believe);
  • realistic about the flaws and limitations of your own thinking;
  • teachable, willing to accept that other people are more knowledgeable or more expert than you,[15] in order to avoid the trap of “doing your own research” (which is associated with lower trust in scientific institutions and more belief in misinformation).[16]

Intellectual humility can be measured through whether people agree with statements like “It’s important to be able to change my mind about a question when high-quality scientific evidence suggests I may be wrong” and “It’s important to be open-minded about new claims about the nature of the world, even ones I disagree with.”

This last method is especially effective at fostering intellectual humility when it’s focused on dispelling the illusion of explanatory depth. That’s our tendency to believe we understand things better than we actually do (a subtype of the second cognitive bias, “I make correct assessments of the world.”)

Thinking shortcuts

While biases are always at work, they’re not the only thing that can lead us to believe in misinformation. Heuristics are techniques that reduce cognitive load – how much thinking we have to do – by being fast and frugal with our energy and attention. Some of these “thinking shortcuts,” like proverbs, are taught and learned explicitly, but we develop many more over time and are often not even aware that we’re using them.

These are thought to have developed early in humanity’s evolution, when it was safer to wrongly believe a vine was a snake than to mistake a snake for a vine,[17] but we are equally likely to use them in our “high choice media environment.”[18]

We can see the value of these thinking shortcuts by remembering when we learned to do something complex and risky, like driving a car or riding a bicycle. At first the cognitive load was very high, as we had to think consciously about every aspect of what we were doing. The result was a high level of stress and, likely, a fair number of mistakes. With practice, however, certain things became automatic, reducing our cognitive load and as a result our stress. We also learned or developed heuristics: for instance, rather than looking at the speedometer, which is neither fast (we need to shift our gaze) nor frugal (we need to read a numeric display), most people gauge their speed by tracking how quickly things at the side of the road are moving through their peripheral vision.[19]

Rather than trying to discourage people from using these shortcuts, we can promote better ones – or ways of using existing ones better – that are equally fast and frugal. Here are some examples of common thinking shortcuts that are likely to backfire:

Aesthetic: Judging a source based on how it looks, or other trappings of authority (such as having a dot-org web address). While in the past this meant a preference for professional-looking sites, young people now are more likely to look for sources that have no obvious advertising, as that’s seen as meaning “that they just care about making money and not the actual news.”[20] Similarly, students in a 2024 study nearly all still used a website’s domain (e.g. whether or not its web address ended in “.org”) as a top indicator of trustworthiness.[21]

Availability: Drawing on things that are easiest to remember or to think of. This can be misleading because what is most available is often those things that happened most recently or those that stuck in our memory due to being highly dramatic. This can also lead us to see patterns where there aren’t any. For instance, if we draw three cards from a deck that are all hearts we may decide there’s a pattern and stop there, rather than seeing it as part of a larger, random sequence. Because we overvalue our own experience, this shortcut can also lead us to assume that things that are common in our experience are common in everybody’s.

Convenience: Choosing a source based on how easy it is to access or understand. This is often expressed in “satisficing” or good-enough thinking (for instance, getting just enough information to support an argument), doing a search but not investigating any of the results or turning to video over written content.[22]

Crowdsourcing credibility: Turning to “the crowd” to decide whether or not to accept or believe something – for instance, by checking the comments to a post to see if anyone else has objected to it.[23]

Naïve skepticism: Treating every source as equally unreliable in order to avoid being fooled. Though doubt is an essential part of critical thinking, if we don’t recognize that some sources are more trustworthy than others than we will simply fall back on what we already believe. As well, purveyors of disinformation may take advantage of this by claiming there’s less consensus about something than there actually is. This is sometimes called the “tobacco industry playbook,” as it was used for many years to make the public think there was still doubt about whether or not smoking caused cancer.[24]

Salience: Focusing the most on what seems the most relevant or important at a particular time. Pay attention to when something provokes strong feelings like anger or fear, or reassures you that you’re right.[25]

Surrogate thinking: Rather than drawing on multiple sources, it can be easier to find one or a small number to whom we can turn to “filter information and clarify points of view.”[26]

Even when we do engage in more deliberate reasoning, there’s no guarantee that it will lead us to a good result. Critical thinking, then, isn’t just a matter of promoting reason over intuition, but a matter of reasoning correctly. As Susan Gelman put it, “the cognitive tendencies that make us so smart and sophisticated can also be sources of distortion and bias.”[27]

Three critical thinking questions

One way of making sure that you’re actually thinking critically is to ask yourself three questions:[28]

  1. What do I already think or believe about this?

We pay attention to things that fit with what we already think is true. That’s not always a bad thing: science, medicine and other subjects all have a consensus – what experts believe is most likely true, based on all the evidence that’s been found so far.

It’s important to know what the consensus is on a topic when you’re judging a claim or a new piece of information. Be careful not to accept something just because it supports your opinion or to reject something because it doesn’t match what you already think or believe.[29]

  1. Why do I want to believe or disprove this?

It’s good to be skeptical. Sometimes, though, we’re less skeptical of things we want to think are true – and more skeptical of things we don’t want to believe[30] – even to the point of ignoring strong signals like fact-checking labels that something is false or unreliable.[31] Partisanship (identifying strongly with one of a number of opposing groups or one side of an issue) also makes people less interested in sources of information that might challenge their views.[32]

What’s important is not being trustful or distrustful by default, but pausing long enough to let you focus on accuracy.[33] Before you decide to believe something, take a few moments to ask yourself if you’d make the same decision if it you didn’t want it to be true. Make sure you work just as hard to double-check things that you want to believe!

  1. What would make me change my mind?

“To investigate properly, you have to allow yourself to be wrong.” - Jordan Wildon

Imagine you’re doing a jigsaw puzzle. Conspiracy theorists already think they know what the puzzle will look like. They only pick the pieces that support their idea and throw away any that suggest it might be wrong.

That’s why the most important part of critical thinking is being willing to change your mind based on new information.

Before you verify something, decide what would convince you that it is or isn’t true:

What if a reliable fact-checker confirmed or disproved it?

What if the source turned out to be a reliable news outlet or if you learned it was an unreliable source of disinformation? (Remember that you need to find the original source of a fact or claim before you verify it. See www.breakthefake.ca for info on how to do that.)


[1] Van de Cruys, S., Bervoets, J., Gadsby, S., Gijbels, D., & Poels, K. (2023). Insight in the conspiracist’s mind.

[2] McRaney, D. (2022) How Minds Change. Penguin Random House.

[3] Warzel, C. (2022) “The Bad Ideas Our Brains Can’t Shake.” The Atlantic

[4] Reboot Foundation. (2022) Science Fictions: Low Science Knowledge and Poor Critical Thinking are Linked to Conspiracy Beliefs.

[5] Booth, E., Lee, J., Rizoiu, M. A., & Farid, H. (2024). Conspiracy, misinformation, radicalisation: understanding the online pathway to indoctrination and opportunities for intervention. Journal of Sociology, 14407833241231756.

[6] Oeberst, A., & Imhoff, R. (2023). Toward parsimony in bias research: A proposed common framework of belief-consistent information processing for a set of biases. Perspectives on Psychological Science, 18(6), 1464-1487.

[7] Stone, C., Mattingley, J. B., & Rangelov, D. (2022). On second thoughts: changes of mind in decision-making. Trends in cognitive sciences, 26(5), 419-431.

[8] Oeberst, A., & Imhoff, R. (2023). Toward parsimony in bias research: A proposed common framework of belief-consistent information processing for a set of biases. Perspectives on Psychological Science, 18(6), 1464-1487.

[9] Perloff, R. M. (2018). A three-decade retrospective on the hostile media effect. In Advances in Foundational Mass Communication Theories (pp. 196-224). Routledge.

[10] Duke, A. (2018). Thinking in bets: Making smarter decisions when you don't have all the facts. Penguin.

[11] De Neys, W. (2023). Advancing theorizing about fast-and-slow thinking. Behavioral and Brain Sciences, 46, e111.

[12] Reboot Foundation. (n.d.) How to Teach Critical Thinking.

[13] Knöchelmann, L., & Cohrs, J. C. (2024). Effects of intellectual humility in the context of affective polarization: Approaching and avoiding others in controversial political discussions. Journal of Personality and Social Psychology.

[14] Koetke, J., Schumann, K., Porter, T., & Smilo-Morgan, I. (2023). Fallibility salience increases intellectual humility: Implications for people’s willingness to investigate political misinformation. Personality and Social Psychology Bulletin, 49(5), 806-820.

[15] Tongeren, D. (2024) The Curious Joy of Being Wrong. The Conversation

[16] Chinn, S., & Hasell, A. (2023). Support for “doing your own research” is associated with COVID-19 misperceptions and scientific mistrust. Harvard Kennedy School Misinformation Review.

[17] Wilke A. and Mata R. (2012) Cognitive Bias. In: V.S. Ramachandran (ed.) The Encyclopedia of Human Behavior, vol. 1, pp. 531-535. Academic Press.

[18] Young, D. G. (2023). Wrong: How media, politics, and identity drive our appetite for misinformation. JHU Press.

[19] Denton, G. G. (1980). The influence of visual pattern on perceived speed. Perception, 9(4), 393-402.

[20] Hassoun, A., Beacock, I., Consolvo, S., Goldberg, B., Kelley, P. G., & Russell, D. M. (2023, April). Practicing Information Sensibility: How Gen Z Engages with Online Information. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-17).

[21] Besharat-Mann, R. (2024). Can I trust this information? Using adolescent narratives to uncover online information seeking processes. Journal of Media Literacy Education, 16(1), 1-18.

[22] Hassoun, A., Beacock, I., Consolvo, S., Goldberg, B., Kelley, P. G., & Russell, D. M. (2023, April). Practicing Information Sensibility: How Gen Z Engages with Online Information. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-17).

[23] Hassoun, A., Beacock, I., Consolvo, S., Goldberg, B., Kelley, P. G., & Russell, D. M. (2023, April). Practicing Information Sensibility: How Gen Z Engages with Online Information. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-17).

[24] Oreskes, N., & Conway, E. M. (2010). Defeating the merchants of doubt. Nature, 465(7299), 686-687.

[25] Szewach, P., Reifler, J., & Oscarsson, H. (2022). Is resistance futile? Citizen knowledge, motivated reasoning, and fact-checking. Knowledge Resistance in High-Choice Information Environments, 166.

[26] Hassoun, A., Beacock, I., Consolvo, S., Goldberg, B., Kelley, P. G., & Russell, D. M. (2023, April). Practicing Information Sensibility: How Gen Z Engages with Online Information. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-17).

[27] Gelman, S. A. (2023). Looking beyond the obvious. American Psychologist, 78(5), 667.

[28] Sadek, D., & Carvin A. (2024) “How to consume information in times of conflict,” Digital Forensic Research Lab (DFRLab), March 18, 2024, https://dfrlab.org/2024/03/18/how-to-consume-information-in-times-of-conflict

[29] Warzel, C. (2024) Evidence Maximalism Is How the Internet Argues Now. The Atlantic.

[30] Li, J. (2023). Not all skepticism is “healthy” skepticism: Theorizing accuracy-and identity-motivated skepticism toward social media misinformation. new media & society, 14614448231179941.

[31] Li, J. (2023). Not all skepticism is “healthy” skepticism: Theorizing accuracy-and identity-motivated skepticism toward social media misinformation. New Media & Society, 14614448231179941.

[32] Chopra, F., Haaland, I., & Roth, C. (2024). The demand for news: Accuracy concerns versus belief confirmation motives. The Economic Journal, ueae019.

[33] Mayo, R. (2023). Trust or distrust? Neither! The right mindset for confronting disinformation. Current Opinion in Psychology, 101779