Finding and Evaluating Science and Health Information - Introduction

Two of the most important kinds of information we look for online are about health and science: almost half of Americans, for example, report seeing science content in the previous few weeks, and a third follow a science-focused page or account online.[1] However, nine in 10 Canadians, have seen misinformation on at least one of those topics.[2]

This can have a big effect on decisions we make about our own lives and our opinions on controversial issues: one study found that misinformation may have contributed to 47 million additional deaths in the United States during the COVID-19 pandemic.[3]

The death consequences of the vaccine misinformation and disinformation were severe. People died unnecessarily because of all the nonsense out there that caused many people to not want to get vaccinated. 
Dr. Reed Tuckson, co-founder of the Black Coalition Against Covid[4]

For someone who isn’t a scientist or doctor – and, given the numerous specialties in each field, for many who are – it can be hard to judge claims about science or medicine. Many people take advantage of that fact for commercial reasons, using everything from biology to quantum physics to sell supplements and self-help books. Others play on our fears that we don’t understand enough about science to make informed judgments in order to cast doubt on scientific consensus that doesn’t fit their worldview.

Why we trust science but distrust scientists

There’s good reason to be doubtful when we hear about the latest scientific discovery. In fact, doubting what we know is what science is all about. That’s been true at least as far back as the ancient Greeks, who would propose competing theories about everything from whether the world is made of elements or atoms to whether humans were created by gods or evolved from fish. While some of their theories turned out to be very nearly true, in most cases there was no way to know which were false and which were true because they had not yet developed the method of modern science: testing a hypothesis by gathering data (usually through experiments or observations) that would prove or disprove it. A hypothesis must be falsifiable (that is, there must be a way that it could be proven wrong) and, similarly, must allow us to predict things we haven’t yet encountered.

As valuable as our modern scientific method is, there are some things about it that can lead to distrust of science. One is that few of us ever see the evidence supporting (or disproving) a theory or hypothesis, and in a lot of cases we wouldn’t have the expertise to evaluate it even if we did. As a result, most of us have to take what scientists tell us has been proven at least partially on faith. The two major checks on this are the peer-review process (where other scientists in the field review work before it’s published) and replicability (where hypotheses are not considered proven unless other scientists have gone through the same process and found the same results). But the economics of science funding often mean that it’s easier to find time and money to do new research than to replicate (or falsify) existing work; as well, new work is much more likely to get news coverage than later research that provides evidence against it.

Even when it works as it should, this process can have the ironic effect of making people doubt science rather than trust it. The strength of science as a way of understanding the universe is its ability to self-correct, but when people learn of past theories that have been disproven, they often view them mistakenly as “bad science” rather than examples of science done well.[5]

Perhaps not surprisingly, then, for many people the scientific consensus takes a back seat to their personal or political beliefs. Not only is the question of whether climate change is caused by human activity strongly polarized politically (an effect that is actually stronger in Canada than the US),[6] American research has found that among Republicans, education doesn’t make a significant difference in attitudes.[7] Similarly, Democrats are substantially more likely to say they trust experts generally (82 percent vs. 66 percent of Republicans) and journalists (63 percent vs. 23 percent) on scientific topics.[8] and trust in science has only become more polarized in the last ten years.[9] 

There is good news, however: an experiment in “inoculating” readers of scientific information found that making them aware of the strategies used in scientific misinformation neutralized the effect of political polarization when the same tactic was used.[10] Other research has found that in fields where there is a strong consensus, such as climate change or vaccination, making people aware of that consensus has a significant effect on people’s views[11] – and that this effect is even stronger for people who start out doubtful or dismissive of the consensus.[12]

The following sections cover how we get news and information about health and science topics, types of misinformation that are particularly common in those subjects and steps we can take to determine how reliable a source or claim is. 


 [1] Saks, E., & Tyson A. (2022) Americans report more engagement with science news than in 2017. Pew Research Center.

[2] Kischuk, O. (2024) 9 in 10 Canadians have seen misinformation on health and health care, and most think that is here to stay. Abacus Data.

[3] DeVerna, M. R., Pierri, F., Ahn, Y. Y., Fortunato, S., Flammini, A., & Menczer, F. (2024). Modeling the amplification of epidemic spread by misinformed populations. arXiv preprint arXiv:2402.11351.

[4] Durham, A. (2023) COVID 19 Changed How We View Health Misinformation. Word in Black

[5] Hansson, S. O. (2020). How not to defend science. A Decalogue for science defenders. Disputatio, 9(13), 197-225.

[6] Boulianne, S., & Belland, S. (2022). Climate denial in Canada and the United States. Canadian Review of Sociology/Revue canadienne de sociologie, 59(3), 369-394.

[7] Mooney, C. (2011, May/June). The Science of Why We Don't Believe Science. Mother Jones. Retrieved March 29, 2018, from https://www.motherjones.com/politics/2011/04/denial-science-chris-mooney/

[8] Saks, E., & Tyson, A. (2022). Americans report more engagement with science news than in 2017. Pew Research Center.

[9] Ahuja, A. (2024) Trump’s curious effect on trust in science. Financial Times.

[10] Appel, R. E., Roozenbeek, J., Rayburn-Reeves, R., Basol, M., Corbin, J., Compton, J., & van der Linden, S. (2024). Psychological inoculation improves resilience to and reduces willingness to share vaccine misinformation. PsyArXiv Preprints.

[11] van Stekelenburg, A., Schaap, G., Veling, H., van’t Riet, J., & Buijzen, M. (2022). Scientific-consensus communication about contested science: A preregistered meta-analysis. Psychological Science, 33(12), 1989-2008.

[12] Goldberg, M. H., Gustafson, A., van der Linden, S., Rosenthal, S. A., & Leiserowitz, A. (2022). Communicating the scientific consensus on climate change: diverse audiences and effects over time. Environment and Behavior, 54(7-8), 1133-1165.