Finding and Evaluating Science and Health Information - Introduction

Two of the most important kinds of information we look for online are about health and science. These can have a big effect on decisions we make about our own lives and our opinions on controversial issues.

For someone who isn’t a scientist or doctor – and, given the numerous specialties in each field, for many who are – it can be hard to judge claims about science or medicine. Many people take advantage of that fact for commercial reasons, using everything from biology to quantum physics to sell supplements and self-help books. Others play on our fears that we don’t understand enough about science to make informed judgments in order to cast doubt on scientific consensus that doesn’t fit their worldview.

Why we trust science but distrust scientists

There’s good reason to be doubtful when we hear about the latest scientific discovery: in fact, doubting what we know is what science is all about. That’s been true at least as far back as the ancient Greeks, who would propose competing theories about everything from whether the world is made of elements or atoms to whether humans were created by gods or evolved from fish. While some of their theories turned out to be very nearly true (though unfortunately these were not, in general, the theories that won widespread acceptance over the following millennia), in most cases there was no way to know which were false and which were true because they had not yet developed the method of modern science: testing a hypothesis by gathering data (usually through experiments or observations) that would prove or disprove it. A hypothesis must be falsifiable (that is, there must be a way that it could be proven wrong) and, similarly, must allow us to predict things we haven’t yet encountered.

As valuable as our modern scientific method is, there are some things about it that can lead to distrust of science. One is that few of us ever see the evidence supporting (or disproving) a theory or hypothesis, and in a lot of cases we wouldn’t have the expertise to evaluate it even if we did. As a result, most of us have to take what scientists tell us has been proven at least partially on faith. The two major checks on this are the peer-review process (where other scientists in the field review work before it’s published) and replicability (where hypotheses are not considered proven unless other scientists have gone through the same process and found the same results). But the economics of science funding often mean that it’s easier to find time and money to do new research than to replicate (or falsify) existing work; as well, new work is much more likely to get news coverage than later research that provides evidence against it.

Even when it works as it should, though, this process can have the ironic effect of making people doubt science rather than trust it. The strength of science as a way of understanding the universe is its ability to self-correct, but when people learn of past theories that have been disproven, like the ether or phlogiston; of theories that were once considered fringe but are now accepted, such as the movement of tectonic plates; of fundamental ideas like Newton’s model of gravity being partially disproven; or of outright frauds that were, if only briefly, accepted by the scientific community; they often view them mistakenly as “bad science” rather than examples of science done well.

Perhaps not surprisingly, then, for many people the scientific consensus takes a back seat to their personal or political beliefs. One American study, for example, found that not only was the question of whether climate change is caused by human activity strongly polarized politically, better-educated Republicans were actually twice as likely to deny the scientific consensus than those who had not been to university. (The reverse held true with Democrats, though the effect wasn't as strong.[1]) While some of these may be consciously rejecting science that they know is probably correct, what’s more likely for the majority is that they are getting more and stronger misinformation messages about this issue than accurate ones, and their education in science – which, in the K-12 system, often focuses more on content and following prescribed steps than on the practice of asking questions and testing hypotheses[2] – has not prepared them to judge different messages about science accurately.

There is good news, however. An experiment in “inoculating” readers of scientific information found that making them aware of the strategies used in scientific misinformation – such as the tobacco industry’s efforts to cast doubt on the evidence that smoking is bad for you – neutralized the effect of political polarization when the same tactic was used.[3]

The following sections cover how we get news and information about health and science topics, types of misinformation that are particularly common in those subjects, and steps we can take to determine how reliable a source or claim is. 

 


[1] Mooney, C. (2011, May/June). The Science of Why We Don't Believe Science. Mother Jones. Retrieved March 29, 2018, from https://www.motherjones.com/politics/2011/04/denial-science-chris-mooney/
[2] Park, Hyeran. Understanding the Nature of Science: a Comparative Study of Canadian and Korean Students. PhD Dissertation, University of Toronto, 2012. https://tspace.library.utoronto.ca/bitstream/1807/34832/1/Park_Hyeran_201209_PhD_thesis.pdf
[3] Cook, J., Lewandowsky, S., & Ecker, U. K. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. Plos One, 12(5). doi:10.1371/journal.pone.0175799