Where do we get information about health and science?

While many of us strongly prefer online sources when seeking out health and science information,[1] a majority first encounter health or science stories through traditional news outlets.[2]

In the News

My job is to be right, but it is even more to be read.[3]

While there are many magazines and newspapers with a long tradition of reliable science reporting, some fundamental differences between how science and journalism are done can result in misleading stories. The most fundamental may be that for a science story to make it into the news, it must be newsworthy. Along with the factors that generally influence newsworthiness, Boyce Rensberger in A Field Guide for Science Writers identifies four factors specific to science stories:

  • Fascination value: “People love to be fascinated, to learn something, and think, ‘That’s amazing, I didn’t know that.’” This is particularly true of stories that might remind audiences of a childhood enthusiasm, or that parents might share with their children: dinosaurs, for example, are newsworthy by default.
  • Size of the natural audience: The number of people to whom the story will be directly relevant. For example, a discovery relating to a rare disease is less newsworthy than one about a more common disease.
  • Importance: The likely impact of the story on the audience’s lives.
  • Reliability of the results: Science journalists generally rely on the peer review in science to determine whether or not a finding is reliable, and responsible ones do not platform fringe views.[4]

These factors, of course, may sometimes be in conflict with one another. Findings relating to nutrition, which is one of the fields with the lowest standards of reliability, are often reported on because they’re seen as relevant and important to many people (and particularly if they’re counter-intuitive).

As a result, new findings or those that challenge the scientific consensus are more likely to be covered than those that reinforce the consensus or produce null results (for instance, results that don’t show a health impact of a particular substance) – even though these are just as important in science.[5] This can also produce an impression that there is less of a consensus on some issues than there actually is, because findings that challenge the consensus on a politically charged issue, such a climate change, are automatically more newsworthy than those that support it. As well, the “inverted pyramid” structure of most news articles, which requires the most interesting details to be presented first, can de-emphasize important caveats or limitations of a study. Finally, the findings have to be presented in a way that readers will understand and find relevant. As one science journalist says:   

Remember, newspapers are about trying to make a story; they are not presenting results from a science study. They are writing a story about a science study and trying to get you to read it and trying to get you interested in it.[6]

Perhaps because of these factors, research has found that newspapers are actually more likely to cover medical research with fewer citations,[7] as well as studies with male authors[8] and ones for which a press release had been issued.[9] Because so many science news stories are based on press releases, rather than the studies themselves, articles are also often sensationalized, misrepresent the data (for instance, by not making clear that a study was done only in mice and not humans) and make recommendations (such as supplementing or avoiding a particular nutrient) that either are premature or aren’t supported by the study’s findings.[10]

Many of the studies that receive news coverage are also funded by corporations, such as food companies. While this research isn’t necessarily dishonest, it’s often flawed because of biased design (for example, providing grants “towards research on any relevant health issue in which grape consumption may have a beneficial impact”) or poor methodology (looking for correlation between a product and better health, rather than the more challenging task of finding whether the product actually causes the health effect).[11]

Social networks and video platforms

As with other news, social networks have become an important way in which people first encounter health and science information: 26 percent of U.S. social media users actively follow science-related accounts.[12] For young people, especially, TikTok has become an important source of health information. However, while it shows the usefulness of social networks for delivering this kind of information, the study also shows how social networks can spread misinformation. While research has found that content posted on the platform by health professionals, such as dermatologists, is generally accurate,[13] most users publishing health-related content are not professionals – and the overall accuracy is much lower.[14]

Two aspects of the architecture of social networks – the ability of users to create personal networks and the use of algorithms to push content that users are likely to respond to – can also make users more likely to be exposed to misinformation and, in particular, for misinformed views to be reinforced. So far as the algorithms are concerned, true and false information are no different. One study found that while just one in four videos with the #Autism hashtag on TikTok were accurate, people engaged equally with the accurate and inaccurate videos.[15] Once a user is identified as having any interest in “natural” or “alternative” health claims or cures, they are likely to be shown more: “Bogus cures and dangerous health trends relentlessly pursue the people who are most vulnerable to their influence by finding ways to enter the algorithmic recommendations of the sick and searching, even if they are less visible to the platform’s user base at large.”[16]  

This can have a powerful impact on behaviour. Parents’ decisions about having their children vaccinated are strongly influenced by what they've seen in both digital and traditional media.[17] Organized groups take advantage of this effect – and amplify it – by finding people who are seeking information and “recruiting” them into Facebook groups that reinforce misinformation and block out competing views, in much the same way that hate groups radicalize people online.[18]

This thing about alternative facts and fake news, we’ve been living it in the autism community for 20 years. People gradually disappear into little bubbles – private email groups, Facebook groups. And then they seek confirmation.

– Mike Stanton, parent of an autistic child[19]

This effect – dubbed the “majority illusion” by researchers – can lead participants in those closed networks to see only information that reinforces their views and to perceive scientific consensus as being marginalized.[20] A similar effect, pluralistic ignorance, can lead people to underestimate how widely held accurate beliefs are. For example, while at least two-thirds of Americans support policies aimed at mitigating climate change, they believe that fewer than half share their opinion.[21]

The practice of using “influencers” to endorse products on social media – whose posts are frequently not identified as ads[22] – has spread to health and medicine, with hospitals, pharmaceutical companies and market researchers paying patients to make testimonials on their behalf. Like other influencer campaigns, these dangerously blur the distinction between advertising and honest accounts of their patients’ experiences, and can be misleading to people who have recently been diagnosed with a condition or may suspect that they’re suffering from one.[23] Different forms of scientific disinformation are becoming increasingly tied together in social media, as “wellness” influencers now often spread false claims about climate change[24] and even Russia’s invasion of Ukraine.[25]

Search engines

For many people seeking information about science and health, search engines – or their voice-operated kin such as Apple’s Siri – are their first and sometimes last stop. One quarter of Canadian youth have used the internet to look for information on physical health, while 16 percent have searched for information on mental health and 14 percent have looked for information on sexual health.[26] 

As with social networks, though, the fact that search engines don’t generally discriminate between reliable and unreliable sources is a concern. While research has found that searching for one’s symptoms can lead to a more accurate diagnosis if people seek professional medical care afterwards,[27] one quarter of young Canadians (wrongly) believe that Google is endorsing a source as reliable by including it in search results.[28] “Alternative” cancer clinics have also been found to use Google listings and reviews to mislead potential patients about the effectiveness of their treatments.[29]

Consciously or unconsciously, people also often bias their results by looking for information that justifies what they already believe, rather than answering a genuine question:[30] a participant in one study describes using the query “milk bad,” rather than the more neutral “milk good for health,” because he’s already made up his mind on the topic and is looking for ways to bring his family around to his point of view.[31] 

The web

Whether we get to it through search engines, social media or by bookmarking sources we know are reliable, most of the health information we get online ultimately comes from websites. Though there are excellent sources of health and science information available online, the ease of publishing on the web means that more than ten percent of the information found on health websites is unreliable – and unfortunately, less reliable sites generally receive more traffic.[32]

Video sites, such as YouTube, have many of the same problems as search engines and social networks, with both personal and algorithmic filters pushing viewers to watch increasingly extreme content. While YouTube now allows doctors and nurses to apply for “verified provider labels,”[33] efforts to reduce the spread of misinformation on topics like climate change have had a limited impact.[34] Even when they take steps to improve the quality of information, the reach of sites like TikTok and YouTube mean that they remain useful to people who spread fringe theories and misinformation: Alex Olshansky, who studies the role of YouTube in spreading “flat Earth” theories, says he’s only met one person who was not influenced by what they’d seen on the platform, while Kelly Weill, author of the book Off the Edge, has only met three.[35]

In short, there is good quality information about health and science topics available online, but many of the sites, platforms and forums we use either do not discriminate between good and bad information or actively steer us towards content that is less reliable. As a result, it’s important that we be alert to the different kinds of misinformation on these topics – and the reasons why people promote them – and to have the skills to seek out good information and recognize it when we find it.


 [1]  Wang, X., & Cohen, R. A. (2023). Health Information Technology Use Among Adults: United States, July–December 2022. US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics.

[2] Funk, C., Gottfried, J., & Mitchell, A. (2017). Science News and Information Today (Rep.). Pew Research Center.

[3] An unnamed science journalist interviewed in Jarman, R., & Mcclune, B. (2010). Developing students ability to engage critically with science in the news: Identifying elements of the ‘media awareness’ dimension. The Curriculum Journal, 21(1), 47-64. doi:10.1080/09585170903558380

[4] Rensberger, B. (1997). Covering science for newspapers. A field guide for science writers, 7-16.

[5] Zhang, Y., Willis, E., Paul, M. J., Elhadad, N., & Wallace, B. C. (2016). Characterizing the (perceived) newsworthiness of health science articles: A data-driven approach. JMIR medical informatics, 4(3), e5353.

[6] Jarman, R., & Mcclune, B. (2010). Developing students’ ability to engage critically with science in the news: Identifying elements of the ‘media awareness’ dimension. The Curriculum Journal, 21(1), 47-64. doi:10.1080/09585170903558380

[7] O'Connor, E. M., Nason, G. J., O'Kelly, F., Manecksha, R. P., & Loeb, S. (2017). Newsworthiness vs scientific impact: are the most highly cited urology papers the most widely disseminated in the media?. BJU international, 120(3), 441-454.

[8] Amberg, A., & Saunders, D. N. (2020). Cancer in the news: Bias and quality in media reporting of cancer research. PLoS One, 15(11), e0242133.

[9] Haneef, R., Ravaud, P., Baron, G., Ghosn, L., & Boutron, I. (2017). Factors associated with online media attention to research: a cohort study of articles evaluating cancer treatments. Research Integrity and Peer Review, 2, 1-8.

[10] Dempster, G., Sutherland, G., & Keogh, L. (2022). Scientific research in news media: a case study of misrepresentation, sensationalism and harmful recommendations. Journal of Science Communication, 21(1), A06.

[11] Nestle, M. (2018). Unsavory truth: how food companies skew the science of what we eat. Basic Books.

[12] Funk, C., Gottfried, J., & Mitchell, A. (2017). Science News and Information Today (Rep.). Pew Research Center.

[13] Kassamali, B., Villa-Ruiz, C., Mazori, D. R., Min, M., Cobos, G. A., & LaChance, A. H. (2021). Characterizing top educational TikTok videos by dermatologists in response to “TikTok and dermatology: An opportunity for public health engagement”. Journal of the American Academy of Dermatology, 85(1), e27-e28.

[14] Yeung, A., Ng, E., & Abi-Jaoude, E. (2022). TikTok and attention-deficit/hyperactivity disorder: a cross-sectional study of social media content quality. The Canadian Journal of Psychiatry, 67(12), 899-906.

[15] Aragon-Guevara, D., Castle, G., Sheridan, E., & Vivanti, G. (2023). The reach and accuracy of information on autism on TikTok. Journal of autism and developmental disorders, 1-6.

[16] Owlheiser, A.W. (2024) How discredited health claims find a second life on TikTok. Vox.

[17] Wigle, J., Hodwitz, K., Juando-Prats, C., Allan, K., Li, X., Howard, L., ... & Parsons, J. A. (2023). Parents’ perspectives on SARS-CoV-2 vaccinations for children: a qualitative analysis. CMAJ, 195(7), E259-E266.

[18] Haenschen, K., Shu, M. X., & Gilliland, J. A. (2023). Curated misinformation: Liking Facebook pages for fake news sites. American Behavioral Scientist, 00027642231175638.

[19] Chivers, T. (2017). How The Parents Of Autistic Children Are Being Targeted By Misinformation Online. Buzzfeed.

[20] Lerman K, Yan X, Wu XZ (2016) The "Majority Illusion" in Social Networks. PLOS ONE 11(2): e0147617. https://doi.org/10.1371/journal.pone.0147617

[21] Sparkman, G., Geiger, N., & Weber, E. U. (2022). Americans experience a false social reality by underestimating popular climate policy support by nearly half. Nature communications, 13(1), 4779.

[22] Chavkin, S., Gilbert C., O’Connor A. (2023) The food industry pays influencer dietitians to shape your eating habits. The Washington Post.

[23] Sy, S., & Nagy, L. (2022) Misinformation abounds in the wellness community: How one antivax influencer broke free. PBS.org.

[24] Simmons, Cecile. (2023) ‘Conspirituality’ and climate: How wellness and new age influencers are serving anti-climate narratives to their audiences. ISD Global.

[25] Butler, K. (2022) How Wellness Influencers Became Cheerleaders for Putin’s War. Mother Jones.

[26] MediaSmarts. (2023). “Young Canadians in a Wireless World, Phase IV: Digital Media Literacy and Digital Citizenship.” MediaSmarts. Ottawa.

[27] Levine, D. M., & Mehrotra, A. (2021). Assessment of diagnosis and triage in validated case vignettes among nonphysicians before and after internet search. JAMA network open, 4(3), e213287-e213287.

[28] MediaSmarts. (2023). “Young Canadians in a Wireless World, Phase IV: Digital Media

Literacy and Digital Citizenship.” MediaSmarts. Ottawa.

[29] Zenone, M., Snyder, J., van Schalkwyk, M. C., Bélisle-Pipon, J. C., Hartwell, G., Caulfield, T., & Maani, N. (2024). Alternative cancer clinics’ use of Google listings and reviews to mislead potential patients. British Journal of Cancer.

[30] Haider, J., & Rödl, M. (2023). Google Search and the creation of ignorance: The case of the climate crisis. Big Data & Society, 10(1), 20539517231158997.

[31] Haider, J., & Rödl, M. (2023). Google Search and the creation of ignorance: The case of the climate crisis. Big Data & Society, 10(1), 20539517231158997.

[32] Gregory, J. (2019) Health websites are notoriously misleading. So we rated their reliability. Statnews.

[33] Serrano, J. (2022) YouTube Is Making It Easier to Tell the Difference Between Real Doctors and Quacks.

[34] Ruvic, D. (2023) YouTube making money off new breed of climate denial, monitoring group says. Reuters.

[35] Weill, K. (2022) Off the Edge: Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything. Algonquin Books.