Health and science misinformation

Though health and science topics are subject to the same kinds of misinformation found everywhere, there are two types that are particularly common in these fields: denialism and snake oil.

Denialism

Unlike much of the information that is discussed online, such as politics or the relative merits of different science-fiction franchises, there is in most scientific fields a consensus of what is generally considered to be supported by the current evidence. As a result, much of the misinformation on health and science topics is aimed at denying that the consensus exists or creating the impression that the topic is more under debate than it genuinely is. (The reverse of this, false consensus, was referred to in the previous section; it uses many of the same persuasive tools as denialism.) The goal of denialism is to reduce people’s confidence in the scientific consensus and make them complacent about the risks of things like not vaccinating against preventable diseases or not acting to reduce and mitigate climate change.[1] This has been disturbingly successful: according to the US Centers for Disease Control and Prevention, routine vaccination rates for children were at a ten-year low in 2023.[2]

There are four common elements particularly found in denialist misinformation:[3]

Conspiracy theory: One reason why denialism can be persuasive is that it has the same appeal as conspiracy theories, suggesting that powerful people are lying to you. There are also some valid historical reasons to be skeptical of the groups that are painted as villains in denialist narratives. Some doctors have been suborned by powerful industries,[4] governments have conducted medical experiments on people without their consent[5] and pharmaceutical companies have pushed their products without regard for anything but their own profits.[6] But denialists turn this history upside down by painting scientists as the villains and painting corporations – such as those who profit from denying the consensus on climate change – as their victims. As well, in every case, it was journalists, whistleblowers or historians who discovered these, not conspiracy theorists.[7] Most medical conspiracy theories are also improbably widespread, needing almost every government on Earth – as well as many international organizations – to be “in on it” to be true.

Cherry-picking: Denialists know that few of us are qualified to judge scientific evidence and that, if given the opportunity, will choose to believe data that supports our worldview. They take advantage of this by seizing on any contrary or even equivocal evidence and presenting it as a challenge to the scientific consensus. For example, tobacco companies paid for research that uncovered genuine links between lung cancer and a variety of factors, but promoted those as evidence against the much stronger link between lung cancer and smoking. Another method is to try to represent debate about the details of a phenomenon – such as the ongoing discussion of how and where climate change will have the most impact, and how quickly – as debate on the overall consensus.[8]

Double standards: The research funded by the tobacco companies is also a good example of the denialist tactic of subjecting evidence that supports the scientific consensus to a higher standard than evidence that challenges it. This is frequently seen in anti-vaccination denialism, where a small number of studies that imply a link between vaccination and conditions such as autism are treated as having the same weight as the many studies that show no such link.

A related technique is the moving target, where disproven claims are immediately replaced by slightly different ones. Once thimerosal was removed from vaccines, for instance, people claiming a link between vaccination and autism began claiming that aluminum was responsible instead.[9]

Tame experts: Another way that denialists make the consensus seem less settled is to amplify the voices of those few scientists who are challenging it. If no experts that are actually in the field can be found, anyone with a PhD will do:[10] for example Dr. David Group, a chiropractor and naturopath, has given his opinion on topics that range from vaccines to fluoride to schizophrenia.[11]

Much of the denialism encountered today comes from individuals or small groups posting online, as well as “merchants of doubt” – corporations or other organizations that aim to cast doubt on things like the impact of fossil fuels on climate change and the connection between smoking and cancer[12] - and “astroturf” organizations that are funded by corporations or other powerful interests but pretend to be grassroots organizations.[13] But denialism can also find its way into legitimate news outlets in a variety of ways: when a public figure, such as a politician, endorses it; when journalists uncritically accept denialists’ arguments of how much the consensus in a topic is under debate; and when news outlet owners or publishers deliberately promote denialism for ideological or commercial reasons. While this last example is surely the least frequent, it does happen. Jonathan Kay, former comments editor for the National Post, has said that while the paper offered responsible news coverage of climate change during his time there, in editorial, “when it came to global warming, all bets were off… This niche was simply too popular to give up.”[14]

Snake oil

Snake oil is health and science misinformation whose purpose is to get you to buy something right now. Websites, videos and social network accounts selling supplements, books and miracle cures abound, and can be both highly lucrative and highly dangerous. One frequently sold treatment is a so-called “mineral miracle solution” made of bleach, which parents are instructed to either have their children drink or deliver as enemas,[15] while many videos on TikTok promote dangerous “cures” for autism.[16] Algorithms can promote snake oil, identifying people’s health issues and concerns based on their views, searches and purchases and then delivering “related” content that is neither safe nor effective: “once these trends get hold of someone’s feed, it can be difficult to shake them out.”[17]

A frequent tactic of snake-oil websites is giving away products and then making their money from consultations. For example, Jillian Mai Thi Epperly, whose thousands of Facebook followers believe a diet of her fermented cabbage recipe will rid them of “parasites” that she claims cause all diseases, charges $70 an hour to provide guidance over the phone.[18] Similarly, Gwyneth Platrow’s Goop website features an interview with a naturopath who claims that “you probably have a parasite,” which she recommends treating with a milk bath – alongside an ad for a $70 bottle of “Milk Protein + Gut Food.”[19]

The U.S. Food and Drug Administration lists six commonly found characteristics of snake-oil products or services:[20]

Cure-alls: Most medicines treat a very specific range of symptoms or conditions. Anything that claims to cure or treat multiple conditions is likely to be snake oil. The cure may be implied instead of stated outright, since more sophisticated snake-oil peddlers are careful not to make specific medical claims to avoid legal trouble.

Miracle cures: Similarly, claims that a treatment is a “new discovery” or “scientific breakthrough” are warning signs. Not only are such treatments extremely rare in history, such a miracle cure would certainly be published in scientific journals, not websites or infomercials.

Conspiracy theories: These can range from a fairly narrow accusation that “they” are trying to cover up a new miracle cure, to darker narratives about government conspiracies to poison us with toxins that only the supplements the snake-oil peddler is selling can cure.[21] This can be effective because there have been times when industry or government have poisoned people[22] – or allowed them to be poisoned through inaction.[23] These can be especially effective on people who’ve had bad experiences with the medical system.[24]

Personal testimonials: If a medicine is effective, there should be solid data showing that. A pitch that relies only on patients’ personal stories is almost certainly snake oil.

Quick fixes: Any pitch that emphasizes how quickly the treatment will work should make you cautious.

“All natural”: The word “natural” doesn’t have any scientific meaning. Many substances found in nature are toxic and many synthetic substances are safe when used as directed. As well, many of the synthesized medicines we use today, like aspirin, were originally derived from “natural” sources. Another word to watch out for is “quantum,” which is used irresponsibly to sell everything from self-help books[25] to exercise machines.[26]

What both denialism and snake oil have in common is a distrust of scientific and medical authorities.[27] It’s also not unusual for health or science misinformation to fall under both categories. For example, Alex Jones, the radio host whose conspiracy theories include denialism about both vaccines and climate change,[28] makes most of his money selling supplements[29] that purportedly cure all conditions that Jones’ listeners have been frightened into worrying about by his broadcasts; “The conspiracy theories are better seen as a marketing tool for his real products — InstaHard, BodEase, Diet Force and all manner of oils, tinctures and supplements.”[30]

As that example shows, snake oil frequently has an ideological element, as well. Like anti-vaccination attitudes, things like raw milk that were either apolitical or associated with the political left in the past are now often sold to people on the right as a way of opposing government authority.[31] More and more, anti-science “wellness” and far-right messaging go hand-in-hand.[32]

It’s incredible how networked these conspiracy theories are. Once you’re involved in one of the buckets of misinformation, your likelihood to believe in others is far, far amplified. Rachel Moran, Center for an Informed Public[33]

Also, sometimes the commercial motivation isn't immediately obvious. For instance, Andrew Wakefield, who put forth the supposed link between the MMR vaccine and autism, held a patent for a vaccine that could have replaced the MMR vaccine, meaning that he stood to make money if it was found to be dangerous, but this wasn’t widely known when his paper was first published.[34]


[1] Dubé, E., Bettinger, J. A., Fisher, W. A., Naus, M., Mahmud, S. M., & Hilderman, T. (2016). Improving Vaccination Rates: Vaccine acceptance, hesitancy and refusal in Canada: Challenges and potential approaches. Canada communicable disease report, 42(12), 246.

[2] Maxmen, A. (2024) How fringe anti-science views infiltrated mainstream politics, and what it means in 2024. MedicalXpress

[3] Diethelm, P., & Mckee, M. (2008). Denialism: What is it and how should scientists respond? The European Journal of Public Health,19(1), 2-4. doi:10.1093/eurpub/ckn139

[4] UC Davis Health, & Department of Public Affairs and Marketing. (n.d.). Big tobacco and science: Uncovering the truth. Retrieved March 26, 2018, from https://web.archive.org/web/20180716032837/ucdmc.ucdavis.edu/welcome/features/20071114_cardio-tobacco/

[5] Centers for Disease Control and Prevention. (n.d.). The Tuskegee Timeline. Retrieved March 26, 2018, from https://www.cdc.gov/tuskegee/about/timeline.html

[6] Keefe, P. R. (2017). The Family That Built an Empire of Pain. The New Yorker. n

[7] Butter, M. (2020). The nature of conspiracy theories. John Wiley & Sons.

[8] Kaiser, D., & Wasserman, L. (2016). The Rockefeller Family Fund vs. Exxon. The New York Review of Books.

[9] Medsafe Pharmacovigilance Team. (2016) Risk Communication. New Zealand Medicines and Medical Devices Safety Authority.

[10] Ellis E.G. (2017) To Make Your Conspiracy Theory Legit, Just Find an ‘Expert.’ Wired.  

[11] Merlan, A. (2017). Alex Jones and His Pill-Pushing Chiropractor Are Here to Save Your Colon From the Globalists. Jezebel.

[12] Bergstrom, C. (2022) To Fight Misinformation, We Need to Teach That Science Is Dynamic. Scientific American.

[13] Yoshida, K. S., & UTC. (2011). Astroturfing works, and it's a major challenge to climate change. Ars Technica.

[14] Kay, J. (2017). How Climate Change Denial Set the Stage for Fake News. The Walrus.

[15] Chivers, T. (2017). How The Parents Of Autistic Children Are Being Targeted By Misinformation Online. Buzzfeed.

[16] Diego Aragon-Guevara et al, The Reach and Accuracy of Information on Autism on TikTok, Journal of Autism and Developmental Disorders (2023). DOI: 10.1007/s10803-023-06084-6

[17] Ohlheiser, A.W. (2024) How discredited health claims find a second life on TikTok. Vox.

[18] Subbaraman, N. (2018. Here’s How A Cabbage Juice "Cult" With 58,000 Followers Set Off A Facebook War. Buzzfeed.

[19] You Probably Have a Parasite-Here's What to Do About It. (2017). Goop.

[20] Consumer Updates - 6 Tip-offs to Rip-offs: Don't Fall for Health Fraud Scams. (n.d.). Retrieved March 26, 2018, from https://www.fda.gov/ForConsumers/ConsumerUpdates/ucm341344.htm

[21] Davis, S. (2016). How Alex Jones Uses Fear of the Government to Sell Diet Supplements. MotherBoard.

[22] Blum, D. (2013). Looney Gas and Lead Poisoning: A Short, Sad History. Wired.

[23] Rosen, M. (2016). Leads Health Effects Can Last a Lifetime: Tainted Water for Kids in Flint Could Mean Problems in Adulthood. Science News.

[24] Christou-Ergos, M., Leask, J., & Wiley, K. E. (2022). How the experience of medical trauma shapes Australian non-vaccinating parents' vaccine refusal for their children: A qualitative exploration. SSM-Qualitative Research in Health, 2, 100143.

[25] Ehrenreich, B. (2010). Bright-sided: How the relentless promotion of positive thinking has undermined America. Detroit: Gale, Cengage Learning.

[26] Szeto, E. (2018). 'This is snake oil': Scientists don't buy balance-boosting clips featured on Dragons' Den. CBC News.

[27] Pickup, M., Stecuła, D., & Van der Linden, C. (2022). Who shares conspiracy theories and other misinformation about Covid-19 online: Survey evidence from five Countries. Journal of Quantitative Description: Digital Media, 2.

[28] Blakeslee, N. (2010). Alex Jones Is About To Explode. Texas Monthly.

[29] Brown, S. (n.d.). Alex Jones's Infowars Media Empire Is Built to Sell Snake-Oil Diet Supplements. New York Magazine.

[30] Manjoo, F. (2022) Alex Jones and the Wellness Conspiracy Industrial Complex. The New York Times.

[31] Novicoff, M. (2024) How Raw Milk Went from a Whole Foods Staple to a Conservative Signal. Politico.

[32] Butler, K. (2022) How Wellness Influencers Became Cheerleaders for Putin’s War. Mother Jones.

[33] Butler, K. (2022) How Wellness Influencers Became Cheerleaders for Putin’s War. Mother Jones.

[34] (2011Vaccine study's author held related patent, medical journal reports. CNN.