Health and science misinformation

Though health and science topics are subject to the same kinds of misinformation found everywhere, there are two types that are particularly common in these fields: denialism and snake oil.

Denialism

Unlike much of the information that is discussed online, such as politics or the relative merits of different science-fiction franchises, there is in most scientific fields a consensus of what is generally considered to be supported by the current evidence. As a result, much of the misinformation on health and science topics is aimed at denying that the consensus exists or creating the impression that the topic is more under debate than it genuinely is. (The reverse of this, false consensus, was referred to in the previous section; it uses many of the same persuasive tools as denialism.)

There are four common elements particularly found in denialist misinformation:[1]

Conspiracy theory: One reason why denialism can be persuasive is that it has the same appeal as conspiracy theories, suggesting that powerful people are lying to you. There are also some valid historical reasons to be skeptical of the groups that are painted as villains in denialist narratives: some doctors have been suborned by powerful industries,[2] governments have conducted medical experiments on people without their consent,[3] and pharmaceutical companies have pushed their products without regard for anything but their own profits.[4] But denialists turn this history upside down by painting scientists as the villains and painting corporations – such as those who profit from denying the consensus on climate change – as their victims.

Cherry-picking: Denialists know that few of us are qualified to judge scientific evidence and that, if given the opportunity, will choose to believe data that supports our worldview. They take advantage of this by seizing on any contrary or even equivocal evidence and presenting it as a challenge to the scientific consensus: for example, tobacco companies paid for research that uncovered genuine links between lung cancer and a variety of factors, but promoted those as evidence against the much stronger link between lung cancer and smoking. Another method is to try to represent debate about the details of a phenomenon – such as the ongoing discussion of how and where climate change will have the most impact, and how quickly – as debate on the overall consensus.[5]

Double standards: The research funded by the tobacco companies is also a good example of the denialist tactic of subjecting evidence that supports the scientific consensus to a higher standard than evidence that challenges it. This is frequently seen in anti-vaccination denialism, where a small number of studies that imply a link between vaccination and conditions such as autism are treated as having the same weight as the many studies that show no such link.

Tame experts: Another way that denialists make the consensus seem less settled is to amplify the voices of those few scientists who are challenging it. If no experts that are actually in the field can be found, anyone with a PhD will do:[6] for example Dr. David Group, a chiropractor and naturopath, has given his opinion on topics that range from vaccines to fluoride to schizophrenia.[7]

Much of the denialism encountered today comes from individuals or small groups posting online, as well as “astroturf” organizations that are funded by corporations or other powerful interests but pretend to be grassroots organizations.[8] But denialism can also find its way into legitimate news outlets in a variety of ways: when a public figure, such as a politician, endorses it; when journalists uncritically accept denialists’ arguments of how much the consensus in a topic is under debate; and when news outlet owners or publishers deliberately promote denialism for ideological or commercial reasons. While this last example is surely the least frequent, it does happen: Jonathan Kay, former comments editor for the National Post, has said that while the paper offered responsible news coverage of climate change during his time there, in editorial “when it came to global warming, all bets were off… This niche was simply too popular to give up.”[9]

Snake Oil

Snake oil is health and science misinformation whose purpose is to get you to buy something right now. Websites, videos and social network accounts selling supplements, books and miracle cures abound, and can be both highly lucrative and highly dangerous: one frequently sold treatment is a so-called “mineral miracle solution” made of bleach, which parents are instructed to either have their children drink or deliver as enemas.[10]

A frequent tactic of snake-oil websites is giving away products and then making their money from consultations: for example Jillian Mai Thi Epperly, whose thousands of Facebook followers believe a diet of her fermented cabbage recipe will rid them of “parasites” that she claims cause all diseases, charges $70 an hour to provide guidance over the phone.[11] Similarly, Gwyneth Platrow’s Goop website features an interview with a naturopath who claims that “you probably have a parasite,” which she recommends treating with a milk bath – alongside an ad for a $70 bottle of “Milk Protein + Gut Food.”[12]

The U.S. Food and Drug Administration lists six commonly found characteristics of snake-oil products or services:[13]

Cure-alls: Most medicines treat a very specific range of symptoms or conditions. Anything that claims to cure or treat multiple conditions is likely to be snake oil. (The cure may be implied instead of stated outright, since more sophisticated snake-oil peddlers are careful not to make specific medical claims to avoid legal trouble.)

Miracle cures: Similarly, claims that a treatment is a “new discovery” or “scientific breakthrough” are warning signs: not only are such treatments extremely rare in history, such a miracle cure would certainly be published in scientific journals, not websites or infomercials.

Conspiracy theories: These can range from a fairly narrow accusation that “they” are trying to cover up a new miracle cure, to darker narratives about government conspiracies to poison us with toxins that only the supplements the snake-oil peddler is selling can cure.[14] This can be effective because there have been times when industry or government have poisoned people[15] – or allowed them to be poisoned through inaction.[16]

Personal testimonials: If a medicine is effective, there should be solid data showing that. A pitch that relies only on patients’ personal stories is almost certainly snake oil.

Quick fixes: Any pitch that emphasizes how quickly the treatment will work should make you cautious.

“All natural”: The word “natural” does not have any scientific meaning. Many substances found in nature are toxic, and many synthetic substances are safe when used as directed. As well, many of the synthesized medicines we use today, like aspirin, were originally derived from “natural” sources. Another word to watch out for is “quantum,” which is used irresponsibly to sell everything from self-help books[17] to exercise machines.[18]

It’s not unusual for health or science misinformation to fall under both denialism and snake oil: for example Alex Jones, the radio host whose conspiracy theories include denialism about both vaccines and climate change,[19] makes most of his money selling supplements[20] that purportedly cure all conditions that Jones’ listeners have been frightened into worrying about by his broadcasts, such as fluoride poisoning (“curable” with a $14 anti-fluoride toothpaste) and male feminization (for which Jones recommends his Super Male Vitality serum).[21] Also, sometimes the commercial motivation isn't immediately obvious: for instance Andrew Wakefield, who put forth the supposed link between the MMR vaccine and autism, held a patent for a vaccine that could have replaced the MMR vaccine, meaning that he stood to make money if it was found to be dangerous, but this was not widely known when his paper was first published.[22]

 


[1] Diethelm, P., & Mckee, M. (2008). Denialism: What is it and how should scientists respond? The European Journal of Public Health,19(1), 2-4. doi10.1093/eurpub/ckn139
[2] UC Davis Health, & Department of Public Affairs and Marketing. (n.d.). Big tobacco and science: Uncovering the truth. Retrieved March 26, 2018, from http://www.ucdmc.ucdavis.edu/welcome/features/20071114_cardio-tobacco/
[3] Centers for Disease Control and Prevention. (n.d.). The Tuskegee Timeline. Retrieved March 26, 2018, from https://www.cdc.gov/tuskegee/timeline.htm/
[4] Keefe, P. R. (2017, October 30). The Family That Built an Empire of Pain. The New Yorker. Retrieved March 26, 2018, from https://www.newyorker.com/magazine/2017/10/30/the-family-that-built-an-empire-of-pain
[5] Kaiser, D., & Wasserman, L. (2016, December 8). Http://www.nybooks.com/articles/2016/12/08/the-rockefeller-family-fund-vs-exxon/#fn-23. The New York Review of Books. Retrieved March 26, 2018, from http://www.nybooks.com/articles/2016/12/08/the-rockefeller-family-fund-vs-exxon/#fn-23
[6] Ellis E.G. (2017, May 31.) To Make Your Conspiracy Theory Legit, Just Find an ‘Expert.’ Wired. Retrieved March 26, 2018, from https://www.wired.com/2017/05/conspiracy-theory-experts/
[7] Merlan, A. (2017, May 01). Alex Jones and His Pill-Pushing Chiropractor Are Here to Save Your Colon From the Globalists. Retrieved March 26, 2018, from https://jezebel.com/alex-jones-and-his-pill-pushing-chiropractor-are-here-t-1794383993
[8] Yoshida, K. S., & UTC. (2011, July 18). Astroturfing works, and it's a major challenge to climate change. Retrieved March 26, 2018, from https://arstechnica.com/science/2011/07/astroturfing-a-major-challenge-to-climate-change/
[9] Kay, J. (2017, May 1). How Climate Change Denial Set the Stage for Fake News. The Walrus. Retrieved March 26, 2018, from https://thewalrus.ca/how-climate-change-denial-set-the-stage-for-fake-news/
[10] Chivers, T. (2017, August 28). How The Parents Of Autistic Children Are Being Targeted By Misinformation Online. Retrieved March 26, 2018, from https://www.buzzfeed.com/tomchivers/how-online-filter-bubbles-are-making-parents-of-autistic
[11] Subbaraman, N. (2018, March 17). Heres How A Cabbage Juice "Cult" With 58,000 Followers Set Off A Facebook War. Retrieved March 26, 2018, from https://www.buzzfeed.com/nidhisubbaraman/facebook-jilly-juice-cabbage-epperly?utm_term=.rnr0glnbD#.vsMExP8mR
[12] You Probably Have a Parasite-Here's What to Do About It. (2017, August 12). Retrieved March 26, 2018, from https://goop.com/wellness/detox/you-probably-have-a-parasite-heres-what-to-do-about-it/
[13] Consumer Updates - 6 Tip-offs to Rip-offs: Don't Fall for Health Fraud Scams. (n.d.). Retrieved March 26, 2018, from https://www.fda.gov/ForConsumers/ConsumerUpdates/ucm341344.htm
[14] Davis, S. (2016, July 22). How Alex Jones Uses Fear of the Government to Sell Diet Supplements. Retrieved March 26, 2018, from https://motherboard.vice.com/en_us/article/vv7zza/infowars-life
[15] Blum, D. (2013, January 05). Looney Gas and Lead Poisoning: A Short, Sad History. Retrieved March 26, 2018, from https://www.wired.com/2013/01/looney-gas-and-lead-poisoning-a-short-sad-history/
[16] Rosen, M. (2016, March 19). Leads Health Effects Can Last a Lifetime: Tainted Water for Kids in Flint Could Mean Problems in Adulthood. Science News.
[17] Ehrenreich, B. (2010). Bright-sided: How the relentless promotion of positive thinking has undermined America. Detroit: Gale, Cengage Learning.
[18] Szeto, E. (2018, February 02). 'This is snake oil': Scientists don't buy balance-boosting clips featured on Dragons' Den. Retrieved March 26, 2018, from http://www.cbc.ca/news/business/quantum-wellness-clips-marketplace-1.4513382
[19] Blakeslee, N. (2010, March). Alex Jones Is About To Explode. Texas Monthly.
[20] Brown, S. (n.d.). Alex Jones's Infowars Media Empire Is Built to Sell Snake-Oil Diet Supplements. Retrieved March 26, 2018, from http://nymag.com/selectall/2017/05/how-does-alex-jones-make-money.html
[21] Warzel, C. (n.d.). Alex Jones Will Never Stop Being Alex Jones. Retrieved May 9, 2018, from https://www.buzzfeed.com/charliewarzel/alex-jones-will-never-stop-being-alex-jones?utm_term=.tpA64yeNX#.ooqJj2V8N
[22] No author listed. (2011, January 12.) Vaccine study's author held related patent, medical journal reports. CNN.