The Signpost

Opinion

Wikipedia's war against scientific disinformation

Mary Mark Ockerbloom was a Wikipedian-in-residence at the Science History Institute from 2013 to 2020.

The playbook for undermining scientific expertise was created in the 1920s. Perhaps surprisingly, its creators were scientists. Wikipedia is one battleground in the hundred years' war against scientific disinformation. The tactics of scientific disinformation that were developed then are still used today. We need to be aware of these tactics so that we can counter them; we also need more scientifically-savvy allies to help us get it right.

The practice of scientific disinformation

Tetraethyllead (TEL) at the point of sale
Charles Kettering, on a Time cover, 1933

On December 3, 1921, Thomas Midgley Jr. and Charles Kettering, scientists at General Motors, discovered that adding tetraethyllead (TEL) to gasoline improved a car's performance.[1] As patent holders of the new technology who held senior positions in the companies producing and marketing TEL, Kettering and Midgely had everything to gain by promoting TEL's use rather than developing other non-patentable alternatives.[1][2]

TEL contains lead: an odorless, colorless, tasteless, poisonous element. Once present in a person, lead does not degrade, it accumulates.[3][4] Kettering and Midgley were warned of the dangers of lead and TEL by other scientists. Erik Krause of the Institute of Technology, Potsdam, Germany had studied TEL extensively, and described it as "a creeping and malicious poison".[2] Internal confidential reports document that the companies involved knew TEL was "extremely hazardous",[1] while their safety precautions were "grossly inadequate".[1]

"Loony gas", New York Journal, October 31, 1924

By 1923, cases of violent madness and death were being reported among workers at TEL plants. At least 17 persons died, and hundreds more suffered neurological damage.[5][6] Postmortems confirmed the cause as tetraethyllead.[7] Coworkers referred to the fumes they breathed as "loony gas" and to a building they worked in as the "House of Butterflies", because workers had hallucinations of insects.[6]

Tetraethyllead

The newspapers, the public, and the government began to take notice.[5] Midgley himself was treated for lead poisoning. Despite this, he protested that TEL was perfectly safe, even washing his hands in it in front of reporters on October 30, 1924,[1][3] the same day the New York City Board of Health banned the sale of TEL-enhanced gasoline.[1][3]

As concerns about TEL grew, Kettering took protective measures for the company, but not its workers. He hired Robert A. Kehoe, a 1920 medical school graduate, as an in-house "medical expert" whose job was to prove that leaded gasoline did not harm humans. An early study is illustrative of the methodological problems in his work. In that study he reported that workers handling TEL had levels of exposure no higher than a control group, which was composed of workers at the same plant. He suggested their levels of lead were "normal", equating normal and harmless.[8]

Alice Hamilton was among those who criticized Kehoe's research. A pioneer in industrial toxicology and occupational health, Hamilton was America's leading authority on lead poisoning, with decades of experience in public health research and policy-making.[9] On May 20, 1925, Hamilton and other public health advocates from Harvard, Yale and Columbia faced off against Kettering, at a conference called by the Surgeon General of the Public Health Service to consider the use of lead in gasoline. Would lead in gasoline be released into the air when the fuel was burned, putting the general public at risk? Hamilton warned that lead posed environmental as well as occupational dangers. "You may control conditions within a factory, but how are you going to control the whole country?"[10] Kettering argued that TEL was the only way to improve gasoline, and asserted that no one had proved that leaded gasoline was harmful.[1][10]

The conference ended with the formation of a committee to further investigate the possible effects of leaded gasoline. The committee could have taken responsibility for further independent research, but public health advocates lacked funding, and the government wasn't willing to provide it. In a classic case of setting the fox to watch the henhouse, the commission chose to rely on industry to monitor itself, ignoring the inherent conflict of interest. Not surprisingly, Robert Kehoe reported that the industry-funded research showed "no evidence of immediate danger to the public health."[10][11]

Hamilton reportedly told Kettering to his face that he was "nothing but a murderer".[2][1] Kettering formed the Kettering Foundation for public policy-related research. Kettering and Midgley developed Freon, which eventually became another environmentally disastrous product.[2] Kehoe became the gasoline industry's chief spokesperson, largely controlling the next fifty years of scientific and public narrative around leaded gasoline.[11][8]

Risk, responsibility and the undermining of science

Public health advocates and corporate representatives took different approaches to risk and responsibility in the debate over leaded gasoline. Public health followed a precautionary principle, arguing that unless scientists could demonstrate that something was safe, it should not be used. The Kehoe paradigm laid the burden of proof on the challenger. Rather than demonstrating that their product was safe, they demanded that critics prove it was harmful. But Kehoe's rule is logically flawed – absence of evidence of risk does not imply evidence of the absence of risk. Regardless, the Kehoe paradigm became extremely influential in the United States.[11][8][12]

This not only set scientist against scientist, it made the undermining of science tactically useful. Experimental research is a method of testing ideas and assessing the likelihood that they are correct based on observable data. By its very nature, the results of a scientific study do not provide a 100% yes-or-no answer. Disinformation exploits this lack of absolute certainty by implying that uncertainty means doubt. If evidence is presented to challenge a position, it is suggested that the proof is not sufficiently compelling, that doubt still remains, that more research must be done, and that no responsibility need be taken in the meantime. By repeatedly raising the issue of doubt, companies and scientists use "cascading uncertainty" to manipulate public opinion and protect their own interests.[13][14][15][12]

In the 1960s geochemist Clair Patterson developed sophisticated monitoring equipment and methods to measure the history of the earth's chemical composition. Initially uninterested in lead, his research provided compelling evidence of its extraordinary increase in the planet's recent history. Patterson broke the industry's leaded gasoline narrative. The industry spent almost 25 years trying to discredit him and his work through professional, personal, and public attacks. Nonetheless Patterson did what Kettering and Kehoe had challenged public health officials to do in 1925 – demonstrate the impact of TEL from gasoline.[11][8][12]

By then, lead contamination from gasoline (and from paint) was found worldwide, not just in North America. The World Health Organization (WHO) considers lead to be one of the top ten chemicals posing a major public health risk, with immense personal, social and economic costs. One of the "key facts" they state is "There is no level of exposure to lead that is known to be without harmful effects."[4]

Kettering's playbook has been replicated repeatedly. Companies and their researchers have argued that smoking, CFCs, opioids, vaping, fossil fuels and climate change (to name only a few) aren't really dangers; that concerns about public safety and environmental damage have not been sufficiently proven and so do not require action; and that raising a shadow of doubt is enough to challenge widespread scientific consensus. I strongly recommend reading Merchants of Doubt (Oreskes & Conway, 2010)[13] and Doubt Is Their Product (2008)[14] and The Triumph of Doubt (2020) by David Michaels.[15]

At its most extreme, we face an attitude that assumes scientific issues are simply matters of opinion and belief, independent of underlying scientific evidence and informed consensus. The dangers of this are apparent: We are in the midst of a pandemic that some refuse to believe exists. Anti-mask and anti-vaccination propaganda are recent areas of scientific disinformation, putting us all at risk.[16]

Defeating scientific disinformation on Wikipedia

Companies and individuals who try to whitewash their home pages or promote pseudoscience cures are fairly obvious examples of conflict of interest and promotional editing. I suspect most Wikipedians both recognize and know how to deal with such situations. But there are patterns of disinformation, manipulation, and the undermining of science that are harder to spot and address. We need to watch for patterns of bias that go far beyond articles for a specific company or product.

Tactics in the disinformation playbook include:

  • Ignore warnings and deny reports of possible safety issues,
  • Accuse critics and the media of sensationalism,
  • Fund "experts" to conduct research that can only support your position,
  • Control the narrative and reframe issues to avoid consideration of risk,
  • Use any level of doubt to claim there is not enough evidence of risk,
  • Ignore scientific consensus,
  • Personally attack those who challenge you,
  • Avoid acknowledgement of responsibility and delay taking actions that could address issues.

Do funders and researchers have a vested interest in a particular outcome? Too often, companies are funding the research that claims their products are safe or effective. Scientific articles will generally indicate who supported the research. Check out the funders, and watch out for industry organizations, think tanks and researchers with a history of anti-regulatory bias. Look for independent evidence-based sources from credible groups like Consumer Reports.

In striving for balance, we need to recognize the importance of counter-narratives around public health. Actively look for scientific work that raises issues of public health and follows a precautionary principle. These are important concerns, and addressing these concerns is part of presenting a balanced picture. Be wary of the burden of proof. Are different expectations being applied to the research of proponents and critics of an idea? Keep in mind that industry pours huge amounts of money into its research while public health receives far less.

In a war of supposedly scientific claims and counterclaims, are proponents of a particular position using the technique of raising doubt? Is research critical of a position repeatedly minimized or dismissed on the grounds that it fails to meet some increasingly strict or absolute standard of proof? Are personal attacks and appeals to public opinion being used to displace scientific evidence? These tactics can be particularly insidious in Wikipedia articles, because we are encouraged to present all sides of an issue. As editors we need to remember that writing in a fair and balanced way doesn't mean that all ideas have to be given equal weight. Present them in proportion to their importance.

Beware of cherry picking: it is encouraged and amplified by social media, and Wikipedia articles can be particularly susceptible to the selective presentation of information. Look for the weight of scientific consensus on an issue. If the vast majority of scientists worldwide accept that climate change is real (as they do), you can be definite about it.

Wikipedia needs more scientific expertise

The tactics of disinformation complicate the difficulties of writing about science. Let's be clear – writing about science is hard! Writing about science for Wikipedia's readers – much of the world's population – is an even more challenging task. First you must wrap your own head around an area of expertise that may be highly specialized, and then you must communicate your understanding to readers who may lack a scientific background or do not share your frame of reference. As the Wikipedian-in-residence at the Science History Institute in Philadelphia for seven years, I rarely read a science-related article on Wikipedia that did not need significant improvement. Working on biographies of scientists and other pages with historical content, I asked myself "Where is the science?"

When it comes to science, Wikipedia needs all the help it can get. There are scientists who have manipulated and undermined scientific information, as I describe above, but there are many more scientists whose credentials, expertise and research are rock-solid. We need to find ways to engage with those scientists and leverage their knowledge. Doing that was the most challenging part of my job as a Wikipedian-in-residence.

We need more editors with scientific expertise. We need editors with an awareness of how scientific information can be manipulated. We need science communicators who can evaluate scientific materials and write about science in a way that is comprehensible to nonscientists. We need as many people as possible keeping a careful eye on scientific information on Wikipedia to prevent the kinds of manipulation we know occurs. I often worry that a collection of volunteer editors hasn't a hope of keeping up. We need allies.

In the ongoing war against disinformation, Wikipedia needs scientific expertise – and science needs Wikipedians to get things right.

References

  1. ^ a b c d e f g h Kovarik, William (2006). "Ethyl-leaded gasoline: how a classic occupational disease became an international public health disaster" (PDF). International Journal of Occupational and Environmental Health. 11 (4): 384–397. doi:10.1179/oeh.2005.11.4.384.
  2. ^ a b c d Dauvergne, Peter (2010). The shadows of consumption : consequences for the global environment. Cambridge, Mass.: MIT Press. ISBN 9780262260572.
  3. ^ a b c Kitman, Jamie Lincoln (March 20, 2000). "The Secret History of Lead". The Nation.
  4. ^ a b "Lead poisoning and health". World Health Organization. 23 August 2019.
  5. ^ a b Kovarik, Bill. "The Ethyl conflict & the media: Paper to the AEJMC, April 1994". Prof. Kovarik Utinam patribus nostrius digni simus. Retrieved 28 January 2021.
  6. ^ a b Blum, Deborah (2011). The Poisoner's Handbook: Murder and the Birth of Forensic Medicine in Jazz Age New York. New York: Penguin.
  7. ^ Norris, Charles; Gettler, Alexander O. (12 September 1925). "Poisoning by tetra-ethyl lead: postmortem and chemical findings". Journal of the American Medical Association. 85 (11): 818. doi:10.1001/jama.1925.02670110032009.
  8. ^ a b c d Needleman, Herbert L. (1998). "Clair Patterson and Robert Kehoe: two views of lead toxicity". Environmental research. 78 (2): 79–85. doi:10.1006/enrs.1997.3807. PMID 9719611.
  9. ^ Sicherman, Barbara (2003). Alice Hamilton : a life in letters. Urbana: University of Illinois Press. ISBN 0-252-07152-2.
  10. ^ a b c Rosner, D; Markowitz, G (1985). "A 'gift of God'?: The public health controversy over leaded gasoline during the 1920s". American Journal of Public Health. 75 (4): 344–52. doi:10.2105/ajph.75.4.344. PMC 1646253. PMID 2579591.
  11. ^ a b c d Nriagu, J. O. (August 1998). "Clair Patterson and Robert Kehoe's paradigm of "show me the data" on environmental lead poisoning". Environmental research. 78 (2): 71–8. doi:10.1006/enrs.1997.3808. PMID 9719610.
  12. ^ a b c Percival, Robert V. (2006). "Who's Afraid of the Precautionary Principle?". Pace Environmental Law Review. 23 (1).
  13. ^ a b Oreskes, Naomi; Conway, Erik M. (2010). Merchants of doubt : how a handful of scientists obscured the truth on issues from tobacco smoke to global warming (1st U.S. ed.). New York: Bloomsbury Press. ISBN 9781596916104.
  14. ^ a b Michaels, David (2008). Doubt Is Their Product: How Industry's Assault on Science Threatens Your Health. Oxford; New York: Oxford University Press. ISBN 978-0-19-530067-3.
  15. ^ a b Michaels, David (2020). The triumph of doubt : dark money and the science of deception. Oxford; New York: Oxford University Press. ISBN 978-0-19-092266-5.
  16. ^ Roozenbeek, Jon; Schneider, Claudia R.; Dryhurst, Sarah; Kerr, John; Freeman, Alexandra L. J.; Recchia, Gabriel; van der Bles, Anne Marthe; van der Linden, Sander (October 2020). "Susceptibility to misinformation about COVID-19 around the world". Royal Society Open Science. 7 (10): 201199. doi:10.1098/rsos.201199.
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
  • Thank you, No Swan So Fine. It's amazing how much information -- and how much history -- there can be behind something you take for granted. A while back I rewrote the baking powder article. I had had no idea that there were different types of it, much less the conflicts that were part of its history. Mary Mark Ockerbloom (talk) 04:32, 1 February 2021 (UTC)[reply]
  • As a biotechnology dropout currently studying business, I've recently gone in conflict on the ethics of products and corporate actions; one of my recent assignments involved mediate between companies as Olympic sponsors and environmentalist and health groups, citing the London 2012 Olympics as background I still haven't done this assignment -Gouleg🛋️ (StalkHound) 16:03, 2 February 2021 (UTC)[reply]
  • We need more editors with scientific expertise. Why should they join? Fighting misinformation in science on here can be an utter time sink. Such scientists probably don’t have the time or energy to engage in lengthy circular discussions on poor sourcing attempting to introduce disinformation, unlike people whose sole agenda is disinformation. And we don’t topic ban in the name of ‘free speech’ / ‘legitimate content debate’ etc. COVID misinformation is ripe, at talk:Ivermectin for example, and it took too long for topic bans there. It’s just boring. There’s legitimate content debate, and there’s misrepresenting sources and quoting crappy sources to push a pseudoscientific POV. We should start treating editors’ time with respect, if we want editors with expert scientific expertise to contribute. ProcrastinatingReader (talk) 21:13, 31 January 2021 (UTC)[reply]
    • There's likely a couple things to do, e.g. realizing that POV pushing occurs in science articles too and more strictly enforcing our rules.
    • We can invite in scientists (or their grad students or even Wikipedians-in-residence) or bring them into edit-a-thons or other methods more familiar to them.
    • What we can't do however is tell them that they'll be immune from the usual back and forth of Wikipedia editing
    • Perhaps set up a freely licensed "teaching journal" where they can get some academic credit towards tenure in a fairly normal (but speedier) peer review process, but that material can then be imported into Wikipedia more easily.
  • I'm not sure any of the above are *the answer*, but I certainly hope that nobody is giving up! Smallbones(smalltalk) 01:22, 1 February 2021 (UTC)[reply]
    • It is a huge challenge, and I thank you for your hard work. Groups like WikiEdu do good work bringing in students and teaching them to write. I'd love to see more outreach with working scientists and scientific organizations. My personal opinion is that we need to look for ways to collaborate with scientists and get them to share their expertise without requiring that they jump through all the bells and whistles of becoming full-fledged editors. We need to find ways to meet them halfway. Current talk pages are fine if you already have expertise with Wikipedia, but not if you're a scientist reading an article and going "but that's wrong". Mind you, would a scientist who knows it's wrong be reading the article? Maybe someday we'll have an interface that will make it easy for readers to highlight what they think is wrong or confusing on a Wikipedia page and flag it or tell us more about it. Then we could ask a group of scientists to review articles for us, in the same way I've sometimes asked an expert to read a Wikipedia page printout and mark what's dodgy with a highlighter pen. Knowing where the problems are goes a long way to getting them fixed; in a two-stage process experienced editors could vet those reports. Applying more of a "bug-reporting" mentality to content is just one idea; I'd love to hear what other people would suggest. Mary Mark Ockerbloom (talk) 04:32, 1 February 2021 (UTC)[reply]
      • OK Mary Mark Ockerbloom I'll bite on your bait and list some of the above ideas and a few more just to see if something strikes a chord with other people
        • encourage WikkiEdu to get more science students (and teachers) involved with them
        • get a freely licensed journal where scientists can "quickly" publish peer-reviewed Wiki-relevant material.
        • similarly, help set up some freely licensed science-wikis where scientists could limit participant to, say PhDs, so they could avoid some of the trolls around here.
        • There are actually lots of advanced degree holders around here who have been very successful editors. Jesswade88 comes to mind immediately. Just ask them in a survey what works and what hasn't.
        • There are lots of academics in some of the softer sciences who love to publish about Wikipedia or use Wikipedia data (see any Recent research column in The Signpost). Get some of them to write in Wikipedia about Wikipedia, or to get the "hard scientists" you seem to be talking about involved in some of the things that they like to do. Actually I just ran into a chemist this month who published a paper in Nature about Wikipedia. It's not like they are not interested.
        • And, of course, ask anybody reading this for their ideas. Smallbones(smalltalk) 18:46, 1 February 2021 (UTC)[reply]
  • Good piece. Thanks! --Piotr Konieczny aka Prokonsul Piotrus| reply here 09:27, 1 February 2021 (UTC)[reply]
  • My personal experience in contributing to scientific debate on Wikipedia, which is documented on talk pages but I can not easily find it now, is from several years ago, when I have seen a hot debate between two users discussing what is the energy flux and some issues related to it. I am a full professor in physics at a top-100 university in the world. I have taken an undergraduate text which was on my table and added a citation from the text, which answered precisely the question being debated. Both sides dismissed the citation, saying that it does not correspond to the current scientific consensus or whatever formulation they have chosen, I do not remember, and continued fighting. Then I thought "fuck you" and unwatched the page. I have to fight enough for my own research results and funding in the real world, and I do not have time and energy to debate with ignorant users without academic credentials about the issues which are part of a standard undergrad curriculum. I believe one of the users was later dragged to ANI and either blocked or topic-banned, the other one is probably still there.--Ymblanter (talk) 10:27, 1 February 2021 (UTC)[reply]
Yes, kind of.--Ymblanter (talk) 06:35, 2 February 2021 (UTC)[reply]
  • Yes, I agree this is a good article, and points out some of the tension that exists in science, science as servant, science as master. Are we suppressing "fringe theories" or are we ignoring disinterested science? I must admit that the "fringe" stuff is often so bleeding obvious that there is no problem refuting and dismissing it. The history of hand-washing in medicine and continental drift in geology and biogeography are both depressing however. My 2 bob: archaeologist, academic in Geography, Environmental Science, Biomedical Science and Medicine. Scientists are experts, but usually in the field that they are expert in. In other fields they are not. Take medical science, what they do they do well, but they are driven by funding and by medical culture (a powerful force in medicine). Sociology, history &c., they are quite often rubbish at. Look at the Kombucha article, the history there would fail a first year arts essay, relying on other medical journals quoting puff pieces written by kombucha sellers. But it must be true, it is written in a first-class refereed journal. No, medical scientist often do history as well as historians do medical science. Science is not the only discipline with the problems outlined in the article above. But all too often we seem to equate "scientist" with "rational thinker who can solve anything". No, scientists in the real world do what they do in their field of expertise usually quite well, but asking them to solve problems in the political or social or cultural or other complex spheres is not a "golden bullet". So, Wikipedia, disinterested, scientifically literate outsiders have been a powerful force in communicating science. Stephen Jay Gould, Tim Flannery and others are good scientists, but they also stepped out of their narrow field of expertise to summarize a broad range of disciplines. Many had/have quibbles with their work, hey, that's science. But you do not necessarily need to be a practising scientist to evaluate and communicate science, there are many science journalists who convey reasonable information strongly. Sometimes it helps to have some distance. An encyclopaedia is not a science journal. Does WP scare scientist away? Yes, if they are people who do not accept criticism and/or simplification, or notification of breaching WP protocols. I'm sorry, the previous response above, professor of physics, quotes something about energy flux from a undergraduate textbook, secondary source, a few years old, obviously not his field of expertise otherwise quote the definitive references. He gets upset when he is told that it does not represent present consensus (it may not have), the textbooks view may be under pressure from recent research, &c. He was not sufficiently disinterested in the topic to discuss calmly the options. I'm not saying that there aren't people who are sure that dragons are involved somehow, but it's amazing what a quick back and forth can reveal. Are there less scientists involved in editing WP than there are out there in the real world? Is the science in WP poorly done generally? Sorry, as a consequence of science and academia I ask, what does the data tell us? Thank you to Mary Mark Ockerbloom for the article, it is good, thank you to Ymblanter and all others who are endeavouring to make WP a better information communication source. Brunswicknic (talk) 12:22, 1 February 2021 (UTC)[reply]
    For the record, what I was referring to is my field of expertise - well, this particular question has been resolved a hundred years ago, and there is no current research going on, but the ideas are still being used in my field of research. In addition, the undergraduate book was on my table because I was at the time giving an undergraduate course on the subject.--Ymblanter (talk) 12:53, 1 February 2021 (UTC)[reply]
  • ps. Lead is increasingly being used as a dating tool for the recent past, not only do lead radioisotopes lend themselves to recent timescales, but the sudden increase of lead in urban deposits can mark the introduction of lead into petrol into the atmosphere into sediments. Brunswicknic (talk) 12:22, 1 February 2021 (UTC)[reply]
  • I'd say that Wikipedia needs more editors with expertise in countless fields. My college degree is in English literature, & I dread reading any article on one of the standard texts of the Western Canon. They are often inadequate because students are taught to respond to literature, not what are the important commentaries on a given work. (Both approaches are defensible.) I learned about what tools exists to find academic papers on literature -- such as the MLA International Bibliography -- as an afterthought by a few of my professors. I could make similar remarks about other areas I've contributed to, such as Ancient & Classical history.
    To be fair, this is a symptom of our success: back in the Stone Age of Wikipedia, articles were not very good, often written on the fly; over the years, we've gradually raised the level of quality, insisting on citations & that articles better reflect the state of knowledge. With over 6 million articles (half of which are stubs), it is expected that many will not match our expectations. But we are still raising the level of quality. -- llywrch (talk) 07:53, 2 February 2021 (UTC)[reply]
  • Llywrch, I agree with your first statement -- more expertise is needed cross the board -- but I also want to emphasize that scientific disinformation has potentially harmful consequences that are serious for everyone. Unlike Jasper Fforde's alternate Nextian England, where ProCath terrorists wreck havoc in support of the young Catherine, no one is likely to die if we mess up a literary detail. This comment is not meant to disparage your field, btw. Mary Mark Ockerbloom (talk) 15:28, 5 February 2021 (UTC)[reply]
  • Unfortunately, your comment is harmful. There is a movement in higher education to cut back offerings in the Liberal Arts in order to expand those in STEM; while this is mostly happening in the US, I've seen signs of this in the UK. The largest target for these cutbacks is in the Classics, but I have seen reports that offerings in such arguably practical fields as French are being slashed back or eliminated. All because of thinking like yours -- "scientific disinformation has potentially harmful consequences ... [while] no one is likely to die if we mess up a literary detail" -- ignoring the fact that many wars have been fought over faulty explications of texts. In short, education is focusing more on how to do things, rather than on what things, or why. -- llywrch (talk) 22:53, 5 February 2021 (UTC)[reply]

















Wikipedia:Wikipedia Signpost/2021-01-31/Opinion