Upstream of Misinformation: Mark Ungrin on Scientific Errors, Institutional Policy, and Public Trust
- Heather McSharry, PhD

- 5 hours ago
- 4 min read
Summary

When people talk about misinformation in public health, the conversation usually focuses on social media or individuals spreading false ideas. But what if some of the most consequential misinformation begins inside the institutions we trust most?
In this episode of Infectious Dose, Heather speaks with biomedical researcher Dr. Mark Ungrin about how scientific ideas move through institutional systems — from research papers to clinical guidance to public messaging — and how structural failures in those systems can allow flawed assumptions to become entrenched policy.
Together they explore why institutional errors can be difficult to correct, how evidence-based medicine frameworks can sometimes reinforce authority rather than evidence, and what changes might help rebuild trust in scientific institutions before the next pandemic arrives.
Listen here or scroll down to read full episode.
Full Transcript
👉 Download the full episode transcript:
👉Dr. Ungrin's Talk: Science, Pseudoscience and Public Policy https://whn.global/science-pseudoscience-and-public-policy/
Links to other resources are after the signature at the end of the post.
Episode Overview
In this episode of Infectious Dose, Heather speaks with biomedical researcher Dr. Mark Ungrin about how flawed ideas can emerge inside scientific and medical institutions themselves — and how those ideas sometimes become embedded in policy before they are corrected.
Dr. Ungrin’s work examines the systems that translate scientific evidence into public health guidance. During the COVID-19 pandemic, he became increasingly focused on how institutional structures, authority hierarchies, and evidence-evaluation frameworks can allow errors to persist even after contradictory evidence emerges.
This conversation explores how misinformation can move through scientific systems, why institutional correction is often slow, and what changes might improve public trust and scientific accountability before the next pandemic arrives.
Episode Outline
Introduction — Where Misinformation Really Begins
Most discussions about misinformation focus on social media or individuals misunderstanding science. But what if some of the most influential misinformation begins upstream, inside scientific publications, policy frameworks, and institutional decision-making systems? Dr. Ungrin argues that understanding misinformation requires examining how knowledge moves through institutions, not just how it spreads among the public.
1. Why “Human Error” Is Often the Wrong Explanation
Public health failures are frequently attributed to human error. Dr. Ungrin argues this framing misses the real problem: system design. When systems predictably fail in response to known human mistakes, the issue isn’t individual behavior — it’s structural failure.
Key takeaway:
Systems that collapse under ordinary human error are poorly designed systems.
2. How Institutional Failures Can Generate Misinformation
One of the most powerful examples comes from the origins of anti-vaccine misinformation. Andrew Wakefield’s fraudulent paper linking vaccines to autism was published in a major journal and remained unretracted for over a decade. The investigation that exposed the fraud came not from within the medical system but from investigative journalism.
Key takeaway:
When institutions fail to correct flawed research quickly and transparently, misinformation can originate from sources that appear authoritative.
3. Why Institutionalized Errors Are So Hard to Correct
Once ideas become embedded in policy or clinical guidance, reversing them becomes difficult. Several structural forces contribute:
Hierarchical medical culture
Professional incentives that discourage admitting error
Liability concerns
Institutional authority structures
Evidence hierarchies that privilege certain types of studies over others
These dynamics can create what Dr. Ungrin calls a “cognitive lobster trap” — a system that makes it difficult for institutions to back away from incorrect positions.
Key takeaway:
Institutional authority structures can unintentionally lock systems into outdated or incorrect conclusions.
4. Case Study: Airborne Transmission During COVID-19
One of Dr. Ungrin’s central examples involves early debates about whether SARS-CoV-2 was transmitted through aerosols. He argues that outdated assumptions about droplet transmission shaped early public health guidance, despite longstanding aerosol physics research contradicting those assumptions. The result: delayed recognition of airborne transmission and delayed adoption of respiratory protection strategies.
Key takeaway:
Scientific consensus can lag behind established physical evidence when institutional frameworks resist updating long-standing assumptions.
5. Evidence-Based Medicine and the Limits of Evidence Hierarchies
Evidence-based medicine (EBM) was originally intended to help clinicians interpret research. But Dr. Ungrin argues that some evidence hierarchies now function more like authority systems than scientific frameworks. Certain forms of evidence may be excluded from consideration if they do not fit predefined methodological categories — even when those sources provide important insights.
Key takeaway:
Decision frameworks can unintentionally exclude critical scientific evidence.
6. The Accountability Problem
When public health systems fail, consequences are measured in lives lost and long-term disability. Yet accountability is often rare. Dr. Ungrin suggests several structural reasons:
Institutional power and influence
Legal liability concerns
Professional culture discouraging admission of error
Escalating consequences that make acknowledgment more difficult over time
Key takeaway:
Institutional incentives often favor defending past decisions rather than correcting them.
7. Rebuilding Trust Through System Design
If misinformation sometimes originates within institutions, simply telling the public to “trust the science” is not enough. Trust must be built through transparent systems that demonstrate trustworthiness. Dr. Ungrin proposes separating key decision processes:
Determining what the science actually shows
Deciding what society should do about it
Determining what actions are practically feasible
Combining all three decisions within a single authority structure can create conflicts that distort scientific interpretation.
Key takeaway:
Transparency and separation of decision processes can reduce institutional bias.
8. Practical Areas for Reform
Despite the scale of institutional challenges, several areas may offer opportunities for progress:
Improving indoor air quality and ventilation standards
Expanding workplace protections and disability coverage for long COVID
Increasing transparency in research correction and retraction processes
Encouraging open scientific debate rather than suppressing questioning
Key takeaway:
Small structural improvements can gradually strengthen scientific systems.
Conclusion — Trustworthiness Before Trust
Public trust in science cannot be commanded. It must be earned through institutions that demonstrate transparency, accountability, and a willingness to correct errors. Understanding how scientific ideas move through institutions — and how those systems sometimes fail — is essential for preparing for future pandemics and biological threats.

Episode Resources
Ungrin et al. Preprint. Medical masks versus N95 respirators for preventing COVID-19 among health care workers: A secondary analysis of findings inconsistent with prior understanding reflects the expected inferiority of medical masks. https://osf.io/preprints/metaarxiv/ey7bj_v2
The SARS Commission Interim Report
Loudon I. 2013. Ignaz Phillip Semmelweis’ studies of death in childbirth. Journal of the Royal Society of Medicine. https://journals.sagepub.com/doi/10.1177/0141076813507844
ASHRAE Standard 241, Control of Infectious Aerosols https://www.ashrae.org/technical-resources/bookstore/ashrae-standard-241-control-of-infectious-aerosols
.png)




Comments