WTF HAPPENED
TO THE ADULTS IN THE ROOM?
This piece explores psychiatry, science, and lived experience. It is not medical advice, and it cannot replace care tailored to an individual person.
I didn’t start writing because I had something profound to say. I started because I was irritated in the specific way that only physicians get irritated, quietly at first, then all at once, like a lab value that drifts for months before suddenly doubling overnight. The initial spark was the rise of Robert F. Kennedy Jr. into a position where his ideas about health might actually shape policy. I told myself I would handle it the way I handle everything else: stay calm, stay evidence-based, and assume that if I explained things clearly enough, the room would come back into focus. That assumption did not survive first contact.
What I was watching was not disagreement. Medicine tolerates disagreement and depends on it. What I was watching felt more like the slow replacement of evidence with intuition, data with suspicion, and “we studied this” with “I saw something online.” Once you see that shift, it becomes very difficult to unsee it.
People do not come in asking questions anymore so much as arriving with answers that are dressed up like questions. And honestly, I understand why. There is too much information, and no one has time to read actual studies, let alone interpret them. Most people do not want to. Studies are long, dense, and written in a tone that feels like it was designed to discourage eye contact. So people simplify. The problem is that simplification, once it loses contact with evidence, turns into something else. It turns into a story.
Stories have villains and motives. They make emotional sense. Data does not. Data sits there and refuses to care how you feel about it. Inside medicine, we used to argue within the same system, even when we disagreed. Now people step outside that system entirely and critique it as if they had just discovered it five minutes ago. Listening to someone like Robert F. Kennedy Jr. talk about health policy can feel like listening to someone explain physics using confidence as a substitute for structure. And confidence, unfortunately, works.
• Scientific papers published each year: about 2 to 3 million¹
• Number any one person can realistically read: zero
• Length of a viral video explaining complex science: under 2 minutes
• Confidence level in that video: absolute
Confidence is a cognitive shortcut. It reads as competence, especially outside your own field. That works when you are choosing a restaurant. It works less well when you are deciding whether to trust vaccines or dismantle public health systems. As Carl Sagan warned, “We’ve arranged a global civilization in which most crucial elements depend on science and technology, and then arranged things so that almost no one understands them.”¹ That line reads less like a warning now and more like a description.
The shift might still be manageable on its own, but it does not exist in isolation. It sits on top of something else that has been building for years, which is a loss of trust that did not come out of nowhere. The Jeffrey Epstein case did not just expose a crime. It exposed a belief that there are different layers of reality, different rules, and different consequences depending on who you are. Once people see that, even if they only half believe it, it changes how everything else looks. If that system missed something that obvious, what else is it missing?
That shift in perception feeds directly into how people view power. At some point, we turned billionaires into characters. Figures like Elon Musk, Peter Thiel, and Mark Zuckerberg are no longer just individuals running companies. They occupy narrative space. Their actions get interpreted as part of a larger story, whether or not that story actually exists. That framing is not entirely fair, but it is not entirely irrational either, because when wealth reaches a certain scale, it stops functioning as a number and starts functioning as a force.
• Seconds in a year: about 31 million
• Seconds in a billion: about 31 years
• Number of people who intuitively grasp that difference: very few
So when people ask why someone would need that much, they are not just asking an economic question. They are asking what the limits are, and whether any still exist. As Ricky Gervais joked during the Golden Globe Awards, “You say you’re woke, but the companies you work for… Apple, Amazon, Disney… If ISIS started a streaming service, you’d call your agent.” 3 It lands because it points to something people already sense about power and selective awareness.
Experience leaves faster than it can be replaced, and the system compensates until it cannot. While that is happening, there is also a pull in the opposite direction, toward certainty. Older ideas about structure and hierarchy start to feel appealing again, not because they are better, but because they feel stable. As Hannah Arendt observed, “The most radical revolutionary will become a conservative the day after the revolution.”7 People do not just want change. They want something that feels like it will hold.
Institutions were supposed to provide that stability. Boring, procedural, often frustrating stability. Organizations like NATO exist because the alternatives have already been tested and did not go well. The problem is that institutions only feel valuable when they fail. When they are working, they look unnecessary, so they get chipped away at in small, reasonable steps until the system becomes thinner and less able to handle stress.
At this point, it is worth asking whether I am missing something. It is easy, especially sitting in Boulder, to assume the answer is obvious. It is not. People are responding to real frustrations, and ignoring that does not help. But recognizing frustration is not the same as validating every conclusion that comes from it. Once trust erodes, it spreads, and eventually disagreement is not about facts anymore. It is about reality.
So what is this?
It is not one thing. It is a shift, from evidence to instinct, from systems to individuals, from what works to what feels right. It accumulates slowly until the baseline moves. As George Carlin said, “Never underestimate the power of stupid people in large groups.”8 It sounds like a joke, but it is really about scale. Small errors, repeated widely, become structure.
That is the part that is hard to shake. This does not feel like a moment. It feels like a direction. And direction is harder to reverse. So when I say “WTF,” it is not just frustration. It is a real question. Where are the adults in the room?
Public Health Is Not a Thought Experiment
At a certain point, all of this stops being abstract and lands somewhere that does not care about narrative.
Public health is not a philosophy or a branding exercise. It is infrastructure. It is the set of conditions that determines whether problems stay small or become large, and it operates whether people are paying attention to it or not. When it works, nothing happens. That is the design. No outbreak, no crisis, no headline. The absence of events is the outcome, which makes it difficult to value and even harder to defend.
From the outside, it can look procedural and slow. Committees, shifting recommendations, systems that do not always appear decisive. From the inside, it functions more like a network of safeguards. It absorbs variation, detects patterns early, and coordinates responses before they escalate. That work is rarely visible, and because it is rarely visible, it is easy to assume it is unnecessary.
The problem is that these systems do not fail dramatically at first. They degrade. Participation declines slightly, trust erodes gradually, and small disruptions accumulate until the system is operating with less margin than it was designed to have. Each individual change appears manageable. The overall pattern is harder to see.
• Measles declared eliminated in the United States: 2000
• Recent outbreaks: increasing
• Reduction in vaccination needed to lose community protection: smaller than most assume
The same pattern applies to the less visible components. Laboratories that track disease trends, data systems that allow information to move between regions, and the workforce that maintains all of it. When these weaken, the effect is not immediate. Detection slows, response lags, and the system becomes more reactive. By the time the consequences are visible, the underlying capacity has already been reduced.
This is where the conversation around people like Robert F. Kennedy Jr. intersects with something more concrete. It is not just about what is said. It is about how those ideas influence participation in systems that depend on consistency and coordination. You can challenge a system and improve it. That is how medicine evolves. You can also challenge it in a way that reduces trust and engagement, which produces a different outcome.
The difficulty is that public health does not offer a compelling narrative in return. It produces stability, which is easy to overlook. It asks for cooperation without providing immediate feedback, and it relies on trust that is often invisible until it is gone.
As Robert Sapolsky has noted, systems only make sense when you understand the conditions around them. Public health is that condition for medicine. It is the background that allows individual care to function effectively. When that background weakens, the burden shifts to individual interventions, and those interventions begin to fail more often.
The result is not a single point of collapse. It is a gradual loss of resilience. And resilience is not something you notice until you need it.
When “Do Your Own Research” Means “Ignore People Who Did”
“I’ve been reading” usually means “I’ve been filtering,” and the filtering is not random. It follows instinct, tone, and familiarity far more than it follows evidence. That is not because people are careless. It is because the volume of available information has made actual evaluation almost impossible for anyone not working inside the field.
Most people do not have the time or training to read primary literature, and even if they did, much of it is written in a way that is difficult to access without context. Faced with that, people simplify. They move from studies to summaries, from summaries to interpretations, and from interpretations to personalities. Eventually, the source of the information matters less than how it feels to hear it.
That is where the conversation often shifts toward a general distrust of expertise, as if the only alternatives are blind acceptance or total rejection. That framing is misleading. The more relevant distinction is between systems that are designed to correct themselves over time and systems that are designed to reinforce what they already believe. The former are imperfect, slow, and often frustrating. The latter are efficient, consistent, and resistant to change.
Only one of those has a track record of improving outcomes.
As George Carlin observed, “Think of how stupid the average person is, and realize half of them are stupider than that.” The line lands because it exaggerates something uncomfortable about scale. When small misunderstandings are repeated across large populations, they do not remain small. They begin to shape how entire systems are perceived and engaged with.
It is important, though, not to mistake this for simple irrationality. Most people are not trying to reject evidence. They are trying to make sense of a system that feels increasingly opaque and, at times, untrustworthy. When institutions fail visibly, or appear to fail, the instinct to look elsewhere for answers is not unreasonable.
The difficulty is what replaces them.
As Robert Sapolsky has pointed out, the human brain is highly effective at generating explanations, even in the absence of sufficient information. That tendency is useful in many contexts, but it also means that once a narrative takes hold, it can become self-reinforcing. Contradictory information is filtered out, inconsistencies are explained away, and the structure becomes increasingly resistant to correction.
At that point, “doing your own research” is no longer about expanding understanding. It is about maintaining it. And when that pattern operates at scale, it stops being a personal habit. It becomes a systemic problem.
The Epstein Aftertaste
There are moments when a system does not just fail but reveals something about itself that had been easy to ignore until it became impossible to do so. The case of Jeffrey Epstein felt like one of those moments. It was not that people suddenly discovered that wealth and power come with advantages. That has always been understood. What felt different was the scale, the visibility, and the uneasy sense that the usual boundaries did not apply in the same way they were assumed to apply elsewhere.
The reaction was not a single wave of outrage that rose and fell. It was something quieter and more persistent. It lingered as a background question about whether the system operates consistently, or whether there are layers to it that only become visible under extreme circumstances. Once that possibility enters the conversation, it does not stay contained. It begins to shape how people interpret other institutions, even those that have nothing to do with the original event.
People start asking questions that are not irrational, but that are difficult to answer cleanly. If something that visible could go unchecked for that long, what else is being missed? If accountability appears uneven, where exactly is the line? Those questions are rooted in pattern recognition. Humans are very good at noticing when outcomes do not match expectations, and they are equally good at trying to explain why.
The difficulty is that the gap between “something seems inconsistent” and “everything is unreliable” is smaller than it looks. Once trust begins to erode, it does not erode selectively. It spreads across domains, moving from one institution to another until the distinction between them starts to blur.
• Americans reporting low trust in major institutions: over 70%
• Perception that the system is “rigged” in some meaningful way: increasing
• Clear, widely recognized accountability in high-profile cases: limited
Accountability serves more than a corrective function. It signals that rules exist and that they apply in a consistent way. When that signal weakens, people do not become more careful in how they evaluate information. They become more suspicious. Suspicion does not sharpen analysis. It broadens it, and once it broadens, it becomes harder to contain.
That shift has consequences beyond the original context. When people begin to doubt whether institutions operate in good faith, official explanations lose some of their weight. Alternative explanations, even weaker ones, begin to feel more plausible, not necessarily because they are more accurate, but because they seem less constrained by the system that is now being questioned.
As Ricky Gervais has pointed out in his own way, people are often drawn to simple answers in a confusing world. The appeal is not precision. It is closure. A complicated, imperfect system is harder to tolerate than a clear explanation, even if that explanation oversimplifies reality.
The long-term effect is not a single belief but a shift in baseline. Doubt becomes part of the background. It does not need to be actively expressed to influence behavior. It shapes how people evaluate new information, how they interpret recommendations, and how willing they are to accept guidance from institutions that they no longer fully trust.
That doubt rarely announces itself. It just stays. And once it becomes part of the background, it is very difficult to remove.
The Seduction of Going Backward
What’s harder to explain is not the distrust. It’s what people reach for next.
Because a lot of what is gaining traction right now does not feel new. It feels familiar in a way that should make people uncomfortable, like a pattern that has already been tested but is being reconsidered anyway. Ideas about hierarchy, about who should lead, about how much complexity a system should tolerate. Things that, not that long ago, many people assumed were largely settled.
They are not.
The instinct to move in that direction is not irrational. When systems feel unstable or difficult to understand, the appeal of something that looks structured and decisive increases. Ambiguity is exhausting. Complexity requires patience. A framework that appears clear, even if it is rigid or incomplete, can feel like relief.
As Hannah Arendt observed, periods of upheaval often produce a counter-movement toward stability. That shift does not happen because people abandon reason. It happens because they are trying to reduce uncertainty, and certainty, even when it is overstated, feels easier to hold.
• Percentage of Americans who believe the country is on the “wrong track”: consistently high
• Tolerance for ambiguity in decision-making: generally low
• Preference for clear, decisive answers in complex situations: reliably high
These tendencies do not make people irrational. They make them responsive to the environment they are in. When trust in institutions declines and systems become harder to navigate, the threshold for accepting simplified explanations drops. The trade-off is subtle. Clarity increases, but precision decreases.
Once that shift occurs, the criteria for evaluating ideas begin to change. Instead of asking whether something works, people begin asking whether it feels stable or coherent. Evidence still matters, but it competes with the psychological comfort of certainty. In some cases, certainty wins.
The challenge is that certainty is often a presentation rather than a property. It can be expressed confidently even when the underlying structure is weak. When that presentation becomes the basis for decision-making, systems that depend on nuance and adjustment begin to lose ground to frameworks that appear more solid than they actually are.
That is the dynamic that makes regression possible. Not because the past is more effective. Because it is easier to understand.
Institutions: Boring, Necessary, Fragile
Institutions are not inspiring, which is part of the problem. They are slow, procedural, and often frustrating, and they rarely produce moments that feel decisive or dramatic. Most of the time, they function in the background, coordinating activity, setting constraints, and absorbing variation so that larger systems remain stable. When they are working, they are almost invisible. That invisibility makes them easy to undervalue.
Organizations like NATO were not created because they were elegant or efficient. They were created because the alternative had already been tested repeatedly, and the outcomes were severe. The purpose was not to eliminate conflict entirely but to reduce the likelihood that individual actions would escalate into broader crises. The approach is structured, sometimes rigid, and often slow, but it is designed to prevent large-scale failure rather than respond to it after the fact.
The difficulty is that institutions tend to be evaluated based on their most visible moments, which are often their failures. When they function well, there is little to observe. When they struggle, the problems are highly visible and often amplified. This creates a perception imbalance where their limitations are clear but their benefits are diffuse.
As a result, they are often modified in incremental ways that appear reasonable in isolation. Budgets are adjusted, priorities are shifted, and structures are reorganized. Each change can be justified on its own terms, but over time, the cumulative effect can alter how the system functions. Because this process is gradual, it rarely triggers a clear point of concern.
• Time required to build complex institutions: decades
• Time required to weaken them: significantly shorter
• Public attention to gradual institutional change: limited
As focus shifts toward individual actors, institutions can begin to feel secondary. Trust becomes associated with people rather than processes, and the system itself starts to depend more on who is in charge at a given moment. That shift introduces variability into structures that were designed to reduce it.
This is where fragility increases. Institutions are built to persist across changes in leadership. When their stability becomes tied to specific individuals, their ability to function consistently is reduced. Problems that were once absorbed by the system begin to surface more directly.
The outcome is not immediate collapse. It is a gradual thinning. And thinner systems are less capable of absorbing stress.
Meanwhile, Back in Reality
While all of this plays out at the level of ideas, narratives, and public figures, something quieter is happening underneath it. The people who actually run these systems are getting tired and, in many cases, leaving. Not all at once, and not in a way that immediately draws attention, but steadily enough that the effect accumulates over time.
From the outside, nothing looks dramatically different. Hospitals are open, clinics are functioning, and the system appears intact. That appearance is misleading. What is changing is the margin. There are fewer people to absorb unexpected problems, fewer experienced clinicians to recognize early warning signs, and less flexibility to respond when something does not go as planned.
• Physician burnout rates in some specialties: over 60%
• Public health workforce shortages: widespread
• Time required to train a physician or scientist: roughly a decade
• Time required for them to leave: much shorter
That imbalance is difficult to correct quickly. Training new professionals takes years, and experience cannot be accelerated. When people leave, they take with them not just capacity but judgment, pattern recognition, and the ability to manage uncertainty. Those are not easily replaced.
As a result, the system becomes more reactive. Care continues, but with less room for nuance. Decisions are made more quickly, often with less context, and the ability to manage complexity is reduced. None of this necessarily shows up as immediate failure. It presents as increased strain, longer wait times, shorter interactions, and a general sense that everything is functioning, but just barely.
At the same time, expectations continue to rise. More patients, more complexity, more documentation, and more oversight. The system is asked to do more while operating with fewer resources and less flexibility. That tension does not resolve. It accumulates.
This is where problems begin to hide. Not in obvious breakdowns, but in the gradual erosion of capacity. Small adjustments are made to keep things moving, and those adjustments become the new baseline. Over time, the gap between what the system is designed to do and what it is actually able to do widens. When something finally does break, it appears sudden. It is not. It is cumulative.
If you’re in the Boulder area, you can also find more information or get in touch through my practice.
And if you found this useful, feel free to share it with someone who might also be trying to figure out what exactly is going on.
Notes and Sources
¹ Sagan, Carl. The Demon-Haunted World: Science as a Candle in the Dark. Random House, 1995, p. 26.
² Pew Research Center. “Public Trust in Government: 1958–2024.” Washington, DC, 2024.
³ Gervais, Ricky. Golden Globe Awards Opening Monologue, January 5, 2020.
⁴ Centers for Disease Control and Prevention (CDC). “Measles — United States, 2000–Present.” MMWR Surveillance Reports, 2000–2024.
⁵ American Medical Association (AMA). “Physician Burnout and Well-Being Report,” 2023–2024.
⁶ de Beaumont Foundation. “Public Health Workforce Interests and Needs Survey (PH WINS),” 2021–2024.
⁷ Arendt, Hannah. The Origins of Totalitarianism. Harcourt, Brace & Company, 1951, pp. 305–310.
⁸ Carlin, George. Brain Droppings. Hyperion, 1997, pp. 17–22.
⁹ UNESCO Institute for Statistics. “Global Scientific Publication Output,” 2022–2024 estimates.
¹⁰ Sapolsky, Robert M. Behave: The Biology of Humans at Our Best and Worst. Penguin Press, 2017, pp. 24–28, 563–570.
¹¹ World Health Organization (WHO). “Immunization Coverage and Global Impact of Vaccines, 2000–2020.” Geneva, 2021.
¹² Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011, pp. 20–35.
¹³ Lifton, Robert Jay. Thought Reform and the Psychology of Totalism. University of North Carolina Press, 1961, pp. 419–437.
¹⁴ Jamison, Kay Redfield. An Unquiet Mind. Vintage, 1995, pp. 86–102.
¹⁵ Bremmer, Ian. Us vs. Them: The Failure of Globalism. Portfolio, 2018, pp. 112–130.
¹⁶ Zeihan, Peter. The End of the World Is Just the Beginning. Harper Business, 2022, pp. 45–67.
¹⁷ Harari, Yuval Noah. 21 Lessons for the 21st Century. Spiegel & Grau, 2018, pp. 55–72.
¹⁸ Pollan, Michael. How to Change Your Mind. Penguin Press, 2018, pp. 301–320.
¹⁹ Harris, Tristan. Center for Humane Technology, public talks and interviews, 2019–2023.
²⁰ Sapolsky, Robert M. Stanford University, Human Behavioral Biology lecture series, 2010–2022.

