← Back to Library

I don't know

Nate Hagens identifies a cultural pathology that is quietly undermining our ability to navigate an increasingly complex future: our collective addiction to certainty. In a world where status is awarded to the loudest voice and the most decisive claim, the simple admission of ignorance has become a professional liability. This piece is vital because it reframes "I don't know" not as a failure of intellect, but as the only rational response to a reality defined by risk and unknown variables.

The Physiology of Certainty

Hagens anchors his argument in the uncomfortable truth that uncertainty is physically painful. He writes, "Uncertainty is reflected as a bodily state within us... when our stressed system releases cortisol and we start to feel things like a tight chest, a fluttery stomach, or an increased heart rate, these inform our gut that something is 'off.'" This physiological framing is crucial; it explains why the refusal to admit ignorance is not merely a choice, but a biological reflex. The brain, wired to conserve energy and predict threats, treats the ambiguity of the modern world as a survival danger.

I don't know

The author draws a sharp distinction between our evolutionary past and our current reality. "Back in the day on the plains of Tanzania, this response was evolutionarily helpful... it was safer to assume the rustle in the bushes was a lion rather than the wind." However, in the modern context, this instinct backfires. "Our modern status economy runs on conviction and linearity, while nuance and caution only slow us down." By prioritizing speed over accuracy, we create a culture where the "fast hand beats the methodological thinker," a dynamic that Hagens notes he observed even in third grade.

Holding uncertainty is costly! When we're running parallel mental scenarios, inhibiting quick answers and requiring more context flip-flopping, our bodies don't like the inefficient use of energy.

This biological imperative is compounded by what Hagens terms "motivated reasoning," where even the most intelligent individuals develop an "ideological immune system" that defends their existing worldview against contradictory evidence. Critics might argue that confidence is sometimes necessary for leadership and that excessive hesitation can paralyze decision-making. While valid, Hagens suggests that the current cultural pendulum has swung so far toward bravado that it has become a "fatal flaw" rather than a virtue.

The Authority Trap and the Wall Street Lesson

The commentary deepens by examining how institutional structures reinforce this behavior. Hagens recalls his early career at Salomon Brothers, where the training program was designed to break the instinct to guess. "What we were supposed to say, as eventual salespeople who would be talking to billionaires and institutional managers, is 'I don't know, but I will find out and get back to you.'" He contrasts this with the modern media landscape, where producers book guests for "crisp and sharp takes, not the careful, qualifying one."

This dynamic is exacerbated by "authority bias," a cognitive phenomenon where humans defer to those with perceived status, regardless of the evidence. Hagens points to the famous Milgram experiment, where ordinary people administered what they believed were fatal electric shocks simply because a figure in a lab coat told them to. "Signals of authority lower our skepticism, and they raise our compliance," he writes. This historical reference serves as a stark warning: when we outsource our critical thinking to confident figures, we risk repeating catastrophic errors.

The consequences of this deference are visible in high-stakes environments. Hagens notes that "politicians will campaign on certain guarantees and then walk those promises back when they're elected," and that "projects start with rosy baselines and then invariably end up with cost overruns." The pattern is consistent: overconfidence is rewarded at the front end, while the costs are deferred. "The public pays in trust, money, and time," he argues, adding that in the context of climate change, "our planet pays (and ergo we also pay) in the form of ecological stability."

The AI Accelerant

Perhaps the most urgent section of the piece addresses how artificial intelligence is turbocharging this human tendency. Hagens explains that Large Language Models are trained to prioritize answering over truth, leading to "hallucinations" that are presented with absolute confidence. "The main software of ChatGPT-5 hallucinates around 10% of the time when it has internet access. But without internet access, this increases to almost half of all answers." This is not a bug, but a feature of the training logic: "If you think of a quiz show where the only way to get points is to give the correct answer... it is always in your favor to attempt answering."

The convergence of human and machine overconfidence creates a dangerous feedback loop. Hagens cites a Stanford paper describing "Moloch's Bargain," where model performance gains come at the expense of honesty. "For every gain in the model performance came an even bigger loss in honesty," he writes. This leads to a scenario where "more deceptive marketing, more disinformation in political campaigns, and more fake and harmful social media posts" become the norm.

The stakes are existential when considering the integration of these systems into military command. Hagens reveals a chilling insight: "We've actually avoided a dozen, or perhaps even more, potential nuclear wars in the last 50 years. Most of these near-misses were avoided because at the time a single human was unsure and they chose to hold off until they had more information." If AI systems, which lack the human capacity for doubt, control these defense mechanisms, the "rational speed bumps of uncertainty" could vanish entirely.

The cultural stories we tell ourselves, which then guide our actions, wildly outpace our energy, ecosystem, and time constraints. And then we're shocked – shocked – when reality shows up to stare us in the face and remind us that infinite growth is not possible in a finite system.

Bottom Line

Hagens's most compelling argument is that the refusal to say "I don't know" is a systemic risk that threatens our ecological and geopolitical stability. While the piece occasionally risks oversimplifying the necessity of decisive action in crisis management, its core diagnosis of our "confidence game" is undeniable. The strongest takeaway is the warning that as we merge human certainty with machine hallucination, we are building a world where the most persuasive voice is often the least truthful one. The path forward requires a cultural retraining that values the humility of uncertainty over the seduction of false certainty.

Deep Dives

Explore these related deep dives:

  • Milgram experiment

    Directly referenced in the article as a famous demonstration of authority bias - readers would benefit from understanding the full experimental design, ethical controversies, and implications for human obedience

  • Motivated reasoning

    A core concept in the article explaining why intelligent people often double-down on beliefs rather than admitting uncertainty - understanding the psychological mechanisms would deepen comprehension

  • Salomon Brothers

    The author's formative professional experience at this legendary Wall Street firm shaped his understanding of intellectual humility - readers would find the firm's colorful history and culture illuminating

Sources

I don't know

by Nate Hagens · Nate Hagens · Read full article

This post is adapted from last week’s Frankly video titled “The Three Most Important Words We’re Taught Not to Say.” In the future, we’ll be adapting more Frankly videos to written versions and continuing to post them on Substack, so stay tuned for more.

As a podcast host, there’s one answer that I love to hear when I ask my guests a question – but I rarely ever do:

I don’t know.

To me, this answer is a signal of maturity, nuance, and honesty. It’s not trying to give an answer to all the world’s problems.

So, why is hearing “I don’t know” so rare?

We are all members of a social species embedded in a modern culture that’s been turbocharged by energy surplus and social technology. But in this modern setting we still seek status and respect as a product of our evolutionary wiring. Because of this, in most public settings, especially in the media, we overvalue confidence, bravado, and certainty. Today, saying “I don’t know” is seen as a sign of weakness, not of wisdom.

But in a world increasingly defined by ideological debates, when you hear these words today, they act as a sort of antidote to our cultural consensus trance. Admitting uncertainty makes room for discourse and the possibility of different answers.

In fact, I’m beginning to think that the reluctance to express “I don’t know” (or its equivalent) out loud is a fatal flaw in our culture as we begin to discuss our vastly complex, risky, and rapidly approaching future – which is chock full of uncertainties.

The Right Answer on Wall Street.

So back in the day, over 30 years ago, I started working at Salomon Brothers, which at the time was one of the coolest places on Wall Street. Their highly respected training program kicked our asses, and one of the key things I remember is that they would ask a series of questions, starting with something simple that you learn in business school like, “What’s the duration of a 30 year note?” Next, they would ask a slightly harder question: “What’s the ticker symbol of Yahoo?” Easy!: YHOO.

And then, after you answered the first two questions, they would ask you a really hard question that you weren’t supposed to be able to answer. Since we want to impress our bosses, we would inevitably make something up or guess, and then they ...