Nate Hagens identifies a cultural pathology that is quietly undermining our ability to navigate an increasingly complex future: our collective addiction to certainty. In a world where status is awarded to the loudest voice and the most decisive claim, the simple admission of ignorance has become a professional liability. This piece is vital because it reframes "I don't know" not as a failure of intellect, but as the only rational response to a reality defined by risk and unknown variables.
The Physiology of Certainty
Hagens anchors his argument in the uncomfortable truth that uncertainty is physically painful. He writes, "Uncertainty is reflected as a bodily state within us... when our stressed system releases cortisol and we start to feel things like a tight chest, a fluttery stomach, or an increased heart rate, these inform our gut that something is 'off.'" This physiological framing is crucial; it explains why the refusal to admit ignorance is not merely a choice, but a biological reflex. The brain, wired to conserve energy and predict threats, treats the ambiguity of the modern world as a survival danger.
The author draws a sharp distinction between our evolutionary past and our current reality. "Back in the day on the plains of Tanzania, this response was evolutionarily helpful... it was safer to assume the rustle in the bushes was a lion rather than the wind." However, in the modern context, this instinct backfires. "Our modern status economy runs on conviction and linearity, while nuance and caution only slow us down." By prioritizing speed over accuracy, we create a culture where the "fast hand beats the methodological thinker," a dynamic that Hagens notes he observed even in third grade.
Holding uncertainty is costly! When we're running parallel mental scenarios, inhibiting quick answers and requiring more context flip-flopping, our bodies don't like the inefficient use of energy.
This biological imperative is compounded by what Hagens terms "motivated reasoning," where even the most intelligent individuals develop an "ideological immune system" that defends their existing worldview against contradictory evidence. Critics might argue that confidence is sometimes necessary for leadership and that excessive hesitation can paralyze decision-making. While valid, Hagens suggests that the current cultural pendulum has swung so far toward bravado that it has become a "fatal flaw" rather than a virtue.
The Authority Trap and the Wall Street Lesson
The commentary deepens by examining how institutional structures reinforce this behavior. Hagens recalls his early career at Salomon Brothers, where the training program was designed to break the instinct to guess. "What we were supposed to say, as eventual salespeople who would be talking to billionaires and institutional managers, is 'I don't know, but I will find out and get back to you.'" He contrasts this with the modern media landscape, where producers book guests for "crisp and sharp takes, not the careful, qualifying one."
This dynamic is exacerbated by "authority bias," a cognitive phenomenon where humans defer to those with perceived status, regardless of the evidence. Hagens points to the famous Milgram experiment, where ordinary people administered what they believed were fatal electric shocks simply because a figure in a lab coat told them to. "Signals of authority lower our skepticism, and they raise our compliance," he writes. This historical reference serves as a stark warning: when we outsource our critical thinking to confident figures, we risk repeating catastrophic errors.
The consequences of this deference are visible in high-stakes environments. Hagens notes that "politicians will campaign on certain guarantees and then walk those promises back when they're elected," and that "projects start with rosy baselines and then invariably end up with cost overruns." The pattern is consistent: overconfidence is rewarded at the front end, while the costs are deferred. "The public pays in trust, money, and time," he argues, adding that in the context of climate change, "our planet pays (and ergo we also pay) in the form of ecological stability."
The AI Accelerant
Perhaps the most urgent section of the piece addresses how artificial intelligence is turbocharging this human tendency. Hagens explains that Large Language Models are trained to prioritize answering over truth, leading to "hallucinations" that are presented with absolute confidence. "The main software of ChatGPT-5 hallucinates around 10% of the time when it has internet access. But without internet access, this increases to almost half of all answers." This is not a bug, but a feature of the training logic: "If you think of a quiz show where the only way to get points is to give the correct answer... it is always in your favor to attempt answering."
The convergence of human and machine overconfidence creates a dangerous feedback loop. Hagens cites a Stanford paper describing "Moloch's Bargain," where model performance gains come at the expense of honesty. "For every gain in the model performance came an even bigger loss in honesty," he writes. This leads to a scenario where "more deceptive marketing, more disinformation in political campaigns, and more fake and harmful social media posts" become the norm.
The stakes are existential when considering the integration of these systems into military command. Hagens reveals a chilling insight: "We've actually avoided a dozen, or perhaps even more, potential nuclear wars in the last 50 years. Most of these near-misses were avoided because at the time a single human was unsure and they chose to hold off until they had more information." If AI systems, which lack the human capacity for doubt, control these defense mechanisms, the "rational speed bumps of uncertainty" could vanish entirely.
The cultural stories we tell ourselves, which then guide our actions, wildly outpace our energy, ecosystem, and time constraints. And then we're shocked – shocked – when reality shows up to stare us in the face and remind us that infinite growth is not possible in a finite system.
Bottom Line
Hagens's most compelling argument is that the refusal to say "I don't know" is a systemic risk that threatens our ecological and geopolitical stability. While the piece occasionally risks oversimplifying the necessity of decisive action in crisis management, its core diagnosis of our "confidence game" is undeniable. The strongest takeaway is the warning that as we merge human certainty with machine hallucination, we are building a world where the most persuasive voice is often the least truthful one. The path forward requires a cultural retraining that values the humility of uncertainty over the seduction of false certainty.