The most dangerous bias isn't greed or fear—it's something far more subtle. A deep dive into trading floors, space shuttles, and the human brain reveals why confidence in being wrong is our most pervasive flaw.
On July 17th, 1992, a junior trader at Singapore's International Stock Exchange made an expensive error. Instead of buying twenty futures contracts for a client, she sold them, costing Beerings Bank nearly forty thousand dollars. To save her job, her boss, Nick Leon, decided to hide the loss in an obscure error account—account 88888—a placeholder used by banks to resolve small discrepancies in trades.
It was a dangerous move that should have been caught immediately. When no one from Bearings noticed, Leon grew convinced he could win back the loss and get his team out of trouble. He bet that the Japanese stock market would go up—the Nikkei 225 had been dropping since the asset price bubble peaked two years previously, falling from 38,000 to 16,000.
Leon was confident the market would soon bottom out and start rising. It didn't. Over the next few weeks, it dropped to a low of 14,000. Yet Leon doubled down repeatedly, betting double his losses so the next win would bring him back above water. The strategy failed spectacularly. His losses ballooned from forty thousand dollars to around three million.
In spring 1993, the market rebounded to 20,000, and by summer, that account was back in credit. Leon went out to celebrate that weekend, drinking and dancing—finally free of the hole he'd dug himself into. But Monday morning, he made another trading error on the futures market. A damaging loss he didn't want to admit. So he put the loss back into account 88888.
Confident that if he got out of the previous one, he could do it again, he began making risky trades to try to recover that initial loss. As losses mounted, it became harder and harder to cover them. By the end of 1993, his losses exceeded thirty million dollars. Nick Leon is an extreme example, but this tendency to overestimate our abilities exists in all of us.
The Confidence Gap
Overconfidence gets us into all sorts of trouble. It leads us to take risks, make commitments, enter contests, and try things that will ultimately fail—sometimes in costly, embarrassing, and dangerous ways. Overconfidence has been implicated in nearly every major disaster: the sinking of the Titanic, the Chernobyl nuclear disaster, and the loss of the space shuttle Challenger.
But overconfidence isn't reserved for reckless individuals. We can all fall victim to it. In one study, ninety-three percent of people thought they were better drivers than the median—which is mathematically impossible.
The scale of the problem was identified in research with simple quiz questions. Participants were asked true or false questions about facts like whether Australia is wider than the moon or how many stars are in the Milky Way compared to trees on Earth. Then researchers asked how confident respondents were that they were right.
Most said they were one hundred percent certain. The results revealed a surprising disparity between how accurate someone thinks they are and how accurate they actually are. When people are ninety percent certain, they are right only seventy-five percent of the time. In fact, when Veritasium ran its own version online, those who were most confident—describing themselves as ninety-one to one hundred percent sure—were correct only fifty-one percent of the time.
This phenomenon is called calibration: how well what you think you know matches with what you actually know. If you're perfectly calibrated and say eighty percent confident, you should be right eighty percent of the time. But most of us are not well calibrated—even experts fall into this trap.
In one study analyzing surveys of professional forecasters—chief economists at various corporations and banks who are invited to forecast the state of the economy quarterly—they found these experts were on average too sure they knew what would happen. They said they were roughly fifty-three percent sure they had correctly predicted what would happen to inflation, for instance, but they were right only twenty-three percent of the time.
Why We Get It Wrong
The obvious explanation is that we want to feel good about ourselves. We enjoy enormous satisfaction from being able to say I know—or even better, I told you so. We pretend to ourselves and others that we know when we don't. That is the motivated egocentric version of an explanation for excessive faith in our own judgment.
But there's something more uncomfortable at play: how much information our brains are capable of processing.
On January 27th, 1986, at the Kennedy Space Center in Cape Canaveral, engineers from Morton Thiessen made an emergency conference call. They'd just seen the weather forecast predicting temperatures overnight would plummet to twenty-five degrees Fahrenheit—far colder than any previous shuttle launch. They knew the rubber O-rings that seal joints in the boosters become less flexible in the cold.
They had just a few hours to gather data, create charts, and present their case. Over the next six hours, the engineers presented thirteen charts with data on O-ring temperature, hot gas erosion, joint rotation, and more. But the data was scattered, incomplete, and not synthesized into a clear narrative. They'd never tested below fifty-three degrees Fahrenheit.
So NASA managers, overwhelmed by seemingly contradictory data and confident that their rocket boosters were safe, overruled their own engineers and approved the launch. At eleven thirty-eight the next morning, seventy-three seconds after launch, all seven crew members were killed.
The problem is that calibrating your certainty requires thinking of all the ways you could be wrong—and that's hard for finite, fallible agents like us. It means considering everything we don't know.
Research from 2008 investigated how short-term memory capacity was linked to accuracy and overconfidence. Participants were asked to give ranges for factual questions like the length of a river or the population of a city. People's ranges were consistently too narrow, which effectively meant they were being overconfident. And those who had worse short-term memory were more often wrong and more likely to be overconfident.
Another study in 2023 asked participants to keep sequences of letters in their mind while they judged their own performance. As the memory load increased, confidence estimates became less accurate—even for participants with higher working memory capacities.
Together, these studies suggest that assessing your accuracy is a mentally taxing task. So overconfidence isn't necessarily about arrogance—it's your brain working at the limits of what it can track and hold. And because of this, your brain starts using shortcuts.
Psychologist Daniel Kahneman describes these mental shortcuts as heuristics that together create systemic errors called cognitive biases. One shortcut we use a lot is substituting hard questions with easier related ones.
Researchers tested this by asking students how happy they were in their life and how many dates they'd had in the last month. Unsurprisingly, these two questions did not correlate. However, if you switch it around and ask about their dating life first, suddenly the correlation jumps significantly. Working out how happy you are is a difficult question—you have to consider many things and balance them all out. So when you're primed with information about your dating life, you're very likely to substitute that hard question with an easier one.
Overconfidence from misprocessing information like this can be disastrous.
Why We're Wired This Way
You might expect natural selection to have wiped out the confidently incorrect, but there is evidence that overconfidence can actually be advantageous. Overconfidence can massively improve your status.
In a scientific version of the apprentice, researchers in 2012 compared participants' own assessments of their skill with objective measures. They placed them in group tasks to see who was chosen as a leader and whose ideas influenced the group—tracking status over multiple sessions and assessing whether the desire for status led participants to exaggerate their abilities.
The results were clear. Overconfident individuals were more likely to lead, assert themselves, and maintain influence even when their actual abilities were mid-range. And the evidence shows people react better to confident individuals.
Using fMRI, researchers at the University of Sussex measured brain activity of people after hearing unconfident versus confident advice. Those who listened to the confident advice had increased activity in the ventromedial prefrontal cortex—a brain region associated with processing rewards and expected satisfaction. This means human brains are biologically tuned to be influenced by confident individuals. We literally feel better when we hear confident people.
Critics might note that while overconfidence can improve status in short-term group tasks, it leads to catastrophic failures in high-stakes environments like nuclear engineering, financial trading, or space exploration—where the costs vastly outweigh any social benefit.
"Human brains are biologically tuned to be influenced by confident individuals. We literally feel better when we hear confident people."
Bottom Line
The core argument is compelling: overconfidence isn't a moral failing but a cognitive limitation. Our brains operate at the limits of what they can track, forcing mental shortcuts that systematically distort our accuracy. The strongest evidence comes from the research showing working memory directly impacts confidence calibration—and from studies revealing this bias serves an evolutionary social function.
But here's the vulnerability: while understanding why we're confident when wrong helps us identify the problem, it doesn't automatically fix it. We are biologically hardwired to prefer confident people, even when they're wrong. The very knowledge that it's a cognitive limitation doesn't change the fact that we feel better around confident individuals—which means overconfidence isn't going anywhere.