← Back to Library

Introduction to cognitive bias: Crash course scientific thinking #1

In an era where algorithms amplify fear and social media feeds curate our realities, Hank Green of Crash Course delivers a vital correction: our brains are not broken, they are just optimized for the wrong era. The most striking claim isn't that we are irrational, but that our greatest survival tool—pattern recognition—is the very engine of our modern delusions. Green argues that science is not merely a collection of facts, but a deliberate, communal system designed specifically to outsmart the biological shortcuts that keep us safe from hot stoves but blind to statistical reality.

The Evolutionary Trap

Green begins by dismantling the notion that believing the Earth was the center of the universe was a failure of intelligence. Instead, he frames it as a triumph of pattern matching. "Our brains are very good at finding patterns," Green writes, noting that this skill helped ancestors spot predators and identify poisonous plants. This evolutionary advantage, however, has a dark side in the modern world. The author explains that we rely on "mental shortcuts or heuristics" to avoid expending brain power on every decision, a strategy that works perfectly for avoiding a hot stove but fails catastrophically when assessing complex risks.

Introduction to cognitive bias: Crash course scientific thinking #1

The commentary here is sharp because it removes the stigma of error. Green posits that cognitive bias is not a character flaw but a feature of our operating system. "Cognitive biases happen unconsciously," he clarifies, distinguishing them from explicit prejudice. This distinction is crucial; it suggests that even the most well-intentioned people are susceptible to these traps. Critics might note that by normalizing bias so heavily, the piece risks letting individuals off the hook for failing to verify information, but Green quickly pivots to the solution: awareness.

The Availability of Fear

The piece shines when connecting abstract theory to immediate, visceral anxiety. Green uses the example of a 2025 mid-air collision to illustrate availability bias. "In early 2025, after a mid-air collision between a commercial airplane and a military helicopter, people started paying a lot of attention to every near miss," he recounts. The result was a collective feeling that air travel was becoming deadly, despite data showing accident rates remained flat.

"When people make judgments based on the information that's easily available, we call this, wait for it, availability bias."

This is the piece's most effective demonstration of the concept. Green points out the absurdity of the situation: "You're far more likely to be in a car crash than a plane crash," yet the sheer volume of media coverage makes the plane crash feel more probable. The argument lands because it exposes how the modern news cycle, driven by algorithms, weaponizes our evolutionary wiring. We are not reacting to the world as it is, but to the world as it is presented to us. The author effectively argues that our intuition is a liar when the data is invisible.

The Myth of the Visual Learner

Moving from risk perception to education, Green tackles the pervasive myth of learning styles. He highlights a startling statistic: "More than 90% of participants said that people learn better when they're taught using the learning style that best suits them." Yet, as Green notes, "There is no scientific evidence to support the idea of personalized learning styles." This persistence is a textbook case of confirmation bias, where educators and students alike filter out contradictory evidence to maintain a comforting narrative about how they learn.

The author's analogy here is particularly biting. He suggests that someone who believes in learning styles might reject evidence against them "just like I might and sometimes do reject evidence that the butt is not part of the legs." This self-deprecating humor serves a serious purpose: it models the humility required to accept scientific consensus over personal intuition. It is a rare moment in media where the host admits their own fallibility to make a broader point about the necessity of evidence.

The Scientific Antidote

If bias is the disease, Green presents the scientific method not as a dry academic exercise, but as a social technology for truth. He introduces Sage the Bad Naturalist to explain how science is designed to catch these errors. "The process of science is actually designed to overcome biases from methods to reliance on evidence and especially the fact that science is communal," Sage explains. The argument shifts from individual cognition to collective verification.

Green emphasizes that even scientists are not immune. "That's why we say everyone has cognitive bias even scientists," he states, leading into a discussion of randomized control trials. These trials, with their control groups and double-blind protocols, are described as a "multi-step process" specifically engineered to neutralize the very human tendency to see what we want to see. The commentary here is vital: it reframes science from a body of knowledge to a system of checks and balances. Without this communal vetting, Green argues, we are left with "one guy in a bathtub shouting 'Eureka!'" which, while dramatic, is rarely reliable.

"Anybody who says they don't have any biases is just waving a huge red flag."

This line serves as the ultimate litmus test for intellectual honesty. Green suggests that the first step to overcoming bias is simply admitting it exists. He advocates for "cognitive flexibility," the ability to say, "Maybe I was wrong." In a cultural landscape that often rewards stubbornness, this is a radical and necessary prescription. The author argues that interacting with diverse people is a practical application of this, as bias "likes to tell us that our experience is the only reality."

Bottom Line

Crash Course's argument is strongest in its reframing of cognitive bias not as a personal failing but as an evolutionary mismatch that requires a systemic solution. The piece's greatest vulnerability is its reliance on the assumption that people will actually engage in the difficult work of cognitive flexibility once they are aware of the problem. However, the core message remains undeniable: our gut feelings are unreliable narrators, and science is the only tool we have to verify the story. The next time you feel certain about a news story or a personal belief, the author urges you to remember that your brain is likely just copy-pasting an old story onto new information.

Sources

Introduction to cognitive bias: Crash course scientific thinking #1

by Crash Course · Crash Course · Watch video

2,000 years ago, people looked up at the sky and they saw that everything up there seemed to move. So, naturally, the Earth was staying still and everything else was rotating around us. It was a story that just made sense to millions of people and it stuck around well into the 16th century. If I'd been alive then, I would have believed this story.

even today, I feel like I must be at the center of something. That idea, of course, was wrong, but we didn't believe it forever. We found ways to step out of our old stories and find something much more interesting. Hi, I'm Hank Green and this is Crash Course Scientific Thinking.

Science. It is a never-ending quest for knowledge, a way of interrogating our universe to figure out how it works, a tool to guide us when our intuition isn't enough. And also, it can be quite fun. Sometimes you get to blow stuff up.

In the years since Capernicus put forth the theory that the Earth revolves around the Sun, we've learned that some questions are just too big, too complex, or too bizarre to trust our gut with. When we rely on intuition alone to answer those big, complicated questions, our brains fall prey to cognitive biases, predictable weaknesses in the way we've evolved to think. Our brains are very good at finding patterns. We've evolved this skill because it's super helpful for survival.

It helped our ancestors spot the telltale signs of predators and recognize when certain plants might be poisonous. We have always paid attention to and learned from our world. Those pattern recognition skills have also been linked to some very special human qualities, like our ability to imagine and invent. Like, it's what made me notice that Hank and Angler fish sound vaguely similar so that I could invent something called the Hankler fish.

Bad puns are still good pattern recognition. It's also why we are so good at telling stories because really that's all a story is a recognizable pattern of information. And more importantly, our highly evolved pattern recognition skills allow our brains to apply mental shortcuts or heruristics that help us solve simple problems quickly and make life liveable. They are the brain's way of copy pasting stories we already have onto new information so that we don't expend a bunch of brain power in ...