Simon Pearce's piece, "Strategic blindness," challenges a comforting myth that governments are often caught off guard by surprise attacks. The article's most provocative claim is that intelligence agencies frequently do their jobs correctly, but leaders deliberately ignore the warnings because they prioritize political optics over material reality. This is a crucial distinction for anyone trying to understand why modern states repeatedly stumble into catastrophic failures.
The Myth of Intelligence Failure
Pearce opens by dismantling the convenient narrative that bad surprises are always the result of faulty data. He points to historical precedents like Pearl Harbor and the fall of Kabul, noting that in each case, credible warnings existed but were dismissed. "When states suffer major strategic shocks... official postmortems tend to blame the debacle on an 'intelligence failure.' This explanation is usually incorrect," Pearce writes. The author argues that politicians often find it more expedient to blame the spies than to admit they made a conscious choice to ignore the evidence. This framing is powerful because it shifts the blame from the analysts in the basement to the decision-makers in the oval office.
The article suggests that the root cause is often a fear of the political fallout from acting on intelligence. Leaders worry that preemptive action will be seen as provocation, or that acknowledging a threat will look like panic. As Pearce puts it, "Their primary focus tends not be material ('What will happen?') but narrative ('How will it look?')." This obsession with optics, amplified by the 24-hour news cycle, creates a dangerous disconnect between what leaders know and what they do. Critics might argue that sometimes the intelligence is genuinely ambiguous, but Pearce's historical evidence suggests that the ambiguity is often manufactured by leaders who refuse to grapple with the worst-case scenarios.
The missing knowledge is self-knowledge. Even as intelligence agencies get better at collecting and analyzing data, policymakers get worse at understanding what to do with it.
The Tragedy of France
The piece then pivots to a deep dive into the fall of France in 1940, using it as a cautionary tale of institutional decay. Pearce details how France possessed superior tanks and a massive army, yet collapsed in weeks. The failure wasn't a lack of equipment; it was a lack of imagination and a refusal to adapt. "French doctrine had two overarching priorities: avoiding casualties and preventing escalation," Pearce explains. The French military command was so traumatized by the slaughter of World War I that they built a strategy designed to never fight a war again, effectively guaranteeing their defeat when war arrived.
The author highlights the tragic irony that the French ignored the very tactics that would destroy them. They had access to information about German armored warfare but chose to discount it because it didn't fit their static worldview. "Intelligence that contradicted this assumption was discounted," Pearce notes. This cognitive freeze meant that when the German Blitzkrieg arrived, the French high command was paralyzed. "It was not cowardice that caused French generals and politicians to crack and surrender so quickly, but the total failure of their Plan A, and the absence of a Plan B." The human cost of this rigidity was immense, with the civilian population and the army paying the price for a leadership that could not conceive of a different kind of war.
The Cost of Institutional Blindness
The article concludes by connecting these historical failures to the present, warning that the same patterns of denial persist. Pearce argues that the danger today lies not in external threats, but in the internal inability of institutions to process information. "The worst failures occur when political leaders commit to a course of action before integrating new intelligence," he writes. The example of the 2021 evacuation from Kabul illustrates this perfectly, where domestic political calculations led to a chaotic exit that endangered countless lives. The author suggests that until voters and leaders prioritize competence over narrative, these cycles of failure will continue.
We ignore these at our peril.
Bottom Line
Pearce's analysis is a sobering reminder that the greatest threat to national security is often the refusal to see reality clearly. The strongest part of the argument is its relentless focus on the human and institutional costs of ignoring intelligence, moving beyond abstract strategy to the real-world consequences of political cowardice. However, the piece could have explored more deeply how democratic accountability mechanisms might be reformed to prevent such blindness, rather than just diagnosing the problem. Readers should watch for how current leaders respond to emerging threats, and whether they are willing to act on uncomfortable truths or will once again choose the path of least resistance.