← Back to Library

Strategic blindness

Simon Pearce's piece, "Strategic blindness," challenges a comforting myth that governments are often caught off guard by surprise attacks. The article's most provocative claim is that intelligence agencies frequently do their jobs correctly, but leaders deliberately ignore the warnings because they prioritize political optics over material reality. This is a crucial distinction for anyone trying to understand why modern states repeatedly stumble into catastrophic failures.

The Myth of Intelligence Failure

Pearce opens by dismantling the convenient narrative that bad surprises are always the result of faulty data. He points to historical precedents like Pearl Harbor and the fall of Kabul, noting that in each case, credible warnings existed but were dismissed. "When states suffer major strategic shocks... official postmortems tend to blame the debacle on an 'intelligence failure.' This explanation is usually incorrect," Pearce writes. The author argues that politicians often find it more expedient to blame the spies than to admit they made a conscious choice to ignore the evidence. This framing is powerful because it shifts the blame from the analysts in the basement to the decision-makers in the oval office.

Strategic blindness

The article suggests that the root cause is often a fear of the political fallout from acting on intelligence. Leaders worry that preemptive action will be seen as provocation, or that acknowledging a threat will look like panic. As Pearce puts it, "Their primary focus tends not be material ('What will happen?') but narrative ('How will it look?')." This obsession with optics, amplified by the 24-hour news cycle, creates a dangerous disconnect between what leaders know and what they do. Critics might argue that sometimes the intelligence is genuinely ambiguous, but Pearce's historical evidence suggests that the ambiguity is often manufactured by leaders who refuse to grapple with the worst-case scenarios.

The missing knowledge is self-knowledge. Even as intelligence agencies get better at collecting and analyzing data, policymakers get worse at understanding what to do with it.

The Tragedy of France

The piece then pivots to a deep dive into the fall of France in 1940, using it as a cautionary tale of institutional decay. Pearce details how France possessed superior tanks and a massive army, yet collapsed in weeks. The failure wasn't a lack of equipment; it was a lack of imagination and a refusal to adapt. "French doctrine had two overarching priorities: avoiding casualties and preventing escalation," Pearce explains. The French military command was so traumatized by the slaughter of World War I that they built a strategy designed to never fight a war again, effectively guaranteeing their defeat when war arrived.

The author highlights the tragic irony that the French ignored the very tactics that would destroy them. They had access to information about German armored warfare but chose to discount it because it didn't fit their static worldview. "Intelligence that contradicted this assumption was discounted," Pearce notes. This cognitive freeze meant that when the German Blitzkrieg arrived, the French high command was paralyzed. "It was not cowardice that caused French generals and politicians to crack and surrender so quickly, but the total failure of their Plan A, and the absence of a Plan B." The human cost of this rigidity was immense, with the civilian population and the army paying the price for a leadership that could not conceive of a different kind of war.

The Cost of Institutional Blindness

The article concludes by connecting these historical failures to the present, warning that the same patterns of denial persist. Pearce argues that the danger today lies not in external threats, but in the internal inability of institutions to process information. "The worst failures occur when political leaders commit to a course of action before integrating new intelligence," he writes. The example of the 2021 evacuation from Kabul illustrates this perfectly, where domestic political calculations led to a chaotic exit that endangered countless lives. The author suggests that until voters and leaders prioritize competence over narrative, these cycles of failure will continue.

We ignore these at our peril.

Bottom Line

Pearce's analysis is a sobering reminder that the greatest threat to national security is often the refusal to see reality clearly. The strongest part of the argument is its relentless focus on the human and institutional costs of ignoring intelligence, moving beyond abstract strategy to the real-world consequences of political cowardice. However, the piece could have explored more deeply how democratic accountability mechanisms might be reformed to prevent such blindness, rather than just diagnosing the problem. Readers should watch for how current leaders respond to emerging threats, and whether they are willing to act on uncomfortable truths or will once again choose the path of least resistance.

Sources

Strategic blindness

By Simon Pearce

ARE ALL BAD SURPRISES INTELLIGENCE FAILURES?.

When states suffer major strategic shocks—think of the United States, for example, when Pearl Harbor was attacked; or the USSR when Germany invaded in 1941; or the US again, on 9/11 and when Kabul fell—official postmortems tend to blame the debacle on an “intelligence failure.”

This explanation is usually incorrect.

In each of the cases above, intelligence agencies produced credible warnings of the impending catastrophe. Political and military leaders received those warnings, but chose not to act. As intelligence officials have been known to lament, it’s awfully convenient for politicians to attribute their policy failures to faulty intelligence—but it’s not so convenient for the intelligence officials.

Hamas’s attack on Israel on October 7, 2023, fits this pattern perfectly. Israel’s political and military leadership were in possession of detailed intelligence indicating that Hamas was preparing a major attack. Leaders even discussed plausible worst-case scenarios. But they chose to do nothing, because they couldn’t bring themselves to believe that Hamas would truly act on its plans.

WHY LEADERS FAIL TO USE INTELLIGENCE.

Many political leaders have unrealistic expectations about how much certainty intelligence can provide. Intelligence agencies rarely say, “Event X will definitely happen on Date Y.” Political leaders and their military advisors must synthesize reports about capabilities and probabilities into effective plans of action.

Fear of international condemnation can be an impediment to aggressive preemptive action. Leaders fear they will be blamed for provoking precisely the scenario described in the intelligence warnings.

The worst failures occur when political leaders commit to a course of action before integrating new intelligence. Veteran CIA analyst John Gentry, for example, in an article titled “Intelligence Failure Reframed,” writes that CIA director Richard Helms failed to warn Lyndon Johnson that the war in Vietnam was going badly; he also quashed a CIA report warning against the 1971 invasion of Cambodia. He did so not because the CIA lacked relevant data and insight, but because Nixon had already decided to invade. The CIA did its job. It gathered and interpreted information. But by making it clear that this information would be unwelcome, the leadership failed.

During the Gulf War, Donald Rumsfeld warned of “unknown unknowns.” These days, the most dangerous unknown unknowns are not external. They’re within our own institutions and systems—and we don’t understand them well enough to address them. People within these institutions may understand them, ...