← Back to Library

Stop comparing safety and cybersecurity, they have very little in common

# Stop Comparing Safety and Cybersecurity, They Have Very Little in Common

Ross Haleliuk opens with a haunting analogy from Dug Song, the founder of Duo Security. In aviation, a plane crashes the same way only once. Black boxes reveal the failure, designs change, and that crash never happens again. In cybersecurity, organizations get breached the same way repeatedly. The same company, the same reason. Song called it Groundhog Day in the worst possible sense—a hamster wheel where we relive the same incidents without actually improving.

Stop comparing safety and cybersecurity, they have very little in common

Haleliuk agrees with the pain but rejects the comparison. Safety and security, he argues, are fundamentally different problems. The seatbelt analogy that security leaders love to cite simply doesn't hold.

Safety Addresses a Stable Problem. Security Doesn't.

Seatbelts solve a fixed equation. Gravity doesn't learn. Physics doesn't adapt. Engineers account for speed, impact angle, passenger position, vehicle height—a finite set of variables. Once tested, the solution works across millions of identical scenarios.

Cybersecurity faces something entirely different. Attackers study your controls, adapt their tactics, and hunt for gaps. Ross Haleliuk writes, "Basically, wearing a seatbelt won't trigger a new kind of crash, but a security control often does trigger a new attack path."

His friend Luigi Lenguito, founder of BforeAI, captures the distinction sharply. Safety trends toward zero accidents as root causes are fixed. Security cannot trend to zero because adversaries are motivated to increase harm. The environment is non-controlled.

"It doesn't make sense to draw parallels between safety and security, and it's time to put the seatbelts analogy to rest."

What's at Stake: Life Versus Nuisance

Plane crashes devastate families. Car accidents kill people. Safety measures protect what society agrees is our most precious gift.

Most cybersecurity breaches, Haleliuk notes, are financially motivated. The average person experiences them as inconvenience: spam emails, adware pop-ups, fraudulent charges that banks reverse. Even massive breaches like Uber or Equifax feel distant—breach notification emails and free identity monitoring that most don't understand.

Ross Haleliuk writes, "Basically, to most individuals cybersecurity breaches are a nuisance, and to most businesses they are, well… the cost of doing business."

He acknowledges exceptions. Ransomware attacks on hospitals have harmed patients. Families have lost decades of savings. Private details leaked can cause irreparable harm. But these remain anomalies in public perception, not the expected consequence of a breach.

Critics might note this framing underestimates how cybersecurity failures cascade into physical harm. Power grids, water systems, medical devices—all increasingly connected. The distinction between digital and physical risk may narrow faster than Haleliuk suggests.

Standardization Works for Safety. It Fails for Security.

A seatbelt is a seatbelt. No configuration. No context. It works the same way every time. Airbags, guardrails, child safety locks—all passive, predictable, effective without ongoing mental effort.

Security controls are deeply contextual. Every deployment depends on customer environment, configuration, business processes, data flows. Controls interact unpredictably. A single gap causes cascading failures. Ross Haleliuk writes, "Implementing a security tool the wrong way can (and does) itself lead to a breach, making it easier for attackers to get in (imagine if wearing a seatbelt increased the likelihood of a car crash)."

Standardization itself becomes vulnerability. When thousands of companies standardize on the same firewall or VPN tool, a single exploit breaches them all simultaneously. Haleliuk points to Cisco and other network device vendors—thousands of organizations compromised at once because they chose the same protection.

As Ivano Bongiovanni noted in a LinkedIn thread Haleliuk references, the core distinction is human intent. Safety: no malicious intent. Security: malicious intent. A system can be safe but not secure, and vice versa.

The Magic Answer Doesn't Exist

Haleliuk closes with sober realism. We cannot repurpose the safety model for cybersecurity. Borrowing paradigms from slow-moving industrial manufacturing won't work for fast-paced technology defense.

Ross Haleliuk writes, "I think it's just about time to accept that the best we can do is to continue maturing our defenses so that we can consistently make it harder and harder for the adversaries to achieve their goals. There is no other magic answer."

Critics might argue this accepts defeat prematurely. Safety achieved dramatic improvements through regulation, standardization, and cultural shift. Why assume security cannot do the same? The Secure by Design pledge Haleliuk mentions—CISA's comparison to automotive and aviation initiatives—suggests some leaders still believe borrowed paradigms can work.

Bottom Line

Haleliuk's argument cuts through security's favorite analogies with uncomfortable clarity. Seatbelts work because physics doesn't learn. Security fails because attackers do. The comparison isn't just wrong—it's dangerous, luring leaders into solutions that cannot work. The verdict: stop borrowing from safety. Build defenses that assume adaptation, context, and relentless opposition. There is no seatbelt for the internet.

Deep Dives

Explore these related deep dives:

  • Groundhog Day (film)

    The article explicitly uses 'Groundhog Day' as a metaphor to describe recurring security incidents

Sources

Stop comparing safety and cybersecurity, they have very little in common

by Ross Haleliuk · Venture in Security · Read full article

Nearly a year ago, we hosted Dug Song, the legendary founder of Duo Security, on Inside the Network. During that conversation, Dug shared a powerful analogy that has stuck with me. He explained that in aviation, a plane crashes the same way only once, or maybe twice. Whenever it happens, we get to the bottom of the failure by analyzing black boxes, and then the entire systems and plane designs change to prevent the same failure from ever happening again. In security, it’s a different story. Organizations get breached the same way over and over, and oftentimes the same company gets breached for the same reason many times. Dug described this as a “Groundhog Day in the worst possible sense”, a hamster wheel of pain where we’re not actually getting better, just reliving the same incidents again and again.

This issue is brought to you by… Prophet AI.

The Economic Fix for the SOC: AI-Driven Autonomy with Human Guidance

Security leaders know current SOC unit economics are unsustainable. Hiring more analysts cannot scale to meet the volume of modern alerts, and legacy automation tools are often too rigid to maintain. Prophet Security offers a different path: AI-driven autonomy that elevates the role of the SOC analyst.

Prophet AI functions as a virtual SOC analyst, autonomously investigating alerts with the same depth, quality, accuracy, and transparency as your best SOC analysts. By handling the high-volume investigative grunt work, Prophet AI allows you to transform your SOC operations from one where analysts are consumed by repetitive tasks to one where they can focus on high-impact, low-volume AI-validation, threat hunting, or detection engineering.

I think most of us feel Dug’s pain, and, unfortunately, we have to go through this Groundhog Day what feels like every single week. I most definitely agree with the sentiment Dug expressed, but I don’t agree with the analogy. It took me a while to realize why that is the case, so in this issue, I am talking about reasons why it doesn’t make any sense to draw parallels between safety and security.

The well-loved seatbelts analogy is sadly not relevant.

People in security absolutely love to bring up the story of how seatbelts redefined road safety (if you’re not sure what I am talking about, here’s an example of how CISA compares its Secure by Design Pledge to the initiatives in the automotive and aviation industries).

The story ...