← Back to Library

This flock camera leak is like Netflix for stalkers

Netflix for Stalkers: When Surveillance Cameras Forget to Lock the Door

Security researcher and YouTuber Benn Jordan has once again pulled back the curtain on Flock Safety, the surveillance company whose camera systems blanket cities across the United States. This time, the vulnerability is breathtaking in its simplicity: dozens of Flock's AI-powered cameras were found streaming live and archived footage directly to the open internet, accessible to anyone with a search engine. No password. No encryption. No barrier whatsoever between the general public and 31 days of continuously recorded surveillance footage.

The comparison to Netflix is Jordan's own, and it lands with uncomfortable precision. The exposed camera feeds were not buried in some dark corner of the internet requiring specialized tools to access. They were indexed by commercial search engines, viewable in a browser, and could even be cast to a television. The footage was not limited to license plates or traffic patterns. Flock's newer Condor cameras use AI-driven pan-tilt-zoom functionality to actively track and zoom in on individual human beings, following them as they move through public spaces.

This flock camera leak is like Netflix for stalkers

The Human Cost of Exposed Feeds

What elevates Jordan's reporting beyond a standard security disclosure is his willingness to articulate, in specific and discomforting detail, exactly what this kind of exposure means for ordinary people. He describes watching a family load an infant and merchandise into their car at a Lowe's in North Carolina, a woman jogging alone on a forest trail in Georgia, and a man leaving his house in New York. In each case, the subjects had no idea they were being watched, let alone that their footage was available to anyone on the planet.

Within two minutes of open source intelligence using a commercial facial recognition engine, I found out that one of them just finished medical school and the other is dealing with chronic irritable bowel syndrome. The couple also just had a baby last year and they have a pretty concerning debt to income ratio.

That passage should alarm anyone who has ever walked past a surveillance camera, which is to say, everyone. The ease with which Jordan cross-referenced camera footage with publicly available data to construct a detailed profile of complete strangers illustrates a fundamental truth about modern surveillance: the camera is only the beginning. Combined with facial recognition, license plate databases, public records, and data breach information, a single feed becomes a comprehensive dossier.

Children, Playgrounds, and the Unthinkable

Perhaps the most disturbing disclosure involves a Flock camera near the Bay Area permanently pointed at a playground, broadcasting live and archived footage of unattended children to the open internet. Jordan's visible discomfort in even describing this scenario underscores how catastrophically negligent the security failure was. This is not a theoretical risk assessment or a hypothetical attack vector. It was happening in real time, and some of these feeds remained accessible even after the disclosure.

The implications for child safety are self-evident and require no elaboration. What does require elaboration is the institutional failure that allowed it: a city or municipality purchased a surveillance system marketed as a public safety tool, and that system became, through sheer negligence, a tool for the exact predatory behavior it was ostensibly designed to prevent.

The Hawthorne Effect and the Right to Be Unselfconscious

Jordan's most philosophically interesting contribution to the surveillance debate centers on a man he observed swinging alone in an empty park. The moment clearly affected him, and his analysis of it touches on something that often gets lost in privacy discussions focused on crime, data, and civil liberties.

This is a classic example of the Hawthorne effect, which is a change in an individual's behavior or choices when they know that they're being observed. Surveillance manufacturers love to talk about this effect when it deters crime. But what they don't talk about is how it deters escapism.

The surveillance industry celebrates the chilling effect on criminal behavior as a feature. Jordan argues persuasively that the same chilling effect suppresses the private, unselfconscious moments that are essential to human development and wellbeing. People sing, dance, practice new skills, and process emotions differently when they believe they are unobserved. Mass surveillance does not merely watch people; it reshapes their behavior, flattening the private eccentricities that constitute authentic selfhood.

It imposes on our right to find our own identities without judgment. And as someone whose identity and success is owed to ample amounts of escapism, this is a hill that I am willing to die on.

This is a genuinely novel framing in the surveillance discourse, which tends to orbit around Fourth Amendment jurisprudence, crime statistics, and cost-benefit analyses. Jordan's argument is that surveillance erodes something more fundamental than legal privacy -- it erodes the psychological space necessary for people to become themselves.

Flock Safety's Response and the Credibility Gap

Flock Safety's response to Jordan's earlier research was to claim that the devices in his videos did not reflect the security standards of publicly deployed cameras. The CEO posted on LinkedIn about the company's security policies. Jordan offered to fund independent security research into Flock's deployed systems, asking only for permission to legally test live devices. Flock did not respond.

The discovery of dozens of deployed, publicly accessible camera feeds rather thoroughly undermines the company's reassurances. Jordan reads Flock's own security statement aloud while accessing a live, unprotected feed from a deployed camera:

Flock is committed to continuously improving security. The devices in this YouTube video were not connected to the cloud and to the best of our knowledge, not customer-installed. So, the security is akin to factory setting.

The comedic timing of reading this statement from an exposed deployed camera aside, the broader pattern is familiar in the security industry: a company dismisses vulnerabilities as edge cases or non-representative, refuses independent verification, and then is publicly contradicted by reality. The credibility gap between Flock's marketing and its actual security posture appears to be widening with each disclosure.

The Retaliation Problem

A troubling undercurrent runs through Jordan's reporting. After his initial disclosure video, he was visited by police and observed what he believed to be private investigators photographing his home and questioning his neighbors. His research collaborator, John Gain, lost employment within 48 hours of the video's release. Jordan frames these not as consequences for finding vulnerabilities, but as consequences for disclosing them publicly and ethically.

I'm just apparently the first person stupid enough to walk the expensive legal tightrope of making them public on a large platform.

This is a well-documented pattern in security research. Companies that face public vulnerability disclosures sometimes respond with legal pressure, employment retaliation against researchers, or law enforcement referrals rather than addressing the underlying security failures. The chilling effect on future disclosures is itself a security vulnerability -- if researchers face personal consequences for ethical reporting, the vulnerabilities get found by people with less ethical intentions instead.

A Counterpoint Worth Considering

It is worth noting that no surveillance system, or any networked technology, can guarantee zero vulnerabilities. Flock Safety operates at enormous scale, and the exposed feeds, while inexcusable, may represent a small percentage of total deployments. Municipalities that purchase these systems generally do so because they face real public safety challenges, and license plate reader technology has genuine law enforcement utility in recovering stolen vehicles and locating missing persons.

However, this counterpoint ultimately reinforces rather than undermines Jordan's central argument. If a system is deployed specifically because it is powerful -- because it can track, identify, and surveil at scale -- then the security obligations are correspondingly enormous. The consequence of failure is not a minor data leak; it is the wholesale exposure of an entire community's movements, habits, and private moments to anyone who cares to look. The power of the tool demands a proportional standard of care, and that standard is clearly not being met.

Bottom Line

Benn Jordan's latest Flock Safety disclosure reveals something worse than a security vulnerability. It reveals a structural failure in how American municipalities adopt surveillance technology. Cities are purchasing AI-powered camera systems capable of tracking individual human beings in real time, then relying entirely on the vendor's assurances that the systems are secure. When those assurances prove false, the cameras designed to protect communities become tools that endanger them. Jordan's work, conducted at significant personal cost, forces a question that city councils and county boards across the country need to answer: if the surveillance system you approved is found to have been broadcasting your constituents' lives to the open internet, what exactly did you think you were buying?

Sources

This flock camera leak is like Netflix for stalkers

by Benn Jordan · Benn Jordan · Watch video

A few weeks ago, using a commercial search engine, I very easily found the administration interfaces for dozens of lock safety cameras. I shared this information with 404 media, and with John Gain's help, that number quickly grew to nearly 70. None of the data or video footage was encrypted. There was no username or password required.

These were all completely public facing for the world to see, and some of them still are. You don't have to be an expert to find and gain access to this. You don't even have to type anything in to see every single person, vehicle, and activity that took place in these locations in the last 31 days. Whether you wanted to watch this footage live in real time or look at footage from a month ago, you could just point andclick your way to it like you were watching Netflix.

You could even open up the live streams in VC or cast it to a television. Making any modification to the cameras is illegal, so I didn't do this. But I had the ability to delete any of the video footage or evidence by simply pressing a button. I could see the paths where all of the evidence files were located on the file system, and I could see their hashes and signatures.

Some of the devices we saw were the familiar-looking Falcon cameras that you see all over the country, but the majority of these were Flock's new Condor cameras, which are designed to detect and track people. They're PTZ cameras, meaning pan, tilt, zoom. And they quite literally use AI to zoom in and follow you around whether you're a person of interest or not. In just the time that it took to count and verify these vulnerabilities, I saw a family in North Carolina load their infant and a bunch of merchandise in a Lowe's parking lot.

And I suppose one could cross reference their license plate with the Park Mobile data breach and find out exactly where the garage is that will store these new fancy tools. I watched a man leave his house in the morning in New York. I watched a woman jogging alone on a forest trail in Georgia. This trail had multiple cameras and I could watch a man rollerblade and then take a break to watch rollerblading videos on his phone.

How? Because ...