← Back to Library

Breaking the creepy AI in police cameras

The Surveillance Machine Hiding in Plain Sight

Benn Jordan's investigation into automated license plate readers (ALPRs) pulls back the curtain on a surveillance apparatus that most Americans drive past without a second thought. At its center sits Flock Safety, a startup valued at nearly $8 billion, which leases AI-powered cameras to police departments, retailers, and homeowners associations across the country. Jordan's core argument is straightforward: these devices amount to warrantless mass tracking of citizens, and the companies behind them are not security providers but data brokers wearing a public safety costume.

The technical operation is deceptively simple. A camera captures an image of a passing vehicle, an image segmentation model isolates the license plate, and an optical character recognition system reads the characters. If the confidence level is high enough, the plate number, timestamp, and location are logged in a database. Officers can then query any plate and see everywhere that vehicle has been spotted. In cities with dense camera coverage, the effect is functionally identical to attaching a GPS tracker to every car on the road.

Breaking the creepy AI in police cameras

When the AI Gets It Wrong

Jordan opens with a string of cases where ALPR errors led to armed confrontations with innocent people. A Colorado family held at gunpoint. A New Mexico woman stopped with her 12-year-old sister. In each case, the system misread a plate and flagged the vehicle as stolen. The pattern is consistent and the consequences are severe.

Your tax dollars pay startups to rent AI superpowers to law enforcement, and then the following year, your tax dollars pay the lawsuit settlements when it doesn't work.

The financial structure makes these errors particularly galling. Police departments do not own the cameras. They pay for installation and then rent them at $2,000 to $3,000 per camera per year. Flock Safety's liability agreements protect the company against virtually all claims when the technology fails. The agreements even include a clause telling police departments to call 911 rather than rely on Flock Safety services in an emergency. Taxpayers fund both the surveillance and the settlements when it goes wrong, while the company bears neither the risk nor the accountability.

The Data Broker Business Model

Jordan's most pointed analysis concerns what Flock Safety actually is. The company markets itself as a public safety technology provider, but license plate readers and security cameras have existed for decades. The real product is the data. Cities, police departments, and businesses pay to borrow cameras, and Flock retains the right to make that data accessible to other paying clients.

Andreessen Horowitz, who is the VC of choice for data brokers and big data startups like Databricks, Fiverr, Scale AI, and Golden, just to name a few. They've invested nearly half a billion dollars into Flock Safety.

The numbers tell the story. ADT, the largest home security brand in the world, went public in 1969 and has a market cap of $6.8 billion. Flock Safety's recent valuation eclipses that at nearly $8 billion, despite being founded just nine years ago. These are not security company numbers. These are data broker numbers. And Flock Safety and Andreessen Horowitz have spent a combined $92.68 million on lobbying, the vast majority of it in the past year alone.

The Hot List and Its Abuses

The "hot list" feature allows law enforcement to receive real-time alerts whenever a flagged plate is detected. Jordan frames this as the equivalent of a permanent police tail without the constitutional requirement of a warrant. He raises two scenarios that sound hypothetical but are not.

What is there to stop Texas law enforcement tracking women out of state who are suspected of getting illegal abortions? Or how would you prevent a jealous, abusive person who also happens to be a law enforcement officer from using the system to constantly track his ex-girlfriend and her new partner? Well, unfortunately, you wouldn't prevent it because these are just two of the many examples of things that have actually already happened.

The 2018 Supreme Court ruling in Carpenter v. United States established that prolonged location tracking via cell phone data requires a warrant. The court recognized that timestamp data provides "an intimate window into a person's life, revealing not only his particular movements, but through them his familial, political, professional, religious, and sexual associations." Whether ALPR-based tracking falls under the same constitutional protection has not yet been ruled on, a legal gap that Flock Safety's $92 million lobbying budget seems designed to keep open.

Where Retail Meets Law Enforcement

The investigation extends beyond police use into the retail sector. Jordan walks through Walmart's corporate privacy policy, which collects an extraordinary range of data: personal identifiers, device MAC addresses, age, gender, race, marital status, household income, credit card numbers, geolocation history, photographs, audio and video recordings, criminal background checks, and behavioral inferences drawn from shopping patterns. All of this can be shared with third parties, including law enforcement.

Some Home Depot and Lowe's locations are Flock Safety customers and have shared plate data with law enforcement, including ICE. Jordan draws a direct line between this data-sharing arrangement and immigration raids at those stores, describing how a day laborer picking up construction supplies could trigger a hot list alert that leads to detention and deportation.

Fighting Back with Adversarial Noise

The second half of the video shifts from diagnosis to experimentation. Jordan builds his own ALPR system using a Raspberry Pi 5, a camera module, and a YOLO computer vision model, all for roughly $250. He then develops adversarial noise patterns, subtle visual perturbations applied to license plates that confuse the AI models while remaining largely invisible to human eyes.

The approach borrows from his earlier work on audio adversarial noise, which successfully confused AI music classifiers. For license plates, he generates thousands of test images, classifies the results by whether the plate was detected and read correctly, and uses that data to train a model that produces increasingly effective jamming patterns. The real-world tests show promising results: several noise patterns caused commercial ALPR systems to either misread the plate or fail to detect it entirely.

Counterpoints Worth Considering

Jordan acknowledges that ALPR technology has legitimate uses. Missing persons with dementia have been located. Homicide suspects have been apprehended. He explicitly states he has no problem with police using license plate readers mounted on their own vehicles or with officers being able to identify plates in the course of their duties. His objection is to the mass, warrantless, continuous tracking of every vehicle by private companies that sell the data onward.

There is also a practical counterargument he does not fully address: if adversarial noise becomes widespread, it could undermine legitimate law enforcement investigations alongside the surveillance overreach. The same technology that protects a privacy-conscious commuter also protects a hit-and-run driver. Jordan seems aware of this tension but sidesteps it, focusing instead on the principle that the legal framework should be fixed rather than leaving citizens to engineer their own defenses.

It is also worth noting that the European Union's General Data Protection Regulation provides a working model for how data privacy can coexist with public safety. As Jordan points out, in countries with strong data protection laws, there is no reasonable need to disrupt license plate readers because the broader data brokerage ecosystem that makes them dangerous simply does not exist in the same form.

Bottom Line

Jordan's investigation is at its strongest when it follows the money. A startup valued at $8 billion, backed by nearly half a billion in venture capital, spending $92 million on lobbying, operating in a legal gray zone left open by an untested constitutional question: this is not a public safety story. It is a story about a data brokerage industry that has found a way to make taxpayers fund the collection of their own surveillance data. The adversarial noise experiments are technically interesting but secondary to the larger point: the absence of meaningful federal data privacy legislation has created a vacuum that private companies are filling with a surveillance infrastructure that no voter approved and no court has fully examined. Until that changes, the DIY approach Jordan demonstrates may be the only defense available, though he wisely cautions against actually using it on public roads.

Sources

Breaking the creepy AI in police cameras

by Benn Jordan · Benn Jordan · Watch video

If you're an American, you've been probably seeing a whole bunch of these things. And in some places, they're so common that you don't even notice them. They just blend into the background, like their trees or street lights. And you've probably correctly assumed that they're recording traffic.

They're also recording and logging license plates and using AI image recognition. But what if I told you that they are in fact not owned by your local police department or your local government, but are licensed to them by a third-party startup, and all of your vehicles whereabouts are being tracked by a third party data broker. What if I also told you that major retail chains are also using them and they're combining your vehicle's whereabouts with your personal information, your shopping habits, and even your instore behavior. And some of them are giving that information to law enforcement.

And what if I told you that I just possibly may have come up with a way to break it? This Colorado family was being driven to a shopping center in a stolen vehicle, and they would have gotten away with it without cutting edge technology alerting police of their crime. >> Nah, that was actually an OCR ror. Well, this New Mexico woman and her 12-year-old sister were driving to the park with a stolen license plate when AI enhanced technology allowed police to >> That one was another computer vision ror.

>> Well, yet another car thieving woman of color caught red-handed. >> Nope, just a glitch. This situation keeps happening all over the country. Your tax dollars pay startups to rent AI superpowers to law enforcement, and then the following year, your tax dollars pay the lawsuit settlements when it doesn't work.

Flock Safety is a startup that was founded in 2017 that specializes in developing and leasing security cameras that have AI capabilities such as license plate recognition and vehicle identification. And these security cameras feed into databases that law enforcement, private companies, and even private citizens can access and utilize. And if you own a car in the United States, you have unquestionably been logged within one of these databases. It works like this.

So, you drive past a flock safety camera and it records an image or video. An image segmentation model or something similar looks for the license plate itself or a rectangle with ...