Connect with us

Hi, what are you looking for?

News

Hacking and jamming flock cameras in Florida will now send you to jail

Image Credit: Next 9NEWS / Benn Jordan

Hacking and jamming flock cameras in Florida will send you to jail
Image Credit: Next 9NEWS / Benn Jordan

Repair and data-recovery YouTuber Louis Rossmann says Florida has quietly moved the fight over Flock Safety cameras from the internet into the criminal code.

In a recent video, Rossmann walks through a new Florida law that makes tampering with your license plate – specifically in ways that confuse AI surveillance cameras – a jailable offense.

He ties it directly to the rise of Flock Safety automated license plate readers, the same cameras privacy researchers and journalists like Spencer Soicher and Benn Jordan say are both intrusive and surprisingly easy to hack.

Put simply:

If you “jam” or trick the cameras in Florida, the state now treats you a lot like a drunk driver.

Florida’s New Law: No More “Captchas” For Your Plate

Rossmann tells his viewers the new law bans “any substance, reflective matter, illuminated device, spray coating, covering, or other material” that interferes with the legibility or detectability of a license plate – or with the ability of a device to record it.

That wording is important. It’s not just about whether a human can read your tag.

Florida’s New Law No More “Captchas” For Your Plate
Image Credit: Louis Rossman

According to reporting on the statute, intentional plate obstruction in Florida is now a second-degree misdemeanor, with penalties of up to 60 days in jail and a $500 fine, and harsher penalties if the obscured plate is tied to another crime. 

Manufacturing, selling, or even just possessing certain plate-obscuring devices can bump you into higher misdemeanor territory.

Rossmann argues that this isn’t written like a simple “don’t cover your tag with mud” rule.

It’s written like a “make sure the AI can see you at all times” rule.

Rossmann: This Is About AI Cameras, Not Cops’ Eyes

In his video, Rossmann points straight at earlier work by Benn Jordan, who showed how simple patterns and a thin film over a plate could confuse the AI in Flock cameras while leaving the numbers perfectly clear to human beings.

Jordan’s demo, as Rossmann explains, used basic shapes on a plastic-like layer.

To a cop standing behind the car, the plate was totally readable.

To the AI system feeding a massive license-plate database, it suddenly became “no plate detected.”

Rossmann’s point is that Florida’s new law doesn’t care whether a person can read your plate.

It cares whether the machine can.

He even compares the punishment range to a DUI, arguing that someone who can be identified by any normal camera, witness, or dashcam is now being criminalized just because they interfered with one specific type of surveillance.

From his perspective, that’s a big shift: the state isn’t just enforcing ID rules on the road anymore – it’s enforcing the needs of a specific technology.

When Mud Looks Like a Crime

Rossmann also raises a practical problem: intent.

The law bans applying a “substance” that interferes with detectability. But what happens when the pattern that confuses an AI reader shows up by accident?

When Mud Looks Like a Crime
Image Credit: Next 9NEWS

He notes that if you go off-road, or even just park under a messy tree, your plate can easily end up more obscured than Jordan’s carefully drawn pattern.

If AI “sees” that as non-readable and an officer doesn’t like you, Rossmann worries the law can be used as a pretext.

He’s especially bothered by the idea that someone trying to avoid constant AI tracking could be punished like a drunk driver.

In his view, hiding from a dragnet is not the same thing as hiding from every human driver and police officer around you.

Whether you agree or not, Rossmann is clearly framing this as a Fourth Amendment problem.

He even cites Supreme Court language (like Carpenter v. United States) to argue that turning roads into 24/7 automated location logs is exactly the kind of “near-perfect surveillance” the Court has warned about.

Meanwhile, Researchers Say Flock Is “Extraordinarily Easy” To Hack

While Florida is threatening jail over tricks that confuse Flock cameras, Spencer Soicher at Next 9NEWS has been covering a very different angle: how fragile the system itself appears to be.

In his TV report, Soicher talks to cybersecurity researcher John Gaines, who says he bought Flock hardware on eBay and started probing it for weaknesses.

Meanwhile, Researchers Say Flock Is “Extraordinarily Easy” To Hack
Image Credit: Next 9NEWS

Gaines tells Soicher he was able to take control of a Flock device in under 30 seconds.

He describes it as “extraordinarily easy — like 12-year-olds could do it.”

According to Soicher’s report, Gaines could:

  • Connect to the device and plant data on it.
  • Pull footage off the camera, including images from the factory where it was built.
  • In a worst-case scenario, insert fake imagery that would generate a “hot list” hit and send officers to the wrong person’s door.

Soicher explains that Gaines found dozens of vulnerabilities and documented them in a white paper.

Those same flaws later became the backbone of a viral YouTube breakdown by Benn Jordan.

Jordan, whom Soicher also cites, points to what he calls the most “egregious” problem: Flock didn’t require multi-factor authentication for all police users.

The same security step you use to log into your email wasn’t universally mandatory for a system tracking millions of cars.

Dark Web Logins, Fake Footage, Real Consequences

In Soicher’s segment, Jordan says he found police Flock logins for sale on the dark web, though he did not buy them for legal reasons.

He tells viewers this is a “national security risk,” because once a bad actor is inside the system, they could plant or manipulate evidence that officers treat as gospel.

Soicher underscores that concern by pointing to a recent Colorado case where police insisted they had video “dead to rights” on a suspect – and the accused person had to fight hard to prove the footage was wrong.

Now add in Gaines’ demonstration, where he shows he can wirelessly connect to Flock’s compute box and inject video or images.

Soicher spells out the nightmare scenario: a system that cities trust to place you at a specific time and location can be quietly altered, with minimal skill and cheap hardware.

To be fair, Soicher also reports Flock’s response.

The company says 97% of its customers now use multi-factor authentication and that new users have been required to turn it on since late 2024. 

Wikipedia

But that still leaves some agencies choosing weaker security on a platform that tracks “the location of everybody,” as Jordan bluntly puts it.

Jam The Camera, Go To Jail – Leave It Weak, No Problem

This is where the Florida law and the Flock research collide in a way that feels upside-down.

Jam The Camera, Go To Jail Leave It Weak, No Problem
Image Credit: Next 9NEWS

On one side, as Rossmann describes it, Florida is threatening ordinary drivers with jail time if they use patterns or coatings that interfere with AI recognition – even when humans can still read the plate just fine. 

On the other side, Soicher’s reporting shows that the same surveillance infrastructure can be:

  • Hacked in seconds,
  • Accessed through stolen logins, and
  • Potentially fed fake video that officers may treat as unquestionable evidence.

If you, as a citizen, try to reduce how much of your life lands in that system, Florida now treats you as the problem.

If the system itself is shipped with outdated software, lax defaults, or weak authentication, the consequences primarily fall on… no one in particular.

That imbalance is hard to ignore.

The Bigger Fight: Privacy, Surveillance, And Who Gets Punished

The Bigger Fight Privacy, Surveillance, And Who Gets Punished
Image Credit: Next 9NEWS

Rossmann makes it clear he’s not defending people who rip plates off their cars to hit-and-run with total anonymity.

He says when someone crashes into him and flees, he wants a readable plate and accountability.

What he’s pushing back on is the idea that every car must be perfectly machine-readable all the time, for the benefit of a growing network of private surveillance vendors and government databases.

Soicher’s reporting adds a second layer to that debate.

If police, prosecutors, and judges are going to treat Flock data as solid evidence, the public has every reason to demand that the hardware be hardened, the logins locked down, and the footage protected from tampering.

Instead, what we’re seeing right now is a legal environment where:

  • The citizen who tries to dodge or confuse the system can be jailed.
  • The system that can be hacked, spoofed, or misused gets to catch up at its own pace.

From a civil-liberties standpoint, that’s backwards.

If the state is going to insist you be constantly scannable, it should at least guarantee that the scanner is secure, transparent, and tightly audited.

Until that happens, stories like Rossmann’s Florida breakdown and Soicher’s Flock investigation will keep raising the same question:

Why is it always easier to criminalize the people being watched than to properly regulate the machines doing the watching?

You May Also Like

News

Image Credit: Max Velocity - Severe Weather Center