What started as a simple tech experiment by one man in Spain turned into something far more unsettling: access to roughly 7,000 robot vacuums spread across 24 countries, along with live video feeds, home layouts, microphone access, and remote controls that could have turned those machines into rolling surveillance devices inside private homes.
That is how journalist Roman Balmakov laid out the story in a recent episode of Facts Matter, where he said the case is about more than one robotic vacuum brand. In his view, it exposes a much bigger weakness in the modern smart-home world, where cameras, microphones, and cloud-connected devices are now sitting in millions of houses under the assumption that no one else can see through them.
Balmakov’s warning was blunt. People keep filling their homes with devices that listen, map, monitor, and watch, while trusting that the system on the other side is secure.
This case suggests that trust may be doing a lot of heavy lifting.
A Personal Tinkering Project Turned Into A Global Privacy Breach
Roman Balmakov said the story first came to him through an article in Popular Science about a Spanish man named Sammy Asdafal.
According to Balmakov, Asdafal had bought a DJI-brand Romo vacuum robot and was frustrated with the official app. Rather than live with the clunky controls, he wanted to do something more creative: use a PS5 controller to drive the vacuum around like an RC car.

That sounds like the sort of harmless experiment a tech-savvy hobbyist might try on a weekend.
But Balmakov said Asdafal was not just a random guy with no background. He leads AI strategy at a vacation home rental company, and Roman noted that he also had an AI chatbot assistant helping him with the coding side of the project.
That detail matters because it shows how much easier this kind of probing has become. You no longer need to be some elite hacker in a basement writing every line from scratch. In Balmakov’s telling, this man used a public AI coding assistant to help reverse engineer how the vacuum talked to DJI’s cloud servers.
And then things got weird.
The Credentials Worked On Thousands Of Other Robots
Balmakov said the key discovery came when Asdafal’s app tried to communicate with DJI’s remote servers and extract a security token proving he owned his own vacuum.
But instead of validating him for just his device, the system reportedly treated him as if he had legitimate access to thousands of others.
Roman quoted the report describing how the same credentials that let Asdafal see and control his own unit also provided access to live camera feeds, microphone audio, maps, and status data from nearly 7,000 different vacuums in 24 countries.
That is the kind of failure that makes your stomach drop a little.
This was not merely a bug that caused a vacuum to stop cleaning a corner properly. According to Balmakov’s account, it meant one person could essentially become a fly on the wall inside homes all over the world.
He said Asdafal could see video feeds, activate microphones, compile 2D floor plans of homes, view approximate device locations through IP addresses, and even remotely control the robots themselves.
At that point, the issue stops being about convenience and becomes about private living spaces being exposed from the inside out.
These Were Not Cheap Toys Sitting In A Closet
Balmakov spent some time explaining why these particular robots matter.
He said the vacuum involved was a DJI Romo, made by the Chinese company better known for civilian drones. He described the device as a high-end smart vacuum retailing for around $2,000, not some throwaway gadget from a bargain bin.

Like many modern robot vacuums, it maps a house using sensors and visual data, figures out what kind of room it is in, and stores at least some of that data remotely on company servers rather than only on the machine itself.
That cloud link is where the convenience comes from, but it is also where the risk lives.
Balmakov quoted reporting that explained how the robot needs to constantly collect visual information from the home in order to do its job. It also has to understand spatial details, distinguish a kitchen from a bedroom, and build a virtual layout to navigate.
In other words, this is not just a vacuum. It is a moving sensor platform.
And once that is true, a security failure does not just expose cleaning patterns. It exposes the inside of someone’s home.
Roman Says The Bigger Story Is Not The Vacuum Itself
One of the strongest points Balmakov made was that the robot vacuum is almost beside the point.
Yes, the DJI flaw is alarming on its own. But as he framed it, the more important takeaway is what this says about all the other smart devices people now accept as normal household items.
Balmakov rattled off the list: home camera systems, voice assistants listening for wake words, smart refrigerators, smart watches, smart glasses, smart appliances, and anything else with a camera or microphone attached to the internet.
His argument was that people often behave as though these devices are passive tools. But they are really always-on collection systems that rely on remote infrastructure the average user does not understand and cannot audit.
That may sound a little grim, but honestly it is hard to say he is overreacting when a regular user trying to customize his own vacuum can accidentally gain access to thousands of homes in dozens of countries.
If that can happen by accident, it raises a much darker question about what can happen on purpose.
DJI Says It Fixed The Problem, But Not All Of It
Balmakov said that fortunately, Asdafal was not a malicious actor and reported the problem.
According to Roman, Asdafal shared the findings with a reporter at The Verge, who then contacted DJI. The company responded by saying it had identified the vulnerability during internal review, moved to fix it quickly, and pushed out two automatic software updates so owners did not need to do anything on their own.
That sounds reassuring at first.

But Balmakov quickly added that, according to The Verge’s reporting, the story did not end there. He said Asdafal claimed DJI still had not fixed every vulnerability he found.
Roman highlighted one unresolved issue involving the ability to view a user’s own DJI Romo video stream without needing its security PIN. He also said there was another vulnerability so serious that Asdafal would not describe it publicly until DJI had more time to patch it.
That is where the whole thing gets more unnerving.
Because once a company says it fixed the issue, most people mentally move on. But if major holes remain, the device may still be safer in theory than in reality.
The Smart-Home Dream Starts Looking A Lot Less Comfortable
Balmakov widened the lens even further by bringing up another recent report, this one involving Meta smart glasses.
He cited claims from workers in Kenya who said they could see deeply personal videos collected through those devices, including bank details, sexual content, and naked people who seemed unaware they were being recorded.
Roman used that example to argue that the vacuum case is not some isolated one-off. Instead, he said it is part of a broader pattern where the public keeps adopting connected products while underestimating how much human access may sit behind the system.
That point lands because it cuts against the cozy marketing pitch behind smart technology.
People are sold convenience, automation, and sleek design. They are not usually sold the mental picture of strangers, contractors, hackers, or internal reviewers potentially peering through their devices.
But that picture is becoming harder to dismiss.
Balmakov even brought up the old viral image of Mark Zuckerberg with tape covering his laptop camera, arguing that if someone at that level feels the need to do that, regular users should probably pay attention.
That may be a little theatrical, but it is also a point many people quietly understand. The people closest to the technology often seem the least casual about it.
The Real Nightmare Question Is Who Else Could Do This
Toward the end of his report, Balmakov asked what may be the most disturbing question in the whole story.
If one man using a publicly available AI chatbot could stumble into control of 7,000 devices across 24 countries, what about well-funded bad actors, organized cybercriminals, or state intelligence services with teams of highly trained specialists?
That is where the story stops feeling quirky and starts feeling strategic.

Roman mentioned countries such as China, Iran, Russia, Israel, Japan, and the United Kingdom, asking how many intelligence services around the world might be trying to access remote video feeds or similar streams from household devices.
His point was not that all of them are definitely doing this in exactly the same way. His point was that the attack surface exists, and ordinary people are helping build it one camera-equipped gadget at a time.
That is the uncomfortable truth underneath this story. The vacuum was just the visible example.
The deeper issue is that millions of people now live with networked cameras and microphones in their homes, and most of them are trusting companies, software systems, and security layers they will never really see.
If It Has A Camera, It Can Become A Window
By the time Roman Balmakov finished the story, his conclusion was pretty clear.
If a device has a camera, there is at least the possibility that someone else could access what it sees. If it has a microphone, there is at least the possibility someone else could hear what it hears. And if it maps a room, then it may be building a digital blueprint of the inside of your house for systems you do not control.
That is not paranoia. It is just the logical consequence of how these products work.
Balmakov did not present the vacuum incident as a bizarre fluke to laugh off. He treated it as a warning that the smart-home future people bought into has a serious dark side, especially when cloud infrastructure, AI tools, and weak security practices all collide.
The Spanish user in this story appears to have been a white-hat type who reported what he found. That is the lucky part.
The unlucky part is what his accidental discovery revealed: inside some of the most private spaces people have, the wall between “my device” and “someone else’s access” may be a lot thinner than most users ever imagined.

Gary’s love for adventure and preparedness stems from his background as a former Army medic. Having served in remote locations around the world, he knows the importance of being ready for any situation, whether in the wilderness or urban environments. Gary’s practical medical expertise blends with his passion for outdoor survival, making him an expert in both emergency medical care and rugged, off-the-grid living. He writes to equip readers with the skills needed to stay safe and resilient in any scenario.

































