Connect with us

Hi, what are you looking for?

News

Roblox under fire for removing “vigilantes” who were catching predators on the platform

Image Credit: Roblox / Shawn Ryan Clips

Roblox under fire for removing “vigilantes” who were catching predators on the platform
Image Credit: Roblox / Shawn Ryan Clips

On the Shawn Ryan Show, host Shawn Ryan asks a blunt question most parents probably never even think to ask:

where are online predators actually hunting kids?

His guest, ethical hacker and Sentinel Foundation CTO Ryan Montgomery, doesn’t hesitate.

He lists Roblox, Minecraft, Instagram, TikTok, Snapchat – and he keeps coming back to Roblox as the biggest problem.

Montgomery tells Ryan that Roblox is now the largest children’s game in the world, with tens of millions of kids playing every day.

To him, that kind of scale makes the platform a magnet for predators who want easy access to children.

What really sets him off, though, isn’t just that predators operate there.

It’s Roblox’s decision to ban “vigilantes” who were helping expose them.

The Vigilante Catcher Known As “Schlepp”

Montgomery explains that one young creator, a YouTuber who goes by “Schlepp,” had been groomed himself as a kid on Roblox.

Later, he turned that experience into a mission to catch predators on the platform.

According to Montgomery, Schlepp posed as a kid inside Roblox, confronted adults who appeared to be chasing minors, and then worked with media and law enforcement.

The Vigilante Catcher Known As “Schlepp”
Image Credit: Shawn Ryan Clips

Montgomery says he personally saw six mugshots linked to arrests that he believes came from Schlepp’s work.

That’s when, he claims, the hammer came down.

He tells Ryan that Roblox sent Schlepp a cease-and-desist letter and then published a press release about removing “vigilantes” from the platform.

Shawn Ryan is openly stunned. He calls it “unbelievable” that, in his view, the biggest kids’ game in the world would punish people trying to remove predators instead of rewarding them.

Montgomery goes even further. He accuses Roblox of wanting those people on the platform because, in his view, that activity still drives engagement and profit.

To be clear, that’s his opinion, not a proven fact.

But it’s exactly the kind of explosive claim that has made this story blow up outside tech and gaming circles.

Inside Roblox’s Official Justification

On the other side is Roblox’s own Chief Safety Officer, Matt Kaufman, who lays out the company’s position in a detailed blog post titled “More on our Removal of Vigilantes From Roblox.” 

Inside Roblox’s Official Justification
Image Credit: Roblox

Kaufman admits right up front that these vigilantes may look “well-intentioned.”

But he says their methods “created an unsafe environment for users.”

According to Kaufman, Roblox observed vigilantes doing many of the same things predators do:

  • impersonating minors,
  • approaching other users, and
  • steering them off-platform to have sexually explicit conversations. 

That kind of behavior, even if done “for a sting,” is a direct violation of Roblox’s Terms of Use.

Kaufman argues that normalizing that kind of chat — even as a setup — makes the platform more dangerous, not safer.

He also points out that Roblox processes over 6 billion chat messages per day and more than a billion user reports per year, across more than 100 million daily active users. 

In that environment, he says, Roblox relies on structured reporting tools, not YouTube-style predator exposés.

The company says it uses those tools to gather hidden metadata – object IDs, avatar IDs, context – that screenshots or vigilante videos simply don’t capture.

That information is what Roblox claims allows them to quickly remove accounts and share usable material with law enforcement when needed. 

Kaufman also notes that Roblox reported 24,522 incidents to the National Center for Missing and Exploited Children in 2024. 

Those numbers are meant to show that the company is not ignoring abuse, even if the public can’t see every action it takes.

Two Very Different Stories About “Safety”

Listening to Ryan and Montgomery, you get one very stark picture.

In their telling, Roblox is cashing in on danger while silencing people who shine light on it.

Montgomery describes disturbing examples:

  • servers where players openly reference child sexual abuse,
  • groups tied to the “764” grooming cult,
  • and even user-made “assassinate Charlie Kirk” simulators that children can join with a click.

He says kids can buy in-game shirts or skins showing political figures shot and bleeding, and that Roblox still collects a 30% cut on those purchases through its Robux system.

Two Very Different Stories About “Safety”
Image Credit: Shawn Ryan Clips

To him, that’s proof the platform profits from some of the worst content imaginable.

Again, those are Montgomery’s allegations based on what he says he has personally documented as an investigator and ethical hacker.

He spends his days infiltrating dark spaces, so he tends to see the ugliest 1% of the internet for a living.

Roblox, through Kaufman, tells a very different story.

The company says it bans sexual content, extremist imagery, and off-platform grooming attempts, and claims its moderation systems and human teams work constantly to remove this stuff when they find it. 

From Roblox’s perspective, the vigilantes crossed a line from whistleblowers into policy violators themselves.

They allegedly delayed reporting, staged their own social-media rollouts, and sometimes encouraged conversations and meet-ups just to catch people later – leaving kids exposed in the meantime.

Kaufman warns that acting on unverified screenshots or edited videos could also let bad actors forge “evidence” and weaponize false accusations.

That’s why, he says, Roblox won’t ban users solely on outside material it can’t independently confirm.

So you essentially end up with two competing safety frameworks:

  • Montgomery’s world, where hard-charging citizen stings are the only thing keeping predators in check.
  • Roblox’s world, where centralized reporting and slow, careful evidence rules are the only way to protect users and avoid abuse of the system.

The Parents Caught In The Middle

The part that really lands like a punch is what this means for parents.

Montgomery tells Shawn Ryan that he doesn’t even game himself.

But his investigations keep dragging him into Roblox servers, Discord chats, and open groups where kids and predators mix in real time.

He says most parents see “little cartoon characters running around the screen” and assume it’s harmless. They hand over a tablet, buy Robux gift cards at Target, and don’t realize that the same in-game chat can be used to funnel their child toward Discord servers or explicit content.

The Parents Caught In The Middle
Image Credit: Shawn Ryan Clips

He’s brutally honest about how he’d handle it in his own home.

If he has kids, he says, they won’t be allowed on Roblox or similar open-chat games at young ages.

Roblox, for its part, is basically telling those same parents: We’re handling this. Help us by reporting.

Kaufman encourages users to rely on the in-game “Report Abuse” tools, profile reporting, and a special Trusted Flagger program for vetted organizations. 

The company’s message is: don’t run your own stings.

Don’t play pretend-minor in sexually explicit chats.

Don’t build a YouTube show around vigilante operations.

Instead, hit the report button and let corporate moderation and law enforcement handle it behind the scenes.

If you’re a parent, that pitch either feels reassuring… or incredibly hollow.

And that’s where most of the public anger is coming from.

Vigilantes, Liability, And Who Gets Blamed If Something Goes Wrong

There’s also a very real legal angle here.

From a corporate-law standpoint, companies hate uncontrolled “vigilante” operations on their platforms.

If a self-styled predator hunter impersonates a minor wrongly, pushes a conversation into more graphic territory, or mistakenly exposes an innocent person, there’s a huge liability mess waiting to happen.

That’s the risk Roblox is pointing to.

In Kaufman’s article, he says some vigilante groups held back reports while they organized real-life meet-ups and social-media content – meaning alleged predators stayed on the platform longer than necessary. 

From Roblox’s view, that’s reckless.

From Montgomery’s view, it’s the only way anything ever changes, because public pressure is what finally forces big platforms to act.

There’s probably some truth on both sides.

Citizen stings can go wrong, and bad ones absolutely exist.

At the same time, if platforms appear slow, opaque, or defensive, it’s no surprise that frustrated users take matters into their own hands.

What Really Needs To Happen Next

What Really Needs To Happen Next
Image Credit: Roblox

Stepping back from the emotions, the Roblox “vigilante” fight points to a bigger question:

Who actually owns child safety online – parents, platforms, or independent watchdogs?

Shawn Ryan and Ryan Montgomery are basically telling parents:

don’t trust platforms, don’t assume moderation is working, and seriously reconsider letting your kids roam in open chat worlds like Roblox.

Matt Kaufman is telling the same parents:

we know we’re not perfect, but we’re investing heavily in safety, we’re working with NCMEC and law enforcement, and we need you to report through official channels, not run your own sting operations.

Honestly, both things can be true at once.

Platforms do have huge responsibilities, and Roblox’s scale means it must treat this as a top-tier priority.

At the same time, parents can’t outsource all of their vigilance to a corporate safety team and hope for the best.

The healthiest path is probably a mix of:

  • Parents learning what Roblox really is, not just what it looks like in commercials.
  • Platforms being radically more transparent about what they remove, how fast, and why.
  • Watchdogs working with companies as vetted partners instead of trying to out-moderate them on social media.

Right now, though, Roblox is under fire because it feels like the only people who actually faced consequences in public were the ones exposing predators, not the predators themselves.

Until parents see that imbalance change, this controversy isn’t going away.

You May Also Like

News

Image Credit: Max Velocity - Severe Weather Center