According to engadget, Texas Attorney General Ken Paxton has filed a lawsuit against Roblox alleging the platform violated state and federal safety laws by exposing children to “sexually explicit content, exploitation and grooming.” Paxton specifically accused Roblox of prioritizing “pixel pedophiles and corporate profit” over child safety in his announcement on X. The lawsuit comes after similar actions by Kentucky and Louisiana, with Louisiana pointing to user-created experiences like “Escape to Epstein Island” and “Diddy Party” despite most users being under 16. Roblox responded that it’s “disappointed” Paxton chose litigation over collaboration and called his claims “misrepresentations and sensationalized.” The company has implemented several safety measures including blocking users under 13 from unrated experiences and restricting DMs for younger users, with an age estimation feature requiring video selfies rolling out to everyone by year’s end.
The platform safety dilemma
Here’s the thing about Roblox’s situation – they’re caught in what seems like an impossible position. They’ve implemented age restrictions, limited messaging for younger users, and are rolling out video selfie verification. But is any of that actually enough when you’re dealing with determined predators? The Louisiana case revealed that until November 2024, users could initiate voice chats with strangers, and one predator used voice-altering software to pose as a young girl. Basically, every time platforms add safety measures, bad actors find workarounds.
The vigilante controversy
This lawsuit gets even more interesting when you consider the Schlep connection. Schlep was the Roblox user running predator sting operations similar to “To Catch a Predator” – and he got banned in August for violating Roblox’s new anti-vigilante rules. Now he’s cheering Paxton’s lawsuit in the comments. So Roblox banned someone who was actually catching predators on their platform, which looks pretty bad when you’re being sued for not doing enough about predators. It’s a no-win situation for them either way.
This isn’t isolated
Look, Texas going after Roblox isn’t some random outlier. Paxton has previously sued TikTok over parental controls and Meta and Character.AI over data misuse with underage users. Multiple states are clearly coordinating on this child safety crackdown. And when you’ve got Kentucky, Louisiana, and now Texas all filing similar lawsuits, that’s a pattern that suggests Roblox has some serious explaining to do. The platform’s defense about “industry-wide challenges” sounds reasonable until you consider that other gaming platforms aren’t facing this level of coordinated legal action.
What comes next?
Roblox is walking a tightrope here. They need to maintain their reputation as a safe space for kids while fighting lawsuits that claim they’re anything but. Their age verification system using video selfies sounds thorough, but will it be enough to satisfy regulators? And more importantly, can any technical solution actually prevent determined predators from finding ways around it? This feels like the beginning of a much longer battle over what responsibility platforms actually bear for user behavior. One thing’s for sure – this lawsuit isn’t going away quietly, and other states are probably watching closely to see how it plays out.
