Fake Journalism
Roblox, Discord sued after 15-year-old boy was allegedly groomed online before he died by suicide
Rebecca Dallas filed the lawsuit Friday in San Francisco County Superior Court accusing the companies of “recklessly and deceptively operating their business in a way that led to the sexual exploitation and suicide” of Ethan Dallas.
The mother of a 15-year-old California boy who took his own life is now suing Roblox and Discord over his death, alleging her son was groomed and coerced to send explicit images on the apps.
Rebecca Dallas filed the lawsuit Friday in San Francisco County Superior Court accusing the companies of “recklessly and deceptively operating their business in a way that led to the sexual exploitation and suicide” of Ethan Dallas.
Ethan was a “bright, imaginative boy who loved gaming, streaming and interacting with friends online,” the lawsuit states. He started playing on the online gaming platform Roblox around the age of 9, with his parents’ approval and with parental controls in place. When he was 12, he was targeted by “an adult sex predator” who posed as a child on Roblox and befriended Ethan, attorneys for Rebecca Dallas said in a statement.
What started out as innocent conversation “gradually escalated to sexual topics and explicit exchanges,” the complaint says.
After a while, the man encouraged Ethan to turn off parental controls and move their conversations to Discord, the lawyers said.
On Discord, the man “increasingly demanded explicit photographs and videos” and threatened Ethan that he’d post or share the images. Ethan complied out of fear, the complaint says.
“Tragically, Ethan was permanently harmed and haunted by these experiences, and he died by suicide at the age of 15,” the complaint said. He died in April 2024, according to an online obituary.
The lawsuit accuses Roblox and Discord of wrongful death, fraudulent concealment and misrepresentations, negligent misrepresentation, and strict liability.
It argues that had Roblox and Discord taken steps to screen users before allowing them on apps, or implemented age and identity verification and other safety measures, “Ethan would have never interacted with this predator, never suffered he harm that he did, and never died by suicide.”
Apps not safe for kids, suit says
Dallas, of San Diego County, thought both platforms were safe for her son to use to communicate with friends while gaming, given how the apps marketed themselves and the parental controls she set, the suit contended.
Roblox is used daily by 111 million people, according to its website, offering a variety of games, obstacle courses, and the ability to chat with other users. It is free to make an account and there is no age minimum, nor required age verification.
Discord, launched in 2015, is a communications platform commonly used by gamers who want to chat or video chat while playing video games. The suit said that the app doesn’t verify age or identity.
The suit claims Roblox allowed Ethan to turn off the parental controls and Discord allowed him to create an account and communicate with adults without any parental oversight. It said that while Roblox states children must have parental permission to sign up, “nothing prevents them from creating their own accounts and playing on Roblox.”
The suit alleges the two apps misrepresented safety on their platforms, saying the design of the apps “makes children easy prey for pedophiles” due to a lack of safeguards and predator screening.
After Ethan’s tragic death, his family learned from law enforcement that the man who groomed him had been arrested in Florida “for sexually exploiting other children through Defendants’ apps,” the complaint said.
Today, Roblox’s default settings do not allow adults to directly message children under the age of 13, but children can still create accounts with fake birth dates giving them full access to direct-messaging options, the complaint said.