Roblox, Discord sued after 15-year-old boy was allegedly groomed online before he died by suicide

Posted by Marlene Lenthang | 8 hours ago | News | Views: 24



The mother of a 15-year-old California boy who took his own life is now suing Roblox and Discord over his death, alleging her son was allegedly groomed and coerced to send explicit images on the apps.

Rebecca Dallas filed the lawsuit Friday in San Francisco County Superior Court accusing the companies of “recklessly and deceptively operating their business in a way that led to the sexual exploitation and suicide” of Ethan Dallas.

Ethan was a “bright, imaginative boy who loved gaming, streaming and interacting with friends online,” the lawsuit states.

He started playing on the online gaming platform Roblox around the age of 9, with his parents’ approval and with parental controls in place. When he was 12, he was targeted by “an adult sex predator” who posed as a child on Roblox and befriended Ethan, attorneys for Rebecca Dallas said in a statement.

What started out as innocent conversation “gradually escalated to sexual topics and explicit exchanges,” the complaint says.

After a while, the man encouraged Ethan to turn off parental controls and move their conversations to Discord, the lawyers said.

On Discord, the man “increasingly demanded explicit photographs and videos” and threatened Ethan that he’d post or share the images. Ethan complied out of fear, the complaint says.

“Tragically, Ethan was permanently harmed and haunted by these experiences, and he died by suicide at the age of 15,” the complaint said.

The lawsuit accuses Roblox and Discord of wrongful death, fraudulent concealment and misrepresentations, negligent misrepresentation, and strict liability.

It argues that had Roblox and Discord taken steps to screen users before allowing them on apps, or implemented age and identity verification and other safety measures, “Ethan would have never interacted with this predator, never suffered he harm that he did, and never died by suicide.”

Apps not safe for kids, suit says

Dallas, of San Diego County, thought both platforms were safe for her son to use to communicate with friends while gaming, given how the apps marketed themselves and the parental controls she set, the suit contended.

Roblox is used daily by 111 million people, according to its website, offering a variety of games, obstacle courses, and the ability to chat with other users. It is free to make an account and there is no age minimum, nor required age verification.

Discord, launched in 2015, is a communications platform commonly used by gamers who want to chat or video chat while playing video games. The suit said that the app doesn’t verify age or identity.

The suit claims Roblox allowed Ethan to turn off the parental controls and Discord allowed him to create an account and communicate with adults without any parental oversight. It said that while Roblox states children must have parental permission to sign up, “nothing prevents them from creating their own accounts and playing on Roblox.”

The suit alleges the two apps misrepresented safety on their platforms, saying the design of the apps “makes children easy prey for pedophiles” due to a lack of safeguards and predator screening.

After Ethan’s tragic death, his family learned from law enforcement that the man who groomed him had been arrested in Florida “for sexually exploiting other children through Defendants’ apps,” the complaint said.

Today, Roblox’s default settings do not allow adults to directly message children under the age of 13, but children can still create accounts with fake birth dates giving them full access to direct-messaging options, the complaint said.

“We are deeply saddened by this tragic loss. While we cannot comment on claims raised in litigation, we always strive to hold ourselves to the highest safety standard,” a spokesperson for Roblox told NBC News.

Roblox said it is designed with “rigorous built in safety features” and is “continually innovating new safety features — over 100 this year alone — that protect our users and empower parents and caregivers with greater control and visibility.”

Safety efforts include processes to detect and act on problematic behaviors and 24/7 human moderation. Roblox added that the company partners with law enforcement and leading child safety and mental health organizations worldwide to combat the sexual exploitation of children.

While Discord has settings to keep minors safe such as automatically scanning messages for explicit images and videos, the suit said Discord is “overflowing with sexually explicit images and videos involving children, including anime and child sex abuse material.”

Discord said it doesn’t comment on legal matters but said the platform is “deeply committed to safety.”

“We require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies,” a spokesperson said. “We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet.”

Other allegations against Roblox, Discord

Anapol Weiss, the firm that filed Dallas’ suit, noted this is the ninth lawsuit it has filed in connection with allegations that children were groomed, exploited or assaulted after contact on Roblox or related platforms.

The National Center on Sexual Exploitation in 2024 complied a “Dirty Dozen” list of mainstream entitles it says facilitate, enable and profit from sexual abuse and exploitation. It included Discord, saying “this platform is popular with predators seeking to groom kids and with creeps looking to create, trade or find sexually abusive content of children and unsuspecting adults,” and Roblox, saying children are exposed to sex-themed games and exposed to predators.

An NBC News investigation in 2023 found 35 cases over the six years prior in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord.

In August, Louisiana’s top prosecutor sued Roblox, alleging that its failure to implement strong safety protocols for children has made it “the perfect place for pedophiles.”

“This case lays bare the devastating consequences when billion-dollar platforms knowingly design environments that enable predators to prey on vulnerable children,” said Alexandra Walsh, a partner at Anapol Weiss. “These companies are raking in billions. Children are paying the price.”

Dallas seeks a jury trial and compensatory damages.

If you or someone you know is in crisis, call or text 988, or go to 988lifeline.org, to reach the Suicide & Crisis Lifeline. You can also call the network, previously known as the National Suicide Prevention Lifeline, at 800-273-8255, or visit SpeakingOfSuicide.com/resources.



NBC News

Leave a Reply

Your email address will not be published. Required fields are marked *