Family Sues Roblox, Discord After Child Raped By Predatory User

Parents assume online games for kids such as Roblox are safe, but a new lawsuit claims the apps give … More
Most Parents are aware of concerns about social media and children, but often perceive games marketed to children like Roblox as safe. A lawsuit filed in California on July 17, 2025 alleges that Roblox misrepresents itself as safe while allowing a platform that makes children easy prey for pedophiles.
The lawsuit was filed against Roblox and Discord on behalf of an 11-year-old girl, identified as Jane Doe R. M., of Miami-Dade County, Florida. According to the lawsuit, her mother allowed her to use the apps Roblox and Discord because she believed there were proper safeguards in place. While using the apps, the young girl was targeted by a child predator, who groomed her over time and eventually convinced her to meet in person. In April 2022, the predator drove to the child’s grandfather’s house in Florida, lured her into his vehicle, then took her to a nearby neighborhood where he violently raped her. The predator was later convicted of his crimes.
Roblox, Discord And Explicit Content
This isn’t just one tragic story. In May the San Antonio Express reported on a Texas girl who was allegedly raped by a man who groomed her on Roblox and then traveled to her home. In 2022, Fox5 Atlanta reported on a similar case involving a 13-year-old Kansas girl.
As lead attorney on the lawsuit Matthew Dolman comments, “In allowing apps of these kinds to operate with virtually no restrictions, monitoring or age verification processes in place, we open ‘worlds’ designed for children to depraved individuals with unimaginable intentions and all the resources at their fingertips.” The lawsuit alleges that the apps actively misrepresent themselves as safe to parents.
The court filing outlines a broader failure within Roblox’s digital playground, alleging widespread graphic sexually explicit material as well as reports of avatars engaging in sexual activities in “condo games,” virtual bathrooms and strip clubs. Additionally, the law group states its investigation uncovered hundreds of games with names referencing child sex trafficking themes inside the Roblox platform with names such as “Diddy Party,” “Survive Diddy,” “JeffEpsteinSupporter” and “Escape to Epstein Island.”
Play Puzzles & Games on Forbes
Meanwhile, the lawsuit also names Discord as playing a role in giving predators access to children. Discord is a communication app that links to many popular video games. For Roblox communities, Discord is a place where users can move their interactions and offers chat, voice, video, photo and text messaging.
Roblox And Child Safety
Roblox is an extremely popular gaming platform for children and teens, reporting 97.8 million daily active users. Common Sense Media, a non-profit organization that provides information and resources to help families make informed decisions about media and technology, reports that Roblox has more than 32 million daily players under age 13. But Roblox does not have strict age restrictions, so kids under 13 can play alongside teens and adults.
According to a page on Roblox’s website that explains parental controls to kids, “When you create a Roblox account, the age you select helps us know what content to show you on Roblox.” The page goes on to explain to kids that users under 13 need parent permission to access certain chat features, while those under 9 need it to access experiences with “moderate” content maturity. At age 13, the user may now add their phone number, and add friends and join group chats without parental permission. Further, parents are no longer able to manage privacy settings, screen time limits or how much money their child spends in the game.
Regarding the lawsuit, a Roblox spokesperson declined to comment on pending litigation but emphasized the company’s commitment to child safety. They emphasized Roblox’s investment in advanced safety technology and 24/7 human moderation to detect and address inappropriate content, including efforts to share personal information or move conversations off-platform. The spokesperson also pointed to resources for parents provided on the Roblox website.
Discord and Child Safety
Discord requires users to be at least 13 years old, but it’s age verification relies on users self-reporting their age by entering a birthdate. Only if a child is reported as being under 13 will Discord lock down their account. Common Sense Media’s Kids Review page is full of reviews by teens warning parents about the racism and sexually explicit material they’ve encountered on Discord. 13-year-old Riley writes, “please please please do not let your kid get discord. The amount of sexual assault, grooming and awful pictures/ videos of dead and dismembered body parts is disturbing.”
A Discord Spokesperson declined to comment on ongoing litigation, but pointed to a deep commitment to user safety, by removing content, banning users, shutting down servers, and involving law enforcement when needed. In addition, the spokesperson pointed to safety tools for teens and guardians which can be found on their website.
Parental Oversight Essential to Keep Kids Safe on Roblox
Parents cannot rely on an app’s safety claims alone. Roblox markets itself as a child-friendly platform, but this lawsuit highlights how much content may fly under the radar. Parents should not assume the moderation apps deploy is enough. Instead:
Keep access to your children’s devices, including your young teenagers. If your children argue, remind them that you pay for the phone, tablet, gaming system and wifi, and you have a right to monitor them.
- Set up a parent account and use it to explore the platform your child is using.
- Visit your child’s “experiences” and chat history frequently.
- Learn how the parental controls work—and don’t stop there. Assume they’re imperfect.
- Keep all the communication in one place. Predators often start grooming on one platform and shift to another to avoid detection. That’s exactly what happened here: from Roblox to Discord.
- Encourage your child to keep all conversations inside the platform you can monitor—or better yet, limit or turn off chat functions altogether.
- If your child uses Discord or similar apps, consider banning direct messages from non-friends, and require you to approve new contacts.
Grooming can be subtle. Kids may not recognize it as manipulation because it feels like friendship. Parents can access tools to keep their kids safe at the National Sexual Violence Resource Center’s Keeping Kids Safe Online toolkit.