Lawsuit Alleges Child Predator Exploited Roblox and Discord
A lawsuit filed in California claims that Roblox and Discord enabled a predator to groom and sexually assault an 11-year-old girl by failing to moderate content and user interactions effectively. The 105-page complaint states that both platforms hosted or allowed access to graphic, adult-themed content referencing Sean “Diddy” Combs and Jeffrey Epstein. This material was reportedly accessible to children. The lawsuit seeks damages based on allegations of negligence and unsafe platform environments.
According to the complaint, users were able to access games titled “DIDDY SURVIVAL,” “Midnight at Diddy’s Party,” and “Escape to Epstein Island.” These games often included stylized depictions of the two men. Screenshots presented in the lawsuit and cited in a prior investigative report show avatars fleeing from a demonic figure resembling Epstein or interacting in virtual spaces inspired by real-world criminal allegations. Roblox has since removed the content, and searches for the names “Diddy” or “Epstein” on the platform now produce no results.
The lawsuit also alleges that the predator initially contacted the child on Roblox and then continued grooming her on Discord. The lack of effective safeguards, according to the filing, allowed the communication to proceed without intervention. The legal team describes this as part of a pattern of serious misconduct and raises broader concerns about child protection on digital platforms.
Research Links Games to Real-World Criminal Allegations
The lawsuit references a 2024 report from Hindenburg Research, which investigated the presence of explicit and harmful content on Roblox. As part of the study, researchers created child accounts and searched for the names of individuals linked to criminal cases. Using an account registered to a nine-year-old, they found more than 600 games referencing Sean Combs and over 900 accounts using variations of Jeffrey Epstein’s name. One account was reportedly named “JeffEpsteinSupporter.”
Examples included in both the lawsuit and the research report highlight games such as “Survive THE DIDDY in Area51” and “Freaky Diddy Simulator.” These games appeared to trivialize serious crimes and present them as forms of entertainment. One Epstein-themed game depicted a child avatar attempting to escape from a fire-covered monster with Epstein’s face. Although most of these games received little mainstream media coverage at the time, researchers preserved screenshots that now appear as evidence in the legal filing.
In response, Roblox stated that the content had already been removed prior to the lawsuit’s filing. A company spokesperson told 404 Media that Roblox remains committed to community safety and uses a combination of machine learning, automated systems, and thousands of human moderators to review activity around the clock. The company also emphasized that its Community Standards prohibit content that portrays real-world criminal events and encouraged users to report violations using the platform’s built-in Report Abuse feature.
Broader Safety Issues Cited in Legal Filing
The lawsuit presents a wider argument about systemic moderation failures. It states that Roblox has faced ongoing problems with inappropriate and predatory content, particularly in areas referred to as “condo games.” These are user-created virtual spaces that simulate adult environments such as strip clubs or private bathrooms, where avatars can engage in sexually suggestive behavior. According to the complaint, children as young as nine years old were able to enter and interact in these spaces.
In the case of the plaintiff, identified in court documents as Jane Doe, the predator reportedly used Roblox to make initial contact and then shifted the conversation to Discord. The lawsuit argues that Discord failed to intervene despite prior criticism regarding safety issues on its platform. Both companies are accused of contributing to the abuse through poor moderation, insufficient oversight, and inadequate enforcement policies.
Neither Roblox nor Discord has commented on the specific claims beyond Roblox’s general response. However, the lawsuit raises serious questions about how platforms that rely on user-generated content can protect minors. Roblox reports having more than 70 million daily active users, many of them children. The outcome of this legal case may influence future industry standards concerning moderation practices, corporate liability, and child safety in digital environments.