
LOS ANGELES (KABC) -- For nearly 20 years, Roblox has kept kids entertained with more than 85 million people using the online gaming platform each day, but two new lawsuits allege the online gaming platform exposed children to sexual predators
"It is being marketed for children and it is not safe," said one woman who did not want to be identified because her 12-year-old daughter was sexually abused by people she met online via Roblox and the messaging app Discord.
She said her daughter found herself chatting on Roblox with a predator who told her to move their conversation to Discord. That's where the mother said the girl was convinced to post inappropriate pictures in chat rooms with grown men pretending to be teenagers.
"She would send her sexually explicit photographs and tell my daughter to do it and she would do it," the woman told Eyewitness News.
That woman filed a lawsuit last month in Los Angeles County Superior Court, accusing both Roblox and Discord of creating an unsafe environment for children which leads to grooming and sexual exploitation.
Attorney Steven Vanderporten is behind that lawsuit as well as another against Roblox in Riverside County where a different 12-year-old girl was sexually abused after chatting with a man who claimed to be a teenager.
"This is far too common on Roblox," Vanderporten said. "They've known about it for years while actively promoting the platform as safe, when in fact they've done the minimum to keep children safe."
In response, Roblox issued a written statement:
"We are deeply troubled by any incident that endangers any user. Roblox aims to build a platform that sets the bar for safety online, and we prioritize the safety of our community. This is why our policies are purposely stricter than those found on many other platforms. We limit chat for younger users, don't allow user-to-user image sharing, and have filters designed to block the sharing of personal information. We also understand that no system is perfect and that is why we are constantly working to further improve our safety tools and platform restrictions to ensure parents can trust us to help keep their children safe online, launching 145 new initiatives this year alone.
We also understand this is an industry-wide issue and we are working to develop industry-wide standards and solutions. For instance, Roblox is implementing an industry-leading policy to help prevent older users from communicating with children by requiring a sophisticated facial age estimation process for all Roblox users who access our communications features. We partner with law enforcement and leading child safety and mental health organizations worldwide to combat the sexual exploitation of children and are a founding member of the Tech Coalition's Lantern Project and the nonprofit Robust Open Online Safety Tools ROOST."
Discord responded to ABC7's request for comment with a written statement as well:
"Discord is deeply committed to safety and we require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies. We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet."
But Vanderporten said these companies are not doing enough, and hiding the dangers from parents.
"Roblox is not a safe platform for your children to be using," he said. "Your child could be communicating with a predator who is attempting to lure them into exploitative situations and possible in-person meetings."