This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).
Efforts to safeguard children from harmful content and online predators continue to grow, as regulators and lawmakers say companies – from social media networks to gaming platforms – have failed to protect minors from grooming, exploitation and exposure to inappropriate material.
The debate has sparked a wave of lawsuits from parents and state attorneys general who say tech companies have not done enough to keep children safe online. One company at the center of several of those lawsuits: Roblox.
“Some of those statements we do reject categorically. But at the same time, we are all unified in keeping kids safe,” Roblox CEO David Baszucki told FOX Business. “Everyone in the world wants kids to be safe.”
Baszucki didn’t admit wrongdoing. He argued that the issue of online child safety extends far beyond the Roblox platform.
“We think it’s an industry issue,” Baszucki said, adding that many parents “don’t know their younger child has access not just to Roblox, but a lot of apps where you can share images, where you can text freely.”
Still, Roblox, in particular, has been named in several lawsuits that allege that the platform is an environment where predators could target vulnerable children.
ROBLOX CEO INSISTS PLATFORM IS SAFE FOR CHILDREN DESPITE LAWSUITS OVER ONLINE PREDATORS
The platform allows users to play and create games, which are often called “experiences,” and is open to children of all ages. In total, the company has 111.8 million daily active users. Roughly 39.7 million of whom are under the age of 13, though Roblox’s website states that the company has stronger privacy settings for them. For instance, access to certain features will be restricted in the interest of protecting their personal information, according to the website, which noted that the company is “especially committed to protecting the privacy of children.”
| Ticker | Security | Last | Change | Change % |
|---|---|---|---|---|
| RBLX | ROBLOX CORP. | 118.51 | -15.23 | -11.39% |
LOUISIANA SUES ONLINE GAMING PLATFORM ROBLOX FOR ALLEGEDLY ENABLING CHILD PREDATORS
A Kentucky mother filed a lawsuit in the Eastern District of Kentucky last week on behalf of her 13-year-old daughter who she said died by suicide after months of manipulation and extremist grooming on the gaming platform Roblox and the messaging app Discord.
A California mother filed a lawsuit against Roblox and Discord in September, alleging that her 15-year-old autistic son, who died by suicide, had been groomed and abused on the platforms when he was as young as 12 years old. In this suit, the mother alleged that her son was targeted by an adult sex predator who was able to use Roblox to pose as a child and befriend her son. The predator then encouraged the 15-year-old to turn off parental controls and move the conversation to Discord, where he coerced the teen into sharing explicit images.

FOX Business reached out to Discord for comment.
A Roblox spokesperson said that the company is deeply saddened by the incident, but highlighted that the platform does not allow users to share images or videos, and none of the images involved were shared on the platform.
META ADDS TEEN SAFETY FEATURES TO INSTAGRAM, FACEBOOK
States are also taking action. Earlier this month, Florida Attorney General James Uthmeier launched a criminal probe into Roblox, characterizing the platform as a “breeding ground” for predators.
Roblox argued Uthmeier’s claims about Roblox are false, and said that the suggestion that illicit image sharing is happening on the platform “demonstrates a lack of understanding of our platform’s functionality.”
Kentucky brought a civil lawsuit against Roblox, alleging the massive gaming platform is not safe for children. Attorney General Russell Coleman alleged in the lawsuit, filed in Kentucky Circuit Court, that Roblox has insufficient guardrails for children and therefore exposes them to child predators, violence and sexually explicit material.
Baszucki said the company implemented text filtering on all communications, blocks users from sharing images such as selfies and monitors for critical harms. When the company detects bad actors, Baszucki said it works with law enforcement to remove them from the platform.
“In the last year, we’ve brought to our customers and all the users on Roblox, over a hundred new safety innovations,” Baszucki said. “We are already continuously innovating. We believe age estimation is the next phase of that.”
Roblox announced a plan in September to expand age estimation to all Roblox users who access the on-platform communication features by the end of this year using facial age estimation technology, ID age verification and verified parental consent. This process, according to Roblox, “will provide a more accurate measure of a user’s age than simply relying on what someone types in when they create an account.”
Baszucki believes this will “set the standard” for the industry, allowing the company to better control who communicates with whom on the platform.
“We believe there’s a great opportunity to work with attorneys general, and really lead the industry and in what we think is really an industry issue,” he added.
The company teamed up with the Attorney General Alliance to create a new initiative, dubbed Youth Online Safety, that is dedicated to strengthening online safety for children.
Read the full article here


