Google, OpenAI, Roblox, and Discord have created a new nonprofit organization to help improve children’s online safety. The Robust Open Online Safety Tools (ROOST) initiative aims to make basic safety technologies more accessible to companies and provide free, open-source artificial intelligence tools to detect, verify, and report child sexual abuse material.
According to ROOST founding partner and former Google CEO Eric Schmidt, the initiative was partly motivated by the changes that generative AI has brought to the online environment and aims to address “the critical need to accelerate innovation in child safety online.” Details about CSAM’s detection tools are sparse, except that they will use large AI language models and “unify” existing content options.
“Starting with a platform focused on child protection, ROOST’s collaborative, open-source approach will foster innovation and make the underlying infrastructure more transparent, accessible, and inclusive, with the goal of creating a safer internet for all,” Schmidt said.
ROOST’s announcement comes amid a huge regulatory battle over child safety on social media and online platforms, with companies trying to appease lawmakers through self-regulatory methods.
The National Center for Missing and Exploited Children (NCMEC) reports that the number of suspected child exploitation cases increased by 12 percent between 2022 and 2023. As of 2020, more than half of American children were users of Roblox, and the company has been repeatedly criticized for failing to address child sexual exploitation and the spread of inappropriate content on its platform. Roblox and Discord were also featured in a lawsuit against the social media platforms filed in 2022, which claimed that the platforms failed to prevent adults from messaging unsupervised children.
ROOST’s founding members provide funding in various ways and offer their tools or expertise to the project. ROOST says it is collaborating with leading developers of fundamental AI models to create a “community of practice” for content protection, which will include providing validated datasets for AI training and identifying security gaps.
The initiative says it will make “tools that already exist” more accessible by effectively combining different detection and reporting technologies from member organizations into a unified solution that will be easier for other companies to implement. Naren Koneru, Roblox’s vice president of engineering, trust and security, told Fast Company that ROOST can host artificial intelligence moderation systems that companies can integrate through API calls. However, there is some ambiguity about what exactly ROOST’s AI moderation tools will include.
For example, Discord states that its contribution will be based on the Lantern cross-platform information sharing project, which it joined in 2023 along with Meta and Google. It may also include an updated version of the Roblox artificial intelligence model for detecting profanity, racism, bullying, sexting, and other inappropriate content in audio clips, which the company plans to release as open source this year. It is not yet clear how these tools will interact with existing first-line CSAM detection systems, such as Microsoft’s PhotoDNA image analysis tool.
In parallel with its participation in ROOST, Discord has released a new “Ignore” feature that allows users to hide messages and notifications they receive without notifying the people they have muted. “At Discord, we believe that safety is a common good,” said Discord Chief Legal Officer Clint Smith in the ROOST announcement. “We are committed to making the entire Internet, not just Discord, a better and safer place, especially for young people.”
In its first four years of operation, ROOST has raised more than $27 million with the support of philanthropic organizations including the McGovern Foundation, Future of Online Trust and Safety Fund, Knight Foundation, and AI Collaborative. According to the press release, ROOST will also be supported by experts in child safety, artificial intelligence, open source technologies, and “countering violent extremism.”









