New Jersey Attorney General Matthew Platkin has filed a lawsuit against Discord over its child safety features in chat. The lawsuit claims that Discord “misled parents about the effectiveness of its safety controls and concealed the risks that children face when using the app”.
The Attorney General’s Office and the state’s Office of Consumer Protection concluded that Discord violated the New Jersey Consumer Protection Act after a multi-year investigation into the company’s activities. The details of the lawsuit have not yet been disclosed, but Platkin’s statement indicates several ways in which he plans to prove that Discord’s approach may have put children at risk. According to him, the app uses standard settings that “allow users to receive friend requests from anyone in the app” and also makes it easy for children under 13 to create an account. According to Platkin, Discord “only requires a user to enter their date of birth to establish their age when creating an account.”
Over the years, Discord has implemented numerous features to protect young users. Following a report detailing 35 Discord-related cases in which adults were prosecuted for charges such as “kidnapping, grooming, or sexual assault,” the company introduced the Family Center tool, which allows adults to monitor what their children are doing on the app. The Teen Safety Assist tool, also introduced in 2023, added automatic content filters and a new warning system for people who violate the app’s rules. In 2025, Discord created a non-profit coalition called Roost to develop open source child safety tools.
Discord, like other social platforms, has faced scrutiny before, and it looks like the pressure will only increase. Back in 2024, California lawmakers proposed blocking children’s access to algorithmic social feeds, and this year Utah passed an age verification law for app stores – an apparently crude way to try to keep kids safe.