In the year to March 2024, more than 7,000 crimes of sexual communication with a child were recorded in the UK, the highest number since the offense was created.
Snapchat accounted for almost half of the 1,824 cases where police recorded a specific platform being used for grooming.
The NSPCC said that this shows that society is “still waiting for tech companies to make their platforms safe for children”.
Snapchat said it has a “zero tolerance” for the sexual exploitation of young people and is taking additional safety measures for teens and their parents.
Becky Riggs, chair of the National Police Child Protection Board, called the findings “shocking.”
“It is imperative that the responsibility for protecting children online is placed on the companies that create the space for them, and that the regulator strengthens the rules that social media platforms must follow,” she added.
Grooming at the age of 8
The gender of victims of grooming crimes is not always recorded by the police, but in cases where it is known, four out of five victims were girls.
Nicky – whose real name is not used – was eight years old when she received a message on a gaming app from a groomer who suggested she go on Snapchat to chat.
“I don’t need to explain the details, but that conversation had everything you can imagine – videos, photos. Requests for certain materials from Niki and so on,” her mother explained.
She then created a fake Snapchat profile pretending to be her daughter, and the man sent her a message, after which she contacted the police.
Now she checks her daughter’s devices and messages every week, despite her objections.
“It’s my duty as a mother to keep her safe,” she said.
She said that parents “can’t rely” on apps and games to do the job for them.
Snapchat design issues
Snapchat is one of the smallest social networks in the UK, but it is very popular with children and teenagers.
It is “something that adults are likely to use when looking for children,” says Roni Govender, online child safety manager at the NSPCC.
But Ms. Govender says there are also “issues with Snapchat’s design that also put children at risk.”
Messages and images on Snapchat disappear after 24 hours, making it difficult to track incriminating behavior, and senders also know if the recipient has taken a screenshot of the message.
Ms. Govender says the NSPCC hears directly from children who highlight Snapchat as a concern.
“When they report [Snapchat], they are not being listened to and they can also see extreme and violent content on the app,” she told the BBC.
A Snapchat spokesperson said that sexual exploitation of young people is “appalling.”
“If we detect or are made aware of such activity, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report it to authorities,” they added.
Recorded offenses
Cases of recorded grooming have been on the rise since the offense of Sexual Communication with a Child came into effect in 2017, and this year reached a new record high of 7,062 cases.
Of the 1,824 cases reported on the platform last year, 48% were on Snapchat.
The number of reports of grooming crimes on WhatsApp has increased slightly over the past year. On Instagram and Facebook, the number of known cases has reportedly decreased in recent years. All three platforms are owned by Meta.
WhatsApp said it has “robust security measures” in place to protect people using its app.
Jess Phillips, the Minister for Women and Girls Against Violence, said social media companies “have a responsibility to stop this abhorrent violence on their platforms”.
In her statement, she added: “Under the Internet Safety Law, they will have to stop distributing this kind of illegal content on their sites, including private and encrypted messaging services, or they will have to pay significant fines.”
The Law on Internet Safety contains a legal requirement for technology platforms to ensure the safety of children.
Starting in December, major tech companies will be required to publish their risk assessments for unlawful harm on their platforms.
This was stated by the media regulator Ofcom, which will monitor compliance with these rules: “Our draft codes of practice include robust measures to help prevent grooming by making it harder for abusers to come into contact with children.
“We are prepared to use the full extent of our enforcement powers against any companies that fail to do so when the time comes.”