close
close

Association-anemone

Bite-sized brilliance in every update

Snapchat app most used for online care, NSPCC says
asane

Snapchat app most used for online care, NSPCC says

Getty Images Thumbs above an illuminated phone screenGetty Images

Messaging app Snapchat is the most used platform for online grooming, according to police data provided to children’s charity the NSPCC.

More than 7,000 offenses of having sex with a child were recorded in the UK in the year to March 2024 – the highest number since the offense was created.

Snapchat accounted for almost half of the cases where the platform used for grooming was recorded by the police.

The NSPCC said it shows society is “still waiting for tech companies to make their platforms safe for children”.

Snapchat told the BBC it has “zero tolerance” for the sexual exploitation of young people and has taken extra safety measures for teenagers and their parents.

Becky Riggs, head of the National Police Council for Child Protection, described the figures as “shocking”.

“It is imperative that the responsibility to protect children online falls on the companies that create spaces for them, and the regulator strengthens the rules that social media platforms must follow,” she added.

Cared for at the age of 8

The gender of victims of grooming offenses was not always recorded by the police, but of the cases where it was known, four out of five victims were girls.

Nicki – whose real name is not being used by the BBC – was eight when she was messaged on a gaming app by a carer who encouraged her to go on Snapchat for a chat.

“I don’t have to go into detail, but whatever you can imagine going on in those conversations – videos, pictures. Requests for certain material from Nicki etc,” explained her mother, whom the BBC calls Sarah.

She then created a fake Snapchat profile pretending to be her daughter and the man sent a message – at which point he contacted the police.

She now checks her daughter’s devices and messages weekly, despite her daughter’s objections.

“It’s my responsibility as a mother to make sure he’s safe,” she told the BBC.

She said parents “cannot rely” on apps and games to do this job for them.

“Snapchat design issues”

Snapchat is one of the smaller social media platforms in the UK – but it is very popular among children and teenagers.

This is “something that adults could exploit when looking to look after children”, says Rani Govender, online policy manager for child safety at the NSPCC.

But Ms Govender says there are also “issues with Snapchat’s design that also put children at risk”.

Snapchat messages and images disappear after 24 hours – making incriminating behavior harder to track – and senders also know if the recipient has captured a message.

Ms Govender says the NSPCC is hearing directly from children who find Snapchat a concern.

“When I make a report (on Snapchat), it’s not listened to and I can see extreme and violent content on the app as well,” she told the BBC.

A spokesperson for Snapchat told the BBC that the sexual exploitation of young people was “horrific”.

“If we identify such activity or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts and report them to the authorities,” they added.

Record the crime

Grooming registrations have risen since the offense of having sex with a child came into force in 2017, reaching a new record of 7,062 this year.

Of the 1,824 cases where the platform was known last year, 48% were recorded on Snapchat.

The number of grooming offenses reported on Snapchat has increased every year since 2018/19.

Grooming offenses reported on WhatsApp have increased slightly over the past year. On Instagram and Facebook, known cases have declined in recent years, according to the figures. All three platforms are owned by Meta.

WhatsApp told the BBC it has “robust security measures” in place to protect people on its app.

Jess Phillips, Minister for Defense and Violence Against Women and Girls, said social media companies “have a responsibility to stop this vile abuse from happening on their platforms”.

In a statement, she added: “Under the Online Safety Act, they will have to stop this type of illegal content being shared on their sites, including private and encrypted messaging services, or face significant fines.” .

The Online Safety Act includes a legal requirement for technology platforms to keep children safe.

From December, big tech firms will have to publish their risk assessments of illegal damages on their platforms.

Media regulator Ofcom, which will enforce the rules, said: “Our draft codes of practice include robust measures to help prevent grooming by making it more difficult for perpetrators to contact children.

“We stand ready to use the full extent of our enforcement powers against any companies that cannot be found when the time comes.”