Data shows how criminals are using private messaging platforms to manipulate and groom children

We're urging tech giants to act as online grooming crimes reach record high

47595-exp-2027-08.jpg

  • Police recorded 7,263 Sexual Communication with a Child offences in the last year, almost double since the offence came into force in 2017/18.
  • Where the platform could be identified, 40% of offences took place on Snapchat, 9% on WhatsApp and 9% on Facebook and Instagram.
  • Data shows how private messaging platforms are being used by criminals to manipulate and groom children.
  • Our new research highlights a range of tools tech companies, Ofcom, and Government can employ to protect children from perpetrators.

Worried about a child?

You can contact the NSPCC Helpline by calling 0808 800 5000 or emailing help@NSPCC.org.uk

Find out more

The data in detail

Online grooming offences have hit record levels across the UK, with Snapchat, WhatsApp, and Facebook emerging as the top platforms where these crimes take place.

The figures, provided by 44 UK police forces, show that 7,263 Sexual Communication with a Child offences were recorded last year. Where police forces could be directly compared, the number of crimes had almost doubled from the first year of when the offence came into force in 2017/18.1 2

Of the 2,111 offences where police could identify the platform used, 40% took place on Snapchat, 9% on WhatsApp and 9% on Facebook and Instagram.

Where gender was known, 80% of children targeted were girls. Meanwhile, the youngest victim of online grooming recorded was a 4-year-old boy.

We’re highlighting that while these are the offences recorded by police, the real number of crimes is likely to be much higher due to abuse happening in private spaces where harms can be harder to detect.

To tackle the issue, we’re publishing new research with solutions that can be used to prevent, detect and disrupt grooming in private messaging spaces.

Online child sexual abuse crimes can have a long-term impact on a child, leaving them with feelings of guilt, shame, depression, confusion, anxiety and fear.

One 14-year-old who contacted Childline said3:

“I feel so insecure all the time, so, when this guy I’ve met online, who’s a few years older, started flirting with me, that made me feel so special. He seemed to care, but now he’s insisting I send him nudes, and I don’t know if he just gave me attention, so I’d send him nudes. I feel like I’ve been tricked but I’m afraid what he might do if I just block him. I can’t control how anxious this makes me feel.”

Our new research

Our new research identifies cycles of behaviours that perpetrators use, such as creating multiple different profiles and manipulating young users to engage with them across different platforms.

We’re urging Ofcom and tech companies to take swift action on the recommendations set out in the report, so that they can better identify and prevent online grooming.

Our recommendations include:

  • Implementing tools on a child’s phone that can scan for nude images and identify child sexual abuse material, before its shared.
  • Using metadata analysis, which uses background information, like when, where, and how someone is using a platform, to spot suspicious patterns. It does not read private messages, but it can flag behaviours that suggest grooming, such as adults repeatedly contacting large numbers of children or creating fake profiles.
  • Create barriers for adult profiles engaging children on social media platforms, like restrictions on who they can search and how many people they can contact.
  • Tech platform leaders should commit to delivering services which effectively support and balance user safety and privacy.

The research shows that safety measures must be introduced at the same time to be effective, working in tandem to ensure harm is prevented across the grooming cycle.

We’re urging tech companies, Ofcom, and Government to take leadership on addressing this devastating crime and commit to using every tool available to them to stop perpetrators in their tracks.

chris sherwood 900x506.jpg

Chris Sherwood, NSPCC Chief Executive, said:

"It’s deeply alarming that online grooming crimes have reached a record high across the UK, taking place on the very platforms children use every day.

“At Childline, we hear first-hand how grooming can devastate young lives. The trauma doesn’t end when the messages stop, it can leave children battling anxiety, depression, and shame for years.

“Tech companies must act now to prevent further escalation. The tools the NSPCC sets out to protect children are ready to use and urgently needed. Importantly, they mean that services can keep children safe while protecting all user’s privacy. Children’s safety must be built into platform design from the start, not treated as an afterthought.”

Kerry Smith, Chief Executive of the Internet Watch Foundation (IWF) said:

“The internet has opened a door into millions of homes, giving predators access to children. The harms are very real and, as this data shows, are affecting children everywhere.

“Safety should be something which is built into all services and platforms from the bottom up, not tacked on as an afterthought. There should be absolutely nowhere for predators to hide online.

“Tech companies must do everything they can, including in end to end encrypted spaces, to keep children safe. It is clear now that this can be done effectively without compromising users’ privacy. There really is no excuse — and the alternative is allowing children to continue to suffer.”


References

  1. 1) Data was provided by 44 police forces in England, Wales, Scotland and Northern Ireland. Lincolnshire did not provide the data.

  2. 2) The NSPCC sent Freedom of Information requests and asked for data on recorded instances of Sexual Communication with a Child offences for 2024/2025. In Northern Ireland, the offence was introduced in 2015 and was recorded by police from 2015/16. The offence of communicating indecently was introduced in Scotland by the Sexual Offences (Scotland) Act 2009, which came into force on 1 December 2010.

  3. 3) Snapshots are based on real Childline service users but are not direct quotes. All names and potentially identifying details have been changed to protect the identity of the child or young person involved.
    About the Freedom of Information data