Online Grooming Offences Surge, Charity Warns of Underreported Scale
Online grooming offences recorded by UK police have doubled since 2018, with victims as young as four years old, according to new data analyzed by teh NSPCC. The charity reports 7,836 offences were recorded in the year to March 2023, a notable increase from the 3,960 reported in 2018.However, the actual number of victims is likely higher, as a single police-recorded offence can involve multiple victims and various interaction methods.
The NSPCC also cautioned that the true extent of grooming is “much higher” due to abuse occurring in private online spaces, making detection more challenging.
Snapchat is identified as a platform of particular concern. Almost three-quarters of british children use the platform, making it a popular avenue for offenders, explained Matthew Sowemimo, the NSPCC’s associate head of child safety online. He highlighted the platform’s ‘quick add’ feature,which allows adults to easily connect with and directly message “a very large number of child users.”
Perpetrators are adapting their tactics, the NSPCC’s research reveals, creating multiple profiles and manipulating young people to engage with them across different platforms.
The charity is urging tech companies to leverage the metadata they possess to identify suspicious patterns of behavior. This would involve flagging instances of adults repeatedly contacting large numbers of children or creating fake profiles – indicators of grooming – without accessing private message content.
Snapchat responded by stating they “work closely with the police, safety experts, and NGOs…to prevent, identify, and remove this activity” and report offenders. They also outlined existing safety measures, including blocking teens from appearing in search results unless they have mutual connections and requiring mutual friendship or existing phone contact for direct communication. Snapchat also deploys in-app warnings to prevent unwanted contact.
Meta, the parent company of Facebook and Instagram, stated it uses technology to ”proactively identify child exploitation content.” Between January and March 2023, over six million pieces of such content were removed from its platforms, with over 97% detected proactively before being reported. Meta claims it already implements the protections recommended in the NSPCC’s report.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK.In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.