Child grooming: NSPCC says social media behind 25% rise in grooming cases

Grooming crimes recorded by police have soared by a quarter in the last year, new data obtained by the NSPCC has revealed.
NSPCC reveals rise in number of grooming casesNSPCC reveals rise in number of grooming cases
NSPCC reveals rise in number of grooming cases

And the use of social media is being blamed for the massive increase.

There were 710 offences of sexual communication with a child recorded in the year to April 2019 compared with 565 in the previous year, in the North West.

Hide Ad
Hide Ad

In England and Wales there were 4,373 offences of sexual communication with a child recorded in the year to April 2019 compared with 3,217 in the previous year. The offence came into force on April 3, 2017, following an NSPCC campaign.

NSPCC reveals rise in number of grooming casesNSPCC reveals rise in number of grooming cases
NSPCC reveals rise in number of grooming cases

The data obtained from 43 police forces in England and Wales under Freedom of Information laws also revealed that, where age was provided, one in five victims were aged just 11 or younger.

In 2018/19 in England and Wales the number of recorded instances of the use of Instagram, which is owned by Facebook, was more than double that of the previous year.

Overall in the last two years, Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) and Snapchat were used in 83% of the instances where police in the region recorded and provided the communication method. Instagram was used in 26% of them.

Hide Ad
Hide Ad

The Government has indicated it will publish a draft Online Harms Bill early next year, following the NSPCC’s Wild West Web campaign. The proposals would introduce independent regulation of social networks, with tough sanctions if they fail to keep children safe on their platforms.

The NSPCC believes it is now crucial that Boris Johnson’s Government makes a public commitment to draw up these Online Harms laws and implement robust regulation for tech firms to force them to protect children as a matter of urgency.

Peter Wanless, NSPCC Chief Executive, said: “It’s now clearer than ever that Government has no time to lose in getting tough on these tech firms.

“Despite the huge amount of pressure that social networks have come under to put basic protections in place, children are being groomed and abused on their platforms every single day.

Hide Ad
Hide Ad

“These figures are yet more evidence that social networks simply won’t act unless they are forced to by law. The Government needs to stand firm and bring in regulation without delay.”

Freya was 12 when, while she was staying at a friend’s house, a stranger bombarded her Instagram account with sexual messages and videos.

Her mum Pippa told the NSPCC: “She was quiet and seemed on edge when she came home the next day.

“I noticed her shaking and knew there was something wrong so encouraged her to tell me what the problem was.

Hide Ad
Hide Ad

“When she showed me the messages, I just felt sick. It was such a violation and he was so persistent.

“He knew she was 12, but he kept bombarding her with texts and explicit videos and images. Freya* didn’t even understand what she was looking at. There were pages and pages of messages, he just didn’t give up.Our children should be safe in their bedrooms, but they’re not. They should be safe from messages from strangers if their accounts are on private, but they’re not.”

The NSPCC’s Wild West Web campaign is calling for social media regulation to require platforms to:

Take proactive action to identify and prevent grooming on their sites by:

Hide Ad
Hide Ad

Using Artificial Intelligence to detect suspicious behaviour

Sharing data with other platforms to better understand the methods offenders use and flag suspicious accounts

Turning off friend suggestion algorithms for children and young people, as they make it easier for groomers to identify and target children

Design young people’s accounts with the highest privacy settings, such as geo-locators off by default, contact details being private and unsearchable and livestreaming limited to contacts only.

Hide Ad
Hide Ad

The charity wants to see tough sanctions for tech firms that fail to protect their young users – including steep fines for companies, boardroom bans for directors, and a new criminal offence for platforms that commit gross breaches of the duty of care.

If you need help call the NSPCC helpline on 0808 800 5000.