NEW DATA released by the NSPCC shows a shocking increase in the number of child sexual abuse images crimes recorded by one police force in South Wales. 

Child abuse image crimes recorded by Gwent Police increase by 18 per cent in a year and 78 per cent in five years with more than 2,000 offences recorded since 2017/18. 

A total of 475 offences where child abuse images were collected and distributed, were logged in Gwent in 2022/23, with 33,000 offences across the UK, according to Freedom of Information data.

This is just in Gwent, but during this time, there was a 79 per cent increase across the UK as a whole. 

The NSPCC said the figures show the need for swift and ambitious action by tech companies to address what is currently happening on their platforms and for Ofcom to significantly strengthen its approach to tackling child sexual abuse through effective enforcement of the Online Safety Act.

The new data shows the widespread use of social media and messaging apps in child sexual abuse image crimes which the NSPCC say results largely from a failure to design child safety into products.

Social media and messaging apps used in abuse crimes

Where police for disclosed the site involved, Snapchat was flagged in almost half (44 per cent) of instances - over 4,000 times. Meta-owned products (Facebook, Instagram and WhatsApp) were flagged more than 2,500 times making up a quarter (26 per cent) of known instances.

It comes as insight from Childline shows young people being targeted by adults to share child abuse images via social media and the calculated use of end-to-end encrypted private messaging apps by adults to find and share child abuse images.

A 14-year-old girl told the NSPCC-run service: “One night I got chatting with this guy online who I’d never met and he made me feel so good about myself.

"He told me he was 15, even though deep down I didn’t believe him. I sent him a couple of semi-nudes on Snap(chat), but then instantly regretted it.

"I asked him to delete the pics, but he just kept on making me do stuff for him not to post them – like getting me to strip live on camera. I just want to block him, but if I block him he will just post the pictures.”

A 15-year-old boy told Childline: “A while ago I saw a video on YouTube about how a guy was busting paedophiles and creeps on the internet by pretending to be a kid, and I kind of wanted to do a similar thing.

"I looked around Instagram for the creepiest accounts about kids my age and younger. In the end, I came across this link on one of their stories. It’s a link to a WhatsApp group chat in which [child sexual abuse material] is sent daily!

"There are literally hundreds of members in this group chat and they’re always calling the kids ‘hot’ and just being disgusting.”

Online Safety Act implementation

The NSPCC said that disrupting online child sexual abuse taking place at increasing levels will require regulated tech platforms to introduce systemic changes to their products to stop them being used to organise, commit, and share child abuse.

A consultation into Ofcom’s first codes for companies to adopt to disrupt child sexual abuse on their platforms closed last week.

The NSPCC want these measures introduced without delay but urged Ofcom to begin work on a second version of the codes that will require companies to go much further.

The charity said companies should be required to use technology that can help identify and tackle grooming, sextortion and new child abuse images.

They also want tougher measures for private messaging services to make child protection a priority, including in end-to-end encrypted environments.

Facebook and Instagram were used in more than a fifth abuse image instances where a platform was recorded by police forces.

The NSPCC warned that Meta’s roll-out of end-to-end encryption on these sites will prevent authorities from identifying offenders and safeguarding victims.

The charity wants plans paused until Meta can prove child safety will not be compromised and have urged parties to find a balance between the safety and privacy of all users, including children.

The NSPCC said further rollout should be delayed until Ofcom can study Meta’s risk assessment as part of the new regulatory regime.

NSPCC Chief Executive, Sir Peter Wanless said: “It’s alarming to see online child abuse continue to rise, especially when tech companies should be acting to make their sites safe by design ahead of incoming regulation.

“Behind these crimes are children who have been targeted by adults who are able to organise and share sexual abuse with other offenders seamlessly across social media and messaging apps.

“The Online Safety Act sets out robust measures to make children fundamentally safer on the sites and apps they use so they can enjoy the benefits of a healthy online experience.

“Ofcom has been quick off the blocks but must act with greater ambition to ensure companies prioritise child safety in the comprehensive way that is so desperately needed.”

Chief Executive of the Internet Watch Foundation, the UK’s front line against child sexual abuse imagery online, Susie Hargreaves OBE said: “This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.

“The people viewing and sharing and distributing this material need to know it is not a victimless crime. They are real children, suffering real abuse and sexual torture, the effects of which can linger a lifetime.

“That more and more people are trying to share and spread this material shows we should all be doing everything we can to stop this, building more, and innovative solutions to keep children safe.

"The IWF is ready to support technology companies and Ofcom in implementing the Online Safety Act to help make the UK the safest place in the world to be online.”

Detective Chief Superintendent of Gwent Police Nicky Brain said: “A rise in any criminal offence committed against a child is a concern and we’re seeing, like many other UK police services, an increase in offences of this nature.

“These offences can have a huge emotional impact not only on the victims, but on whole families and the wider community.

“Many of our recorded cases are linked to self-generated images, for example where someone aged under 18 has shared a sexually explicit or suggestive image consensually with one of their peers.

“We ensure that these cases are dealt with proportionally as safeguarding is always the priority.”

According to Gwent Police, the majority of cases are identified via referrals from National Centre for Missing and Exploited Children (NCMEC) which are reports linked to online activity, and  a significant proportion of these referrals will relate to young people who are navigating adolescence and exploring their sexuality. 

They say this might be out of embarrassment or not feeling comfortable enough to talk to peers, they have turned to the internet to search for information and most occasions the material accessed will be “non-aggravated child sexual abuse material” – whereby the imagery has not been coerced or forced and contains individuals of a similar age to themselves.

The force has said they do receive reports from schools and concerned parents, and encourage anyone who has concerns about a child to call 101, contact the Child Exploitation and Online Protection (CEOP) on their website or always call 999 if there is an immediate risk of harm to a child. 

Gwent Police have said their data unit has rerun the figures through the system and has updated them. 

A spokesperson for Gwent Police said: "Although not identical, they are broadly in line so we’ve got no issues with those quoted in the NSPCC release."

EDITOR'S NOTE: This article has been updated due to some further information received from Gwent Police.