UK

Paedophiles are starting to use virtual reality headsets to view child abuse images, data shows

Published

on

Paedophiles are starting to use virtual reality headsets to view child abuse images, according to police data.

Use of this technology was recorded in eight cases in 2021/22 – the first time this technology has been specifically mentioned in crime reports.

During that period, police recorded 30,925 offences involving obscene images of children – the highest number logged by forces in England and Wales.

Of these, a social media or gaming site was recorded in 9,888 cases – including Snapchat 4,293 times, Facebook 1,361, Instagram 1,363 and WhatsApp 547.

NSPCC, which collated the data, is calling for a number of amendments to the Online Safety Bill to prevent more children becoming exposed to abuse.

Sir Peter Wanless, chief executive of the NSPCC, said: “These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online.

“We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.

More from UK

“By creating a child safety advocate that stands up for children and families the government can ensure the Online Safety Bill systemically prevents abuse.”

Read more:
NSPCC’s Childline reports 45% rise in number of boys suffering online sexual abuse
Child abuse inquiry: Turning a blind eye should be against the law
Mother reveals trauma of court delays after her ex-husband sexually abused her daughter

The NSPCC also wants a change to the law that would mean senior managers of social media sites are held criminally liable if children are exposed to abuse.

Sir Peter said: “It would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media.”

A government spokesperson said: “Protecting children is at the heart of the Online Safety Bill and we have included tough, world-leading measures to achieve that aim while ensuring the interests of children and families are represented through the children’s commissioner.

“Virtual reality platforms are in scope and will be forced to keep children safe from exploitation and remove vile child abuse content.

“If companies fail to tackle this material effectively, they will face huge fines and could face criminal sanctions against their senior managers.”

A spokesman for Meta – which owns Facebook, Instagram and WhatsApp – said: “This horrific content is banned on our apps, and we report instances of child sexual exploitation to NCMEC (National Centre for Missing & Exploited Children).

“We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts and industry partners to tackle this societal issue.

“Our work in this area is never done, and we’ll continue to do everything we can to keep this content off our apps.”

A Snapchat spokesperson said: “Any sexual abuse of children is abhorrent and illegal. Snap has dedicated teams around the world working closely with the police, experts and industry partners to combat it.

“If we proactively detect or are made aware of any sexual content exploiting minors, we immediately remove it, delete the account, and report the offender to authorities. Snapchat has extra protections in place that make it difficult for younger users to be discovered and contacted by strangers.”

Image:
Roxy Longworth

‘I had no control’

Roxy Longworth was 13 when a 17-year-old boy she didn’t know contacted her on Facebook, before coercing her into sending images via Snapchat.

She said it left her feeling isolated and full of guilt, and soon a friend of his started using the images to push for more explicit pictures.

“My whole life was about doing what he told me, and hiding it from everybody,” Roxy said. “And then obviously the more photos he had, the more he had to blackmail me with until eventually he asked me to send a video. Him and his friend, they just completely owned me at that point, I had no control.”

It had a devastating effect on her mental health.

“The shame of it buried me,” she said. “I ended up becoming very ill. I self-harmed a lot, I stopped sleeping and eventually I was hospitalised with a psychotic episode. I was on suicide watch for about a year.”

She’s written a book called When You Lose It as a means of coming to terms with what happened, but says it is still haunting to know the photos exist.

Roxy added: “It’s just like a creeping feeling that you try and forget about, and then you realise those photos are still out there.

“They were on group chats with hundreds of people on them, they were everywhere.

“And the thing is – those photos are of a 13-year-old girl. That is so messed up. That’s disgusting.”

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK.

Trending

Exit mobile version