Campaigners are warning the use of artificial intelligence (AI) to create realistic but fake nude images of real women is becoming “normalised”.
It’s also an increasing concern in schools. A recent survey by Internet Matters found 13% of teenagers have had an experience with nude deepfakes, while the NSPCC told Sky News “a new harm is developing”.
Ofcom will later this month introduce codes of practice for internet companies to clamp down on the illegal distribution of fake nudes, but Sky News has met two victims of this relatively new trend, who say the law needs to go further.
Earlier this year, social media influencer and former Love Island contestant, Cally Jane Beech, 33, was horrified when she discovered someone had used AI to turn an underwear brand photograph of her into a nude and it was being shared online.
The original image had been uploaded to a site that uses software to digitally transform a clothed picture into a naked picture.
She told Sky News: “It looked so realistic, like nobody but me would know. It was like looking at me, but also not me.”
She added: “There shouldn’t be such a thing. It’s not a colouring book. It’s not a bit of fun. It’s people’s identity and stripping their clothes off.”
More on Artificial Intelligence
Related Topics:
When Cally reported what had happened to the police, she struggled to get them to treat it as a crime.
“They didn’t really know what they could do about it, and because the site that hosted the image was global, they said that it’s out of their jurisdiction,” she said.
In November, Assistant Chief Constable Samantha Miller, of the National Police Chiefs’ Council, addressed a committee of MPs on the issue and concluded “the system is failing”, with a lack of capacity and inconsistency of practice across forces.
ACC Miller told the women and equalities committee she’d recently spoken to a campaigner who was in contact with 450 victims and “only two of them had a positive experience of policing”.
The government says new legislation outlawing the generation of AI nudes is coming next year, although it is already illegal to make fake nudes of minors.
Meanwhile, the problem is growing with multiple apps available for the purpose of unclothing people in photographs. Anyone can become a victim, although it is nearly always women.
Professor Clare McGlynn, an expert in online harms, said: “We’ve seen an exponential rise in the use of sexually explicit deepfakes. For example, one of the largest, most notorious websites dedicated to this abuse receives about 14 million hits a month.
“These nudify apps are easy to get from the app store, they’re advertised on Tik Tok, So, of course, young people are downloading them and using them. We’ve normalised the use of these nudify apps.”
‘Betrayed by my best friend’
Sky News spoke to “Jodie” (not her real name) from Cambridge who was tipped off by an anonymous email that she appeared to be in sex videos on a pornographic website.
“The images that I posted on Instagram and Facebook, which were fully clothed, were manipulated and turned into sexually explicit material,” she said.
Jodie began to suspect someone she knew was posting pictures and encouraging people online to manipulate them.
Then she found a particular photograph, taken outside King’s College in Cambridge, that only one person had.
It was her best friend, Alex Woolf. She had airdropped the picture to him alone.
Woolf, who once won BBC young composer of the year, was later convicted of offences against 15 women, mostly because of Jodie’s perseverance and detective work.
Even then, his conviction only related to the offensive comments attached to the images, because while it’s illegal to share images – it’s not a crime to ask others to create them.
He was given a suspended sentence and ordered to pay £100 to each of his victims.
Jodie believes it’s imperative new laws are introduced to outlaw making and soliciting these types of images.
“My abuse is not your fun,” she said.
“Online abuse has the same effect psychologically that physical abuse does. I became suicidal, I wasn’t able to trust those closest to me because I had been betrayed by my best friend. And the effect of that on a person is monumental.”
‘A scary, lonely place’
A survey in October by Teacher Tap found 7% of teachers answered yes to the question: “In the last 12 months, have you had an incident of a student using technology to create a fake sexually graphic image of a classmate?”
In their campaigning both Cally and Jodie have come across examples of schoolgirls being deep faked.
Cally said: “It is used as a form of bullying because they think it’s funny. But it can have such a mental toll, and it must be a very scary and lonely place for a young girl to be dealing with that.”
The NSPCC said it has had calls about nude deepfakes to its helpline.
The charity’s policy manager for child safety online, Rani Govender, said the pictures can be used as “part of a grooming process” or as a form of blackmail, as well as being passed around by classmates “as a form of bullying and harassment”.
“Children become scared, isolated and they worry they won’t be believed that the images are created by someone else,” Ms Govender said.
She added: “This is a new harm, and it is developing, and it will require new measures from the government with child protection as a priority.”
Alex Davies-Jones, under-secretary of state for victims, told MPs in November: “We’ve committed to making an offence of creating a deepfake illegal and we will be legislating for that this session.”
For campaigners like Jodie and Cally the new laws can’t come soon enough. However, they worry they won’t have strong enough clauses around banning the soliciting of content and ensuring images are removed once they’ve been discovered.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.
A man has been arrested after a woman in her 80s was killed in a Christmas Day motorway crash.
A white Ford Fiesta and a black Volkswagen Tiguan collided on the A1(M) near Darlington just after 8.30pm, North Yorkshire Police said.
The passenger of the Ford Fiesta, a woman in her 80s from the Durham area, suffered serious injuries and died at the scene.
The car’s driver, a man in his 80s from the Durham area, was taken to hospital in a serious but stable condition.
The driver of the Volkswagen, a man in his 20s from the Durham area, was arrested on suspicion of causing death by dangerous driving.
He has now been released under investigation.
More on North Yorkshire
Related Topics:
The motorway was closed until around 8am on Boxing Day for collision investigators and National Highways to assess the road surface.
It is now open in both directions but with a lane closure still in place as of 9.30am.
Police have appealed for witnesses and dashcam footage of the crash, which happened on the northbound carriageway between Junction 57 (A66(M) junction) and Junction 58 (Merrybent).
The force also thanked members of the public who assisted at the scene.
Two women have died following reports of a stabbing in Milton Keynes on Christmas Day, police have said.
A dog injured in the incident in Bletchley also died after being taken to the vets.
A man and a teenage boy suffered serious injuries.
A 49-year-old man from Milton Keynes has been arrested on suspicion of murder and attempted murder and remains in custody.
Officers were called to a block of apartments in Santa Cruz Avenue just after 6.30pm on Christmas Day following reports of a stabbing.
The two women, aged 38 and 24, died at the scene, Thames Valley Police said. Their next of kin have been informed.
The injured man and teenage boy were taken to hospital and are both in a stable condition.
Police said the parties are known to each other.
Senior investigating officer Detective Chief Inspector Stuart Brangwin said: “Firstly I would like to extend my deepest condolences to the families of the women who have tragically died in this shocking incident.
“We have launched a double murder investigation, which may be concerning to the wider public; however, we have made an arrest and are not looking for anyone else in connection with this incident and the parties are known to each other.”
A man has been charged with murdering a woman whose body was found nine days after she went missing.
Police said extensive searches and appeals were launched to find Mariann Borocz after she vanished on 14 December.
Her body was discovered at a property in Bolton, Greater Manchester, on Christmas Eve.
Christopher Barlow, 61, from Bolton, has been charged with her murder and has been remanded in custody ahead of an appearance before magistrates on Thursday.
Greater Manchester Police said Ms Borocz’s family are being supported by specialist officers.
More on Greater Manchester
Related Topics:
Detective Chief Inspector Tony Platten thanked those who spoke to officers and shared the missing person appeals.
“On behalf of the entire investigation team, our condolences remain with Mariann’s family as they try to come to terms with her death,” he said.
“Our investigation is moving at pace, and we are continuing to work hard to build a full timeline of events leading up to Mariann’s death.
“As part of our investigation, we are once again appealing for additional information from the local community.”