Campaigners are warning the use of artificial intelligence (AI) to create realistic but fake nude images of real women is becoming “normalised”.
It’s also an increasing concern in schools. A recent survey by Internet Matters found 13% of teenagers have had an experience with nude deepfakes, while the NSPCC told Sky News “a new harm is developing”.
Ofcom will later this month introduce codes of practice for internet companies to clamp down on the illegal distribution of fake nudes, but Sky News has met two victims of this relatively new trend, who say the law needs to go further.
Earlier this year, social media influencer and former Love Island contestant, Cally Jane Beech, 33, was horrified when she discovered someone had used AI to turn an underwear brand photograph of her into a nude and it was being shared online.
The original image had been uploaded to a site that uses software to digitally transform a clothed picture into a naked picture.
Image: Cally Jane Beech says she was horrified after a photo of her was turned into a nude and shared online
She told Sky News: “It looked so realistic, like nobody but me would know. It was like looking at me, but also not me.”
She added: “There shouldn’t be such a thing. It’s not a colouring book. It’s not a bit of fun. It’s people’s identity and stripping their clothes off.”
More on Artificial Intelligence
Related Topics:
When Cally reported what had happened to the police, she struggled to get them to treat it as a crime.
“They didn’t really know what they could do about it, and because the site that hosted the image was global, they said that it’s out of their jurisdiction,” she said.
Spreaker
This content is provided by Spreaker, which may be using cookies and other technologies.
To show you this content, we need your permission to use cookies.
You can use the buttons below to amend your preferences to enable Spreaker cookies or to allow those cookies just once.
You can change your settings at any time via the Privacy Options.
Unfortunately we have been unable to verify if you have consented to Spreaker cookies.
To view this content you can use the button below to allow Spreaker cookies for this session only.
In November, Assistant Chief Constable Samantha Miller, of the National Police Chiefs’ Council, addressed a committee of MPs on the issue and concluded “the system is failing”, with a lack of capacity and inconsistency of practice across forces.
ACC Miller told the women and equalities committee she’d recently spoken to a campaigner who was in contact with 450 victims and “only two of them had a positive experience of policing”.
The government says new legislation outlawing the generation of AI nudes is coming next year, although it is already illegal to make fake nudes of minors.
Meanwhile, the problem is growing with multiple apps available for the purpose of unclothing people in photographs. Anyone can become a victim, although it is nearly always women.
Professor Clare McGlynn, an expert in online harms, said: “We’ve seen an exponential rise in the use of sexually explicit deepfakes. For example, one of the largest, most notorious websites dedicated to this abuse receives about 14 million hits a month.
“These nudify apps are easy to get from the app store, they’re advertised on Tik Tok, So, of course, young people are downloading them and using them. We’ve normalised the use of these nudify apps.”
‘Betrayed by my best friend’
Sky News spoke to “Jodie” (not her real name) from Cambridge who was tipped off by an anonymous email that she appeared to be in sex videos on a pornographic website.
“The images that I posted on Instagram and Facebook, which were fully clothed, were manipulated and turned into sexually explicit material,” she said.
Image: Alex Woolf avoided jail and was told to pay £100 to each of his 15 victims
Jodie began to suspect someone she knew was posting pictures and encouraging people online to manipulate them.
Then she found a particular photograph, taken outside King’s College in Cambridge, that only one person had.
It was her best friend, Alex Woolf. She had airdropped the picture to him alone.
Woolf, who once won BBC young composer of the year, was later convicted of offences against 15 women, mostly because of Jodie’s perseverance and detective work.
Even then, his conviction only related to the offensive comments attached to the images, because while it’s illegal to share images – it’s not a crime to ask others to create them.
Image: Jodie identified Woolf as he was the only one she’d sent this photo to
He was given a suspended sentence and ordered to pay £100 to each of his victims.
Jodie believes it’s imperative new laws are introduced to outlaw making and soliciting these types of images.
“My abuse is not your fun,” she said.
“Online abuse has the same effect psychologically that physical abuse does. I became suicidal, I wasn’t able to trust those closest to me because I had been betrayed by my best friend. And the effect of that on a person is monumental.”
‘A scary, lonely place’
A survey in October by Teacher Tap found 7% of teachers answered yes to the question: “In the last 12 months, have you had an incident of a student using technology to create a fake sexually graphic image of a classmate?”
In their campaigning both Cally and Jodie have come across examples of schoolgirls being deep faked.
Cally said: “It is used as a form of bullying because they think it’s funny. But it can have such a mental toll, and it must be a very scary and lonely place for a young girl to be dealing with that.”
The NSPCC said it has had calls about nude deepfakes to its helpline.
The charity’s policy manager for child safety online, Rani Govender, said the pictures can be used as “part of a grooming process” or as a form of blackmail, as well as being passed around by classmates “as a form of bullying and harassment”.
“Children become scared, isolated and they worry they won’t be believed that the images are created by someone else,” Ms Govender said.
She added: “This is a new harm, and it is developing, and it will require new measures from the government with child protection as a priority.”
Alex Davies-Jones, under-secretary of state for victims, told MPs in November: “We’ve committed to making an offence of creating a deepfake illegal and we will be legislating for that this session.”
For campaigners like Jodie and Cally the new laws can’t come soon enough. However, they worry they won’t have strong enough clauses around banning the soliciting of content and ensuring images are removed once they’ve been discovered.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.
Heathrow Airport bosses had been warned of a potential substation failures less than a week before a major power outage closed the airport for a day, a committee of MPs has heard.
The chief executive of Heathrow Airline Operators’ Committee Nigel Wicking told MPs of the Transport Committee he raised issues about resilience on 15 March after cable and wiring took out lights on a runway.
A fire at an electricity substation in west London meant the power supply was disrupted to Europe’s largest airport for a day – causing travel chaos for around 200,000 passengers.
“I’d actually warned Heathrow of concerns that we had with regard to the substations and my concern was resilience”, Mr Wicking said.
“So the first occasion was to team Heathrow director on the 15th of the month of March. And then I also spoke to the chief operating officer and chief customer officer two days before regarding this concern.
“And it was following a number of, a couple of incidents of, unfortunately, theft, of wire and cable around some of the power supply that on one of those occasions, took out the lights on the runway for a period of time. That obviously made me concerned.”
Mr Wicking also said he believed Heathrow’s Terminal 5 could have been ready to receive repatriation flights by “late morning” on the day of the closure, and that “there was opportunity also to get flights out”.
However, Heathrow chief executive Thomas Woldbye said keeping the airport open during last month’s power outage would have been “disastrous”.
There was a risk of having “literally tens of thousands of people stranded in the airport, where we have nowhere to put them”, Mr Woldbye said.
This breaking news story is being updated and more details will be published shortly.
Another 23 female potential victims have reported that they may have been raped by Zhenhao Zou – the Chinese PhD student detectives believe may be one of the country’s most prolific sex offenders.
The Metropolitan Police launched an international appeal after Zou, 28, was convicted of drugging and raping 10 women following a trial at the Inner London Crown Court last month.
Detectives have not confirmed whether the 23 people who have come forward add to their estimates that more than 50 other women worldwide may have been targeted by the University College London student.
Metropolitan Police commander Kevin Southworth said: “We have victims reaching out to us from different parts of the globe.
“At the moment, the primary places where we believe offending may have occurred at this time appears to be both in England, here in London, and over in China.”
Image: Metropolitan Police commander Kevin Southworth
Zou lived in a student flat in Woburn Place, near Russell Square in central London, and later in a flat in the Uncle building in Churchyard Row in Elephant and Castle, south London.
He had also been a student at Queen’s University Belfast, where he studied mechanical engineering from 2017 until 2019. Police say they have not had any reports from Belfast but added they were “open-minded about that”.
“Given how active and prolific Zou appears to have been with his awful offending, there is every prospect that he could have offended anywhere in the world,” Mr Southworth said.
“We wouldn’t want anyone to write off the fact they may have been a victim of his behaviour simply by virtue of the fact that you are from a certain place.
“The bottom line is, if you think you may have been affected by Zhenhao Zou or someone you know may have been, please don’t hold back. Please make contact with us.”
Image: Pic: Met Police
Zou used hidden or handheld cameras to record his attacks, and kept the footage and often the women’s belongings as souvenirs.
He targeted young, Chinese women, inviting them to his flat for drinks or to study, before drugging and assaulting them.
Zou was convicted of 11 counts of rape, with two of the offences relating to one victim, as well as three counts of voyeurism, 10 counts of possession of an extreme pornographic image, one count of false imprisonment and three counts of possession of a controlled drug with intent to commit a sexual offence, namely butanediol.
Please use Chrome browser for a more accessible video player
3:16
Moment police arrest rapist student
Mr Southworth said: “Of those 10 victims, several were not identified so as we could be sure exactly where in the world they were, but their cases, nevertheless, were sufficient to see convictions at court.
“There were also, at the time, 50 videos that were identified of further potential female victims of Zhenhao Zou’s awful crimes.
“We are still working to identify all of those women in those videos.
“We have now, thankfully, had 23 victim survivors come forward through the appeal that we’ve conducted, some of whom may be identical with some of the females that we saw in those videos, some of whom may even turn out to be from the original indicted cases.”
Mr Southworth added: “Ultimately, now it’s the investigation team’s job to professionally pick our way through those individual pieces of evidence, those individual victims’ stories, to see if we can identify who may have been a victim, when and where, so then we can bring Zou to justice for the full extent of his crimes.”
Mr Southworth said more resources will be put into the investigation, and that detectives are looking to understand “what may have happened without wishing to revisit the trauma, but in a way that enables [the potential victims] to give evidence in the best possible way.”
The Metropolitan Police is appealing to anyone who thinks they may have been targeted by Zou to contact the force either by emailing survivors@met.police.uk, or via the major incident public portal on the force’s website.
An 11-year-old girl who went missing after entering the River Thames has been named as Kaliyah Coa.
An “extensive search” has been carried out after the incident in east London at around 1.30pm on Monday.
Police said the child had been playing during a school inset day and entered the water near Barge House Causeway, North Woolwich.
A recovery mission is now said to be under way to find Kaliyah along the Thames, with the Metropolitan Police carrying out an extensive examination of the area.
Image: Barge House Causeway is a concrete slope in North Woolwich leading into the Thames
Chief Superintendent Dan Card thanked members of the public and emergency teams who responded to “carry out a large-scale search during a highly pressurised and distressing time”.
He also confirmed drone technology and boats were being used to “conduct a thorough search over a wide area”.
He added: “Our specialist officers are supporting Kaliyah’s family through this deeply upsetting time and our thoughts go out to all those impacted by what has happened.”
More from UK
“Equally we appreciate this has affected the wider community who have been extremely supportive. You will see extra officers in the area during the coming days.”
On Monday, Kerry Benadjaoud, a 62-year-old resident from the area, said she heard of the incident from her next-door neighbour, who “was outside doing her garden and there was two little kids running, and they said ‘my friend’s in the water'”.
When she arrived at the scene with a life ring, a man told her he had called the police, “but he said at the time he could see her hands going down”.
Barge House Causeway is a concrete slope that goes directly into the River Thames and is used to transport boats.
Residents pointed out that it appeared to be covered in moss and was slippery.