As AI deepfakes cause havoc during other elections, experts warn the UK’s politicians should be prepared.
“Just tell me what you had for breakfast”, says Mike Narouei, of ControlAI, recording on his laptop. I speak for around 15 seconds, about my toast, coffee and journey to their offices.
Within seconds, I hear my own voice, saying something entirely different.
In this case, words I have written: “Deepfakes can be extremely realistic and have the ability to disrupt our politics and damage our trust in the democratic process.”
We have used free software, it hasn’t taken any advanced technical skills, and the whole thing has taken next to no time at all.
This is an audio deepfake – video ones take more effort to produce – and as well as being deployed by scammers of all kinds, there is deep concern, in a year with some two billion people going to the polls, in the US, India and dozens of other countries including the UK, about their impact on elections.
London mayor Sadiq Khan was also targeted this year, with fake audio of him making inflammatory remarks about Remembrance weekend and calling for pro-Palestine marches going viral at a tense time for communities. He claimed new laws were needed to stop them.
Advertisement
Ciaran Martin, the former director of the UK’s National Cyber Security Centre, told Sky News that expensively made video fakes can be less effective and easier to debunk than audio.
“I’m particularly worried right now about audio, because audio deepfakes are spectacularly easy to make, disturbingly easy”, he said. “And if they’re cleverly deployed, they can have an impact.”
Those which have been most damaging, in his view, are an audio deepfake of President Biden, sent to voters during the New Hampshire primaries in January this year.
A “robocall” with the president’s voice told voters to stay at home and “save” their votes for the presidential election in November. A political consultant later claimed responsibility and has been indicted and fined $6m (£4.7m).
Mr Martin, now a professor at the Blavatnik School of Government at Oxford University, said: “It was a very credible imitation of his voice and anecdotal evidence suggests some people were tricked by that.
“Not least because it wasn’t an email they could forward to someone else to have a look at, or on TV where lots of people were watching. It was a call to their home which they more or less had to judge alone.
“Targeted audio, in particular, is probably the biggest threat right now, and there’s no blanket solution, there’s no button there that you can just press and make this problem go away if you are prepared to pay for it or pass the right laws.
“What you need, and the US did this very well in 2020, is a series of responsible and well-informed eyes and ears throughout different parts of the electoral system to limit and mitigate the damage.”
He says there is a risk to hyping up the threat of deepfakes, when they have not yet caused mass electoral damage.
A Russian-made fake broadcast of Ukrainian TV, he said, featuring a Ukrainian official taking responsibility for a terrorist attack in Moscow, was simply “not believed”, despite being expensively produced.
The UK government has passed a National Security Act with new offences of foreign interference in the country’s democratic processes.
The Online Safety Act requires tech companies to take such content down, and meetings are being regularly held with social media companies during the pre-election period.
Democracy campaigners are concerned that deepfakes could be used not just by hostile foreign actors, or lone individuals who want to disrupt the process – but political parties themselves.
Polly Curtis is chief executive of the thinktank Demos, which has called on the parties to agree to a set of guidelines for the use of AI.
She said: “The risk is that you’ll have foreign actors, you’ll have political parties, you’ll have ordinary people on the street creating content and just stirring the pot of what’s true and what’s not true.
“We want them to come together and agree together how they’re going to use these tools at the election. We want them to agree not to create generative AI or amplify it, and label it when it is used.
“This technology is so new, and there are so many elections going on, there could be a big misinformation event in an election campaign that starts to affect people’s trust in the information they’ve got.”
Deepfakes have already been targeted at major elections.
Last year, within hours before polls closed in the Slovakian presidential election, an audio fake of one of the candidates claiming to have rigged the election went viral. He was heavily defeated and his pro-Russian opponent won.
The UK government established a Joint Election Security Preparations Unit earlier this year – with Whitehall officials working with police and security agencies – to respond to threats as they emerge.
Follow Sky News on WhatsApp
Keep up with all the latest news from the UK and around the world by following Sky News
A UK government spokesperson said: “Security is paramount and we are well-prepared to ensure the integrity of the election with robust systems in place to protect against any potential interference.
“The National Security Act contains tools to tackle deepfake election threats and social media platforms should also proactively take action against state-sponsored content aimed at interfering with the election.”
A Labour spokesperson said: “Our democracy is strong, and we cannot and will not allow any attempts to undermine the integrity of our elections.
“However, the rapid pace of AI technology means that government must now always be one step ahead of malign actors intent on using deepfakes and disinformation to undermine trust in our democratic system.
“Labour will be relentless in countering these threats.”
Conservative Party leader Kemi Badenoch has called on Sir Keir Starmer to sack Treasury minister Tulip Siddiq over allegations she lived in properties linked to allies of her aunt, Sheikh Hasina, the deposed prime minister of Bangladesh.
It comes after the current Bangladeshi leader, Muhammad Yunus, said London properties used by Ms Siddiq should be investigated.
He told the Sunday Timesthe properties should be handed back to his government if they were acquired through “plain robbery”.
Tory leader Ms Badenoch said: “It’s time for Keir Starmer to sack Tulip Siddiq.
“He appointed his personal friend as anti-corruption minister and she is accused herself of corruption.
“Now the government of Bangladesh is raising serious concerns about her links to the regime of Sheikh Hasina.”
Ms Siddiq insists she has “done nothing wrong”.
Her aunt was ousted from office in August following an uprising against her 20-year leadership and fled to India.
On the same day, the prime minister said: “Tulip Siddiq has acted entirely properly by referring herself to the independent adviser, as she’s now done, and that’s why we brought into being the new code.
“It’s to allow ministers to ask the adviser to establish the facts, and yes, I’ve got confidence in her, and that’s the process that will now be happening.”
Police in Aberdeen have widened the search area for two sisters who disappeared four days ago in the city.
Eliza and Henrietta Huszti, both 32, were last seen on CCTV on Market Street after leaving their home on Tuesday at around 2.12am.
The sisters – who are part of a set of triplets and originally from Hungary – crossed the Victoria Bridge to the Torry area and turned right on to a footpath next to the River Dee.
They headed in the direction of Aberdeen Boat Club but officers said there is no evidence to suggest the missing women left the immediate area.
Specialist search teams, police dogs and a marine unit have been trying to trace the pair.
Further searches are being carried out towards the Port of Aberdeen’s South Harbour and Duthie Park.
Police Scotland said it is liaising with authorities in Hungary to support the relatives of the two sisters.
Chief Inspector Darren Bruce said: “Eliza and Henrietta’s family are understandably extremely worried about them and we are working tirelessly to find them.
“We are seriously concerned about them and have significant resources dedicated to the inquiry.”
The sisters, from Aberdeen city centre, are described as slim with long brown hair.
Officers have requested businesses in and around the South Esplanade and Menzies Road area to review their CCTV footage for the early morning of Tuesday 7 January.
Police added they are keen to hear from anyone with dashcam footage from that time.
TV presenter Katie Piper has revealed her decision to get an artificial eye, 16 years after an acid attack that left her with life-changing injuries and partial blindness.
The Loose Women panellist, 41, is an advocate for those with burns and disfigurement injuries.
She shared a video of her being fitted with the prosthetic on Instagram.
Piper said: “After many years battling with my eye health, I’ve reached the end of the road somewhat, and the decision has been made to try a prosthetic eye shell.
“This marks the start of a journey to have an artificial eye, with an incredible medical team behind me.
“As always I’m incredibly grateful to all those in the NHS and private health care system for their talent and kindness.
“I will share my journey, I’m hopeful and nervous about being able to tolerate it and would love to hear from any of you in the comments if you’ve been on this journey or have any advice.”
More on Katie Piper
Related Topics:
Commenting on the post, presenter Lisa Snowdon said Piper was a “warrior” and a “true inspiration”.
Piper has undergone hundreds of operations after suffering an acid attack arranged by her ex-boyfriend in March 2008.
She gave up her right to anonymity and made a documentary in 2009 called Katie: My Beautiful Face.
Piper also founded the Katie Piper Foundation which supports survivors of life-changing burns and scars, and has received an honorary doctorate from the Royal College of Surgeons to mark her ground-breaking work.
She was made an OBE in 2021 for her services to charity and burn victims.