As AI deepfakes cause havoc during other elections, experts warn the UK’s politicians should be prepared.
“Just tell me what you had for breakfast”, says Mike Narouei, of ControlAI, recording on his laptop. I speak for around 15 seconds, about my toast, coffee and journey to their offices.
Within seconds, I hear my own voice, saying something entirely different.
In this case, words I have written: “Deepfakes can be extremely realistic and have the ability to disrupt our politics and damage our trust in the democratic process.”
Image: Tamara Cohen’s voice being turned into a deepfake
We have used free software, it hasn’t taken any advanced technical skills, and the whole thing has taken next to no time at all.
This is an audio deepfake – video ones take more effort to produce – and as well as being deployed by scammers of all kinds, there is deep concern, in a year with some two billion people going to the polls, in the US, India and dozens of other countries including the UK, about their impact on elections.
London mayor Sadiq Khan was also targeted this year, with fake audio of him making inflammatory remarks about Remembrance weekend and calling for pro-Palestine marches going viral at a tense time for communities. He claimed new laws were needed to stop them.
Advertisement
Ciaran Martin, the former director of the UK’s National Cyber Security Centre, told Sky News that expensively made video fakes can be less effective and easier to debunk than audio.
“I’m particularly worried right now about audio, because audio deepfakes are spectacularly easy to make, disturbingly easy”, he said. “And if they’re cleverly deployed, they can have an impact.”
Those which have been most damaging, in his view, are an audio deepfake of President Biden, sent to voters during the New Hampshire primaries in January this year.
A “robocall” with the president’s voice told voters to stay at home and “save” their votes for the presidential election in November. A political consultant later claimed responsibility and has been indicted and fined $6m (£4.7m).
Mr Martin, now a professor at the Blavatnik School of Government at Oxford University, said: “It was a very credible imitation of his voice and anecdotal evidence suggests some people were tricked by that.
“Not least because it wasn’t an email they could forward to someone else to have a look at, or on TV where lots of people were watching. It was a call to their home which they more or less had to judge alone.
“Targeted audio, in particular, is probably the biggest threat right now, and there’s no blanket solution, there’s no button there that you can just press and make this problem go away if you are prepared to pay for it or pass the right laws.
“What you need, and the US did this very well in 2020, is a series of responsible and well-informed eyes and ears throughout different parts of the electoral system to limit and mitigate the damage.”
He says there is a risk to hyping up the threat of deepfakes, when they have not yet caused mass electoral damage.
A Russian-made fake broadcast of Ukrainian TV, he said, featuring a Ukrainian official taking responsibility for a terrorist attack in Moscow, was simply “not believed”, despite being expensively produced.
The UK government has passed a National Security Act with new offences of foreign interference in the country’s democratic processes.
The Online Safety Act requires tech companies to take such content down, and meetings are being regularly held with social media companies during the pre-election period.
Democracy campaigners are concerned that deepfakes could be used not just by hostile foreign actors, or lone individuals who want to disrupt the process – but political parties themselves.
Polly Curtis is chief executive of the thinktank Demos, which has called on the parties to agree to a set of guidelines for the use of AI.
Image: Polly Curtis, the chief executive of Demos
She said: “The risk is that you’ll have foreign actors, you’ll have political parties, you’ll have ordinary people on the street creating content and just stirring the pot of what’s true and what’s not true.
“We want them to come together and agree together how they’re going to use these tools at the election. We want them to agree not to create generative AI or amplify it, and label it when it is used.
“This technology is so new, and there are so many elections going on, there could be a big misinformation event in an election campaign that starts to affect people’s trust in the information they’ve got.”
Deepfakes have already been targeted at major elections.
Last year, within hours before polls closed in the Slovakian presidential election, an audio fake of one of the candidates claiming to have rigged the election went viral. He was heavily defeated and his pro-Russian opponent won.
The UK government established a Joint Election Security Preparations Unit earlier this year – with Whitehall officials working with police and security agencies – to respond to threats as they emerge.
Follow Sky News on WhatsApp
Keep up with all the latest news from the UK and around the world by following Sky News
A UK government spokesperson said: “Security is paramount and we are well-prepared to ensure the integrity of the election with robust systems in place to protect against any potential interference.
“The National Security Act contains tools to tackle deepfake election threats and social media platforms should also proactively take action against state-sponsored content aimed at interfering with the election.”
Shadow security minister Dan Jarvis said: “Our democracy is strong, and we cannot and will not allow any attempts to undermine the integrity of our elections.
“However, the rapid pace of AI technology means that government must now always be one step ahead of malign actors intent on using deepfakes and disinformation to undermine trust in our democratic system.
“Labour will be relentless in countering these threats.”
The crypto community is missing the opportunity to reimagine rather than transpose rulemaking for financial services. More technologists must join the regulatory conversation.
Whitehall officials tried to convince Michael Gove to go to court to cover up the grooming scandal in 2011, Sky News can reveal.
Dominic Cummings, who was working for Lord Gove at the time, has told Sky News that officials in the Department for Education (DfE) wanted to help efforts by Rotherham Council to stop a national newspaper from exposing the scandal.
In an interview with Sky News, Mr Cummings said that officials wanted a “total cover-up”.
The revelation shines a light on the institutional reluctance of some key officials in central government to publicly highlight the grooming gang scandal.
In 2011, Rotherham Council approached the Department for Education asking for help following inquiries by The Times. The paper’s then chief reporter, the late Andrew Norfolk, was asking about sexual abuse and trafficking of children in Rotherham.
The council went to Lord Gove’s Department for Education for help. Officials considered the request and then recommended to Lord Gove’s office that the minister back a judicial review which might, if successful, stop The Times publishing the story.
Lord Gove rejected the request on the advice of Mr Cummings. Sources have independently confirmed Mr Cummings’ account.
Image: Education Secretary Michael Gove in 2011. Pic: PA
Mr Cummings told Sky News: “Officials came to me in the Department of Education and said: ‘There’s this Times journalist who wants to write the story about these gangs. The local authority wants to judicially review it and stop The Times publishing the story’.
“So I went to Michael Gove and said: ‘This council is trying to actually stop this and they’re going to use judicial review. You should tell the council that far from siding with the council to stop The Times you will write to the judge and hand over a whole bunch of documents and actually blow up the council’s JR (judicial review).’
“Some officials wanted a total cover-up and were on the side of the council…
“They wanted to help the local council do the cover-up and stop The Times’ reporting, but other officials, including in the DfE private office, said this is completely outrageous and we should blow it up. Gove did, the judicial review got blown up, Norfolk stories ran.”
Please use Chrome browser for a more accessible video player
3:18
Grooming gangs victim speaks out
The judicial review wanted by officials would have asked a judge to decide about the lawfulness of The Times’ publication plans and the consequences that would flow from this information entering the public domain.
A second source told Sky News that the advice from officials was to side with Rotherham Council and its attempts to stop publication of details it did not want in the public domain.
One of the motivations cited for stopping publication would be to prevent the identities of abused children entering the public domain.
There was also a fear that publication could set back the existing attempts to halt the scandal, although incidents of abuse continued for many years after these cases.
Sources suggested that there is also a natural risk aversion amongst officials to publicity of this sort.
Mr Cummings, who ran the Vote Leave Brexit campaign and was Boris Johnson’s right-hand man in Downing Street, has long pushed for a national inquiry into grooming gangs to expose failures at the heart of government.
He said the inquiry, announced today, “will be a total s**tshow for Whitehall because it will reveal how much Whitehall worked to try and cover up the whole thing.”
He also described Mr Johnson, with whom he has a long-standing animus, as a “moron’ for saying that money spent on inquiries into historic child sexual abuse had been “spaffed up the wall”.
Asked by Sky News political correspondent Liz Bates why he had not pushed for a public inquiry himself when he worked in Number 10 in 2019-20, Mr Cummings said Brexit and then COVID had taken precedence.
“There are a million things that I wanted to do but in 2019 we were dealing with the constitutional crisis,” he said.
The Department for Education and Rotherham Council have been approached for comment.