Connect with us

Published

on

As AI deepfakes cause havoc during other elections, experts warn the UK’s politicians should be prepared.

“Just tell me what you had for breakfast”, says Mike Narouei, of ControlAI, recording on his laptop. I speak for around 15 seconds, about my toast, coffee and journey to their offices.

Within seconds, I hear my own voice, saying something entirely different.

Follow the latest updates on the election

In this case, words I have written: “Deepfakes can be extremely realistic and have the ability to disrupt our politics and damage our trust in the democratic process.”

Tamara Cohen's voice being turned into a deepfake
Image:
Tamara Cohen’s voice being turned into a deepfake

We have used free software, it hasn’t taken any advanced technical skills, and the whole thing has taken next to no time at all.

This is an audio deepfake – video ones take more effort to produce – and as well as being deployed by scammers of all kinds, there is deep concern, in a year with some two billion people going to the polls, in the US, India and dozens of other countries including the UK, about their impact on elections.

More on Artificial Intelligence

Sir Keir Starmer fell victim to one at last year’s Labour Party conference, purportedly of him swearing at staff. It was quickly outed as a fake. The identity of who made it has never been uncovered.

London mayor Sadiq Khan was also targeted this year, with fake audio of him making inflammatory remarks about Remembrance weekend and calling for pro-Palestine marches going viral at a tense time for communities. He claimed new laws were needed to stop them.

Ciaran Martin, the former director of the UK’s National Cyber Security Centre, told Sky News that expensively made video fakes can be less effective and easier to debunk than audio.

“I’m particularly worried right now about audio, because audio deepfakes are spectacularly easy to make, disturbingly easy”, he said. “And if they’re cleverly deployed, they can have an impact.”

Those which have been most damaging, in his view, are an audio deepfake of President Biden, sent to voters during the New Hampshire primaries in January this year.

A “robocall” with the president’s voice told voters to stay at home and “save” their votes for the presidential election in November. A political consultant later claimed responsibility and has been indicted and fined $6m (£4.7m).

Read more:
The digital election in India
Time running out for regulators to tackle AI threat
Biden to unveil sweeping AI regulations

Ciaran Martin, the former NCSC director
Image:
Ciaran Martin, the former NCSC director

Mr Martin, now a professor at the Blavatnik School of Government at Oxford University, said: “It was a very credible imitation of his voice and anecdotal evidence suggests some people were tricked by that.

“Not least because it wasn’t an email they could forward to someone else to have a look at, or on TV where lots of people were watching. It was a call to their home which they more or less had to judge alone.

“Targeted audio, in particular, is probably the biggest threat right now, and there’s no blanket solution, there’s no button there that you can just press and make this problem go away if you are prepared to pay for it or pass the right laws.

“What you need, and the US did this very well in 2020, is a series of responsible and well-informed eyes and ears throughout different parts of the electoral system to limit and mitigate the damage.”

He says there is a risk to hyping up the threat of deepfakes, when they have not yet caused mass electoral damage.

A Russian-made fake broadcast of Ukrainian TV, he said, featuring a Ukrainian official taking responsibility for a terrorist attack in Moscow, was simply “not believed”, despite being expensively produced.

The UK government has passed a National Security Act with new offences of foreign interference in the country’s democratic processes.

The Online Safety Act requires tech companies to take such content down, and meetings are being regularly held with social media companies during the pre-election period.

Democracy campaigners are concerned that deepfakes could be used not just by hostile foreign actors, or lone individuals who want to disrupt the process – but political parties themselves.

Polly Curtis is chief executive of the thinktank Demos, which has called on the parties to agree to a set of guidelines for the use of AI.

Polly Curtis, the chief executive of Demos
Image:
Polly Curtis, the chief executive of Demos

She said: “The risk is that you’ll have foreign actors, you’ll have political parties, you’ll have ordinary people on the street creating content and just stirring the pot of what’s true and what’s not true.

“We want them to come together and agree together how they’re going to use these tools at the election. We want them to agree not to create generative AI or amplify it, and label it when it is used.

“This technology is so new, and there are so many elections going on, there could be a big misinformation event in an election campaign that starts to affect people’s trust in the information they’ve got.”

Deepfakes have already been targeted at major elections.

Last year, within hours before polls closed in the Slovakian presidential election, an audio fake of one of the candidates claiming to have rigged the election went viral. He was heavily defeated and his pro-Russian opponent won.

The UK government established a Joint Election Security Preparations Unit earlier this year – with Whitehall officials working with police and security agencies – to respond to threats as they emerge.

Follow Sky News on WhatsApp
Follow Sky News on WhatsApp

Keep up with all the latest news from the UK and around the world by following Sky News

Tap here

A UK government spokesperson said: “Security is paramount and we are well-prepared to ensure the integrity of the election with robust systems in place to protect against any potential interference.

“The National Security Act contains tools to tackle deepfake election threats and social media platforms should also proactively take action against state-sponsored content aimed at interfering with the election.”

A Labour spokesperson said: “Our democracy is strong, and we cannot and will not allow any attempts to undermine the integrity of our elections.

“However, the rapid pace of AI technology means that government must now always be one step ahead of malign actors intent on using deepfakes and disinformation to undermine trust in our democratic system.

“Labour will be relentless in countering these threats.”

Continue Reading

Politics

Rachel Reeves acknowledges damage of ‘too many’ budget leaks

Published

on

By

Rachel Reeves acknowledges damage of 'too many' budget leaks

The Chancellor Rachel Reeves has acknowledged there were “too many leaks” in the run-up to last month’s budget.

The flow of budget content to news organisations was “very damaging”, Ms Reeves told MPs on the Treasury select committee on Wednesday.

“Leaks are unacceptable. The budget had too much speculation. There were too many leaks, and much of those leaks and speculation were inaccurate, very damaging”, she said.

Money blog: Nine-year-old set up Christmas tree business to pay for university

The cost of UK government borrowing briefly spiked after news reports that income taxes would not rise as first expected and Labour would not break its manifesto pledge.

An inquiry into the leaks from the Treasury to members of the media is to take place. But James Bowler, the Treasury’s top official, who was also giving evidence to MPs, would not say the results of it would be published.

Committee chair Dame Meg Hillier asked if the group of MPs could see the full inquiry.

More on Budget 2025

“I’d have to engage with the people in the inquiry about the views on that”, replied Mr Bowler, permanent secretary to the Treasury.

Please use Chrome browser for a more accessible video player

OBR leak ‘a mistake of such gravity’

The entire contents of the budget ended up being released 40 minutes early via independent forecasters, the Office for Budget Responsibility (OBR).

A report into this error found the OBR had uploaded documents containing their calculations of budget numbers to a link on the watchdog’s website it had mistakenly believed was inaccessible to the public.

Tax rises ruled out

The chancellor ruled out future revenue-raising measures, including applying capital gains tax to primary residences and changing the state pension triple.

Committee member and former chair Dame Harriet Baldwin had noted that the chancellor’s previous statement to the MPs when she said she would not overhaul council tax and look at road pricing, turned out to be inaccurate.

During the budget, an electric vehicle charge per mile was introduced, as was an additional council tax for those with properties worth £2m or more.

Continue Reading

Politics

Strategy responds to MSCI letter, makes case for index inclusion

Published

on

By

Strategy responds to MSCI letter, makes case for index inclusion

Strategy, the largest Bitcoin treasury company, submitted feedback to index company MSCI on Wednesday about the proposed policy change that would exclude digital asset treasury companies holding 50% or more in crypto on their balance sheets from stock market index inclusion.

Digital asset treasury companies are operating companies that can actively adjust their businesses, according to the letter, which cited Strategy’s Bitcoin-backed credit instruments as an example.

The proposed policy change would bias the MSCI against crypto as an asset class, instead of the index company acting as a neutral arbiter, the letter said.

Bitcoin Regulation, Stocks, MicroStrategy
The first page of Strategy’s letter to the MSCI pushes back against the proposed eligibility criteria change. Source: Strategy

The MSCI does not exclude other types of businesses that invest in a single asset class, including real estate investment trusts (REITs), oil companies and media portfolios, according to Strategy. The letter said:

“Many financial institutions primarily hold certain types of assets and then package and sell derivatives backed by those assets, like residential mortgage-backed securities.”

The letter also said implementing the change “undermines” US President Donald Trump’s goal of making the United States the global leader in crypto. However, critics argue that including crypto treasury companies in global indexes poses several risks.