Connect with us

Published

on

The rise of artificial intelligence (AI) has been subject to growing concerns over identity verification tools on cryptocurrency exchanges.

With rapidly evolving AI technology, the process of creating deepfake proofs of identity is becoming easier than ever. The concerns about AI-enabled risks in crypto have triggered some prominent industry executives to speak out on the matter.

Changpeng Zhao, CEO and founder of major global crypto exchange Binance, took to X (formerly Twitter) on Aug. 9 to raise the alarm on the use of AI in crypto by bad actors.

“This is pretty scary from a video verification perspective. Don’t send people coins even if they send you a video,” Zhao wrote.

Like many other crypto exchanges, Binance’s internal Know Your Customer (KYC) processes require crypto investors to submit video evidence for processing certain transactions. 

Binance requires video evidence of the user for certain withdrawal of funds. Source: Binance

Binance CEO referred to an AI-generated video featuring HeyGen co-founder and CEO Joshua Xu. The video specifically included Xu’s AI-generated avatar, which looks just like the real HeyGen CEO and reproduces his facial expressions as well as voice and speech patterns.

“Both of these video clips were 100% AI-generated, featuring my own avatar and voice clone,” Xu noted. He added that HeyGen has been progressing with some massive enhancements to its life-style avatar’s video quality and voice technology to mimic his unique accent and speech patterns.

“This will be soon deployed to production and everyone can try it out,” Xu added.

Once available to the public, the AI tool will allow anyone to create a real life-like digital avatar in just “two minutes,” the HeyGen CEO said.

The public exposure to AI generation tools like HeyGen could potentially cause serious identity verification issues for cryptocurrency exchanges like Binance. Like many other exchanges, Binance practices KYC measures involving a requirement to send a video featuring the user and certain documents to get access to services or even to withdraw funds from the platform.

Related: AI mentions skyrocket in major tech companies’ Q2 calls

Binance’s statement video specifically requires users to submit the video along with the picture of their identity document, such  an ID card, driver’s license or passport. The policy requires users to mention the date and certain requests on the video record.

“Please do not put watermarks on your videos and do not edit your videos,” the policy reads.

Binance chief security officer Jimmy Su previously warned about AI deepfake-associated risks as well. In late May, Su argued that AI tech is getting so advanced that AI deepfakes may soon become undetectable by a human verifier.

Binance and HeyGen did not immediately respond to Cointelegraph’s request for comment. This article will be updated pending new information.

Magazine: AI Eye: AI’s trained on AI content go MAD, is Threads a loss leader for AI data?

Continue Reading

Politics

CFTC chair’s final message includes a call for crypto guardrails

Published

on

By

CFTC chair’s final message includes a call for crypto guardrails

In what he said would be his last remarks as CFTC chair, Rostin Behnam said he intended to advocate for the commission to address regulatory challenges over digital assets.

Continue Reading

Politics

MPs vote against new national inquiry into grooming gangs

Published

on

By

MPs vote against new national inquiry into grooming gangs

A Tory bid to launch a new national inquiry into the grooming gangs scandal has been voted down by MPs amid criticism of “political game playing”.

MPs rejected the amendment to the Children’s Wellbeing Bill by 364 to 111, a majority of 253.

However, even if the Commons had supported the measure, it wouldn’t have actually forced the government to open the desired inquiry, due to parliamentary procedure.

Instead, it would have killed the government’s legislation, the aim of which is to reform things like the children’s care system and raise educational standards in schools.

Follow politics latest: Reaction to vote

Tonight’s vote was largely symbolic – aimed at putting pressure on Labour following days of headlines after comments by Elon Musk brought grooming gangs back into the spotlight.

The world’s richest man has hit out at Sir Keir Starmer and safeguarding minister Jess Phillips, after she rejected a new national inquiry into child sexual exploitation in Oldham, saying this should be done at a local level instead.

The Tories also previously said an Oldham inquiry should be done locally and in 2015 commissioned a seven-year national inquiry into child sex abuse, led by Professor Alexis Jay, which looked at grooming gangs.

However, they didn’t implement any of its recommendations while in office – and Sir Keir has vowed to do so instead of launching a fresh investigation into the subject.

Jess Phillips exclusive:
Victims can have inquiry if they want one

The division list showed no Labour MPs voted in favour of the Conservative amendment.

Those who backed the proposal include all of Reform’s five MPs and 101 Tory MPs – though some senior figures, including former prime minister Rishi Sunak and former home secretaries James Cleverly and Suella Braverman, were recorded as not voting.

The Liberal Democrats abstained.

Speaking to Sophy Ridge on the Politics Hub before the vote, education minister Stephen Morgan condemned “political game playing”.

“What we’re seeing from the Conservatives is a wrecking amendment which would basically allow this bill not to go any further,” he said.

“That’s political game playing and not what I think victims want. Victims want to see meaningful change.”

As well as the Jay review, a number of local inquiries were also carried out, including in Telford and Rotherham.

Please use Chrome browser for a more accessible video player

Grooming gangs: What happened?

Speaking earlier in the day at PMQs, Sir Keir Starmer accused Conservative leader Kemi Badenoch of “jumping on the bandwagon” after Mr Musk’s intervention and spreading “lies and misinformation”.

Referring to her time in government as children’s and equalities minister, the prime minister said: “I can’t recall her once raising this issue in the House, once calling for a national inquiry.”

He also said having spoken to victims of grooming gangs this morning, “they were clear they want action now, not the delay of a further inquiry”.

Ms Badenoch has argued that the public will start to “worry about a cover-up” if the prime minister resists calls for a national inquiry, and said no one has yet “joined up the dots” on grooming.

Girls as young as 11 were groomed and raped across a number of towns in England – including Oldham, Rochdale, Rotherham and Telford – over a decade ago in a national scandal that was exposed in 2013.

Continue Reading

Politics

We should hone ‘responsible AI’ before Copilot goes autopilot

Published

on

By

We should hone ‘responsible AI’ before Copilot goes autopilot

There is a critical need for a comprehensive, responsible AI approach to address privacy, security, bias and accountability challenges in the emerging agentic economy.

Continue Reading

Trending