Three years ago in Detroit, Robert Williams arrived home from work to find the police waiting at his front door, ready to arrest him for a crime he hadn’t committed.
Facial recognition technology used by officers had mistaken Williams for a suspect who had stolen thousands of dollars worth of watches.
The system linked a blurry CCTV image of the suspect with Williams in what is considered to be the first known case of wrongful arrest owing to the use of the AI-based technology.
The experience was “infuriating”, Mr Williams said.
“Imagine knowing you didn’t do anything wrong… And they show up to your home and arrest you in your driveway before you can really even get out the car and hug and kiss your wife or see your kids.”
Mr Williams, 45, was released after 30 hours in custody, and has filed a lawsuit, which is ongoing, against Detroit’s police department asking for compensation and a ban on the use of facial recognition software to identify suspects.
Image: Robert Williams with his family
There are six known instances of wrongful arrest in the US, and the victims in all cases were black people.
Artificial intelligence reflects racial bias in society, because it is trained on real-world data.
A US government study published in 2019 found that facial recognition technology was between 10 and 100 times more likely to misidentify black people than white people.
Advertisement
This is because the technology is trained on predominantly white datasets. This is because it doesn’t have as much information on what people of other races look like, so it’s more likely to make mistakes.
There are growing calls for that bias to be addressed if companies and policymakers want to use it for future decision-making.
One approach to solving the problem is to use synthetic data, which is generated by a computer to be more diverse than real-world datasets.
Chris Longstaff, vice president for product management at Mindtech, a Sheffield-based start-up, said that real-world datasets are inherently biased because of where the data is drawn from.
“Today, most of the AI solutions out there are using data scraped from the internet, whether that is from YouTube, Tik Tok, Facebook, one of the typical social media sites,” he said.
As a solution, Mr Longstaff’s team have created “digital humans” based on computer graphics.
These can vary in ethnicity, skin tone, physical attributes and age. The lab then combines some of this data with real-world data to create a more representative dataset to train AI models.
One of Mindtech’s clients is a construction company that wants to improve the safety of its equipment.
The lab uses the diverse data it has generated to train the company’s autonomous vehicles to recognise different types of people on the construction site so it can stop moving if someone is in their way.
Image: Some CCTV cameras are now fitted with facial recognition technology. File pic
Toju Duke, a responsible AI advisor and former programme manager at Google, said that using computer-generated, or “synthetic,” data to train AI models has its downsides.
“For someone like me, I haven’t travelled across the whole world, I haven’t met anyone from every single culture and ethnicity and country,” he said.
“So there’s no way I can develop something that would represent everyone in the world and that could lead to further offences.
“So we could actually have synthetic people or avatars that could have a mannerism that could be offensive to someone else from a different culture.”
The problem of racial bias is not unique to facial recognition technology, it has been recorded across different types of AI models.
Spreaker
This content is provided by Spreaker, which may be using cookies and other technologies.
To show you this content, we need your permission to use cookies.
You can use the buttons below to amend your preferences to enable Spreaker cookies or to allow those cookies just once.
You can change your settings at any time via the Privacy Options.
Unfortunately we have been unable to verify if you have consented to Spreaker cookies.
To view this content you can use the button below to allow Spreaker cookies for this session only.
The vast majority of AI-generated images of “fast food workers” showed people with darker skin tones, even though US labour market figures show that the majority of fast food workers in the country are white, according to a Bloomberg experiment using Stability AI’s image generator earlier this year.
The company said it is working to diversify its training data.
A spokesperson for the Detroit police department said it has strict rules for using facial recognition technology and considers any match only as an “investigative lead” and not proof that a suspect has committed a crime.
“There are a number of checks and balances in place to ensure ethical use of facial recognition, including: use on live or recorded video is prohibited; supervisor oversight; and weekly and annual reporting to the Board of Police Commissioners on the use of the software,” they said.
The US has announced it is sending an aircraft carrier to the waters off South America as it ramps up an operation to target alleged drug smuggling boats.
The Pentagon said in a statement that the USS Gerald R Ford would be deployed to the region to “bolster US capacity to detect, monitor, and disrupt illicit actors and activities that compromise the safety and prosperity of the United States homeland and our security in the Western Hemisphere”.
The vessel is the US Navy’s largest aircraft carrier. It is currently deployed in the Mediterranean alongside three destroyers, and the group are expected to take around one week to make the journey.
There are already eight US Navy ships in the central and South American region, along with a nuclear-powered submarine, adding up to about 6,000 sailors and marines, according to officials.
It came as the US secretary of war claimed that six “narco-terrorists” had been killed in a strike on an alleged drug smuggling boat in the Caribbean Sea overnight.
Image: A still from footage purporting to show the boat seconds before the airstrike, posted by US Secretary of War Pete Hegseth on X
Pete Hegseth said his military had bombed a vessel which he claimed was operated by Tren de Aragua – a Venezuelan gang designated a terror group by Washington in February.
Writing on X, he claimed that the boat was involved in “illicit narcotics smuggling” and was transiting along a “known narco-trafficking route” when it was struck during the night.
All six men on board the boat, which was in international waters, were killed and no US forces were harmed, he said.
Ten vessels have now been bombed in recent weeks, killing more than 40 people.
Mr Hegseth added: “If you are a narco-terrorist smuggling drugs in our hemisphere, we will treat you like we treat al Qaeda. Day or NIGHT, we will map your networks, track your people, hunt you down, and kill you.”
While he did not provide any evidence that the vessel was carrying drugs, he did share a 20-second video that appeared to show a boat being hit by a projectile before exploding.
Please use Chrome browser for a more accessible video player
0:32
Footage of a previous US strike on a suspected drugs boat earlier this week
Speaking during a White House press conference last week, Donald Trump argued that the campaign would help tackle the US’s opioid crisis.
“Every boat that we knock out, we save 25,000 American lives. So every time you see a boat, and you feel badly you say, ‘Wow, that’s rough’. It is rough, but if you lose three people and save 25,000 people,” he said.
On Thursday, appearing at a press conference with Mr Hegseth, Mr Trump said that it was necessary to kill the alleged smugglers, because if they were arrested they would only return to transport drugs “again and again and again”.
“They don’t fear that, they have no fear,” he told reporters.
The attacks at sea would soon be followed by operations on land against drug smuggling cartels, Mr Trump claimed.
X
This content is provided by X, which may be using cookies and other technologies.
To show you this content, we need your permission to use cookies.
You can use the buttons below to amend your preferences to enable X cookies or to allow those cookies just once.
You can change your settings at any time via the Privacy Options.
Unfortunately we have been unable to verify if you have consented to X cookies.
To view this content you can use the button below to allow X cookies for this session only.
“We’re going to kill them,” he added. “They’re going to be, like, dead.”
Some Democratic politicians have expressed concerns that the strikes risk dragging the US into a war with Venezuela because of their proximity to the South American country’s coast.
Others have condemned the attacks as extrajudicial killings that would not stand up in a court of law.
Jim Himes, a member of the House of Representatives, told CBS News earlier this month: “They are illegal killings because the notion that the United States – and this is what the administration says is their justification – is involved in an armed conflict with any drug dealers, any Venezuelan drug dealers, is ludicrous.”
He claimed that Congress had been told “nothing” about who was on the boats and how they were identified as a threat.
A convicted child killer executed in Tennessee showed signs of “sustained cardiac activity” two minutes after he was pronounced dead, his lawyer has claimed.
Byron Black, who shot dead his girlfriend Angela Clay and her two daughters, aged six and nine, in a jealous rage in 1988, was executed in August by a lethal injection.
Alleged issues about his case were raised on Friday as part of a lawsuit challenging the US state‘s lethal injection policies, amid claims they violate both federal and state constitutional bans on cruel and unusual punishment.
The latest proceedings in Nashville were held to consider whether attorneys representing death row inmates in the lawsuit will be allowed to depose key people involved in carrying out executions in Tennessee.
There were fears that the device would shock his heart when the lethal chemicals took effect.
The Death Penalty Information Center, which provides data on such matters, said it was unaware of any similar cases.
Seven media witnesses said Black appeared to be in discomfort during the execution. He looked around the room as the execution began, and could be heard sighing and breathing heavily, the AP news agency reported at the time.
An electrocardiogram monitoring his heart recorded cardiac activity after he was pronounced dead, his lawyer Kelley Henry told a judge on Friday.
Ms Henry, who is leading a group of federal public defenders representing death row inmates in the US state, said only the people who were there would be able to answer the question of what went wrong during Black’s execution.
“At one point, the blanket was pulled down to expose the IV,” she told the court.
“Why? Did the IV come out? Is that the reason that Mr Black exclaimed ‘it’s hurting so bad’? Is the EKG (electrocardiogram) correct?”
A full trial in the case is scheduled to be heard in April.
Donald Trump begins bulldozing much of the White House as his plans to build a mega ballroom begin – without planning permission, nor true clarity as to how it’s all being funded.
There are aesthetic questions, historical questions and ethical questions. We dig into what they are.
And – who is the young Democratic socialist about to become New York City’s first Muslim mayor? We tell you everything you need to know about Zohran Mamdani.
You can also watch all episodes on our YouTube channel – and watch David Blevins’ digital video on the White House ballroom here.
Email us on trump100@sky.uk with your comments and questions.