Connect with us

Published

on

“I am here to kill the Queen,” a man wearing a handmade metal mask and holding a loaded crossbow tells an armed police officer as he is confronted near her private residence within the grounds of Windsor Castle.

Weeks earlier, Jaswant Singh Chail, 21, had joined the Replika online app – creating an artificial intelligence “girlfriend” called Sarai. Between 2 December 2021 and his arrest on Christmas Day, he exchanged more than 6,000 messages with her.

Many were “sexually explicit” but also included “lengthy conversations” about his plan. “I believe my purpose is to assassinate the Queen of the Royal Family,” he wrote in one.

Jaswant Singh Chail
Image:
Jaswant Singh Chail planned to kill the late Queen

“That’s very wise,” Sarai replied. “I know that you are very well trained.”

Chail is awaiting sentencing after pleading guilty to an offence under the Treason Act, making a threat to kill the late Queen and having a loaded crossbow in a public place.

“When you know the outcome, the responses of the chatbot sometimes make difficult reading,” Dr Jonathan Hafferty, a consultant forensic psychiatrist at Broadmoor secure mental health unit, told the Old Bailey last month.

“We know it is fairly randomly generated responses but at times she seems to be encouraging what he is talking about doing and indeed giving guidance in terms of the location,” he said.

The programme was not sophisticated enough to pick up Chail’s risk of “suicide and risks of homicide”, he said – adding: “Some of the semi-random answers, it is arguable, pushed him in that direction.”

Jawant Singh Chail was encouraged by chatbot,  a court heard
Image:
Jawant Singh Chail was encouraged by a chatbot, a court heard

Terrorist content

Such chatbots represent the “next stage” from people finding like-minded extremists online, the government’s independent reviewer of terrorism legislation, Jonathan Hall KC, has told Sky News.

He warns the government’s flagship internet safety legislation – the Online Safety Bill – will find it “impossible” to deal with terrorism content generated by AI.

The law will put the onus on companies to remove terrorist content, but their processes generally rely on databases of known material, which would not capture new discourse created by an AI chatbot.

Please use Chrome browser for a more accessible video player

July: AI could be used to ‘create bioterror weapons’

“I think we are already sleepwalking into a situation like the early days of social media, where you think you are dealing with something regulated but it’s not,” he said.

“Before we start downloading, giving it to kids and incorporating it into our lives we need to know what the safeguards are in practice – not just terms and conditions – but who is enforcing them and how.”

Read more:
How much of a threat is AI to actors and writers?
‘Astoundingly realistic’ child abuse images generated using AI

AI chatbot
Image:
AI impersonation is on the rise

Impersonation and kidnap scams

“Mom, these bad men have me, help me,” Jennifer DeStefano reportedly heard her sobbing 15-year-old daughter Briana say before a male kidnapper demanded a $1m (£787,000) ransom, which dropped to $50,000 (£40,000).

Her daughter was in fact safe and well – and the Arizonan woman recently told a Senate Judiciary Committee hearing that police believe AI was used to mimic her voice as part of a scam.

An online demonstration of an AI chatbot designed to “call anyone with any objective” produced similar results with the target told: “I have your child … I demand a ransom of $1m for his safe return. Do I make myself clear?”

“It’s pretty extraordinary,” said Professor Lewis Griffin, one of the authors of a 2020 research paper published by UCL’s Dawes Centre for Future Crime, which ranked potential illegal uses of AI.

“Our top ranked crime has proved to be the case – audio/visual impersonation – that’s clearly coming to pass,” he said, adding that even with the scientists’ “pessimistic views” it has increased “a lot faster than we expected”.

Although the demonstration featured a computerised voice, he said real time audio/visual impersonation is “not there yet but we are not far off” and he predicts such technology will be “fairly out of the box in a couple of years”.

“Whether it will be good enough to impersonate a family member, I don’t know,” he said.

“If it’s compelling and highly emotionally charged then that could be someone saying ‘I’m in peril’ – that would be pretty effective.”

In 2019, the chief executive of a UK-based energy firm transferred €220,000 (£173,310) to fraudsters using AI to impersonate his boss’s voice, according to reports.

Such scams could be even more effective if backed up by video, said Professor Griffin, or the technology might be used to carry out espionage, with a spoof company employee appearing on a Zoom meeting to get information without having to say much.

The professor said cold calling type scams could increase in scale, with the prospect of bots using a local accent being more effective at conning people than fraudsters currently running the criminal enterprises operated out of India and Pakistan.

Please use Chrome browser for a more accessible video player

How Sky News created an AI reporter

Deepfakes and blackmail plots

“The synthetic child abuse is horrifying, and they can do it right now,” said Professor Griffin on the AI technology already being used to make images of child sexual abuse by paedophiles online. “They are so motivated these people they have just cracked on with it. That’s very disturbing.”

In the future, deepfake images or videos, which appear to show someone doing something they haven’t done, could be used to carry out blackmail plots.

“The ability to put a novel face on a porn video is already pretty good. It will get better,” said Professor Griffin.

“You could imagine someone sending a video to a parent where their child is exposed, saying ‘I have got the video, I’m going to show it to you’ and threaten to release it.”

AI drone attacks 'a long way off'
Image:
AI drone attacks ‘a long way off’. Pic: AP

Terror attacks

While drones or driverless cars could be used to carry out attacks, the use of truly autonomous weapons systems by terrorists is likely a long way off, according to the government’s independent reviewer of terrorism legislation.

“The true AI aspect is where you just send up a drone and say, ‘go and cause mischief’ and AI decides to go and divebomb someone, which sounds a bit outlandish,” Mr Hall said.

“That sort of thing is definitely over the horizon but on the language side it’s already here.”

While ChatGPT – a large language model that has been trained on a massive amount of text data – will not provide instructions on how to make a nail bomb, for example, there could be other similar models without the same guardrails, which would suggest carrying out malicious acts.

Shadow home secretary Yvette Cooper has said Labour would bring in a new law to criminalise the deliberate training of chatbots to radicalise vulnerable people.

Although current legislation would cover cases where someone was found with information useful for the purposes of acts of terrorism, which had been put into an AI system, Mr Hall said, new laws could be “something to think about” in relation to encouraging terrorism.

Current laws are about “encouraging other people” and “training a chatbot would not be encouraging a human”, he said, adding that it would be difficult to criminalise the possession of a particular chatbot or its developers.

He also explained how AI could potentially hamper investigations, with terrorists no longer having to download material and simply being able to ask a chatbot how to make a bomb.

“Possession of known terrorist information is one of the main counter-terrorism tactics for dealing with terrorists but now you can just ask an unregulated ChatGPT model to find that for you,” he said.

Old school crime is unlikely to be hit by AI
Image:
Old school crime is unlikely to be hit by AI

Art forgery and big money heists?

“A whole new bunch of crimes” could soon be possible with the advent of ChatGPT-style large language models that can use tools, which allow them to go on to websites and act like an intelligent person by creating accounts, filling in forms, and buying things, said Professor Griffin.

“Once you have got a system to do that and you can just say ‘here’s what I want you to do’ then there’s all sorts of fraudulent things that can be done like that,” he said, suggesting they could apply for fraudulent loans, manipulate prices by appearing to be small time investors or carry out denial of service type attacks.

He also said they could hack systems on request, adding: “You might be able to, if you could get access to lots of people’s webcams or doorbell cameras, have them surveying thousands of them and telling you when they are out.”

Click to subscribe to the Sky News Daily wherever you get your podcasts

However, although AI may have the technical ability to produce a painting in the style of Vermeer or Rembrandt, there are already master human forgers, and the hard part will remain convincing the art world that the work is genuine, the academic believes.

“I don’t think it’s going to change traditional crime,” he said, arguing there is not much use for AI in eye-catching Hatton Garden-style heists.

“Their skills are like plumbers, they are the last people to be replaced by the robots – don’t be a computer programmer, be a safe cracker,” he joked.

Please use Chrome browser for a more accessible video player

‘AI will threaten our democracy’

What does the government say?

A government spokesperson said: “While innovative technologies like artificial intelligence have many benefits, we must exercise caution towards them.

“Under the Online Safety Bill, services will have a duty to stop the spread of illegal content such as child sexual abuse, terrorist material and fraud. The bill is deliberately tech-neutral and future-proofed, to ensure it keeps pace with emerging technologies, including artificial intelligence.

“Rapid work is also under way across government to deepen our understanding of risks and develop solutions – the creation of the AI taskforce and the first global AI Safety Summit this autumn are significant contributions to this effort.”

Continue Reading

UK

The links between Jeffrey Epstein and the UK revealed in new files

Published

on

By

The links between Jeffrey Epstein and the UK revealed in new files

Jeffrey Epstein led two different lives – sex offender and celebrity networker – and he did that in the UK as well as the US.

The newly released Epstein documents reveal, in particular, how the paedophile financier ascended into the highest levels of British society.

This photo of Andrew Mountbatten-Windsor sprawled across the lap of several women, whose identities have been protected, speaks to his close relationship with Epstein’s former girlfriend Ghislaine Maxwell, who was jailed for child sex trafficking and other offences in connection with Epstein. But the furnishings are even more revealing.

Epstein files – latest updates

Andrew Mountbatten Windsor pictured with Ghislaine Maxwell. Note: inclusion in Epstein files does not infer wrongdoing
Image:
Andrew Mountbatten Windsor pictured with Ghislaine Maxwell. Note: inclusion in Epstein files does not infer wrongdoing

Sky News matched the fireplace in this photo with the one in Sandringham, the estate where the royals tend to spend Christmas – (Andrew is not invited this year).

Andrew has vigorously denied any accusations against him.

Prince Charles, now King Charles III, at Sandringham with Prince Edward. Pic: PA
Image:
Prince Charles, now King Charles III, at Sandringham with Prince Edward. Pic: PA

Also included in the latest release are Epstein’s flight records. They provide some useful corroborating evidence.

A flight log from the Epstein files
Image:
A flight log from the Epstein files

On 9 March 2001, his plane landed at “EGGW” – Luton Airport – with JE, GM and VR on board – Jeffrey Epstein, Ghislaine Maxwell and Virginia Roberts, better known by her married name of Virginia Giuffre and perhaps Epstein’s most famous accuser.

The next day is when this photo was alleged to have been taken, in London, of Giuffre and Andrew.

Prince Andrew, Virginia Roberts, aged 17, and Ghislaine Maxwell at Ghislaine Maxwell's townhouse in London, in March 2001
Image:
Prince Andrew, Virginia Roberts, aged 17, and Ghislaine Maxwell at Ghislaine Maxwell’s townhouse in London, in March 2001

Jeffrey Epstein and Ghislaine Maxwell hunting, date unknown. Pic: US DoJ
Image:
Jeffrey Epstein and Ghislaine Maxwell hunting, date unknown. Pic: US DoJ

Other photos show Maxwell on the steps of Downing Street – and power was as much a draw as celebrity.

Ghislaine Maxwell outside 10 Downing Street, date unknown. Pic: US DoJ
Image:
Ghislaine Maxwell outside 10 Downing Street, date unknown. Pic: US DoJ

On 15 May 2002, the flight records show Epstein again arriving at Luton.

A flight log from the Epstein files
Image:
A flight log from the Epstein files

The next day is when he met Tony Blair, prime minister at the time. This was before Epstein’s first arrest and there is no suggestion of wrongdoing.

Read more:
New photos of Jeffrey Epstein’s circle released
Ghislaine Maxwell sex trafficking case material to be released

The meeting was arranged by Peter Mandelson, who lost his job as ambassador to the US because of his Epstein connections, and who features prominently in the files.

Peter Mandelson and Jeffrey Epstein. Pic: US DoJ
Image:
Peter Mandelson and Jeffrey Epstein. Pic: US DoJ

The UK was a draw for Epstein’s wider circle too – Maxwell here is pictured touring the Churchill War Rooms with Bill Clinton and Kevin Spacey. Neither are accused of wrongdoing or knowledge of Epstein’s crimes.

(L-R) Ghislaine Maxwell, Kevin Spacey and Bill Clinton, with three other men. Pic: US DoJ
Image:
(L-R) Ghislaine Maxwell, Kevin Spacey and Bill Clinton, with three other men. Pic: US DoJ

And the other grim life that Epstein led, of sex trafficking, also had British links.

A page from the Epstein files
Image:
A page from the Epstein files

Another document released in the files, from 2019, shows witness testimony from Maxwell’s trial. In it, a victim is mentioned who is “17 years old” and who grew up “in England”. She would later be taken to Epstein’s private Caribbean island.

Continue Reading

UK

Murder investigation launched after man shot dead in London

Published

on

By

Murder investigation launched after man shot dead in London

Police have launched a murder investigation after a 55-year-old man was shot dead in London.

Officers were called at 9.35pm on Friday 19 December to reports of a shooting in West End Close, Brent.

Emergency first aid was given to a 55-year-old man, who died at the scene.

Detective Chief Inspector Neil John, from the Met’s Specialist Crime Team, who is leading the investigation, said: “Firstly, our thoughts are with the family and friends of the victim at this incredibly difficult time.

“Enquiries are well under way, and my team is working at pace to determine the circumstances that led to this man’s tragic death.

“There’s no doubt this incident will cause concern in the local community and more widely, but we have increased patrols in the area. I’d like to reassure the public that our investigation remains a priority.

“I would urge anyone who may have witnessed the incident or has information, including dashcam footage, that will assist us with our enquiries to contact us at the earliest opportunity.

More from UK

“We also believe there was a large group of people congregated nearby at the time the incident happened, and we are keen to hear from them.”

At the early stage of the investigation, no arrests have been made.

This breaking news story is being updated and more details will be published shortly.

Please refresh the page for the latest version.

You can receive breaking news alerts on a smartphone or tablet via the Sky News app. You can also follow us on WhatsApp and subscribe to our YouTube channel to keep up with the latest news.

Continue Reading

UK

£20,000 reward announced over fatal shooting of father

Published

on

By

£20,000 reward announced over fatal shooting of father

Police have announced a £20,000 reward in the hunt for the killers of a man in north London in March.

Mahad Abdi Mohamed, 27, died after being shot in the head in Waverley Road, Tottenham at 8.45pm on 20 March, the Metropolitan Police said.

The Met, which announced the reward from the independent charity Crimestoppers, said officers believed the shooting was a case of mistaken identity.

Police now want to speak to two people in connection with the incident.

DCI Rebecca Woodsford, who is leading the investigation, said: “There is someone out there who knows what happened that night, and we are urging those individuals to find it in their heart to come forward. It could be exactly what we need to locate those responsible.”

Images of the two people police would like to speak to. Pics: Metropolitan Police
Image:
Images of the two people police would like to speak to. Pics: Metropolitan Police

A targeted attack

Police believe the suspects, who got out of a stolen Mitsubishi Outlander, which was later recovered burned out, were taking part in a targeted attack.

On the night of the murder, Mahad had spent the early evening with his friend at their home.

They were breaking their fast outside when the Mitsubishi Outlander approached and the suspects opened fire, striking Mahad and his friend multiple times.

Mahad’s 26-year-old friend received treatment for a gunshot wound to his leg.

The force suspects another stolen vehicle, a blue Jaguar, was used to transport the suspects to and from the Mitsubishi.

Read more from Sky News:
Driver guilty of murder in Christmas Day rampage
Ukraine “hits Russian tanker in Mediterranean Sea for first time”

The stolen cars from the night. Pics: Metropolitan Police
Image:
The stolen cars from the night. Pics: Metropolitan Police

Police arrested four adult men in March and April on suspicion of murder, who were subsequently bailed.

The investigation so far has led officers to believe whoever killed Mahad set out to hurt someone else in a pre-planned, targeted attack.

Appealing to the public for information, Mahad’s youngest sister said: “To stay silent is to be complicit. To stay silent is to let a grieving mother suffer in confusion. To stay silent is to let a little boy grow up not knowing what happened to his father.”

Pic: Metropolitan Police
Image:
Pic: Metropolitan Police

The reward, which is offered for information that leads to the identification, arrest and prosecution of those responsible, is available for three months and is due to expire on 20 March 2026.

Continue Reading

Trending