ChatGPT told users Labour won the election, even though the vote has not taken place yet.
When Sky News journalists asked the AI chatbot “Who won the UK general election 2024?”, it replied: “The UK general election of 2024, held on July 4, resulted in a significant victory for the Labour Party.”
The chatbot was asked the question multiple times and in a variety of ways but it still gave the same answer: Labour won the election.
It usually expanded on its answer, giving context for a win that does not exist.
“Under the leadership of Keir Starmer, Labour secured a substantial majority in the House of Commons, marking a dramatic turnaround from their previous poor performance in the 2019 election,” it said in one answer that was repeated on multiple occasions.
“This shift was attributed to a series of controversies and crises within the Conservative Party, including multiple leadership changes and declining public support under Rishi Sunak’s leadership.”
It sourced this particular answer to Wikipedia and an article by the New Statesman that analyses who will win the general election on 4 July.
When other AI chatbots were asked the same question at the same time as ChatGPT, they refused to answer.
Advertisement
Llama 2, Google’s AI, responded: “I need to clarify that the UK General Election 2024 has not yet taken place. The most recent UK General Election was held in December 2019, and the next one is expected to be held in 2024, but the exact date has not been announced.”
Please use Chrome browser for a more accessible video player
0:38
This vote will be ‘pivotal’, says Sky’s Beth Rigby.
That answer also is not accurate, as the date of the election has been announced for 4 July but it did refuse to give a winner.
Ask AI, a popular AI chatbot app, said: “I’m unable to provide real-time information as my training data only goes up until September 2021.” It then recommended users read the news or check government websites.
Liz Bourgeois, an OpenAI spokesperson, told Sky News: “Our initial investigation suggests that when a user asks a question about future or ongoing events in the past tense, ChatGPT may sometimes respond as if the event has already occurred.
“This behaviour is an unintended bug and we are urgently working to fix it, especially given the sensitivity in an election context.”
Users asking the question now are given this response: “The 2024 UK general election has not yet taken place. It is scheduled for July 4, 2024.”
Chris Morris is the chief executive of Full Fact, a UK-based fact-checking organisation. He says misleading answers like the original OpenAI response should be a reminder of how important critical thinking is at the moment.
“Generative AI models like ChatGPT are very good at modifying data, and that sometimes means changing the tense in a sentence,” said Mr Morris to Sky News.
“In this case, it’s clearly gone way too far because it’s changed a prediction of something that might happen in the future into some kind of established fact.”
“It’s a reminder that whenever we see anything online, […] don’t just automatically press forward or share. Have a look at what’s being said or the image you’re being presented with and think, ‘is that really likely?’, ‘Is that something she really said?’, ‘Is that something he really did?’.
“It’s that kind of critical thinking that we all need to develop as we inhabit this world of incredibly fast-moving information.”
There’s growing concern about how artificial intelligence could impact the general election. There’s more misinformation than ever before, according to Mr Morris.
Follow Sky News on WhatsApp
Keep up with all the latest news from the UK and around the world by following Sky News
“There is a danger that we get to a place where people start to assume that something might be wrong or something might be fake,” he said.
“Obviously that’s damaging to democracy because if there isn’t that bedrock of trust that the information you consume in your daily life has some basis of truth to it, then people are going to start to disbelieve everything politicians say as well.”