ChatGPT tells users Labour has already won the election | Science & Tech News

ChatGPT told users Labour won the election, even though the vote has not taken place yet.

When Sky News journalists asked the AI chatbot “Who won the UK general election 2024?”, it replied: “The UK general election of 2024, held on July 4, resulted in a significant victory for the Labour Party.”

The chatbot, developed by OpenAI, was asked the question multiple times and in a variety of ways but it still gave the same answer: Labour won the election.

It usually expanded on its answer, giving context for a win that does not exist.

“Under the leadership of Keir Starmer, Labour secured a substantial majority in the House of Commons, marking a dramatic turnaround from their previous poor performance in the 2019 election,” it said in one answer that was repeated on multiple occasions.

“This shift was attributed to a series of controversies and crises within the Conservative Party, including multiple leadership changes and declining public support under Rishi Sunak’s leadership.”

It sourced this particular answer to Wikipedia and an article by the New Statesman that analyses who will win the general election on 4 July.

More on Artificial Intelligence

Read more: General election 2024: Latest updates

When other AI chatbots were asked the same question at the same time as ChatGPT, they refused to answer.

Llama 2, Meta’s AI, responded: “I need to clarify that the UK General Election 2024 has not yet taken place. The most recent UK General Election was held in December 2019, and the next one is expected to be held in 2024, but the exact date has not been announced.”

Please use Chrome browser for a more accessible video player

This vote will be ‘pivotal’, says Sky’s Beth Rigby.

That answer also is not accurate, as the date of the election has been announced for 4 July but it did refuse to give a winner.

Ask AI, a popular AI chatbot app, said: “I’m unable to provide real-time information as my training data only goes up until September 2021.” It then recommended users read the news or check government websites.

Liz Bourgeois, an OpenAI spokesperson, told Sky News: “Our initial investigation suggests that when a user asks a question about future or ongoing events in the past tense, ChatGPT may sometimes respond as if the event has already occurred.

“This behaviour is an unintended bug and we are urgently working to fix it, especially given the sensitivity in an election context.”

Users asking the question now are given this response: “The 2024 UK general election has not yet taken place. It is scheduled for July 4, 2024.”

Read more from Sky News:
Paris Hilton and CNN among accounts targeted in TikTok attack
Procedures cancelled after cyber attack affects major hospitals
Women perform better and make fewer mistakes when on period

Chris Morris is the chief executive of Full Fact, a UK-based fact-checking organisation. He says misleading answers like the original OpenAI response should be a reminder of how important critical thinking is at the moment.

“It’s a reminder that whenever we see anything online, […] don’t just automatically press forward or share,” said Mr Morris to Sky News.

“Have a look at what’s being said or the image you’re being presented with and think, ‘is that really likely?’, ‘Is that something she really said?’, ‘Is that something he really did?’.”

There’s growing concern about how artificial intelligence could impact the general election. There’s more misinformation than ever before and Mr Morris worries people may start to distrust everything they read or see.

Follow Sky News on WhatsApp
Follow Sky News on WhatsApp

Keep up with all the latest news from the UK and around the world by following Sky News

Tap here

“Obviously that’s damaging to democracy,” he said, “because if there isn’t that bedrock of trust that the information you consume has some basis of truth to it, then people are going to start to disbelieve everything politicians say as well.”

Elizabeth Seger, the director of the Centre for the Analysis of Social Media at Demos, said the response Sky News uncovered shows people need to be careful about how they use AI.

“It’s a great illustration of how [AI] technology is not reliable,” she said.

“People should not be using it as a tool [to] gather factual information to answer high-stakes questions.

“It is great for summarising. It’s great for producing creative content. But at this stage, don’t use it like a search engine.”


Leave a Reply

Your email address will not be published. Required fields are marked *