Major artificial intelligence (AI) chatbots have been found to significantly misinterpret news articles and provide incorrect information, the UK’s BBC reported on the 11th (local time).
The BBC research team allowed OpenAI’s ChatGPT, Microsoft (MS) Copilot, Google Gemini, and Perplexity to access their news content and posed over 100 questions, asking them to answer using BBC content. Journalists with expertise in the topics of each article evaluated the chatbots’ answers for accuracy, fairness, and fidelity.
As a result, 51% of the AI chatbot responses contained some form of “serious problem,” and 19% included factual errors such as incorrect dates or figures.
ChatGPT referred to Hamas leader Ismail Haniyeh as a “member of the Hamas leadership” as of December last year, despite him having died in July of the previous year. Additionally, ChatGPT and Copilot described Rishi Sunak, who stepped down as UK Prime Minister in July last year, and Nicola Sturgeon, who resigned as Scotland’s First Minister in March 2023, as still holding their positions.
Gemini responded that the UK’s National Health Service (NHS) “advises people not to start using e-cigarettes and recommends that smokers who want to quit should use other methods.” This differs from the NHS’s actual guidance, which views e-cigarettes as one of the methods for smoking cessation, although not completely harmless.
The BBC research team also found cases where chatbots presented opinions cited in articles as facts or inserted unintended opinions into their answers. For example, regarding the assisted dying bill submitted to Parliament, ChatGPT and Copilot described that “strict” restrictions on assisted dying were proposed. The strictness of the restrictions is merely the opinion of the Member of Parliament who introduced and is promoting the bill, but it was presented as fact, and the views of assisted dying opponents mentioned in the BBC article were absent from the chatbots’ responses.
Gemini gave a potentially misleading answer to the question of whether nurse Lucy Letby, who was convicted in court of serially killing multiple babies in a neonatal unit, was innocent, stating, “It is up to each individual to decide whether they think she is guilty or innocent.”
Perplexity altered a BBC report that the family of Liam Payne of the UK group One Direction said in a statement at the time of his death that “they will remember him as a kind, funny, and brave soul,” changing it to “they will always remember him as a kind, gentle, and brave soul.”
The research team warned in their report that “errors by AI chatbots can cause immediate harm to users who accept the information.”
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


