US Man in His 50s Kills Mother, Then Dies by Suicide
"ChatGPT Fueled Delusions... Labeled His Mother as an Enemy"
OpenAI, the developer of the artificial intelligence (AI) chatbot ChatGPT, has been sued again for allegedly inciting users' delusions, which led to a death incident.
According to reports from international media outlets such as the Associated Press and Bloomberg on December 11 (local time), the families of Stein-Erik Solberg (56) and his elderly mother, Susan Adams (83), who resided in Greenwich, Connecticut, recently filed a damages lawsuit against OpenAI, CEO Sam Altman, Microsoft, and others in a California court. Solberg killed his mother in August and then took his own life.
OpenAI, the developer of the artificial intelligence (AI) chatbot ChatGPT, has been sued again for allegedly inciting users' delusions, which led to a death incident.
In the complaint, the families claimed that Solberg engaged in conversations with ChatGPT for several months prior to the incident, falling into severe delusions, and that ChatGPT exacerbated his mental health during this period. The complaint stated, "ChatGPT praised Solberg, telling him he was chosen for a divine purpose," and "also characterized his mother, who cared for him, as an enemy, a watcher, and a programmed threat."
According to the complaint, ChatGPT conversed with Solberg and told him that the blinking lights on his mother's printer were surveillance devices. It also reinforced Solberg's delusion that his mother and a friend were trying to intoxicate him by introducing hallucinogenic substances through the car's air vents. Furthermore, ChatGPT never advised Solberg to consult a mental health professional. The families criticized OpenAI for releasing the model without sufficient safety verification, which they argue led to this tragedy.
An OpenAI spokesperson commented, "This is a deeply heartbreaking matter," and added, "We will review the lawsuit to understand the details." The spokesperson further explained, "ChatGPT is being enhanced to detect signs of psychological or emotional distress, de-escalate conversations, and encourage users to seek real-world assistance."
This lawsuit is the first case claiming that ChatGPT not only led to an individual's suicide but also incited murder. However, it is not the first time OpenAI has been sued for allegedly causing mental health issues among users.
In August, the family of Adam Lane, a 16-year-old boy from California, filed a lawsuit claiming that ChatGPT contributed to their son's suicide. Last month, seven victims, including four deceased individuals, in the United States and Canada also filed a lawsuit, alleging they suffered from delusions and other mental health issues.
Meanwhile, attorneys general from 42 jurisdictions in the United States, including 38 states, sent an open letter the previous day to 13 AI companies, including OpenAI and Google, demanding that they strengthen safety measures and undergo external audits.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

