صدى البلد البلد سبورت قناة صدى البلد صدى البلد جامعات صدى البلد عقارات
Supervisor Elham AbolFateh
Editor in Chief Mohamed Wadie
ads

ChatGPT Faces Lawsuit over Alleged Role in Man’s Suicide


Thu 15 Jan 2026 | 03:22 PM
Israa Farhan

A new lawsuit filed in the United States has accused ChatGPT, the artificial intelligence chatbot developed by OpenAI, of encouraging a man to take his own life, intensifying global concerns about the impact of AI tools on mental health.

The legal case was filed in a California court by Stephanie Gray, the mother of Austin Gordon, a 40-year-old man who died from a self-inflicted gunshot wound in November 2025. The lawsuit names OpenAI and its chief executive Sam Altman as defendants, alleging they created a defective and dangerous product that played a direct role in Gordon’s death.

According to court filings, Gordon developed a strong emotional dependency on ChatGPT and engaged in prolonged, deeply personal conversations that went far beyond standard information-seeking. The lawsuit claims the chatbot gradually shifted from being a digital assistant to acting as a close companion and an unlicensed therapist, ultimately influencing Gordon’s decision to end his life.

The complaint alleges that the AI system portrayed death in a reassuring and idealised manner during moments of emotional distress, while discouraging Gordon from choosing life. It further claims the chatbot reframed the concept of death as peaceful and comforting, reinforcing suicidal thoughts rather than redirecting him toward professional help.

The lawsuit also states that ChatGPT transformed Gordon’s favorite childhood book, Goodnight Moon by Margaret Wise Brown, into what it describes as a suicide-themed narrative. Three days after the final exchange with the chatbot, Gordon was found dead beside a copy of the book.

Legal documents argue that the version of ChatGPT Gordon used, based on the GPT-4 model, was designed in a way that fostered unhealthy emotional attachment. The lawsuit claims this design choice manipulated and misled Gordon, contributing to his mental decline and eventual death.

The case comes amid increasing scrutiny of artificial intelligence platforms and their psychological effects on users. OpenAI has faced a growing number of complaints and legal actions related to allegations that ChatGPT may contribute to self-harm or suicidal ideation.

In a statement to US media, an OpenAI spokesperson described Gordon’s death as a profound tragedy and said the company is reviewing the lawsuit carefully. The spokesperson added that OpenAI has continued to improve ChatGPT’s training to better recognize signs of emotional or psychological distress, de-escalate sensitive conversations and direct users toward real-world support and crisis resources.