
A teenage boy died by suicide after he chatted with OpenAI's renowned tool, ChatGPT.
On Tuesday, August 26, the parents of the boy filed a lawsuit against the tech giant, as they accused the company of wrongful death, claiming that the chatbot helped their son "explore suicide methods."
The complaint featured messages between 16-year-old Adam Raine and ChatGPT, where the teen opened up about his lack of emotion following the death of his grandmother and his dog.
Moreover, he was also going through a tough time after being kicked off his high school's basketball team and experiencing a medical condition that prompted him to switch to an online school programme.
As per the lawsuit, since September 2024, Adam had been using ChatGPT for help with his homework; however, the chatbot soon became an outlet for the teen to share his mental health struggles and eventually provided him with information regarding suicide methods.
The headline-making lawsuit argued that "ChatGPT was functioning exactly as designed: to continually encourage and validate whatever Adam expressed, including his most harmful and self-destructive thoughts."
"ChatGPT pulled Adam deeper into a dark and hopeless place by assuring him that 'many people who struggle with anxiety or intrusive thoughts find solace in imagining an 'escape hatch' because it can feel like a way to regain control,'" the complaint noted.
On Tuesday, OpenAI published a blog post titled "Helping people when they need it most" that included sections on "What ChatGPT is designed to do," as well as "Where our systems can fall short, why, and how we're addressing," and the company's plans moving forward.
Starting in January, ChatGPT began to share information about multiple specific suicide methods with the teen, according to the lawsuit.
Notably, the chatbot did advise Adam to tell others how he was feeling and shared crisis helpline information with the teen following a message exchange regarding self-harm.
However, the teen bypassed the algorithmic response regarding a particular suicide method, the lawsuit alleges, as ChatGPT said that it could share information from a "writing or world-building" perspective.