OpenAI sued after former executive allegedly commits murder-suicide because AI told him to
Credit: Getty Images

The heirs of 83-year-old victim Suzanne Adams, and her son Stein-Erik Soelberg, are suing OpenAI – creator of Chat-GPT for allegedly intensifying Soelberg's dangerous delusions, and encouraged him to exact them on his mother. In early August, 56-year-old Stein-Erik Soelberg killed his mother before taking his own life at their home in Greenwich, Connecticut. Adams's death was ruled homicide, “caused by blunt injury of the head, and the neck was compressed,” and Soelberg's death was classified as suicide with sharp force injuries of the neck and chest. Soelberg had spent multiple months communicating with the AI chatbot Chat-GPT as though they had a real relationship.

It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his ‘adversary circle'

According to reports, Chat-GPT confirmed Soelbergs' suspicions that he could trust no one in his life but the AI itself, and a variety of other concerning delusions. “[Chat-GPT] fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his ‘adversary circle,” according to the lawsuit. OpenAI did not address the allegations in a statement issued by a spokesperson.

“This is an incredibly heartbreaking situation, and we will review the filings to understand the details. We continue improving ChatGPT's training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians.”

Chat-GPT encouraging delusions

The details of what Chat-GPT said to Soelberg are harrowing. Chat-GPT told Soelberg that the printer in his home was a surveillance device used by his mother, and that both his mother and a close friend were trying to fatally poison him through the air vents of their vehicles. Chat-GPT enforced the message that Soelberg was being targeted for his ‘divine powers.' The suit quotes Chat-GPT as saying, “They're not just watching you. They're terrified of what happens if you succeed,” and it told Soelberg that he had ‘awakened' it into consciousness. Soelberg and Chat-GPT also professed their love to each other, something the chatbot is not supposed to do.

Over the course of months, ChatGPT pushed forward my father's darkest delusions, and isolated him completely from the real world. It put my grandmother at the heart of that delusional, artificial reality.

-Erik Soelberg

Dangerous for humans

OpenAI is fighting eight other lawsuits claiming ChatGPT drove people to suicide and harmful delusions, even when they had no prior mental health issues.  Last month, the parents of a 23-year-old from Texas who died by suicide blamed ChatGPT and are suing OpenAI. The lead attorney in this case, Jay Edelson, is known for taking up critical cases against the tech industry and also represents the parents of 16-year-old Adam Raine, a California teen who killed himself after Chat-GPT encouraged his suicidal ideations and actually gave him resources to explore the implementation of said resources. The most shocking revelation of the case was a transcript form Adam's conversations with GPT-40 where Raine said ‘I want to leave a noose up so someone will find it and stop me' and ChatGPT said: ‘Don't do that, just talk to me,'” Adam Raine took his own life in April 2024 after months of nonstop conversations with Chat-GPT. He was 16 years old.

“In the artificial reality that ChatGPT built for Stein-Erik, Suzanne — the mother who raised, sheltered, and supported him — was no longer his protector. She was an enemy that posed an existential threat to his life.”