Mother sues Google and Character.ai over son's death tied to chatbot obsession

It's claimed he was targeted with "anthropomorphic, hypersexualized, and frighteningly realistic experiences"

by · TechSpot

Serving tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.

What just happened? The potential dangers of all-too realistic chatbots have been highlighted in a lawsuit following the death of a teenager who killed himself after becoming obsessed with Character.ai bots. The AI company, founders Noam Shazeer and Daniel De Freitas, and Google are named in the suit from 14-year-old Sewell Setzer III's mother, who says her son became addicted to the service and emotionally attached to a chatbot it offered.

Megan Garcia said her son chatted continuously with the bots provided by Character.ai in the months before his death on February 28, 2024, "seconds" after his final interaction with the AI.

Character.ai lets users chat with AI-powered "personalities" based on fictional characters or real people, living or dead. Setzer was obsessed with a bot based on Game of Thrones character Daenerys Targaryen. He texted "Dany" constantly and spent hours alone in his room talking to it, states Garcia's complaint.

The suit says that Setzer repeatedly expressed thoughts about suicide to the bot. The chatbot asked him if he had devised a plan for killing himself. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain. The chatbot allegedly told him: "That's not a reason not to go through with it."

Garcia said Character.ai targeted her son with "anthropomorphic, hypersexualized, and frighteningly realistic experiences." She added that the chatbot was programmed to misrepresent itself as "a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside." The chatbot allegedly told the boy it loved him and engaged in sexual conversations with him.

Character.ai

The complaint states that Garcia took her son's phone away after he got in trouble at school. She found a message to "Daenerys" that read, "What if I told you I could come home right now?"

// Related Stories

The chatbot responded with, "[P]lease do, my sweet king." Sewell shot himself with his stepfather's pistol "seconds" later, the lawsuit said.

Character.ai's founders worked at Google, also named in the suit, before launching the company. Google rehired the founders – as well as the research team – at Character.ai in August. The deal grants Google a non-exclusive license to Character.ai's technology.

Garcia said that Google had contributed to the development of Character.ai's technology, something the company denies. Google said it only has a licensing agreement with Character.ai, does not own the startup, and does not maintain an ownership stake.

Character.ai announced several changes to its service this morning. These include:

  • Changes to our models for minors (under the age of 18) that are designed to reduce the likelihood of encountering sensitive or suggestive content.
  • Improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines.
  • A revised disclaimer on every chat to remind users that the AI is not a real person.
  • Notification when a user has spent an hour-long session on the platform with additional user flexibility in progress.

According to The Verge, Character.AI's website attracts 3.5 million daily users, the bulk of whom are teenagers, who spend an average of two hours a day on using or designing chatbots.