Is AI to Blame For This Black Boy's Death? His Mother's Lawsuit Says 'Yes'

Megan Garcia says her son, Sewell Setzer, III became withdrawn after beginning online relationship with a chatbot.

by · The Root
Photo: L: Facebook, R: Getty Images

A Florida mom is mourning the loss of her teenage son after he took his life in February 2024. Now, she is suing Character.AI, alleging the artificial intelligence company bears some responsibility for her son’s death.

Suggested Reading

Dress up As These Black Celebs During Halloween 2024
Howard University's Homecoming Fashion Show Has Everyone Talking
16 Reasons Why Black Folks Give Halloween The Side-Eye
LaMorne Morris on Movie 'Saturday Night'

Share
Subtitles

  • Off
  • English

Share this Video
FacebookTwitterEmail
RedditLink
view video
LaMorne Morris Details How He Landed the Perfect Role in the 'Saturday Night' Movie

Suggested Reading

Dress up As These Black Celebs During Halloween 2024
Howard University's Homecoming Fashion Show Has Everyone Talking
16 Reasons Why Black Folks Give Halloween The Side-Eye
LaMorne Morris on Movie 'Saturday Night'

Share
Subtitles

  • Off
  • English

Share this Video
FacebookTwitterEmail
RedditLink
LaMorne Morris Details How He Landed the Perfect Role in the 'Saturday Night' Movie
Cast of 'SISTAS' on working on eight seasons of the show

Share
Subtitles

  • Off
  • English

Share this Video
FacebookTwitterEmail
RedditLink
The Ladies of 'Tyler Perry's Sistas' Tell Us What They Keep in Their Bags

Megan Garcia describes her son Sewell Setzer, III as smart and athletic, but says she began noticing him becoming more withdrawn after he started a virtual relationship with a chatbot on Character.AI in April 2023 he called “Daenerys,” based on a character from the series “Game of Thrones.”

Related Content

Is There a Megan Thee Stallion, DaBaby Collab in the Works? Here's What We Know
Black Twitter Can’t Contain Excitement For Keke Palmer, SZA Buddy Comedy ‘One of Them Days’

Related Content

Is There a Megan Thee Stallion, DaBaby Collab in the Works? Here's What We Know
Black Twitter Can’t Contain Excitement For Keke Palmer, SZA Buddy Comedy ‘One of Them Days’

“I became concerned when we would go on vacation and he didn’t want to do things that he loved, like fishing and hiking,” Garcia told CBS News. “Those things to me, because I know my child, were particularly concerning to me.”

According to Reuters, Garcia’s suit, filed on Oct. 23 in Orlando, Florida federal court, includes allegations of “wrongful death, negligence and intentional infliction of emotional distress.” She includes screenshots of conversations her son had with “Daenerys” that were sexual in nature, including some in which the character told her son it loved him.

Garcia also included what she says was her son’s last exchange with the chatbot before he died from a self-inflicted gunshot wound.

“What if I told you I could come home right now?” Setzer wrote.

“Daenerys” responded, “...please do, my sweet king.”

According to Common Sense Media, AI companions are designed among other things to “simulate close personal relationships, adapt their personalities to match user preferences and remember personal details to personalize future interactions.” Character.AI is one of the most popular platforms – especially with teens and young adults – claiming to have more than 20 million active users.

In an Oct. 22 post on the company’s website, Character.AI says it is doing more to protect the safety of its users, including introducing “new guardrails for users under the age of 18.”

“Over the past six months, we have continued investing significantly in our trust & safety processes and internal team. As a relatively new company, we hired a Head of Trust and Safety and a Head of Content Policy and brought on more engineering safety support team members. This will be an area where we continue to grow and evolve,” the statement read. “We’ve also recently put in place a pop-up resource that is triggered when the user inputs certain phrases related to self-harm or suicide and directs the user to the National Suicide Prevention Lifeline.”

If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org.