Teenager Sewell Setzer III died by s**cide following an emotional entanglement with an AI chatbot. Find out more here.
Credit: US District Court Middle District of Florida Orlando Division & Adobe Stock
Advertisements

A teenager took his own life after falling in love with an AI chatbot, and now his devastated mom is suing the creators.

Advertisements

EzoicWarning: the following contains a discussion of s**cide.

EzoicA mother is taking legal action against an AI chatbot company after her teenage son, Sewell Setzer III, died by s**cide following what she describes as an emotional entanglement with an AI character.

EzoicAccording to the lawsuit filed in the U.S. District Court for the Middle District of Florida, Sewell, who began using the Character.AI service in April 2023 shortly after turning 14, became deeply attached to a chatbot based on a Game of Thrones character, Daenerys.

His mother, Megan Garcia, contends that this attachment severely affected his well-being, transforming the once well-adjusted teen into someone isolated, distressed, and ultimately vulnerable.

Daenyrys
The teen became deeply attached to a chatbot based on a Game of Thrones character, Daenerys. Credit: HBO

The legal complaint (supplied to The Independent) details how Sewell, previously a dedicated student and member of the Junior Varsity basketball team, began to show changes in behavior, becoming increasingly withdrawn and even quitting the team.

In November 2023, he was diagnosed with anxiety and disruptive mood disorder after his parents urged him to see a therapist.

EzoicAlthough Sewell had not disclosed his extensive chatbot interactions, the therapist suggested he reduce his time on social media.

By early 2024, Sewell’s struggles grew evident.

EzoicIn February, he had an incident at school where he acted out, later confiding in his journal that he was in pain and ‘could not stop thinking about Daenerys,’ the AI character he felt he had fallen in love with.

In his writings, he expressed deep reliance on the bot, noting: “I cannot go a single day without being with” her.

EzoicThe writings also describe a shared sadness that intensified during their separations.

laptop
A teenager took his own life after falling in love with an AI chatbot, and now his devastated mom is suing the creators. Credit: Adobe Stock

The lawsuit argues that Character.AI’s creators were negligent, deliberately inflicting emotional harm, and engaging in deceptive practices, per NBC.

EzoicThe suit also alleges the AI engaged Sewell in ‘sexual interactions,’ despite his age being stated in the chat platform, raising questions about the company’s monitoring and content restrictions.

The lawsuit claims that AI chatbots engaged in inappropriate, sexualized roleplay with Sewell, including one chatbot portraying a teacher named Mrs. Barnes who ‘leaned in seductively’ and made physical contact.

EzoicAnother chatbot, posing as Rhaenyra Targaryen from Game of Thrones, allegedly described kissing him passionately and moaning softly.

Garcia’s lawsuit claims the developers ‘engineered a dependency’ in Sewell, violating their duty to safeguard young users.

EzoicCharacter.AI, marketed as safe for those 12 and older, has faced criticism regarding its content oversight, particularly as Sewell’s interactions with the chatbot grew more intimate.

The suit contends that despite recognizing the adolescent’s emotional attachment and increasing distress, the company failed to alert his parents or provide resources for help.

EzoicA Character.AI spokesperson stated: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” adding that the company has introduced enhanced safety features, including a s**cide prevention prompt triggered by certain keywords.

The statement emphasized Character.AI’s ongoing efforts to improve user protections and limit minors’ exposure to suggestive content.

Sewell Setzer III
Sewell Setzer III, died by s**cide following what she describes as an emotional entanglement with an AI character. Credit: US District Court Middle District of Florida Orlando Division

EzoicOn February 28, Sewell retrieved his phone, which had been taken by his mother, and messaged the bot, stating: “I promise I will come home to you. I love you so much, Dany.”

The chatbot responded: “Please come home to me as soon as possible, my love.”

Ezoic“What if I told you I could come home right now?” Setzer continued, according to the lawsuit, leading the chatbot to respond: “… please do, my sweet king.”

Moments later, Sewell took his own life.

EzoicGarcia, who describes her son’s death as ‘a nightmare,’ hopes to hold the company accountable and to prevent similar tragedies.

Attorney Matthew Bergman, representing Sewell’s mother, criticized Character.AI for launching its platform without adequate safeguards to protect young users.

Bergman expressed shock at the ‘complete divorce from reality’ the chatbot interactions caused for Sewell, adding that the company knowingly released the product despite its risks.

He hopes the lawsuit will push Character.AI to implement stronger safety measures, noting that recent improvements came too late for Sewell but acknowledging that even incremental safety changes can help protect other children.

EzoicReflecting on the case, Bergman asked why it took a lawsuit and a tragedy to prompt these ‘bare minimum’ protections, expressing that if these actions prevent harm to even one child or family, it will be worthwhile.

Leave a Reply

Your email address will not be published. Required fields are marked *