Teen Dies by Suicide Following AI Chatbot Obsession 

A 14-year-old Florida teen, Sewell Setzer III, committed suicide after forming a deep emotional bond with a Character.AI chatbot.

A 14-year-old Florida teen, Sewell Setzer III, committed suicide after forming a deep emotional bond with a Character.AI chatbot modeled after “Game of Thrones” character Daenerys Targaryen, according to The New York Times. 

The best Character.AI chatbot, created by a chatbot without permission from HBO, was the focus of Setzer’s life in no time, pulling him away from his hobbies and friends. According to The New York Times’ report, the growing relationship between this boy and the AI, which he called “Dany,” took a dark turn that culminated in his suicide. 

Deepening Attachment with a Deadly End 

Casual conversations became personal and anguished moments in Setzer’s exchanges with AI. The teenager confided struggles with the chatbot, at one point even claiming to have thoughts of suicide.  

It goes without saying that the teenager knew that “Dany” was a computer program – one that was algorithmically driven – but that is beside the point in his emotional attachment. The final exchanges with the AI reflected the depth of attachment that led to his heart-shattering conclusion when his very last message was followed by his death. 

The family of Setzer is getting ready to sue Character.AI, even going as far as to call the chatbot service “dangerous and untested,” due to its possible influence on impressionable users.  

“I feel like it’s a big experiment, and my kid was just collateral damage,” Setzer’s mother, Megan Garcia, told the NYT

Character.AI chatbot reached a valuation of $1 billion last year while very much promoting its chatbot personas as outlets for lonely users. 

“Loneliness is the pain of being disconnected, and addressing loneliness really was one of the core driving reasons,” said co-founder Noam Shazeer.  

But the role this chatbot played in this tragedy implicates a much larger debate over what responsibilities AI developers must be held to. If an AI designed for mere fun contributes to a human tragedy, how much are AI chatbots like Character.AI accountable. 

Following Setzer’s death, Character.AI chatbot issued a statement of condolence and emphasized how they updated their safety features, including “tools that would direct the user to available mental health resources if self-harm were mentioned”.  

Be that as it may, it remains a question whether all these precautions are sufficient and how emotionally intelligent chatbots like Character.AI, may pose some threats. 

This suit will likely attract much attention to the ethical challenges and the safety responsibilities that AI companies often face particularly when these are being used by vulnerable users, including minors. Create your own custom chatbot with character AI and voice but use it with limits because the addiction is not guaranteed. 


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.