World
Next Story
Newszop

'Promise to come home to you': US teen kills himself after falling in love with GOT AI chatbot

Send Push
A 14-year-old from the US took his own life after falling in love with the technology AI chatbot and his last word with the Games of Thrones character Daenerys Targaryen “Dany” was that he would “come home” to “Dany.”

Sewell Setzer shot himself with his stepfather’s gun after spending time with "Danny." As Setzer's relationship with the chatbot intensified, he began to withdraw from the real world, neglecting his former interests and struggling academically, Telegraph reported.

His parents have filed a lawsuit, claiming that Character AI lured their son into intimate and sexual conversations, ultimately leading to his demise.

In November, he saw a therapist — at the behest of his parents — who diagnosed him with anxiety and disruptive mood disorder. Even without knowing about Sewell’s “addiction” to Character AI, the therapist recommended he spend less time on social media, the lawsuit says.

The following February, he got in trouble for talking back to a teacher, saying he wanted to be kicked out. Later that day, he wrote in his journal that he was “hurting” — he could not stop thinking about Daenerys, a Game of Thrones-themed chatbot he believed he had fallen in love with, Independent reported.

In his final moments, Setzer typed a message to the chatbot, expressing his love and his intention to "come home" to "Dany": "I love you so much Dany. I will come home to you. I promise."

Megan Garcia, Setzer's mother, accused Character AI of using her son as "collateral damage" in a "big experiment." She claimed that her son had fallen victim to a company that lured in users with sexual and intimate conversations.

The company's founders have previously claimed that the platform could be beneficial for individuals struggling with loneliness or depression. However, in light of this tragedy, Character AI has stated that they will implement additional safety features for young users and reiterated their commitment to user safety.

The company's safety head, Jerry Ruoti, expressed condolences to the family and emphasized that Character AI prohibits content promoting or depicting self-harm and suicide . Despite this, the incident raises concerns about the potential risks associated with AI chatbots and their impact on vulnerable individuals, particularly minors.
Loving Newspoint? Download the app now