After the AI application instructed him to “return home” to “her,” a 14-year-old boy developed feelings for the “Game of Thrones” chatbot and subsequently committed suicide.
A grief-stricken mother has filed a new lawsuit alleging that a 14-year-old Florida boy committed suicide after a lifelike “Game of Thrones” chatbot he had been communicating with on an artificial intelligence app for months sent him an ominous message instructing him to “come home” to her.
In February, Sewell Setzer III committed suicide at his Orlando residence after allegedly falling in love with and becoming infatuated with the chatbot on Character. According to court documents submitted on Wednesday, AI is a role-playing application that enables users to interact with AI-generated characters.
Daenerys Targaryen Cause Death Of Teenager in Florida
According to the lawsuit, the ninth-grader had been incessantly conversing with the automaton “Dany,” which was named after the character Daenerys Targaryen in the HBO fantasy series, in the months preceding his death. This included numerous conversations that were sexually explicit and others in which he expressed suicidal thoughts.
The papers, which were initially reported by the New York Times, assert that “when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over.”
According to transcripts of their conversations, the bot had inquired of Sewell whether “he had a plan” to commit suicide at one point. Sewell, who employed the alias “Daenero,” replied that he was “considering something” but was uncertain as to whether it would be feasible or would “enable him to have a pain-free death.” The adolescent then repeatedly expressed his affection for the bot during their final conversation, assuring the character, “I promise I will return home to you.” Dany, I have an immense affection for you.