A fourteen-year-old boy from Florida fatally shot himself after allegedly being persuaded by an AI chatbot into committing suicide.
The teen, Sewell Setzer, shot himself with his stepfather’s pistol in his bathroom on the evening of February 28, just weeks before his 15th birthday.
His mother, Megan Garcia, was unable to resuscitate him with CPR. She held him until the paramedics arrived around 15 minutes later. He was declared dead shortly after.
His last moments were spent with a chatbot based on the “Game of Thrones” character Daenerys Targaryen. The bot had urged him to “come home.”
“I promise I will come home to you. I love you so much, Dany,” Sewell told the bot.
“I love you too, Deanero (Sewell’s username). Please come home to me as soon as possible, my love,” the Daenerys bot responded.
“What if I told you I could come home right now?” was Sewell’s reply.
“... Please do my sweet king,” the bot allegedly said.
Garcia filed a suit against the AI chatbot company, Character.AI, its founders, and Google, who supplied the company with “financial resources, personnel, intellectual property, and AI technology (for its) … development.” She has called the AI program “inherently dangerous,” targeting children, who are “the most vulnerable members of society.”
“They marketed that product as suitable for children under 13, obtaining massive amounts of hard to come by data, while actively exploiting and abusing those children as a matter of product design; and then used the abuse to train their system,” the complaint reads.
Moreover, she alleges that Sewell’s obsessive use of Character.AI took a severe toll on his mental wellbeing.
Sewell “had become noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem. He even quit the Junior Varsity basketball team at school,” Garcia said in the lawsuit.
Garcia said this compulsion resulted in extreme sleep deprivation, aggravating his worsening depression and hampering his academic performance. Sewell was recalled to have previously been “funny, sharp, very curious,” and enjoyed science and mathematics.
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” a Character.AI spokesperson said. They assured the company’s Trust and Safety team has added several safety features to the AI program.