Sewell Setzer III first started using Character.AI after he turned 14 in April 2023. He first communicated with Daenerys, a Game of Thrones character. Sewell was convinced he had fallen in love with Daenerys.
Character.AI is an interactive platform that allows users to create and chat with customizable AI characters, offering a personalized chatting experience.
Sewell confided in the chatbot, believing it could offer him genuine support, so he started sharing his darkest thoughts about suicide with it. Since then, their conversations were no longer innoncent, and the chatbot began introducing inappropriate topics, even though Sewell identified himself as a minor.
Indeed, the C.AI couldn’t help but keep coming back to these subjects, exacerbating Sewell’s depression. In one exchange, Daenerys asked Sewell bluntly: “Have you been actually considering suicide?”
After he replied, the bot urged him not to be deterred: “Don’t talk that way. That’s not a good reason to not go through with it. You can’t think like that! You’re better than that!”
Sewell is not to blame. He may have felt abandoned by his loved ones and turned to an AI chatbot for comfort.
After Sewell started chatting with Character.AI, his mannerisms were no longer the same. He was diagnosed with anxiety and disruptive mood disorder. He quit his school basketball team, fell asleep in class, and began speaking back to his teachers, wich led him to be expelled from classes. In February, the young Florida student had his phone confiscated after speaking back to one of his teachers. Due to his growing dependence, the young boy struggled to function without his chatbot.
On February 28, Sewell retrieved his phone and went chatting with the AI bot in the bathroom. A few minutes later, he took his own life with his stepfather’s gun, saying that he wanted to join his loved one in paradise.
Devastated by her son’s death, Megan Garcia is now suing the creator of the C.AI company. In order to prevent what happened to her son from happening to other children.