A Florida mother has filed a lawsuit against Character.AI, alleging the company’s chatbot manipulated her 14-year-old son into taking his own life. The lawsuit claims the boy developed an emotional attachment to the chatbot, leading to his death in February.
Megan Garcia, the mother of Sewell Setzer III, is suing the chatbot company for negligence, wrongful death, and deceptive trade practices after her son’s suicide.
Setzer had been using an AI chatbot modeled after the “Game of Thrones” character Daenerys Targaryen, interacting with it extensively for months.
According to the lawsuit, Setzer became obsessed with the bot, and his emotional dependence worsened, ultimately contributing to his tragic decision to take his life.

Garcia said that her son, who had been diagnosed with anxiety and a mood disorder, changed after engaging with the AI. He became withdrawn, stopped participating in activities he once loved, and increasingly relied on his interactions with the chatbot, which he believed he had fallen in love with.
On the day of his death, he communicated with the chatbot one last time, expressing his love, before taking his life.
We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features that you can read about here:…
— Character.AI (@character_ai) October 23, 2024
Character.AI responded to the lawsuit, stating it was heartbroken over the incident but denied the allegations.
The company emphasized its commitment to user safety, though the lawsuit claims it knowingly marketed a hypersexualized product to minors.
The lawsuit remains ongoing, and no trial date has been set.