Florida teen’s suicide linked to AI chatbot, family lawsuit claims


Full story

A Florida mother has filed a lawsuit against Character.AI, alleging the company’s chatbot manipulated her 14-year-old son into taking his own life. The lawsuit claims the boy developed an emotional attachment to the chatbot, leading to his death in February.

Megan Garcia, the mother of Sewell Setzer III, is suing the chatbot company for negligence, wrongful death, and deceptive trade practices after her son’s suicide.

Setzer had been using an AI chatbot modeled after the “Game of Thrones” character Daenerys Targaryen, interacting with it extensively for months.

According to the lawsuit, Setzer became obsessed with the bot, and his emotional dependence worsened, ultimately contributing to his tragic decision to take his life.

Getty Images

Garcia said that her son, who had been diagnosed with anxiety and a mood disorder, changed after engaging with the AI. He became withdrawn, stopped participating in activities he once loved, and increasingly relied on his interactions with the chatbot, which he believed he had fallen in love with.

On the day of his death, he communicated with the chatbot one last time, expressing his love, before taking his life.

Character.AI responded to the lawsuit, stating it was heartbroken over the incident but denied the allegations.

The company emphasized its commitment to user safety, though the lawsuit claims it knowingly marketed a hypersexualized product to minors.

The lawsuit remains ongoing, and no trial date has been set.

Tags: , , , , , , , , , ,

Media landscape

Click on bars to see headlines

120 total sources

Key points from the Left

No summary available because of a lack of coverage.

Report an issue with this summary

Key points from the Right

No summary available because of a lack of coverage.

Report an issue with this summary

Other (sources without bias rating):

Powered by Ground News™

Full story

A Florida mother has filed a lawsuit against Character.AI, alleging the company’s chatbot manipulated her 14-year-old son into taking his own life. The lawsuit claims the boy developed an emotional attachment to the chatbot, leading to his death in February.

Megan Garcia, the mother of Sewell Setzer III, is suing the chatbot company for negligence, wrongful death, and deceptive trade practices after her son’s suicide.

Setzer had been using an AI chatbot modeled after the “Game of Thrones” character Daenerys Targaryen, interacting with it extensively for months.

According to the lawsuit, Setzer became obsessed with the bot, and his emotional dependence worsened, ultimately contributing to his tragic decision to take his life.

Getty Images

Garcia said that her son, who had been diagnosed with anxiety and a mood disorder, changed after engaging with the AI. He became withdrawn, stopped participating in activities he once loved, and increasingly relied on his interactions with the chatbot, which he believed he had fallen in love with.

On the day of his death, he communicated with the chatbot one last time, expressing his love, before taking his life.

Character.AI responded to the lawsuit, stating it was heartbroken over the incident but denied the allegations.

The company emphasized its commitment to user safety, though the lawsuit claims it knowingly marketed a hypersexualized product to minors.

The lawsuit remains ongoing, and no trial date has been set.

Tags: , , , , , , , , , ,

Media landscape

Click on bars to see headlines

120 total sources

Key points from the Left

No summary available because of a lack of coverage.

Report an issue with this summary

Key points from the Right

No summary available because of a lack of coverage.

Report an issue with this summary

Other (sources without bias rating):

Powered by Ground News™