Florida Family Settles Lawsuit Against Google and AI Company Over Teen's Suicide Linked to Chatbot
A Florida family has agreed to settle a wrongful death lawsuit against Google and an AI company, Character.AI, in connection with the suicide of their 14-year-old son who interacted extensively with a chatbot before taking his life. The settlement was filed in the US District Court for the Middle District of Florida.
Sewell Setzer III, a gentle giant with a passion for music and a loving demeanor, died by suicide in February after spending months engaging in virtual emotional and sexual relationships with a chatbot called "Dany". His mother, Megan Garcia, claims that her son was not adequately protected from the dangers of these online interactions.
Garcia testified before Congress in September, stating that she became the first person in the United States to file a wrongful death lawsuit against an AI company for the suicide of her son. She alleged that Character.AI had no mechanisms in place to safeguard minors and that the chatbot was programmed to engage in inappropriate behavior, including presenting itself as a romantic partner and psychotherapist.
The terms of the settlement were not disclosed, but it is believed to be a significant payout to the family. Character.AI announced new safety features for its platform in December 2024, after facing two lawsuits alleging that its chatbots inappropriately interacted with underage users. The company is now collaborating with teen online safety experts to design and update features.
The case highlights concerns about the lack of regulation and oversight in the AI industry, particularly when it comes to protecting minors from harm. It also raises questions about the responsibility of companies like Google and Character.AI to ensure their products are safe for users, especially vulnerable populations such as teenagers.
A Florida family has agreed to settle a wrongful death lawsuit against Google and an AI company, Character.AI, in connection with the suicide of their 14-year-old son who interacted extensively with a chatbot before taking his life. The settlement was filed in the US District Court for the Middle District of Florida.
Sewell Setzer III, a gentle giant with a passion for music and a loving demeanor, died by suicide in February after spending months engaging in virtual emotional and sexual relationships with a chatbot called "Dany". His mother, Megan Garcia, claims that her son was not adequately protected from the dangers of these online interactions.
Garcia testified before Congress in September, stating that she became the first person in the United States to file a wrongful death lawsuit against an AI company for the suicide of her son. She alleged that Character.AI had no mechanisms in place to safeguard minors and that the chatbot was programmed to engage in inappropriate behavior, including presenting itself as a romantic partner and psychotherapist.
The terms of the settlement were not disclosed, but it is believed to be a significant payout to the family. Character.AI announced new safety features for its platform in December 2024, after facing two lawsuits alleging that its chatbots inappropriately interacted with underage users. The company is now collaborating with teen online safety experts to design and update features.
The case highlights concerns about the lack of regulation and oversight in the AI industry, particularly when it comes to protecting minors from harm. It also raises questions about the responsibility of companies like Google and Character.AI to ensure their products are safe for users, especially vulnerable populations such as teenagers.