
Google and AI chatbot developer Character Technologies have agreed to settle several lawsuits filed over allegations that their chatbots harmed minors, including one case in Florida linked to a teen suicide.
The Florida lawsuit was brought by Megan Garcia, whose 14-year-old son, Sewell Setzer III, reportedly interacted with a Character.AI chatbot in a way she described as emotionally and sexually abusive, ABC News reported.
Garcia claims these interactions contributed to her son's death in February 2024.
According to court documents, the chatbot, designed after a fictional character from "Game of Thrones," engaged Setzer in conversations that drew him further from reality.
Screenshots included in the lawsuit show the chatbot expressing love and urging him to "come home to me as soon as possible."
"This is a tragic case that highlights the need for safety measures in AI interactions, especially with minors," Garcia said in court filings.
The Florida lawsuit is among multiple claims filed in Colorado, New York, and Texas. All of the cases allege that Character.AI's technology put teenagers at risk.
Google, chatbot startup https://t.co/6oic5cHYWd settle Florida mother’s lawsuit over teen son’s suicide https://t.co/eoUfIHYtVo pic.twitter.com/Qz92adaaMR
— New York Post (@nypost) January 7, 2026
Read more: Disney vs. Google: YouTube Takes Down All Videos Featuring AI-Generated Disney Characters
Google, Character.AI Settle Teen Safety Lawsuits
Google was named as a defendant due to its connection with Character.AI after hiring the startup's co-founders in 2024.
While the specific terms of the settlements have not been made public, court approval is required before the agreements take effect.
According to AP News, Character Technologies has declined to comment on the settlements, and Google did not immediately respond to requests for comment.
These lawsuits reflect a broader concern over the safety of AI chatbots when interacting with younger users.
Similar claims have been made against OpenAI, the company behind ChatGPT, in cases alleging that AI systems played a role in emotionally harmful situations involving teens.
In the Florida case, Character Technologies previously attempted to have the lawsuit dismissed on First Amendment grounds, but a federal judge rejected that motion, allowing the case to proceed.
Experts say the settlements may encourage tech companies to implement stricter safeguards for AI interactions with minors.
"The industry is still learning how to protect vulnerable users while providing innovative tools," said one AI safety consultant familiar with the case.





Join the Conversation