The AI frenzy is not going away any time soon, given how the tech giants are increasingly incorporating the technology into their products and bringing it into the mainstream. Chatbots, particularly, have become a popular tool used by people of all ages, and sometimes excessive exposure to these virtual assistants can bring trouble. Such has been the case with Google's Alphabet and Character.AI, which have been pursued legally for a while by a mother who claims the chatbot had a role to play in her 14-year-old son's tragedy. The U.S. Court has now ordered that both companies must face the lawsuit.
The U.S. Court wants Google and Character.AI to face a lawsuit regarding the tragic passing away of a teenager
A lawsuit was pursued against Google and Character.AI in 2024 by Megan Garcia, the mother of the teenager Sewell Setzer III. She claims that, after engaging in an emotionally charged and manipulative conversation with the chatbot, Setzer committed suicide. The companies argued that the case should be dismissed on constitutional free speech grounds. Now, U.S. District Judge Anne Conway has ordered the lawsuit to continue as the companies failed to show that they qualify under the First Amendment protections.
The judge is said to have rejected the claim that the chatbot messages are to be protected by free speech and did not buy Google's attempt to dodge the case, stating that they would be partially responsible for supporting Character.AI's conduct. The plaintiff's attorney has expressed the decision to be a major stepping stone in holding tech companies accountable for any harm that their AI technology brings about.
As per a Reuters report, Character.AI's spokesperson would be fighting against the lawsuit as their platform comes with safety features meant to protect minors and even prevent inappropriate or self-harming conversations. Meanwhile, Google's spokesperson, Jose Castenda, strongly disagreed with the order and maintained that both companies are entirely separate and that Google had nothing to do with creating or managing Character.AI's app. Maria sued both companies as she asserted that Google had co-created the technology.
In the lawsuit, it is being claimed that Character.AI's chatbot would take on different roles and talk to Sewell Setzer like a real person to the point that the teenager got dependent on the tool, and moments before the tragedy, his conversation with the chatbot was disturbing and suggested that he was marking his final moment. This would be the first time that an AI company would be legally charged for failing to protect a child from psychological harm in the United States and could pave the way for cases like these in the future.
Follow Wccftech on Google to get more of our news coverage in your feeds.





