🚀 Read this awesome post from TechCrunch 📖
📂 **Category**: AI,TC,character.ai,Google,lawsuits
📌 **What You’ll Learn**:
In what may mark the tech industry’s first significant legal settlement over harms associated with artificial intelligence, Google and startup Character.AI are negotiating terms with families whose teens died by suicide or harmed themselves after interacting with Character.AI’s chatbot companions. The parties agreed in principle to settle; Now comes the harder work of finalizing the details.
These are among the first settlements in lawsuits accusing AI companies of harming users, a legal frontier that should have OpenAI and Meta watching nervously from the wings as they defend themselves against similar lawsuits.
Founded in 2021 by former Google engineers who returned to their former employer in 2024 in a $2.7 billion deal, Character.AI invites users to chat with AI characters. The most disturbing case involves Sewell Setzer III, who had sexual conversations with a robot Daenerys Targaryen at the age of 14 before committing suicide. His mother, Megan Garcia, told the Senate that companies should be “legally accountable when they intentionally design harmful AI technologies that kill children.”
Another lawsuit describes a 17-year-old who encouraged his chatbot to harm himself and suggested that killing his parents was a reasonable thing to limit screen time. Character.AI banned minors last October, Character.AI told TechCrunch. The settlements are likely to include financial damages, although no liability was acknowledged in court filings made available Wednesday.
TechCrunch reached out to both companies.
⚡ **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Google #Acter.AI #negotiating #major #settlements #cases #teen #chatbot #deaths**
🕒 **Posted on**: 1767836723
🌟 **Want more?** Click here for more info! 🌟
