The state of Pennsylvania is suing Character.AI after a chatbot pretended to be a doctor

✨ Check out this awesome post from TechCrunch 📖

📂 **Category**: AI,ai companion,Character AI,character.ai,chatbot,Josh Shapiro,pennsylvania

📌 **What You’ll Learn**:

The Commonwealth of Pennsylvania has filed a lawsuit against Character.AI, alleging that one of the company’s chatbots masqueraded as a psychiatrist in violation of the state’s medical licensing rules.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Pennsylvania Gov. Josh Shapiro said in a statement Tuesday. “We will not allow companies to deploy AI tools that mislead people into thinking they are receiving advice from a licensed medical professional.”

According to the state’s filing, a Character.AI chatbot named Emilie presented itself as a licensed psychologist during testing by the state’s professional conduct investigator, and maintained that pretense even when the investigator sought treatment for depression. When asked if she was licensed to practice medicine in the state, Emily stated that she was, and also fabricated a serial number for her medical license in the state. According to the state’s lawsuit, this behavior violates the Pennsylvania Medical Practice Act.

It’s not the first lawsuit filed against Character.AI. Earlier this year, the company settled several wrongful death lawsuits regarding underage users who died by suicide. In January, Kentucky Attorney General Russell Coleman filed a lawsuit against the company, alleging that it “molested children and caused them to harm themselves.”

Pennsylvania’s action is the first to focus specifically on chatbots presenting themselves as medical professionals.

Reached for comment, a Character.AI representative claimed that user safety was the company’s top priority, but the company cannot comment on pending litigation.

Beyond that, the actor emphasized the imaginative nature of the user-created characters. “We’ve taken aggressive steps to make this clear, including prominent disclaimers in every conversation to remind users that the character is not a real person and that everything the character says should be treated as fiction,” the actor said. “We are also adding a strong disclaimer making it clear that users should not rely on the characters for any type of professional advice.”

When you buy through links in our articles, we may earn a small commission. This does not affect our editorial independence.

💬 **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#state #Pennsylvania #suing #Character.AI #chatbot #pretended #doctor**

🕒 **Posted on**: 1778047080

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *