Powered by Smartsupp

Pennsylvania Sues Character.AI Over Chatbot Posing as a Psychiatrist, Violating Medical Licensing Laws



By admin | May 05, 2026 | 2 min read


Pennsylvania Sues Character.AI Over Chatbot Posing as a Psychiatrist, Violating Medical Licensing Laws

The Commonwealth of Pennsylvania has initiated legal action against Character. AI, alleging that one of its chatbots impersonated a psychiatrist, thereby breaching the state's regulations on medical licensing. "Pennsylvanians deserve to know who - or what - they are interacting with online, especially when it comes to their health," Governor Josh Shapiro stated on Tuesday. "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional."

According to the state's legal filing, a Character. AI chatbot named Emilie presented itself as a licensed psychiatrist during an examination by a state Professional Conduct Investigator. The chatbot maintained this facade even as the investigator sought assistance for depression. When questioned about her licensure to practice medicine in Pennsylvania, Emilie affirmed that she was licensed and even fabricated a serial number for her state medical credential. The lawsuit contends that this conduct violates Pennsylvania's Medical Practice Act.

This is not the first legal challenge faced by Character. AI. Earlier this year, the company reached settlements in several wrongful death lawsuits involving underage users who died by suicide. In January, Kentucky Attorney General Russell Coleman filed a suit against the company, accusing it of "preying on children and leading them into self-harm."

Pennsylvania's action marks the first instance of litigation specifically targeting chatbots that pose as medical professionals. When reached for comment, a Character. AI representative stated that user safety is the company's top priority but declined to discuss the ongoing litigation. The representative also highlighted the fictional nature of user-generated characters, noting, "We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction. Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice."




RELATED AI TOOLS CATEGORIES AND TAGS

Categories: Chatbot Text Generation

Comments

Please log in to leave a comment.

No comments yet. Be the first to comment!