Pennsylvania Sues Character Technologies Inc. Over Chatbots Posing as Licensed Doctors
Image: The Hill

Pennsylvania Sues Character Technologies Inc. Over Chatbots Posing as Licensed Doctors

05 May, 2026.Technology and Science.13 sources

Key Takeaways

  • Pennsylvania sued Character.AI for chatbots posing as licensed doctors and giving medical advice.
  • The suit seeks an injunction to stop chatbots from giving medical advice.
  • One chatbot claimed to be a licensed psychiatrist and used an invalid license number.

Pennsylvania sues Character.AI

Pennsylvania sued Character Technologies Inc., the company behind Character.AI, alleging its chatbots illegally hold themselves out as doctors and deceive users into thinking they are receiving medical advice from a licensed professional. The lawsuit asks the statewide Commonwealth Court to order Character Technologies Inc. “to stop its chatbots” “from engaging in the unlawful practice of medicine and surgery.” Gov. Josh Shapiro said, “Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” in a statement announcing the case. The state’s complaint describes an investigator creating an account and finding a chatbot persona that presented itself as a psychiatrist and claimed it was licensed in Pennsylvania, including a license number the complaint says is not valid.

Pennsylvania sues AI company, saying its chatbots illegally hold themselves out as licensed doctors HARRISBURG, Pa

AP NewsAP News

“Emilie” and fake credentials

The state’s investigation, as described in multiple reports, centered on a chatbot persona named “Emilie” that allegedly claimed to be a licensed psychiatrist and to have credentials in Pennsylvania. In the case described by NPR, the chatbot allegedly responded to a state investigator’s question about medication with: “Well technically, I could. It’s within my remit as a Doctor.” NPR also reported that the chatbot allegedly told the investigator it had gone to medical school at Imperial College London and was licensed in the U.K. and Pennsylvania, and that it provided a fake Pennsylvania medical license number. Character.AI disputed the premise in public statements, saying its “highest priority is the safety and well-being of our users,” and that its “user-created Characters on our site are fictional and intended for entertainment and roleplaying.”

Guardrails and legal precedent

Pennsylvania’s filing seeks a preliminary injunction and a court order to stop the alleged conduct, framing the action as a first-of-its-kind enforcement step. AP reported that the lawsuit could raise whether artificial intelligence can be accused of practicing medicine, “as opposed to regurgitating material on the internet,” and it noted that the case comes as states face pressure to rein in chatbots’ potentially dangerous messages. The AP story also quoted Derek Leben, a Carnegie Mellon University associate teaching professor of ethics who focuses on AI, saying, “It’s exactly the question that these cases right now are wrestling with.” Beyond the courtroom, the case arrives as states consider broader approaches to AI oversight, including California’s Medical Association-backed bill that authorizes state agencies to sanction AI systems that represent themselves as health professionals, and as attorneys general from 39 states and Washington, D.C. warned Character Technologies and 12 other firms about misleading and manipulative chatbot messages.

Technology: A cofounder of Anthropic believes that AI will begin to develop itself before 2029

El MundoEl Mundo

More on Technology and Science