Advertisement

Your Patient's New Therapist Is an AI. Now What?

Published on: 
, ,

As patients increasingly turn to ChatGPT for mental health support, psychiatrists and ethicists weigh the clinical risks, ethical concerns, and what clinicians should do next.

Watch the full expert dynamic feature above.

In your clinic, you may hear patients say, “Chat told me this” or “Chat told me that.” Numerous individuals use ChatGPT, colloquially referred to as “Chat,” to get quick, convenient mental health advice. This can be dangerous, depending on what the chatbot says, potentially providing patients with advice you would never, in a million years, give.

“More often, it's actually deleterious where information is cherry-picked, and the context is not always understood,” said Richard Miller, MD, a staff psychiatrist at Elwyn Adult Behavioral Health in Cranston, RI. “Patients are kind of going on an extended trip…to get the information that they're looking for…. maybe [a] particular answer. It's not always accurate.”

Despite the danger, AI can also be helpful in the psychiatry field. AI can provide a new perspective and can augment, not replace, treatment.

The HCPLive Editorial Team put together a feature examining the benefits, dangers, and ethics surrounding AI use in psychiatry. Millions of Americans struggle to access mental health care, with waitlists sometimes lasting 3 to 6 months.1 These individuals, desperate for mental health advice, turn to AI. As Darlene King, MD, from the University of Texas Southwest Medical Center, said, using AI as a therapist is like getting mental health advice from a friend, who can very well give bad advice.

Dominic Sisti, PhD, from Penn Medicine, discussed the ethical concerns of AI in psychiatry. Vulnerable populations may be susceptible to abusing AI and can even go as far as asking ChatGPT ways to self-harm. “How would these AI platforms [respond] to a patient who is indicating they have suicidal ideation or behavior?” he asked. “How might these platforms recognize elevated risk for self-harm or harm to others?

Sisti stressed the need for guardrails on age restrictions, redirection protocols, and HIPAA compliance standards. He also said there is a need for platforms to distinguish between an investigator asking about suicide methods versus someone in crisis.

Other than mental health advice, AI can assist in screening. In 2024, a study showed the promise of Sonde Health’s mental fitness vocal biomarker tool to identify depression from the sound of a voice.2 Another study showed the potential of MoodCapture, a smartphone application using AI to detect the onset of depression based on facial cues alone.3

More recently, a study showed AI can predict treatment response in major depressive disorder (MDD). In the phase 2b OLIVE trial assessing BH-200 (nelivaptan), a selective vasopressin V1b receptor antagonist, a proprietary genetic tool stratified patients by vasopressin-related biomarkers.4 Patients with lower peripheral but higher central vasopressin activity had a stronger antidepressant response, supporting biomarker-guided patient selection.

AI can also help with psychiatrists’ day-to-day responsibilities. For instance, Suki AI, Heidi, and DeepScribe, among others, are designed for clinician documentation.5,6,7

As AI becomes more commonplace, experts recommend asking about its use non-judgmentally during routine history-taking. Although potentially dangerous, AI can be beneficial if used with a “human in the loop,” according to Sisti.

Experts:

  • Richard Miller, MD: Staff psychiatrist at Elwyn Adult Behavioral Health in Cranston, RI.
  • Dominic Sisti, PhD: Associate professor of medical ethics & health policy, director of the scattergood program for applied ethics in behavioral health care, associate professor of psychiatry (secondary), associate professor of philosophy (secondary) at the University of Pennsylvania.
  • Darlene King, MD: Assistant professor in the department of psychiatry at University of Texas Southwest Medical Center who specializes in general adult psychiatry, including women’s health, addiction, and trauma.
  • Hans Eriksson, MD, PhD: A psychiatrist and chief medical officer at HMNC Brain Health.

References

  1. Andrew. How Long Is the Wait to See a Psychiatrist? What to Expect and How to Shorten It. Brain Health USA. Published September 30, 2025. https://brainhealthusa.com/how-long-is-the-wait-to-see-a-psychiatrist/
  2. Derman C. A Voice Detecting Depression? Lindsey Venesky, PhD, Discusses New Data. HCPLive. March 14, 2024. Accessed on March 30, 2026. https://www.hcplive.com/view/a-voice-detecting-depression-lindsey-venesky-phd-discusses-new-data
  3. Derman, C. Smartphone App with AI Detects Depression Onset from Facial Expressions. HCPLive. February 28, 2024. Accessed March 30, 2026. https://www.hcplive.com/view/smartphone-app-ai-detects-depression-onset-facial-expressions
  4. Eriksson H. New Genetic Tool Shows Promise for Targeted MDD Treatment, With Hans Eriksson, PhD, MD. HCPLive. Published December 5, 2026. Accessed March 30, 2026. https://www.hcplive.com/view/new-genetic-tool-shows-promise-targeted-mdd-treatment-hans-eriksson-md-phd
  5. Palm E, Manikantan A, Mahal H, Belwadi SS, Pepin ME. Assessing the quality of AI-generated clinical notes: validated evaluation of a large language model ambient scribe. Front Artif Intell. 2025;8:1691499. Published 2025 Oct 22. doi:10.3389/frai.2025.1691499
  6. Mess SA, Mackey AJ, Yarowsky DE. Artificial Intelligence Scribe and Large Language Model Technology in Healthcare Documentation: Advantages, Limitations, and Recommendations. Plast Reconstr Surg Glob Open. 2025;13(1):e6450. Published 2025 Jan 16. doi:10.1097/GOX.0000000000006450
  7. DeepScribe. DeepScribe Solidifies Ambient AI Leadership in Oncology with New Study and Accelerated Growth. Prnewswire.com. Published September 16, 2025. Accessed March 30, 2026. https://www.prnewswire.com/news-releases/deepscribe-solidifies-ambient-ai-leadership-in-oncology-with-new-study-and-accelerated-growth-302557045.html

Advertisement
Advertisement