Advertisement

Clinicians Encounter Patient Use of AI for Mental Health, With Darlene King, MD

Published on: 

King discusses real-world patient use of AI tools for mental health advice and implications for clinical conversations.

As patients increasingly turn to artificial intelligence (AI) tools for mental health guidance, clinicians are encountering new dynamics in clinical encounters. Darlene King, MD, assistant professor in the department of psychiatry at UT Southwestern Medical Center, recalled a time when a patient causally pulled out ChatGPT during a psychiatric visit.

“They just asked ChatGPT in front of me what it thought about the medication I was offering,” King told HCPLive.

Other times, King had patients send her chart messages that would say that they asked ChatGPT about their medication plan and this was the response.

“They'll give me a whole printout of their conversation with ChatGPT, and I’ll read through it and talk about it with them, usually,” she said.

King noted that AI-generated responses occasionally align with clinical plans, which can reinforce treatment decisions and support patient understanding. However, she also described scenarios in which AI-generated content conflicts with established diagnoses.

“I’ve had them tell me ChatGPT has known me for 2 years, and it doesn’t think I have schizophrenia,” King said. “It thinks I have X, Y or Z instead.”

In this case, she said there needs to be careful discussion with the patient to review symptoms and prior clinical assessments.

King emphasized that clinicians should proactively ask about AI use during routine history taking. Understanding how frequently patients rely on these tools and for what purposes can help identify potential concerns, including excessive use or reliance in place of social support. She also highlighted privacy considerations, noting that patients may not recognize that AI platforms are not confidential.

Although she has not observed AI directly worsening symptoms, King said it can reinforce preexisting beliefs that diverge from clinical recommendations. “It’s kind of like you go into ChatGPT with…something you want to hear… and it can amplify that,” she said.

King views AI’s role in psychiatry as uncertain, with potential to both augment care and disrupt traditional models, particularly as direct-to-consumer mental health tools expand. She underscored the need for stronger evidence and regulatory frameworks to guide safe implementation.

“Ideally it would help augment care,” King said, “[but] we really need to study and get some good evidence to say whether it can be safe in new ways.”

Watch the full dynamic feature on using AI as therapist, featuring King, here.

King has no relevant disclosures.



Advertisement
Advertisement