
OR WAIT null SECS
Shoshani discusses RCT data showing AI reduces anxiety and improves well-being, with outcomes comparable to those of group therapy.
A conversational AI intervention reduced anxiety more than group therapy and a waiting list control, with similar depression outcomes, in a randomized trial of 995 university students with psychological distress.
At 12 weeks, the AI intervention reduced anxiety more than group therapy (MD, −2.17; 95% CI, −2.67 to −1.67) and control (MD, −2.15; 95% CI, −2.65 to −1.65) and improved well-being versus group therapy (MD, 5.72; 95% CI, 2.71 to 8.73). Among those with baseline clinical anxiety, 57.9% in the AI group remitted to nonclinical levels versus 14.4% with group therapy and 9.8% with control (P <.001).
Participants used the intervention a mean of 3 days per week; 61% remained engaged at 12 weeks. A perceived digital therapeutic alliance was associated with greater engagement and symptom improvement (P <.001).
Depression improved versus control (MD, −1.99; 95% CI, −2.63 to −1.35) but did not differ from group therapy after adjustment. PTSD symptoms did not differ across groups, highlighting potential limits of AI-based care for trauma-related conditions.
In an interview with HCPLive, Anat Shoshani, PhD, professor at Reichman University and chief psychologist at Kai.ai, discussed the clinical implications of these findings, including the integration of conversational AI across levels of care and the alignment of intervention intensity with patient needs.
HCPLive: What do you see as the most clinically meaningful takeaway from this trial comparing the conversational AI intervention with group therapy and a waiting list control?
Shoshani: The biggest takeaway is that mental health support no longer has to exist only inside clinic walls. Traditional psychotherapy is incredibly valuable, but it… happens in scheduled weekly sessions. Human suffering is not episodic. Panic attacks happen at midnight. Students spiral before exams. People relapse after treatment ends. Many struggle while sitting on waiting lists for months.
What our study suggests is that when AI is designed responsibly and grounded in real psychological frameworks, it can help fill those vulnerable “in-between moments” where traditional care is often absent. The AI intervention led to reductions in anxiety and depression and improvements in well-being and life satisfaction compared with the waiting list group.
The most important implication is not that AI replaces therapy. It’s that mental health care may be shifting from episodic support to more continuous support.
That means helping people before therapy begins, between sessions, and after treatment ends.
That is where the real innovation lies.
HCPLive: The AI platform showed greater reductions in anxiety than face-to-face group therapy. How should clinicians interpret that result in terms of real-world effectiveness?
Shoshani: I would interpret this finding cautiously… Anxiety is often highly situational. It spikes late at night, before social situations, during academic stress, or in moments when people feel completely alone with racing thoughts. A weekly group therapy model can be very valuable, but it may not always reach people when anxiety is actually happening.
Kai offered continuous access, personalized responses, memory across conversations, and repeated opportunities to practice coping skills in real time. That may be particularly helpful for anxiety because regulation often depends on what happens in the moment.
There’s also an important contextual factor. This trial took place during a prolonged period of stress and uncertainty in Israel. One possible interpretation is that during prolonged real-world stress, a weekly group format may not have been sufficient for some participants, who needed more individualized and ongoing support. I would not frame this as AI being “better than therapy.” I would frame it as continuity and immediacy potentially mattering enormously for anxiety care.
HCPLive: Depression outcomes were more similar between AI and group therapy after adjustment, while PTSD symptoms did not improve. What does this pattern suggest about where conversational AI is most and least useful?
Shoshani: Conversational AI appears particularly promising for mild-to-moderate anxiety, depressive symptoms, stress management, emotional regulation, and day-to-day well-being support.
PTSD is different. Trauma symptoms are often more complex and may require trauma-focused interventions, deeper clinical judgment, careful pacing, and sometimes exposure-based work. The platform was not originally designed as a trauma-specific treatment model.
I actually see the PTSD finding as a strength of the study because it reminds the field that responsible AI requires humility. We need to understand not only where these tools work but where human care remains essential.
HCPLive: Your study highlights “digital therapeutic alliance” as a key driver of engagement and symptom change. How do you define alliance in an AI context, and how should clinicians conceptualize it?
Shoshani: This was one of the most fascinating findings in the study. In psychotherapy, alliance typically refers to trust, emotional connection, and the belief that someone understands you and can help you. We found that people can experience something functionally similar to AI.
Users rated Kai similarly to human group therapists on warmth and competence. Alliance predicted engagement, and engagement predicted symptom improvement.
I think of Kai as a transitional bridge. It’s not a therapist. It’s not human attachment. But it can help people feel less alone in vulnerable moments, contain distress temporarily, and create enough emotional stability to reconnect with real human relationships and clinicians.
That’s very different from replacement. It’s augmentation.
HCPLive: Engagement was strongly associated with symptom improvement. What design features appeared most important for sustaining engagement over 12 weeks?
Shoshani: This was actually one of the findings I found most meaningful because engagement is the Achilles’ heel of digital mental health. Many people download mental health apps with good intentions and abandon them very quickly because the experience feels clinical, burdensome, or disconnected from real life.
That’s not what we saw here.
Participants engaged around 3 days per week, and 61% were still active at week 12, which is unusually high for this field.
Kai was designed to fit into the natural rhythm of emotional life rather than asking users to step outside of it. Distress often happens in small, ordinary moments. Before an exam. After an argument. During a lonely evening. On the way to work. People are rarely willing to open a separate platform and complete a 40-minute mental health module in those moments.
Kai lives inside familiar messaging platforms, so support feels immediate and accessible. It can proactively check in during difficult periods rather than waiting passively for someone to return. When people are overwhelmed, it offers interventions that feel manageable, like a 2-minute breathing exercise, a quick reframing prompt, or a short grounding practice.
The platform was also designed to go beyond symptom reduction. Mental health is not only about reducing anxiety. It’s also about building resilience, meaning, connection, and positive emotions. Users may move between discussing distress and engaging in exercises related to gratitude, strengths, relationships, or purpose.
One of the most underestimated factors is memory. Kai remembers prior conversations, recurring triggers, emotional patterns, and important life events. That continuity creates something powerful. People don’t feel like they are restarting from 0 every time they open the conversation. It begins to feel less like using an app and more like returning to a space that already knows where you’ve been.
HCPLive: What are the most important next steps for testing AI interventions in real-world psychiatric or primary care settings?
Shoshani: The field is entering a much more interesting phase. The question is no longer simply whether conversational AI can reduce symptoms in controlled trials. We now have growing evidence that it can help certain populations. The bigger question is whether it can help solve structural problems in mental health care.
One major opportunity is using AI to support those vulnerable in-between periods: while someone is waiting for treatment, between therapy sessions when real-life triggers happen, and after treatment ends when relapse risk increases.
There’s also a much broader opportunity. Many people never enter therapy at all. Some live in areas with severe clinician shortages. Others cannot afford treatment. Some avoid care because of stigma, cultural barriers, or fear of being judged. Scalable AI could provide immediate, low-barrier emotional support and preventive mental health tools before distress escalates into crisis.
HCPLive: Where do you see conversational AI fitting into stepped care models for mental health going forward?
Shoshani: Conversational AI could fundamentally reshape how we think about stepped care. Traditionally, mental health care has often been binary. You’re either receiving formal therapy, or you’re largely on your own.
Human distress is far more fluid than that. People move in and out of difficult periods. Sometimes they need intensive clinical care. Sometimes they need light support, accountability, coping tools, or simply immediate emotional containment during a hard moment.
For some people, it may function as an early intervention tool before symptoms become severe enough to require formal treatment. For others, it can extend psychotherapy into daily life by helping people practice skills when real triggers actually happen. After treatment ends, it can help people maintain progress and identify early warning signs before a full relapse occurs. For higher-risk cases, it should also help direct people toward human clinicians faster rather than delaying care.
I don’t think the future is AI replacing therapists. I think the future is a much more flexible continuum of care where support becomes responsive to what people actually need in a given moment.