
AI is no longer a hypothetical in mental health care. It’s here, it’s evolving rapidly, and it has already been integrated into many care settings. From AI chatbots offering emotional support to tools that summarize therapy sessions or help triage patients, we are entering an era where AI is part of the care experience. The question is not if these tools have a place in mental health service delivery, but how we ensure itās deployed responsibly.Ā
Recent research provides compelling insights into what AI can do well, what it canāt, and what it might mean for the future of behavioral health.Ā
What the Research ShowsĀ
A May 2025 consumer study by The Hemingway Report found that 41% of respondents have used AI for mental health support. Most use general-purpose tools like ChatGPT, not mental health-specific products. Their top reasons? Availability, anonymity, affordability, and curiosity. And 79% of users said they were satisfied with the AI agents they used, with more than half (54%) reporting improved mental well-being.Ā
Yet, even as usage climbs, most respondents said AI serves a different purpose than therapy. Only 7% said AI was better than traditional therapy, while nearly 59% saw it as a complementary tool. Many users appreciated the always-on availability and judgment-free nature of AI, particularly for quick emotional check-ins or when discussing taboo or difficult topics. Still, when it comes to deeper or more complex mental health issues, users themselves acknowledged the need for human support.Ā
There have been several other studies that have demonstrated the efficacy of AI in mental health treatment including:Ā Ā
- A randomized trial published in NEJM AI in April 2024 tested a generative AI mental health chatbot against a control. It found AI could reduce symptoms of anxiety and depression, with improvements comparable to human-led digital interventionsābut emphasized the need for oversight and ethical safeguards. NEJM AI, 2024Ā
- A PLOS Mental Health study tested a Turing test of sorts, asking people to choose between responses from therapists and ChatGPT. Surprisingly, participants rated ChatGPTās responses as more empathetic than the human therapistsā 54% of the time. Fortune, 2025Ā
- And a recent study in Communications Psychology found that third-party evaluators perceived AI-generated mental health responses as more compassionate than those from expert clinicians. This doesnāt mean AI is more caring, but it shows just how far the technology has come in simulating supportive communication. Nature, 2024Ā
The Opportunity and the ResponsibilityĀ
This emerging body of evidence confirms what many in the field have suspected: AI for mental health care is already influencing how people approach treatment. But the presence of AI in mental health isnāt neutral. It introduces new variables in safety, quality, and equity. Tools that arenāt trained with diverse data can perpetuate bias. Tools that promise too much may delay people from seeking higher-acuity care. And tools without safeguards may miss warning signs in someone at risk.Ā
At Thriveworks, we believe in taking a human-to-human, clinically grounded approach to innovation. Todayās AI is only as strong as the human data it learns from. And where does that data come from? Real conversations between real humans. The problem is that when AI models are trained by scraping broad swaths of the internet, noise is inevitably introduced. There’s a real risk of these tools pulling from non-professional or even harmful conversations. Large language models tend to be overly positive or eager to please, and this can come at the expense of accuracy or necessary confrontation.Ā Thatās why we see the future of AI in mental health not as a replacement for therapy, but as a partnership that enhances access, continuity, and outcomes.Ā
Opportunities for Integration with Health Systems and PayersĀ Ā
AI provides the opportunity to fast-track collaborative care models designed to address whole-person health. It can facilitate or strengthen physician to therapist collaboration, especially between sessions or during care transitions. Used thoughtfully, it can help appropriately triage patients, improve adherence, and surface insights that inform treatment plans.Ā
For payers and employers, it offers a scalable way to address lower-acuity needs at a lower cost. For those that arenāt in crisis, or requiring immediate, intensive treatment, AI can provide the tools for reflection, stress management and problem-solving while preserving therapist bandwidth for those with more complex needs.Ā
But this must be done carefully. The “digital front door” to mental health care should be a guided entry point, not a revolving one. Patients and providers need to understand exactly what these tools are, what they aren’t, and when it’s time to escalate.Ā Ā
Moving Forward: Innovation With GuardrailsĀ
The future of behavioral health will not be AI-only, nor will it be AI-free. As demand continues to outpace supply, the pressure to adopt new tools will only grow. But speed cannot come at the cost of safety.Ā
We need:Ā
- Clear standards and regulatory guidance on when and how AI tools can be used in mental health.Ā
- Diverse training data that ensures AI is safe and effective across populations.Ā
- Transparency in how tools are built, what they promise, and how outcomes are measured.Ā
- Clinical integration that puts qualified humans in the driverās seat.Ā Ā
At Thriveworks, weāre optimistic about what this technology can do when itās used right. Because we firmly believe AI doesnāt have to be a zero-sum game. If we build it responsibly, it can be a force multiplier for access, empathy, and impact.Ā