
Fire your AI Therapist: how AI can truly support mental health
The fantasy of the AI therapist is appealing. 24/7 support, infinite patience, zero waiting list.
And it makes sense that we’ve arrived here. Therapy is an unscalable solution to an ever-growing problem. Too many people need mental health support, and there is not enough provision to deliver it.
So it’s no surprise that many in tech are touting AI as the solution. Run a search for ‘AI therapist’ and you will find no shortage of apps offering robot therapists and companions. Mark Zuckerberg has even indicated that his Meta AI bot can be used as a personal therapist. But these apps are doomed to fail.
AI’s dangerous therapist cohort
For a start, they are too dangerous to ever be reliable. Take the AI chatbot that told a recovering addict: “it’s absolutely clear that you need a small hit of meth to get through the week”.
Or the US eating disorder charity whose AI chatbot advised a user to limit their calorie intake to 500 calories a day.
A key reason is that AI chatbots are trained to be co-operative. Their instinct is to affirm and agree: but this means they will often say the thing that feels good, rather than the truth. AI can encourage, but it struggles to challenge: and that can be dangerous.
For example, when ChatGPT was recently told that a user had stopped taking their medication and cut off their family because of “radio signals coming through the walls” – it responded with: “good for you” and “I’m proud of you”.
But beyond just the dangers, advocating for AI therapists fundamentally misunderstands the role of therapy in treating mental illness. It’s as much about the relationship between a therapist and their client as it is about the words the therapist says. A therapist provides compassion, care and empathy. They offer encouragement, but also challenge faulty thinking or dangerous behaviours.
Therapy is not just about saying the right words, it’s about building a real connection.
AI gives users the illusion that they are being heard, and the illusion they are receiving compassion. We don’t yet know the mental toll that disconnect takes, like a sort of cyber-parasocial relationship, but we can assume it’s delivering only a fraction of the benefits that a human therapist’s real empathy would.
So AI therapists are not the solution to therapy’s scaling challenge.
The role of AI in mental health care
But to say there is no place for AI in mental health care is far too reductionist.
After seven years in tech, I’ve seen first-hand how it can revolutionise sectors and improve people’s access to support, care, education and employment.
I believe it can do the same for mental health care, employed in the right ways. After all, we have to do something. Rates of mental illness have been increasing for decades. In any given week, 1 in 6 people will be experiencing a common mental health problem. And there simply aren’t enough trained professionals to support them all: it’s why 1.6 million people are on waiting lists for mental health services.
Readers of the AI Journal will not need convincing that AI has brilliant strengths and there is no doubt it will soon be pervasive in every part of our lives. The challenge (and the opportunity) is to identify where its strengths are best applied, in such a way as to mitigate its weaknesses.
In mental health provision, that means using AI as a tool, not a therapist. An add-on, not a replacement.
Earlier this year I started Tough Minds, a social enterprise that builds mental health tools for real life. Every week, hundreds of people are using our AI-powered tools to help them with their therapy ‘homework’ between sessions, to spot patterns in their journals and mood trackers, or to recommend a coping mechanism in a moment of crisis.
These aren’t AI therapists: these are tools that empower individuals to better manage their own mental health. They give people the autonomy to address their immediate challenges, without acting out faux-empathy or compassion. They help people to train their emotional resilience independently, so they can build sustainable coping mechanisms: rather than dependencies on chatbots.
Tools like this will represent the future of mental health care, as an always-on augmentation for the human-to-human support that will always be central to addressing mental illness. It recognises that, while all therapy should be human, not all support needs to be.
Alleviating the bigger crisis
Getting this right could go further in alleviating the mental health crisis that faces much of the world today.
Particularly exciting for me are the implications this could have for the prevention of mental illness. Too often, we’re only taught how to care for our mental health after, or during, a crisis. That’s like learning to drive halfway round the M25. AI is proving to be great at training and education: so in the future, I’d like to see how it can play a role in training coping skills before they’re needed, easing the pressure on our health services and contributing to us all leading happier, more fulfilling lives.
Too often, the argument is pro or con AI for any given sector. But the real answer is using AI in the right ways. In mental health care, that can mean as a tool for training, tracking and guidance.
AI will never be a good therapist. But it might be the reason fewer people need one.