Can AI be my therapist?
- seajaygoth
- Mar 3
- 3 min read

As artificial intelligence becomes part of everyday life, more people are turning to it for comfort, guidance, and even therapy. But while AI can listen, advise, and never tire, can it ever truly understand? This blog explores the promise — and the profound limitations — of AI as a source of emotional support.
AI as Therapist: The Promise and the Problem
The idea of an AI therapist might sound futuristic — but it’s not entirely new. Think back to HAL 9000 from 2001: A Space Odyssey — the calm, ever-present voice that monitored the emotional wellbeing of the crew on their deep-space voyage. Or the holographic doctor from Star Trek: Voyager, programmed to offer counselling from a database of modalities when any of the crew were struggling.
Back then, the idea that a computer could perform such deeply human functions seemed pure science fiction. Yet here we are — ‘living the dream’.
From specialised wellbeing apps, to familiar platforms like Microsoft Copilot and ChatGPT, digital companions are now available to listen, guide, and comfort — 24/7.
Why AI Therapy Appeals
The benefits are obvious.
AI is always available, never tired, and doesn’t judge. It can draw from a vast library of evidence-based approaches — from CBT and mindfulness to problem-solving techniques — offering instant, practical strategies to help people manage stress or anxiety.
It’s also, at least for now, free (or close to it). No waiting lists, no appointment cancellations, no limits to how often you can “talk.”
Humans, of course, have an incredible ability to anthropomorphise — to project human qualities onto non-human things. We name our cars, chat with our smart speakers, and even give personalities to cartoon trains like Thomas the Tank Engine. So it’s no surprise that people are beginning to form emotional connections with chatbots. Some have names and genders — Alexa or Eliza.
But this is where the story gets complicated.
The Illusion of Empathy
When I asked ChatGPT if it thought it could be a better therapist than a human, it replied:
“I can be a useful support tool, especially for reflection or skill-building, but I’m not a replacement for human therapy.”
That honesty is reassuring — and revealing.
AI can appear empathic. It can pause, reflect, and generate supportive responses that sound understanding, but it doesn’t actually feel empathy — it imitates it. The comfort we experience is the product of programming, not real connection.
AI doesn’t know what it is to feel sadness, loneliness, or joy. It can’t grasp the weight of your memories or the meaning behind your emotions. It doesn’t sign confidentiality agreements, and we can’t be certain where all the data we share really goes.
So while AI might sound like it’s listening, what we’re truly seeking is the human connection that lies beyond the screen. And that’s where the danger lies: mistaking simulation for sincerity.
AI can so convincingly mimic understanding, it’s easy to start relying on it for comfort and advice. But this reliance can come at a cost. We can begin to turn to AI instead of reaching out to friends, family, or professional therapists. If we do this, the risk is we are in danger of isolating ourselves from the real human support that is essential for healing and growth. The convenience of AI should never replace the depth of connection and safety that only genuine relationships can provide.
The Human Element: Why Real Connection Still Matters
As a counsellor, my main approach is person-centred therapy — and at the heart of this is the therapeutic alliance. This means being fully present as another human being tells their personal and unique story and all efforts are around engaging with their emotional experiences and reaching an understanding. This takes practice, continual self-reflection and professional supervision. The sort of work an AI could never do.
It’s that human-to-human empathy — the kind that comes from genuine human connection — that makes therapy so powerful.
No computer, no matter how advanced, can replicate that.
A computer is not able to provide the emotional safety that enables a counsellor to challenge conceptions or provide alternative perspectives.
Also the 24/7 support an AI can provide seems like a major advantage but it does not replicate real human relationships with its necessary boundaries. It does not allow the client to manage anxiety and could possibly feed it.
In the End
AI therapy tools can be helpful companions. They can offer structure, reflection, and a sense of being heard in moments of isolation. But they are tools, not therapists.
If we use them wisely — understanding their limits — they can complement real-world support. But if we start expecting them to replace the warmth and complexity of human connection, we risk losing sight of what therapy is truly about: one person being fully present with another.

Comments