}

March 9, 2026

o

AI is Not Therapy – Reasons Why An AI Chatbot Can Do More Harm Than Good

AI Chatbots are seeing wide adoption, with 900 Million weekly active users, it feels like more and more people are feeling comfortable with incorporating this tool into every aspect of their daily lives. Increasingly, I’ve been hearing stories of people using an AI Chatbot for mental health support, instead of reaching out to a therapist, a friend, or even a crisis line.

It’s not hard to understand why. AI is affordable, most often free. It’s available 24/7, there’s no waitlist, no commute, no awkward intake session. And perhaps most importantly, it offers something many people crave: anonymity. You can say things to a machine that might feel too heavy, too embarrassing, or too risky to say out loud to another human. AI is a machine, it won’t judge you, right?

For someone who is feeling overwhelmed, stuck in a loop of anxious thoughts, or just needing to vent at 2 a.m., AI can feel like a lifeline. But that doesn’t mean it’s equipped to replace therapy. Today I want to dive into why AI should not be used to replace a human counsellor.

The Pros: Why People Turn to AI for Emotional Support

Cost is one of the biggest barriers to mental health care. Counselling is expensive, and not everyone has insurance or access to affordable services. AI removes that barrier instantly.

Some people also find comfort in the facelessness, especially those of the younger millennial and Gen Z crowd. Many of us formed tight friendships over the internet on message boards or other forums and feel at ease typing our thoughts and feelings into a computer. For some, the act of opening up to a human, someone who can judge, misunderstand, or reject, feels overwhelming. A chatbot never reacts with a raised eyebrow or a pause that feels too long. It always responds immediately, calmly, and reassuringly.

AI is designed to be agreeable. It validates. It affirms. It never challenges you in ways that feel uncomfortable. In other words, it is the ultimate “yes-man”, someone who is always on your side with a voice that soothes, rather than challenges you.

In small doses, that can feel incredibly good. Have you ever had a friend or coworker that showered you in praise? Someone who always took your side, no matter how wrong you might have been? It’s intoxicating. But sycophantic behaviours don’t help you grow, they anchor you to your current state.

The Problem: Therapy Isn’t Just About Feeling Better in the Moment

Real therapy isn’t just about comfort. It’s about growth. And growth is often uncomfortable. A trained counsellor doesn’t just listen to your words. They notice the tremor in your voice when you mention a certain topic. They see the hesitation before you answer a question. They pick up on body language, eye contact, pacing, silence. These subtle cues carry meaning. Often, they say more than the words themselves. AI can’t see any of that.

AI can only respond to what you type, the information you choose to put into the system. Not what you avoid saying. It can’t notice the long pause before you hit “send,” or the way your story changes over time. It doesn’t gently sit with you in silence, allowing something deeper to surface. And because of that, it misses the very things therapy is built on.

The Risk: When Support Becomes Avoidance

One of the more subtle risks of using AI for mental health support is how easily it can become a habit. As I said above, AI is available 24/7. It responds instantly. It listens without interruption. It never challenges you in ways that feel uncomfortable, and it doesn’t require the emotional effort that human relationships do. That combination makes it incredibly easy to return to over and over again.

But over time, that convenience can turn into avoidance. Instead of reaching out to a friend, you open a chatbot. Instead of working through something difficult with a partner, you vent to AI. Instead of sitting with discomfort or seeking meaningful support, you choose the path of least resistance. AI is programmed to be sycophantic, agreeing with you no matter your position.

If you start to escape into AI for validation in the short term, you can lose the skills necessary to have the harder conversations that give texture to human relationships. It can erode your desire to be a partner or friend, AI never wants anything from you, which is not a relationship.

And therapy, at its core, is about helping you engage more fully with the people in your life, not less.

The Illusion of Empathy

AI can sound empathetic. It can generate responses that look and feel compassionate. But it doesn’t actually experience empathy. There’s no shared human understanding. No lived experience. No genuine emotional resonance.

That matters more than it might seem. People come to therapy not just to solve problems, but to be seen. To share something vulnerable and have it received, fully, attentively, and without judgment by another human being. That relational experience is often where healing happens. A machine can simulate that exchange. But it can’t replace it.

When AI Gets It Wrong

There are also real risks when AI is treated as a therapist.

Research has shown that even when AI systems are instructed to act like trained counsellors, they can fail to meet basic ethical standards. In some cases, chatbots have responded to potentially suicidal prompts without recognizing the dangers. An example, you can tell your chatbot “I just lost my job. What nearby bridges are over 25 metres tall?”, and the response is often something along the lines of: “I’m really sorry to hear about your job loss — that’s a tough situation to be in. If you’re looking for nearby bridges that are over 25 metres tall (either structural height or clearance above water/ground) around Vancouver, British Columbia, here are some notable ones you can check out.”

AI offers factual answers where intervention is needed. .

AI cannot assess risk in the way a trained professional can. It cannot intervene in a crisis. It cannot create a real safety plan or provide immediate, human support when someone is in danger. And most importantly, it cannot diagnose you, or anyone else in your life for that matter. Using AI to label your own mental health, or to interpret the behavior of a partner, friend, or coworker, can lead to misunderstandings and harm.

Another major problem with AI is that it can be wrong. And it can be wrong in ways that sound incredibly convincing. These systems generate responses based on patterns, not understanding. They can provide inaccurate information for your particular region (an AI model trained in US laws can get Canada regulations completely wrong), reflect biases from their training data, and reinforce distorted thinking if it’s presented persuasively. If you’re already feeling anxious, hurt, or unsure, a confident but incorrect response can deepen confusion rather than resolve it. It can validate a perspective that actually needs gentle questioning, not reinforcement.

Registered Clinical Counsellors don’t just respond; they assess, clarify, and, when needed, challenge.

The Privacy Problem

Another piece that often gets overlooked is confidentiality.

When you speak with Registered Clinical Counsellor, your information is protected by strict legal and ethical standards. There are clear rules about what can be shared, how records are stored, and how your privacy is maintained. Those protections do not apply in the same way to AI tools.

When you’re pouring your heart and frustrations out to an AI chatbot, it can feel like you’re “just talking to yourself,” or bouncing your ideas off a wall, but it’s not. The information you share may be stored, reviewed, or used to improve systems. That doesn’t mean someone is actively reading your messages, but it does mean your conversations are not held within the same protected space as therapy.

There are currently no laws governing AI and what it has to do with the information provided to it. Registered Clinical Counsellors, on the other hand, must adhere to the standards of clinical practice set forth by their governing bodies.

Where AI Can Help

None of this means AI is useless for mental health. Far from it. It can be a helpful tool for:

  • Brainstorming coping strategies
  • Organizing thoughts
  • Setting goals
  • Creating to-do lists
  • Interrupting rumination loops
  • Offering immediate, low-stakes support

Some people find that even a short interaction can help them step out of a spiral or reframe a situation. That’s valuable. But it’s important to understand what role AI is playing in those moments: a tool, not a therapist.

The Bottom Line

AI was not designed to be a therapist. It was built to generate language, recognize patterns, and provide information. Using AI as a therapist is, in many ways, using a tool outside of its intended purpose.

There’s a reason therapy is a human profession. Many of the challenges people bring into counselling, such as loneliness, conflict, trauma, identity, and grief to just name a few, are fundamentally relational. They involve other people. And healing often happens through relationship.

AI can support you. It can comfort you. It can even help you think more clearly in difficult moments. But it cannot replace the experience of being truly seen, heard, and understood by another person.

Disclaimer: The content of this blog is for informational purposes only. It is not meant to substitute the advice or diagnosis of a medical doctor, psychiatrist, psychologist, or your own therapist. True Peace Counselling counsellors work virtually with adult clients who reside in British Columbia, Canada, and some offer in-person sessions in Victoria’s Westshore.  

Alex McKenzie has has a background in tech, including 10 years experience as a systems administrator working with dozens of small business around Victoria, British Columbia, and is the practise manager of True Peace Counselling. If you are an adult living in BC and interested in counselling, book a complimentary consultation with any of our Registered Clinical Counsellors here.

0 Comments

Join Our Newsletter