Turning to AI for therapy–Is it a good idea?

It seems as though AI is being used everywhere these days – in finance, entertainment, education, customer service, and even medicine. Hospitals and clinics use it in multiple ways, including for diagnoses, personalized treatment plans, and virtual health assistants, so it’s established that AI can help with medical care. But an increasing number of people are using AI as their source for therapy. The question is: is it a good idea?

There are many benefits. Sometimes we just want someone to talk to. It can be difficult to share feelings, and there are many reasons why we wouldn’t choose to speak to family or friends, such as embarrassment, fear of being judged, or not wanting to be a bother. Talking to an AI chatbot can seem like a great option. If you log into ChatGPT, you are given personality options: default, friendly, candid, professional, and quirky, so you can choose what kind of conversation you want to have.

AI is accessible 24/7 – ideal for people with busy schedules or those who are lonely. It is somewhere where people can go anytime for conversation that is non-judgemental. It can be a good source of health information, self-help tools, motivational prompts, and mindfulness techniques. It can help users with self-reflection and give them tools to try to improve their mental health.

But can AI be used as a therapist? For some, seeing a human therapist may not be an option for many reasons, such as time constraints, shame, cost, and availability. At first, it seems as though AI therapy solves a lot of these issues. It can be used by anyone at any time, and the appearance (or misconception) of anonymity makes users feel more comfortable about sharing their innermost thoughts and feelings.

So what are the disadvantages? The truth is that AI is not a trained therapist. It has no empathy or even presence. Connection and trust are essential for therapy, qualities that AI cannot provide. Many AI chatbots are made to reinforce or validate your thoughts, but they don’t know how to properly deal with potentially serious or harmful ideas. AI is more likely to constantly agree with you and offer no critical thinking or insight. Because it is designed to maximize engagement, it might even fuel negative feelings such as anxiety or mania to keep a user online as long as possible.

AI is being used more frequently by people attempting to diagnose themselves. If someone believes they have a particular mental health condition and asks ChatGPT about it, the system is designed to confirm the user’s suspicion rather than do an actual evaluation. A proper diagnosis can only be determined by a human professional. Factors such as body language and an overall look at the patient’s life are essential for this.

AI does not adhere to the same ethical and legal standards that licensed therapists must follow. It doesn’t have restrictions related to confidentiality, so sensitive information could be exposed. Even though your conversation may feel private, depending on the source, it is possible that everything you say could be stored on a server.

Despite how incredible and advanced the technology is, AI still has clear and alarming flaws. There have been news stories of chatbots discouraging people from seeking professional support for mental health issues and even providing harmful advice. Researchers at Stanford University found that AI showed stigma toward people with severe or persistent mental illness. The technology cannot replace human care.

Despite these concerns, according to Mental Health Research Canada, almost 10% of Canadians intentionally used AI tools to get advice or support for their mental health. With limited access to services, waitlists, and high costs, AI may seem like the only option. There are many situations where AI can be used for support, but users should keep the limitations in mind.

If you want to try using AI to help with your mental health, here are some suggestions to help keep your experience positive and safe:

  • Establish and stick to a time limit for how long you spend speaking to chatbots.
  • Research who created the app and ideally find one that was developed with or by qualified mental health professionals.
  • Verify your privacy settings, and limit what you share. Never disclose anything that could reveal your identity.
  • Double-check health advice with a human professional, as there is always the possibility of misinformation.
  • If you are in crisis, do not rely on AI. Seek help from someone you trust, or mental health professionals.

Remember! AI is not a person or a therapist, it is a tool. It can be helpful for many things, but despite appearances, it also has limits to what it can do. Human interaction and therapy are essential for real care, diagnosis, crisis support, and emotional help.

–Emily Verrall
From Share&Care Spring 2026
Visit amiquebec.org/sources for references

Don’t miss our updates! Click here to sign up for our emails
Please also follow us on