Why AI Can't Replace Therapy
- Andrew Arnold
- Jun 18
- 4 min read
How to Use ChatGPT or Other LLMs Safely and Effectively for Mental Health Support

As a therapist I'm tempted to completely discourage use of a service like ChatGPT for mental health because it can easily cause more harm than good. However I'm not ignorant, and I know how prevalent the use of these Large Language Models (LLMs) has become for everything including mental health support.
With the growing accessibility of models like ChatGPT, Gemini, and others, more people are turning to them for guidance on mental health like self-diagnosis, emotional advice, and relationship dynamics. While these tools can offer helpful insights, it's crucial to remember that AI can't replace therapy with a real therapist. In matters of diagnosis, treatment, or relational depth, relying solely on AI is insufficient, and may even prove harmful.
Why AI Can't Replace Therapy With a Real Person
Human Psychology Is Nuanced, And So Is How We Present Information
Psychological well-being is deeply individual, shaped by personal history, body language, emotional context, and subtle behavioral cues. Therapists are trained to be attentive to, and incorporate all of this when working with you. LLMs, while powerful with language, lack the ability to perceive non-verbal cues.
Moreover, how we describe our experiences to AI is filtered through our biases and emotional states. This often results in confirmation bias where the model reflects or reinforces what we believe rather than challenging us. As a therapist I can recall several occasions in which the information being verbally presented to me in a session was incongruent with an individual's body language, tone of voice, or the gravity of the experience being described. Therapists are trained to identify blind spots, question distortions when we perceive them, and challenge our clients in ways LLMs are incapable.
LLMs Can Hallucinate (Make Things Up)
“Hallucinations” occur when an LLM confidently outputs false or fabricated information. Reliable benchmarks show typical hallucination rates close to 20%, even in top models (https://arxiv.org/abs/2305.11747). Misguided or false advice about a mental health issue can be dangerous thus making AI hallucinations a serious liability to one's well-being.
Relational Healing Requires a Real Relationship
The therapeutic alliance, which is the trust, collaboration, and emotional bond between therapist and client, is foundational to healing. "Meta-analytic findings reveal that the magnitude of the alliance-outcome relationship is modest, accounting for 5-8% of the variance in outcome" (https://pmc.ncbi.nlm.nih.gov/articles/PMC4219056/). LLMs cannot co‑regulate, empathize, or form the kind of genuine relationship that helps create lasting growth.
Information ≠ Implementation
LLMs excel at delivering information. I've been using it to help me write this post by using it to support editing, provide links to academic research to validate my points, and to help overcome writer's block which so frequently prevents me from putting my thoughts to page. Something like ChatGPT can be useful in learning definitions, research summaries, or providing basic coping tips, but real change depends on applying that info in your life. Therapists help personalize and guide implementation based on your unique experience.
LLMs Are Not Private
One of the most important components of therapy is confidentiality, and you will not receive this benefit with LLMs that are readily accessible by the public. When you use a mainstream LLM you are agreeing to their terms and conditions including their privacy policy. A therapist, however, will work to protect your privacy as required by HIPAA.
Five Ways to Use LLMs Responsibly as a Supplement to Mental Health Care
Request Evidence-Based, Cited Answers
Ask LLMs to provide links to peer-reviewed studies or reputable journals. For example: add something like “include academic references or links to studies” so you can personally read and verify the source material.
Ask for Practical Coping Tools
Use LLMs to generate ideas for things like mindfulness exercises, breathing techniques, and journaling prompts for specific issues.
Use Voice Features for Guided Mindfulness
ChatGPT's "Advanced Voice" mode can walk you through customized meditations. While not replacing therapy, it can offer structured mindfulness tailored to your situation. My main hobby is competitive disc golfing, and I've successfully used this feature to lead me in disc golf and competition specific mindfulness exercises. It's a pretty cool feature.
Keep Searches Concrete and Verifiable
Use LLMs for objective requests such as definitions, symptom lists, treatment frameworks (e.g. list of Cognitive Distortions, common physical symptoms of anxiety, or generate a list of emotion words to use instead of "sad"). Avoid using AI for subjective self-analysis or relational advice as anything that is not rooted in fact is most likely to produce misleading, false, and harmful information.
Bring AI-Generated Insights to Your Therapist
If an LLM offers a coping strategy or framing that resonates, bring it into therapy. Your therapist will assess its relevance, accuracy, and usefulness. They will correct information as needed, and they will ensure you're on the right path toward your healing.
Conclusion
I will always recommend therapy over using AI, but if you plan to utilize ChatGPT or another LLM for personal support, please do so thoughtfully so you can keep yourself safely moving on the proper path toward healing.
If you don't yet have a therapist you trust, especially if you've been relying on AI for your mental health care, schedule a consultation with Harmony Heights today. Click HERE to get started!