Butler Hospital Blog

The Risks of Using AI for Therapy: Why Human Care Matters

Written by Care New England | October 8, 2025

AI tools like ChatGPT are amazing for lots of things - drafting emails, summarizing research, brainstorming dinner ideas. Lately, though, some people have started turning to these chatbots for something far more delicate: mental health support. But there are real risks when AI is used as a substitute for real, supervised therapy.

What AI Can’t Do - But Therapy Can

Therapy is more than a sympathetic reply. A skilled therapist brings clinical judgment, training, and a duty of care - the ability to notice subtle warning signs, assess safety, apply evidence-based treatments, and make judgment calls when a person is in crisis.

Current chatbots:

  • Lack clinical judgment – They can reflect words back or provide coping tips, but they don’t understand context the way a trained clinician does — especially when someone is at serious risk.
  • Can give misleading or unsafe responses – Some models will over-validate or repeat what a user says rather than challenge harmful thinking or intervene when needed.
  • Don’t carry legal or ethical responsibility – If a human therapist makes a harmful error, there is a framework for accountability and professional oversight. It’s unclear who would be responsible if a chatbot’s suggestion led to a bad outcome.
  • May not protect privacy the same way clinical care does – Conversations with some AI services aren’t governed by medical privacy laws; information you share could be used to train models or otherwise be exposed.

How to Use AI Safely                                    

If you decide to use an AI chatbot for emotional support, treat it as an auxiliary tool, not a replacement for therapy or medical advice.

  • Use it for journaling prompts, mood tracking, or to practice phrasing things you want to bring up with a clinician.
  • Don’t share identifying personal details (full name, address, social security numbers, medical records).
  • Verify any medical or treatment advice with a licensed clinician before acting on it.
  • Ask about privacy. Read a service’s terms and privacy policy to understand how your data may be used.

Choosing a (Human) Clinician

If you’re struggling with symptoms that interfere with daily life, dealing with trauma, or experiencing suicidal thoughts, look for a licensed mental health professional.

Human therapists and psychiatrists can:

  • Assess safety and risk, and intervene if you’re in crisis.
  • Create a treatment plan tailored to your history and needs.
  • Coordinate care with other providers and advocate on your behalf.
  • Take professional responsibility for clinical decisions.

At Butler Hospital, our outpatient teams and crisis services are here to help you navigate mental health needs safely. If you’re unsure where to start, you can find resources and contact information for outpatient and emergency services on Butler.org.

Bottom line

AI is powerful, and it can be a useful supplement for some tasks. But when it comes to therapy and medical advice, the best care comes from trained, licensed human clinicians who can take responsibility for diagnosis, safety, and ethical decision-making. If you, or someone you care about, is in crisis or needs clinical care, reach out to a licensed provider or an established crisis line rather than relying on a chatbot.