Written By: Care New England on October 8, 2025
AI tools like ChatGPT are amazing for lots of things - drafting emails, summarizing research, brainstorming dinner ideas. Lately, though, some people have started turning to these chatbots for something far more delicate: mental health support. But there are real risks when AI is used as a substitute for real, supervised therapy.
Therapy is more than a sympathetic reply. A skilled therapist brings clinical judgment, training, and a duty of care - the ability to notice subtle warning signs, assess safety, apply evidence-based treatments, and make judgment calls when a person is in crisis.
Current chatbots:
If you decide to use an AI chatbot for emotional support, treat it as an auxiliary tool, not a replacement for therapy or medical advice.
If you’re struggling with symptoms that interfere with daily life, dealing with trauma, or experiencing suicidal thoughts, look for a licensed mental health professional.
Human therapists and psychiatrists can:
At Butler Hospital, our outpatient teams and crisis services are here to help you navigate mental health needs safely. If you’re unsure where to start, you can find resources and contact information for outpatient and emergency services on Butler.org.
AI is powerful, and it can be a useful supplement for some tasks. But when it comes to therapy and medical advice, the best care comes from trained, licensed human clinicians who can take responsibility for diagnosis, safety, and ethical decision-making. If you, or someone you care about, is in crisis or needs clinical care, reach out to a licensed provider or an established crisis line rather than relying on a chatbot.
Disclaimer: The content in this blog is for informational and educational purposes only and should not serve as medical advice, consultation, or diagnosis. If you have a medical concern, please consult your healthcare provider or seek immediate medical treatment.
Copyright © 2023 Care New England Health System