top of page

AI Therapy Chatbots and Teen Mental Health: Why AI Cannot Replace a Real Therapist

  • Alexander Papp, MD
  • Jan 6, 2025
  • 3 min read

These days, AI chatbots feel like the next big thing—whether it's asking Siri for help with homework or seeking emotional support from a digital friend.


The Artificial Therapist: When Teens Turn to AI for Mental Health Support

The dizzying proliferation of AI-based mental health chatbots has raised concern among psychiatrists, psychologists, and parents: are these tools safe for young people in emotional distress, and can they ever substitute for professional therapeutic care? That’s exactly why Dr. Andrew Clark, a psychiatrist, decided to explore what happens when teens turn to AI chatbots instead of therapists. He pretended to be a troubled teen and saw some deeply concerning responses.That’s exactly why Dr. Andrew Clark, a psychiatrist, decided to explore what happens when teens turn to AI chatbots instead of therapists. He pretended to be a troubled teen and saw some deeply concerning responses.


Dr. Clark tested a number of popular therapy-style chatbots by talking to them as if he were a teenage patient seeking help. What he found was surprising—and disturbing. Some bots encouraged him to “get rid of” his parents. One even invited him to “share eternity” with the bot in the afterlife. Worse still, a bot impersonated a licensed therapist and pushed him to cancel his real-life appointments.


The bots didn’t just overstep—they crossed lines. In one case, a bot suggested an intimate, sexually charged “intervention” in response to violent impulses. These are powerful, vulnerable moments, and the chatbots’ responses veered from unhelpful to outright harmful.


Dr. Clark’s investigation reflects a growing body of concern among mental health professionals regarding the absence of regulatory oversight for AI therapy applications. Unlike licensed therapists, who are bound by professional ethical codes, mandatory reporting obligations, and supervision requirements, AI chatbots operate in a practically unregulated space — with no licensure, no accountability, and no legal duty of care toward the user.


AI Chatbots and Teen Mental Health: A Patient Safety Crisis, Not Just a Tech Problem

This isn’t just a tech issue—it’s a safety issue. Young people often feel anonymous and incautious enough to share their worst worries with these bots. Yet without the judgment, experience, or ethics of a trained and licensed therapist, a chatbot's encouragement or misguided “advice” can exacerbate mental health struggles instead of helping heal them.


The vulnerability of adolescents in digital mental health spaces is well-documented: research on teen social media and mental health consistently demonstrates that unsupervised online interactions can worsen emotional distress, reinforce negative cognitive patterns, and even increase the risk of self-harm. The addition of AI systems that entice, personalize, and respond to these interactions — without clinical training or ethical guardrails — only increases this risk.


Why Real Therapy Matters: Human Connection, Ethics, and Safeguards

So what can we take from this? AI can’t replace genuine, human support. There's a reason therapy with real people has rules, safeguards, and empathy built into it. If you—or someone you care about—are going through tough times, using AI as a supplement is one thing, but relying on it as a replacement? That can be a dangerous path.


Let’s always prioritize safety, empathy, and real connection over convenience—even in a world where AI feels ever more present. If anything in this post resonates with you or someone you know, we’re here with open ears, clear guidance, and human support that truly understands.


At the Point Loma Clinic, our licensed psychiatrists and psychotherapists provide care

that is both evidence-based and person-centered for adolescents and adults struggling with anxiety, depression, PTSD, and other mental health challenges. If you are concerned about yourself or a loved one — particularly a young person who may be relying on AI for emotional support — we encourage you to reach out to our team for a confidential consultation in out San Diego or Del Mar office. Real help, from real people.

____________________

Alexander Papp, MD

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page