
We’re living in a time when the demand for mental health support far outpaces the availability of providers. Stigma still lingers. Waitlists can stretch for months. Costs are prohibitive. It makes sense that more and more people are turning to AI apps for emotional support.
But can AI ever truly replace therapy? Or, more importantly: What do we lose when we remove the human from the healing?
While AI may be a helpful tool for mental health support, there are significant limitations to consider—especially when approaching AI as an alternative to therapy. Healing is a deeply personal journey, rich with emotions and nuances that only a human can truly grasp.
The Therapeutic Relationship
When we sit with a therapist, we’re not simply receiving expertise. We are being witnessed (Yalom, 2002). Our pain is held by someone with their own heartbeat, someone attuned to our micro-expressions, who hears the tremble in our voice and notices the hesitation before we speak a painful truth.
AI cannot replicate the therapeutic relationship because it fundamentally lacks what makes that relationship therapeutic: shared humanity, attunement, and ethical responsibility.
A cornerstone of healing in therapy is co-regulation: the process by which two human nervous systems attune and settle together. When a therapist notices a client withdrawing and responds by softening their tone, slowing their pace, and taking an intentional, deep breath, they are helping regulate the client’s arousal—an act of co-regulation. These micro-adjustments form a “relational dance” that fosters emotional safety and trust (Song, 2024).
AI doesn’t know sorrow, shame, or joy. It doesn’t care whether you show up or disappear. It cannot hold you in mind across time. A therapist, by contrast, remembers. They track your story, sense your defenses, and genuinely care about your growth.
When AI Feels Safer: The Impact of Relational Wounds
For many people, human relationships do not feel safe. The very conditions therapy depends on—trust, vulnerability, and the experience of being seen—can evoke fear or even panic.
For some, an AI chatbot may feel like the safest “relationship” they have ever experienced. This, too, makes sense. As Altun et al. (2024) suggest, AI may be particularly appealing to individuals with social anxiety, attachment trauma, or PTSD—populations for whom traditional therapy can feel overwhelming or even threatening.
Those with attachment wounds may find themselves drawn to emotionally “safe” yet one-sided relationships, where control and predictability feel more manageable than emotional risk. Over time, this dynamic can reinforce avoidance and deepen isolation.
The therapeutic relationship is often described as a corrective emotional experience—a chance to feel seen, soothed, and supported in ways that may not have been possible in childhood or past relationships. Through consistency, empathy, and trust, the therapeutic bond evolves into a blueprint for secure attachment over time. Therapy is like a rehearsal for being human with others. It invites us to try out new ways of relating. To risk honesty. To feel our feelings in the presence of someone who won’t run from them.
It’s also important to recognize that some may turn to AI not because they’re avoiding therapy, but because therapy has hurt them. Maybe you opened up to a therapist and were met with judgment, dismissal, or a lack of cultural attunement. Perhaps you felt misunderstood, pathologized, or ignored. Possibly your therapist didn’t handle the rupture well, or maybe they never noticed it at all. These experiences can be profoundly disorienting and leave behind a sense of mistrust. If you’ve been let down by therapy in the past, your cautiousness makes sense. And you still deserve a human presence who will sit with you in your grief, confusion, and joy. If and when you feel ready to try again, consider starting with a free consultation. Ask potential therapists about their therapeutic approach, experience working with clients with similar backgrounds or identities, or share what hasn’t worked for you in the past.
Ethical Concerns and Limitations of AI
While AI platforms can offer brief moments of comfort, their lack of emotional nuance, crisis detection, and attunement raises important ethical and clinical concerns. Kalam et al. (2024) point out the following limitations of AI for mental health support:
Lack of clinical judgment: AI lacks diagnostic training, professional oversight, and the ability to make decisions based on the complexity of a client’s psychological presentation. It cannot distinguish between surface-level distress and deeper clinical concerns.
Potential influence on cognitive and worldview: AI chatbots have the potential to shape users’ thinking and worldview, particularly if they rely on it as a primary source of support. This can create an “information cocoon” where users become increasingly disconnected from diverse perspectives, reinforcing unhelpful or isolating cognitive patterns.
No continuity of care: AI does not retain information from past sessions (unless specifically programmed to do so in limited cases). It cannot track emotional patterns, recall breakthroughs, or develop a therapeutic arc over time. Therapy, by contrast, is a relational container—one that holds your history, growth, and evolving narrative.
Limited nuance: While AI is trained on extensive datasets, it still struggles with the subtlety of human meaning-making. It may miss sarcasm or symbolism, respond in emotionally flat ways, or reinforce problematic beliefs without offering a therapeutic challenge.
Lacks ability to recognize crisis or signs of distress: AI is not equipped to identify or intervene in emergencies. This creates a serious risk in cases such as suicidality, self-harm, domestic abuse, and psychosis, where nuanced attunement and clinical decision-making are critical.
No coordination of care: Therapists consult, collaborate, and advocate on behalf of their clients. With consent, they communicate with other professionals—such as psychiatrists, OBGYNs, primary care providers, school counselors, or dietitians—to ensure continuity and safety. AI platforms, on the other hand, operate in isolation. They cannot synthesize medical history, monitor side effects, or collaborate with other care providers. Without this coordination, essential aspects of a person’s wellbeing may go unnoticed.
Privacy concerns: AI platforms are not subject to the same confidentiality laws as those governing licensed therapists, such as HIPAA. Depending on the platform, user conversations might be stored, analyzed, or used to train future models.
If AI has helped you through a dark time, that matters. You deserve recognition for reaching out in any way you could. It’s essential, though, to be clear about what AI can offer—and what it cannot.
AI can be a bridge, not a destination. If you use it for emotional support, consider sticking to basics, such as journaling prompts for self-reflection. AI can help us feel understood in the moment. But it doesn’t know us, and it doesn’t grow with us.
In therapy, the relationship itself is an integral part of the treatment. It’s not just what’s said—it’s the ongoing presence, the co-regulation, and the experience of being known across time.
References
Abrams, Z. (2025, March 12). Using generic AI chatbots for mental health support: A dangerous trend. https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists
Altun, N., Sari, A., & Ozturk, O. (2024). ChatGPT in Mental Health: A Systematic Review of Advantages, Limitations, and Ethical Concerns. Journal of Clinical Medicine, 13(1), 158. https://doi.org/10.3390/jcm13010158
Kalam, K. T., Rahman, J. M., Islam, M. R., & Dewan, S. M. R. (2024). ChatGPT and mental health: Friends or foes? Health Science Reports, 7(2), e1912. https://doi.org/10.1002/hsr2.1912
Papa, K. M. (2023). Using ChatGPT as a Therapist: 11 Reasons It’s Not the Same. Living Openhearted. https://www.livingopenhearted.com/post/chatgpt-therapist
Song, Y. (2024). The co-construction of empathic identity in psychotherapy. International Journal of Education and Humanities, 9(3), 62–66. https://doi.org/10.56397/IJEH.2024.03.08
Yalom, I. D. (2002). The Gift of Therapy: An Open Letter to a New Generation of Therapists and Their Patients. Harper Perennial.