loader

AI and Therapy: The Comfort of Machines, The Irreplaceability of Humans

As an early-career therapist, I am required to engage in personal therapy—a mandate designed to ensure that I do not bring my own unresolved issues into my work with clients. Therapy is where I learn to differentiate between my pain and theirs, where I examine the unconscious biases and blind spots that could otherwise shape the way I listen. It is also where I meet myself, over and over again, in all my contradictions and unfinished places.

In this process, I have always believed in the irreplaceable role of a human presence—a listener who is not just attentive but attuned, whose words do not merely mirror but challenge, whose gaze carries meaning beyond what language alone can provide. So, when I first heard about people using AI for emotional support, I was skeptical. How could a machine ever replicate the deeply human experience of being seen, heard, and understood?

That was until I tried it myself.

Like everyone else, I have moments of uncertainty, days when the weight of my own thoughts feels too heavy to carry alone. But as with all relationships, human availability comes with limitations. Friends, no matter how caring, have their own lives. Therapists, no matter how committed, are bound by the constraints of time and energy. One evening, when no one was immediately available, I found myself turning to ChatGPT—just to see what it would be like.

What began as an experiment became something unexpected.

The AI responded with a depth of articulation that startled me. It reflected my emotions back at me in a way that felt almost uncanny. It named patterns in my thinking, posed questions that nudged me toward clarity, and, above all, provided an instant response—something no human relationship could offer. That immediacy was, in its own way, soothing. There was no waiting, no scheduling, no uncertainty about whether the other person was in the right headspace to receive me. It was just there.

For someone who lacks an attuned other in their life, this can feel like a gift. For those who struggle to put their feelings into words, AI can offer language, a structured way to translate the formless into something tangible. It can be a mirror, a journal that talks back, a thinking partner that never tires.

But therapy is not just about words.

I’ve spent countless hours in analysis with Indu, a brilliant therapist whose presence itself holds a kind of quiet power. And in my work with her, I see where AI inevitably falls short. It is in the pauses. The silences that are not empty but full. The way she holds my gaze when I am on the verge of something painful. The way she lets a sentence linger in the air, forcing me to sit with it rather than rushing to fill the space.

AI cannot do this. It cannot read the subtleties of a furrowed brow or a nervous shift in posture. It does not register the tremble in my voice when I say, "I'm fine," when I am anything but. It does not challenge me in the way a good therapist does—refusing to function as an echo chamber, gently but firmly resisting my defenses when I try to evade what needs to be confronted. AI can analyze, predict, and reframe, but it cannot hold space in the way that a human being can.

And then there is the question of frustration. Of waiting. Of not being soothed right away.

We live in an age where discomfort is something to be eliminated as quickly as possible. Hunger? Order food. Boredom? Scroll. Loneliness? Open an app. Even therapy, traditionally a slow and painstaking process, is being reshaped by this culture of immediacy. Yet, the ability to tolerate discomfort—the ability to sit with emotions rather than instantly numbing them—is at the core of psychological resilience.

Therapy is not just about relief; it is about growth. And growth is often frustrating. It is the slow realization that some wounds do not have quick solutions, that some questions do not have immediate answers. It is the process of being in relationship with another person, learning to navigate differences, misunderstandings, and the natural ruptures that occur when two minds meet. AI offers regulation, but it does not teach resilience. It soothes, but it does not challenge. It mirrors, but it does not push back.

There is another, deeper concern—the slow erosion of self-trust.

For some, AI can be a powerful tool that enhances cognitive abilities—helping a child struggling with English develop confidence in their writing, offering new ways of thinking to someone trying to articulate complex emotions, or providing an immediate response to clarify uncertainty. But for others, particularly children and young adults, there is a real risk that AI can erode the very foundation of self-trust and agency.

Imagine a child sitting in front of a blank page, struggling to find the right words. In that moment of frustration, they turn to AI. With a few keystrokes, they receive a well-structured, articulate response—something far more polished than what they might have written on their own. The relief is instant. The discomfort of thinking through the problem fades. And so, the next time they encounter a challenge—be it writing an essay, composing an email, or even expressing a deeply personal thought—they reach for the AI again.

Over time, a quiet but powerful shift occurs: the belief that “I can do this” is replaced by “AI can do this better than me.” The struggle that once built cognitive resilience is circumvented. The slow, often uncomfortable process of trial and error—where creativity and independent thought are nurtured—is abandoned in favor of a quick, optimized solution.

The danger here is subtle but profound. It is not just about dependency on AI; it is about a fundamental disruption in how confidence and self-trust are formed. For a child or adolescent still developing their sense of identity, repeated reliance on AI can reinforce the idea that their own thoughts, ideas, and efforts are inherently inferior to what a machine can produce. This can lead to a form of learned helplessness—a hesitancy to engage in creative problem-solving, a reluctance to struggle through difficult tasks, and, in some cases, an aversion to independent thinking altogether.

There is also a loss of ownership in expression. When a child writes something—even if it is clumsy, flawed, or incomplete—it is theirs. Their words, their effort, their evolving voice. AI, no matter how well it replicates human language, is external. It does not belong to them in the same way. If a child consistently outsources their thinking to AI, they risk never fully experiencing the pride and self-assurance that comes from shaping their own ideas, however imperfect they may be.

This is not an argument for or against AI in mental health. It is a reflection on the deeply personal nature of our relationship with it. For someone who cannot afford therapy, AI might serve as a vital entry point—a space to explore thoughts and emotions before taking the leap toward human connection. For another, it might reinforce isolation, offering the illusion of companionship without requiring the vulnerability that real relationships demand. It might help a child struggling with English develop confidence in their writing, but it might also prevent another from ever trusting their own creative instincts, knowing that AI will always produce a "better" response.

Perhaps this is the real tension AI introduces: the interplay between enhancement and dependency, between possibility and limitation. It is a tool, neither inherently good nor bad, but one whose impact depends entirely on how we engage with it.

People are not machines. They are unpredictable, imperfect, and sometimes unbearably difficult to understand. But it is in that very unpredictability that life happens. The messiness, the misunderstandings, the silences and hesitations—these are not inefficiencies to be optimized away. They are the fabric of real human connection.

To be in the presence of another—fully, vulnerably, messily—is an experience that AI, for all its intelligence, cannot replicate. And maybe, as we step further into this era of technological companionship, that is something we should not lose sight of.

Ms. Meghna Joshi
(Psychodynamic Counsellor)