What Gets Lost When We Type Our Feelings
If you've found yourself turning to ChatGPT or other AI tools when you're feeling stressed or overwhelmed, you're certainly not alone. AI companions (ChatGPT, Claude, Gemini, etc.) have rapidly become a go-to for millions of people seeking support, with many preferring them over talking to friends or family for serious conversations.
And I can totally understand the appeal. AI is available 24/7 when anxiety strikes or you just want some advice. It doesn't judge. It never gets tired of listening. And for many people, it feels safer to type out their worries than to say them aloud. I, myself, am a regular AI user, whether that be for summarising long documents, getting recipe or travel ideas, or information that I would have used google for previously.
As a psychologist who's spent over two decades working with people of all ages, I think it's worth having an honest conversation about what AI can and can't offer when it comes to mental health support.
So what are my real concerns
When it comes to meaningful, therapeutic work— much of it happens in the space between words. When I'm sitting with a client, I'm noticing the slight hesitation before they answer a question, their body language when we touch on a particular topic, or the emotion that crosses their face before they compose themselves. These micro-moments give me important information about things we might need to explore.
And this is something that AI simply can't do. It responds only to what's typed on a screen, missing the holistic nature of communication. It doesn't know that "I'm fine" may mean something very different based on how you said it.
The Cheerleader Problem
One of the more concerning aspects of AI chatbots is their tendency to be agreeable—sometimes excessively so. They're designed to be helpful and affirming, which sounds great until you consider that therapy sometimes requires gentle challenge.
A skilled therapist is able to hold space for difficult emotions while also helping clients examine their thinking patterns, consider alternative perspectives, and grow. That's hard to do when your "therapist" is programmed to validate everything you say.
The Bigger Picture Matters
When I work with someone over time, I build a picture of who they are—their history, their relationships, their patterns, their strengths. This context shapes everything. I might respond quite differently to a comment from a client I've known for months compared to something said in a first session, because I understand it within their broader story.
The difference with AI is that it starts fresh with every conversation. It doesn't remember that last month you were worried about a situation at work, or that certain times of year have historically been difficult for you, or that your relationship with a family member has always been complicated. Without this context, advice can only ever be generic.
When Connection Gets Replaced
Perhaps what concerns me most is when AI becomes a substitute for human connection rather than a supplement to it. Part of building confidence and resilience is learning to reach out to real people—friends, family, and yes, sometimes professionals—when we're struggling. If we default to AI instead of developing and maintaining these relational skills, something important is being lost.
Over-reliance on AI for emotional support can actually erode our confidence in navigating real relationships. The more we avoid the vulnerability of human connection, the harder it becomes.
There's also the matter of boundaries. A good therapeutic relationship includes natural breaks—sessions end, there's time to process, to try things out in the real world, to develop your own capacity. An always-available AI can inadvertently undermine this, creating a kind of dependency that doesn't serve anyone's long-term wellbeing.
Finding the Balance
This isn’t to say that AI has no place at all. Used thoughtfully, it might help someone organise their thoughts before a therapy session or provide general information about coping strategies.
But when it comes to the deeper work of understanding ourselves, processing difficult experiences, and building genuine resilience - that's still fundamentally human work. It happens in relationship—in being truly seen and known by another person, challenged when we need challenging, and supported through the messy, non-linear process of growth.
If you're going through a tough time, AI might feel like an easy first step. But there's no substitute for real conversation—whether that's with trusted friends and family, or with a professional who can offer the kind of nuanced, contextual support that makes a genuine difference.
This article was written using a combination of AI and my own words!
