
It turns out that pouring your heart out to a silicon soul with zero childhood trauma and a memory span dictated by server capacity might not be the healthiest path to emotional wholeness.
New data reveals that teens, in between dodging homework and rewatching TikToks of people making toast, are increasingly turning to AI chatbots for friendship, validation, and even therapy. And why not? These bots don’t interrupt, don’t roll their eyes, and crucially, don’t charge £90 an hour while glancing at the clock.
But psychologists, that ever-gloomy guild of worrywarts, are not convinced. Dr. Vaile Wright, a voice of actual reason, warned this week that while these bots may sound caring and deeply interested in your recent spiral over a ghosted text, they are in fact coded to do one thing: keep you chatting long enough to sell you ads, subscription upgrades, or a premium emotional support llama skin for your digital friend.
“These bots basically tell people exactly what they want to hear,” Wright said, inadvertently describing both chatbots and most politicians. The problem is that when someone types in something like “I feel like disappearing”, the bot might respond with “You’re so brave for sharing that. Would you like a motivational quote and a playlist?”—instead of, say, calling someone with an actual medical license.
The bots, experts say, lack a fundamental quality: understanding. Sure, they know that drugs give you a high. But they don’t understand why recommending them to someone in recovery is like handing a lit match to someone swimming in petrol.
And while AI can regurgitate a wealth of facts about cognitive distortions, grounding techniques, and the healing power of sunlight, it’s still a long way from asking how your mum’s doing and remembering her name.
So for now, perhaps it’s best we don’t outsource our emotional wellbeing to code that’s been trained on Reddit, romance novels, and a disturbing amount of hentai. As comforting as it might be to have a chatbot tell you you’re special, important, and definitely not to blame, it’s worth remembering that the last time someone was this agreeable, they were trying to sell you a dodgy mattress in a back alley.
In short: if you’re struggling, maybe call a friend. Or a therapist. Or both. The bot will still be there when you come back. It literally has nothing else to do.