
Loyola University sophomore Beatriz Santos is among those who turned their journal into a digital support tool, using ChatGPT as an AI in therapy tool for mental health support, using intelligent tech as an accessible alternative to traditional therapy.
Santos shared journal entries with the AI tool, receiving responses that help her reframe negative thoughts, and much of it means suggestions she recalls of past accomplishments when doubting her intelligence.
“If you think, ‘I’m not smart enough,’” the AI replies, “recall times when you learned new skills or solved different problems.”
Just because the University sophomore, among others, shares her entries with what has become for her a-go-to therapy bots, Beatriz acknowledges that chatbots aren’t human therapists, but for her, she finds validation in their responses.
“Even for a chatbot to be able to recognize a problem and say, ‘That’s OK,’ is extremely validating — especially when you don’t have someone to talk to,” Santos said.
With therapy made harder due to insurance changes, ChatGPT was a faster and more affordable option, and at the end of the day, It’s the response that makes her feel heard.
AI In Mental Health Therapy
Santos is not alone. Nowadays, Gen–Zs seeking mental health guidance are confronted with therapist shortage and sky-high costs, pushing the generation – and possibly future generations – to outsource mental healthcare to a therapy AI bot.
For conditions like eating disorders, where misdiagnosed rates exceed 50% and waitlists stretch for months, AI chatbots applications like Wysa, Woebot, and Therabot have become crisis lifelines to such users. From a technical view, these chatbots use cognitive behavioral therapy techniques (CBT) to interrupt destructive thought patterns in real-time, by extension, filling the gap in broken systems.
“So long as [the tool] is built by providers who are well-informed, I think it could absolutely be helpful, particularly because nationally we are very, very, very understaffed in terms of the amount of providers who specialize in eating disorders,” said Kelli Rugless, a clinical advisor at Project Heal, who sees promise in these online therapy AI chat tools.
Clinical psychologist Gemma Sharp, who co-designed a chatbot called JEM, also believes AI in therapy can bridge the crisis to the care, saying, “We know that if we leave people on waitlists, they’re more likely to drop out when treatment does become available, and unfortunately, their condition may deteriorate.”
“And I always think if you can get someone in when the motivation is there, it’s very, very important because often people feel quite ambivalent about starting eating disorder treatment.”
Yet the tradeoffs loom, considering that while AI does indeed offer 24/7 support without stigma, the truth of the matter is that it still fundamentally lacks human nuance for complex trauma, such as war trauma, abuse, Yet the tradeoffs loom, considering that while AI does indeed offer 24/7 support without stigma, the truth of the matter is that it still fundamentally lacks human nuance for complex traumas, such as war, abuse, PTSD, and much more.
Promise and Peril of AI in Therapy
But not all AI therapy bots are equal, some are downright dangerous. Where some algorithms guise as the savior, offering algorithmic lifelines, others dispense lethal mental health advice.
Some of the bots, like Tessa which was previously endorsed by the National Eating Disorders Association (NEDA) were taken down after offering lethal advice. Others, like Replika or Character.AI, come with perilous fine print, and have been censored for fomenting toxic behavior and offering unsafe mental health solutions.
“In many ways, this is still kind of a big Wild West,” said David Luxton, a psychologist and AI researcher. He warns that if safety protocols are not clearly established, users could be harmed by an AI bot for mental health therapy that was not designed by professionals.
Mehek Mohan, co-founder of Kahani, “At the end of the day, we as humans crave to feel seen and heard, and the beauty of what AI can do is make you feel that way whenever you feel like you need to feel that way. And there is something really powerful about increasing that type of access to real-time care that has never existed before.”
Despite these setbacks, developers and researchers keep refining the AI chatbot for mental health technology. Apps like Kahani and studies like Sharp’s show that, responsibly designed, AI software can offer real-time support and close gaps in mental health care.
Think about it, can a therapy AI bot twist toxic relationships under the guise of companionship? Because when an AI chatbot glitches, these aren’t simple glaring warnings. We have reached a point in the world where artificial empathy is detrimentally lacking human oversight, and when it fails, it won’t fail gracefully, it will come crashing down hard, failing catastrophically.
Remember, the bots learning fastest are often harming fastest, and their training wheels? Well, they’ve bolted to a runaway train of algorithmic harm. And who’s to pay the price? Societies.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.