Health
Therapists Embrace AI Tools for Mental Health Support
As technology continues to evolve, some therapists are turning to artificial intelligence (AI) tools to support their mental health and that of their clients. This shift has sparked a conversation about the role of AI in therapy, with professionals weighing both its benefits and potential risks.
Jack Worthy, a therapist based in New York City, describes his use of AI, specifically ChatGPT, to assist with personal insights. Initially using the tool for everyday tasks like finding dinner recipes, Worthy began exploring its potential for therapy about a year ago during a challenging period in his family life. He asked the AI to analyze his dream journal, a common therapeutic exercise, and was surprised by the valuable insights it provided. “It was actually going through my dreams with ChatGPT where I started to recognise, like, ‘Oh, I’m actually under a great deal of stress,’” Worthy said.
This trend among mental health professionals illustrates a growing interest in AI-powered chatbots as supplementary tools for therapy. While some critics argue that these technologies can negatively impact vulnerable users—allegations supported by harrowing lawsuits related to suicides—therapists like Nathalie Savell see potential benefits. Savell, who practices near Baltimore, states, “It actually gave me what I thought was, like, top-of-the-line support and guidance.”
Despite these positive experiences, concerns about the safety of AI in therapy are mounting. An April 2023 study by Stanford University found that many AI tools struggle to respond effectively to severe mental health issues, such as suicidal thoughts. In one instance, a chatbot failed to provide an appropriate response when a user expressed distress over job loss, instead listing bridges in New York City. In light of these findings, US Congress has called for AI companies to address the safety of their chatbots, with some states even banning the use of AI for therapeutic purposes.
AI developers, including OpenAI, have acknowledged these risks and are working to enhance the responsiveness of their tools. Most therapists interviewed expressed caution regarding the use of chatbots for mental health treatment, emphasizing that AI should complement, not replace, professional therapy. Many believe that their training as therapists enables them to use AI tools safely.
The growing acceptance of AI among both therapists and clients highlights its appeal as a conversation partner for emotional support. Savell has recommended that clients engage with chatbots for more frequent assistance with anxiety, relationship, and parenting issues between therapy sessions. “They need more frequent support than just once a week, maybe more frequent support than what a friend could even offer,” she explained.
While it is difficult to gauge the popularity of AI for therapy, a Harvard Business Review study from April indicated that therapeutic support is a primary reason for using AI. Conversely, a September study by researchers associated with OpenAI revealed that only about 2% of messages sent to ChatGPT related to “relationships and self-care.” Similarly, Anthropic reported that around 3% of interactions with its AI chatbot, Claude, were focused on emotional needs.
Despite the challenges, many users are sharing their positive experiences with AI on social media. Companies are developing applications specifically tailored for AI-powered therapy, indicating a burgeoning market. Worthy notes that his clients are increasingly open about discussing their interactions with chatbots during therapy sessions. “It’s happening more and more,” he stated. “I, in fact, feel grateful that my patients talk to me openly about the fact that they’ve talked through something with ChatGPT.”
Savell also utilizes AI for personal reflection, especially when immediate support from friends is unavailable. “I would wake up ruminating about something,” she shared. “I’m not going to text a friend at 2 in the morning, but I can chat with ChatGPT, and it can kind of help me clarify my thinking and get out of the rumination.”
Most therapists interviewed do not endorse a complete ban on AI in therapy, even as some states have proposed such measures. They stress that while AI can offer insights, it cannot replace the nuanced understanding that human therapists provide. Worthy emphasized that effective analysis often relies on nonverbal cues, which AI cannot perceive. “Let’s say you’re coming in to tell me about a fight that you’ve had with your wife…I’m going to use that information to guide the session and to guide the questions I ask,” he explained.
Therapists encourage clients who interact with AI about their mental health to share these experiences with their human therapists, ensuring a comprehensive approach to wellness. For those struggling with mental health issues, resources such as Lifeline are available at 0508 828 865, offering support seven days a week.
-
Sports2 months agoNetball New Zealand Stands Down Dame Noeline Taurua for Series
-
Entertainment2 months agoTributes Pour In for Lachlan Rofe, Reality Star, Dead at 47
-
Entertainment1 month agoNew ‘Maverick’ Chaser Joins Beat the Chasers Season Finale
-
Sports2 weeks agoEli Katoa Rushed to Hospital After Sideline Incident During Match
-
Sports2 months agoSilver Ferns Legend Laura Langman Criticizes Team’s Attitude
-
Politics1 month agoNetball NZ Calls for Respect Amid Dame Taurua’s Standoff
-
Sports2 weeks agoJamie Melham Triumphs Over Husband Ben in Melbourne Cup Victory
-
Entertainment3 months agoKhloe Kardashian Embraces Innovative Stem Cell Therapy in Mexico
-
World4 months agoPolice Arrest Multiple Individuals During Funeral for Zain Taikato-Fox
-
Sports3 months agoGaël Monfils Set to Defend ASB Classic Title in January 2026
-
Entertainment2 months agoTyson Fury’s Daughter Venezuela Gets Engaged at Birthday Bash
-
Sports2 months agoHeather McMahan Steps Down as Ryder Cup Host After Controversy
