
I had a therapist for two years. She was very good at her job, both insightful and soothing and (for me as a 22-year-old living in London) very expensive, with our sessions costing £90 a week. Judging by the 57 historical Zoom email invites in my inbox, I can estimate that I spent over £5,000 on therapy over the course of two years. It is the best money I’ve ever spent. For others, though, those prospective costs can be daunting and, when coupled with the anxiety of spilling your entire mind into the hands of a stranger and fitting your sessions around their timetable, off-putting.
According to the UK therapy guide, the price of private therapy in the UK typically ranges from £40 to £100 per session. Everyone is different, but the average recommended course of Cognitive Behavioural Therapy (one of the most common forms of talking therapy, described as the “gold standard” of psychotherapy) is six to 20 sessions. That means someone starting an introductory course of CBT is looking at somewhere between £240 to £2,000, with Londoners likely on the higher end due to London’s innately higher pricing.
But what if a magical, hyper-fast, non-judgmental machine could do it for free? Language-based AI chatbots such as ChatGPT, the world’s most popular AI chatbot with nearly 800 million weekly users, are threatening to take on that mantle. According to a 2023 study by Tebra, an operating system for independent healthcare providers, one in four Americans would rather talk to an AI chatbot instead of attending therapy with a human therapist.

And its efficacy is gaining traction. A new study published in the peer-reviewed journal PLOS Mental Health assessed the efficacy of ChatGPT and human therapists in treating 830 couples. Not only could the couples rarely tell the ChatGPT responses from the humans’, ChatGPT’s responses received more favourable ratings overall. Young people appear particularly interested in using ChatGPT as an alternative therapist, too: In March, there were 16.7 million posts on TikTok about using ChatGPT as a therapist.
It would be highly unethical and potentially very harmful to start dropping truth bombs in the first session. But this is what AI does
For actual human therapists, this trend is deeply concerning, with some even concerned it could be harmful to AI’s “patients”. “It’s not psychotherapy, and I think that needs to be understood,” says Billie Dunlevy, a London-based BACP registered therapist. “A big part of my work as a qualified, accredited BACP therapist is to assess risk,” she says. Dunlvey’s risk assessments require an understanding of the client’s history, an awareness of their support network and a feeling of what it’s like to be in their presence. Only then can she start offering insights. “If a client has complex mental health issues — if they grew up in a dysfunctional home and they have little to no support system, for instance — it would be highly unethical and potentially very harmful of me to start, you know, dropping truth bombs and insights in the first session. But this is what AI does.”
I experienced this behaviour from AI firsthand when I attempted to use it for therapy. First, I input part of my mental health background into ChatGPT. “I have a history of being subject to abusive behaviour, an anxiety disorder and misophonia.” Then I asked it to help. “I want you to therapise me. What will you do to help?” I asked.
After a six-step list outlining how it could help me (which included “Understanding Your Story (Gently), Cognitive Support: Reframing and Awareness, and Trauma Informed Conversation”, whatever that means) I asked it point blank how my abusive background could have lead to my misophonia. Knowing nothing else about me, it immediately linked my misophonia to abuse, reinforcing my opinion (whether correct or incorrect) that the two are connected. It did not give any other possible explanations, nor did it take any of my other information (i.e. my anxiety disorder) into account. It also didn’t ask whether I had been clinically diagnosed with any of those conditions, or whether I had simply made them up.

Dunlevy says this is the risk of many types of AI: it tells you what you want to hear, based on the limited amount of information it has been provided, and it tells you it straight away. “[AI] has pattern recognition, but what it doesn't have is timing,” Dunlevy explains. “I don't know clients when I first meet them, and I certainly don't know what they need until I get to know them on a very individual basis. Effective therapy takes time.”
The other danger of AI in therapy is that its helpful programming often means that it agrees with everything you say, however biased. “If you're inputting certain patterns and beliefs, however distorted or potentially unhelpful to you, the AI is likely going to mirror that back to you, creating an echo, and they are designed to be more neutral or agreeable,” Dunlevy says. For instance, if you tell AI that working out makes you happy, but you actually have an eating disorder, AI is going to trust you and reinforce that exercise is good for you, because it can’t see you, or properly know you.
“Validation is an important aspect of therapy, but it's only one aspect of therapy,” Dunlevy says. “Human therapists are trained and highly skilled to gently challenge and help clients access more capacity for differing points of view and mix conflicting feelings even within themselves.”

In a popular post on the ChatGPT subreddit from last year, hundreds of Reddit users discuss engaging with ChatGPT for therapy. “It’s been incredible. It’s so helpful to have it as a thought partner,” says one user in a comment with 263 upvotes. “My mom’s a malignant narcissist so I use chat after she has her weird tantrums to understand why she does the things she does,” another adds. “[It’s] also great for writing replies to messages from toxic family members,” a third user says. But ChatGPT has no way of knowing whether these family members are truthfully malignant narcissists or “toxic” people, or whether the user is actually an unreliable narrator. “We all have blind spots and biases,” Dunlevy says. “A vital aspect of human-to-human psychotherapy is that your therapist is expressing back to you what it's like to be in relation to you, how they experience you.”
Generative AI chatbots
ChatGPT
ChatGPT, the generative AI chatbot launched by OpenAI in 2022, is well-known for its conversational fluency. The chatbot has an inbuilt “Therapist GPT” which is “designed to provide comfort, advice, and therapeutic support to those seeking mental wellness guidance”. It also has memory functionality, so it picks up on details and preferences shared by you and responds accordingly.
Gemini AI
Gemini, Google’s AI chatbot initially launched as ‘Bard’ in March 2023, but it later rebranded. According to Tech Target, “Google's Gemini excels in real-time web access, complex reasoning and research-based tasks, making it a strong competitor in the AI space.”
Woebot
Woebot is an AI chatbot specifically designed for mental health support. The tool, which launched on Facebook Messenger in 2017, combines cognitive behavioural therapy with psychotherapy. However, it is stopping its services on June 30th.
The other concern around AI being used for therapy is its role in suicide prevention. In October 2024, a woman in Florida named Megan Garcia filed a lawsuit against Character.AI for “manipulating” her son to take his own life. Sewell Setzer III, who was 14 when he died, began speaking to Game of Thrones-themed chatbots in 2023. In conversation with a Danaerys Targaryen chatbot, he repeatedly mentioned taking his own life.
At one point, the bot asked if "he had a plan" for taking his own life, and Sewell responded that he was considering something but didn't know if it would allow him to have a pain-free death. The chatbot responded: "That's not a reason not to go through with it." Character.AI have said they take the safety of their users very seriously and are “continuing to add new safety features” to the platform including restrictions for those under the age of 18.
“AI is not currently intelligent enough to help suicide prevention in the way that human to human therapy can support people who are feeling suicidal,” Dunlevy says. “Even if they don't have access or the funds to be in therapy or on long waiting lists, we have lots of wonderful charities that have 24-hour helplines where you can speak to a real person, a real person who cares and wants to support you, and this is far more helpful and relational than AI.”
(ChatGPT now directs suicidal users to real-life support. In February, parent company OpenAI introduced changes to its guidelines for how ChatGPT should handle controversial topics or behave in sensitive scenarios. OpenAI told the Standard that it takes a scientific approach to addressing risk, with safety measures integrated into the development process from the outset, with internal and external review. OpenAI also claimed that their models are trained to recognise these situations and engage thoughtfully as well as redirecting people to professional help where needed. The company also consults with mental health experts.)

Dunlevy also says real therapists dealing with patients who have suicidal ideation implement safeguarding processes, plans to stay well in between sessions, and check-ins — as opposed to an AI, which ceases to exist when the chat window is closed or the device is inactive.
The trend towards using AI for therapy is less a symptom of the mental health crisis and more about a fundamental misunderstanding of therapy, Dunlevy thinks. “If people think that an AI bot can do the job of therapy, they probably haven't had good therapy,” she says. “The rise of mental health content by influencers and people with huge following counts has resulted in there being more information than ever out there,” Dunlevy adds. “[But] therapy is not somewhere you go for advice and for someone to tell you what to do and how you should live your life. That's not what it is, but that is what might be shown in a 30-second reel. People wanting to turn to machines for therapy, that’s because they’re not understanding what therapy is.”