Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Emma Camp

Maybe AI Therapists Will Suck. That Doesn't Mean We Should Ban Them.

Artificial intelligence is evolving fast. Large language models such as ChatGPT and Gemini can write papers and code, make quirky art, and attempt deep research and complex problem-solving. Now, AI is venturing into a more personal role: therapist.

With loneliness on the rise and many Americans struggling with mental health, entrepreneurs such as Neil Parikh, co-founder of the AI therapy program Ash, argue that AI can help when traditional therapy is inaccessible.

"Hey, Ash, our conversation earlier was super helpful. The thought's still there, but it's not bothering me nearly as much as it was before," Parikh says into his phone in a video demonstrating the tool's therapeutic know-how.

"I'm glad to hear," a calming female voice replies. "That's the power of cognitive diffusion. The thought didn't disappear, but you got a little distance from it. You're not your thoughts, you're the observer of them."

A new startup called friend has recently shipped out its $129 wearable AI companions, currently only available on iPhones. The device resembles an AirTag on a necklace and monitors users' thoughts and feelings by listening to conversations and surroundings via its microphone, then texts responses like a real-world friend.

Outside the tech world, many people are alarmed at the number of users who go to AI models for life advice. "To the AI, the patient typing into the box is always reasonable, always doing their best, and always deserving of a gold star," author Derek Thompson wrote in an August newsletter on AI therapists. "By contrast, a good therapist knows that their patients are sometimes unreasonable, occasionally not doing anything close to their best, and even deserving of a smack upside the head."

The suggestion that AI models could serve a therapeutic role for some users has caused a legal backlash. In August, Illinois Democratic Gov. J.B. Pritzker signed a law banning AI models from being used in "therapeutic decision-making." While the law was ostensibly designed to protect patients from subpar treatment at the hands of inhuman AI models, Pritzker signaled that protecting therapists from competition was also a factor: "This will protect patients from unregulated and unqualified AI products, while also protecting the jobs of Illinois' thousands of qualified behavioral health providers," reads his press release announcing the law.

It's not clear how the new law will
actually protect vulnerable people
struggling with untreated mental health problems. While there's reason to be skeptical that AI mental health interventions are likely to work, no clear evidence indicates that they're dangerous enough to merit state bans.

"While multiple officials in Illinois have exclaimed how the state's new restriction against AI therapy tools will protect public safety, none mentioned how many people there currently go untreated due to scarcity
and cost," says Greg Beato, co-author of Superagency: What Could Possibly Go Right with Our AI Future. "This regulation clearly protects the mental health establishment. But whether it serves people stuck on long waiting lists or those who never even bother to seek treatment because of access and affordability issues is another question entirely."

It is possible to be both skeptical of the supposed effectiveness of AI therapy and skeptical of sweeping state regulations. Americans are spending less and less time in physical contact with one another; it seems ill-considered that the solution could be more time spent in the digital world. Yet even traditional therapy is not always an unalloyed good. One Australian study that divided teenagers into two groups—enrolling them in a therapy group or a typical health class—found that the participants who received therapy ultimately reported worse mental health than those who didn't.

Even if an AI program is designed to give good advice, to push users away from cognitive distortions and narcissistic thinking, it still lacks something a real therapist has—a human touch. When we're feeling lonely and upset, we want good advice, sure, but we also want someone who can really listen to us—something an AI isn't able to do. Yet.

The post Maybe AI Therapists Will Suck. That Doesn't Mean We Should Ban Them. appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.