Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - US
The Guardian - US
Technology
Lauren Aratani

US eating disorder helpline takes down AI chatbot over harmful advice

Tessa, which Neda claims was never meant to replace the helpline workers, almost immediately ran into issues.
Tessa, which Neda claims was never meant to replace the helpline workers, almost immediately ran into issues. Photograph: redsnapper/Alamy

The National Eating Disorder Association (Neda) has taken down an artificial intelligence chatbot, “Tessa”, after reports that the chatbot was providing harmful advice.

Neda has been under criticism over the last few months after it fired four employees in March who worked for its helpline and had formed a union. The helpline allowed people to call, text or message volunteers who offered support and resources to those concerned about an eating disorder.

Members of the union, Helpline Associates United, say they were fired days after their union election was certified. The union has filed unfair labor practice charges with the National Labor Relations Board.

Tessa, which Neda claims was never meant to replace the helpline workers, almost immediately ran into problems.

On Monday, activist Sharon Maxwell posted on Instagram that Tessa offered her “healthy eating tips” and advice on how to lose weight. The chatbot recommended a calorie deficit of 500 to 1,000 calories a day and weekly weighing and measuring to keep track of weight.

“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today,” Maxwell wrote. “It is beyond time for Neda to step aside.”

Neda itself has reported that those who diet moderately are five times more likely to develop an eating disorder, while those who restrict extremely are 18 times more likely to form a disorder.

“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positivity program, may have given information that was harmful and unrelated to the program,” Neda said in a public statement on Tuesday. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

In a 4 May blogpost, former helpline employee Abbie Harper said the helpline had seen a 107% increase in calls and messages since the start of the pandemic. Reports of suicidal thoughts, self-harm and child abuse and neglect nearly tripled. The union, Harper wrote, “asked for adequate staffing and ongoing training to keep up with the needs of the hotline”.

“We didn’t even ask for more money,” Harper wrote. “Some of us have personally recovered from eating disorders and bring that invaluable experience to our work. All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference.”

Lauren Smolar, a vice-president at Neda, told NPR in May that the influx of calls reporting serious mental health crises had presented a legal liability to the organization.

“Our volunteers are volunteers. They’re not professionals. They don’t have crisis training. And we really can’t accept that kind of responsibility. We really need them to go to those services who are appropriate,” she said.

Neda worked with psychology researchers and Cass AI, a company that develops AI chatbots focused on mental health, to develop the chatbot. In a post on Neda’s website about the chatbot that has since been taken down, Ellen Fitzsimmons-Craft, a psychologist at Washington University in St Louis who helped develop the chatbot, said that “Tessa” was thought up as a solution to make eating disorder prevention more widely available.

“Programs that require human time and resources to implement them are difficult to scale, particularly in our current environment in the US where there is limited investment in prevention,” Fitzsimmons-Craft wrote, adding that the support of a human coach has shown to make prevention more effective. “Even though the chatbot was a robot, we thought she could provide some of that motivation, feedback and support … and maybe even deliver our effective program content in a way that would make people really want to engage.”

In a statement to the Guardian, Neda’s CEO, Liz Thompson, said that the chatbot was not meant to replace the helpline but was rather created as a separate program. Thompson clarified that the chatbot is not run by ChatGPT and is “not a highly functional AI system”.

“We had business reasons for closing the helpline and had been in the process of that evaluation for three years,” Thompson said. “A chatbot, even a highly intuitive program, cannot replace human interaction.

“With regard to the weight loss and calorie limiting feedback issues in a chat Monday, we are concerned and are working with the technology team and the research team to investigate this further; that language is against our policies and core beliefs as an eating disorder organization,” she said, adding that 2,500 people have engaged with the chatbot and “we hadn’t see that kind of commentary or interaction”.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.