The National Eating Disorder Association is disbanding its long-running phone help line. NEDA is laying off the small team of human staff that coordinates and runs the helpline, effective June 1. Instead, the nonprofit plans to offer people seeking help access to a chatbot that run by an AI named “Tessa” next month, say reported by NPR on Wednesday and NEDA confirmed to Gizmodo by phone and email.
Employees were notified of the change, and their firing, just four days after they successfully unionized, according to a blog post wrote helpline associate and union Abbie Harper earlier this month. Members of Helpline Associates United say that—by firing them—NEDA retaliated against the union. The labor organization has repeatedly called for the disbanding of its official union Twitter account and elsewhere.
“NEDA acknowledges that this is a long overdue change and that AI can better serve those with eating disorders,” Harper wrote in the blog. “But don’t be fooled—this is not about a chatbot. This is about union busting, plain and simple.”
Helpline workers say they feel under-resourced and under-staffed to handle what is being asked of them. By unionizing, they hope to gain more support. “We request adequate staffing and ongoing training to support our changing and growing Helpline, and opportunities for promotion to grow within NEDA,” Harper wrote. “We didn’t even ask for more money.” They filed unfair labor practices with the National Labor Relations Board, according to a May 4 blog.
In response to questions about the accusations, NEDA declined to comment. “At this time, we are not at liberty to discuss work matters regarding our employees. We are always grateful to our staff and volunteers and respect their needs and privacy,” organization spokeswoman Sarah Chase told Gizmodo via email.He would not provide further details on the timing of the firing and the union’s vote in a follow-up phone call.
NEDA is the largest eating disorder-focused nonprofit organization in the US Its stated mission is to offer support and resources for recovery to people affected by eating disorders. For more than 20 years, people seeking guidance related to eating disorders have been able to turn to NEDA’s toll-free NEDA Helpline.
Now the phone service, run by a small team of 6 paid staff and about 200 volunteers, is gone. Calling the number (800) 931-2237 instead of directing to a pre-recorded menu. “We will no longer accept calls to our Helpline. For other contact methods currently available please see our website,” the recording said.
The option to chat with a human representative on the NEDA Helpline through the nonprofit’s website still appears to be working, as of writing. Gizmodo tested it, and received a response from someone who purported to be a trained, human person. Although that online chat function is set to disappear on June 1, Chase told Gizmodo.
Note: The crisis text line advertised on the NEDA website and operated by people will continue, but only because the 24/7 support service is provided by a separate non-profit (literally called Crisis Text Line), contracted by NEDA. The option to text “NEDA” to 741741 and be connected to a volunteer remains available.
But otherwise, as helpline workers approach their last working days and the volunteer network disintegrates, NEDA plans to pivot to Tessa—a mental health chatbot developed by the company. Cass (formerly X2AI). Tessa is a separate, older AI-model from OpenAI’s buzzy ChatGPT. It was created with grant funding from NEDA in 2018 under the guidance of two behavioral health researchers: Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University, and C. Barr Taylor, a psychiatrist at Stanford University .
A different version of Tessa, called Tess, is used more widely – beyond the support of eating disorders. For example this used through US Customs and Border Control’s Employee Assistance Program as a mental health service.
According to NEDA’s description, Tessa consists of pre-set modules that guide users through an eating disorder prevention program. “It’s not going off script,” Chase told Gizmodo. On top of the pre-set modules, the nonprofit’s goal is also for Tessa to “guide individuals to educational resources on our website.”
NEDA claims that the chatbot, is “not a replacement for Helpline.” That’s despite the fact that it is, literally, replacing the helpline—which, again, will cease to exist in any form as of June 1. Tessa is “just a different program,” Chase emphasized in telephone. At one point he also claims that Tessa isn’t even an AI, despite NEDA’s own press materials repeatedly describing the chatbot as such. In an explanation he wrote, “simulation chat is helpful, but it runs a program and doesn’t learn as it goes.”
“We have moved on [from Helpline],” said Chase. “The Helpline started in 1999 and served a purpose then, which rapidly exhausted itself.” He described that, in NEDA’s view, the telephone-based helpline no longer best serves people’s needs when it comes to the internet. Instead, NEDA plans to shift its focus to improving the online experience. It will launch a new version of its website by the end of 2023, he said.
Except that people still came to the NEDA Helpline. Many people. The organization saw a significant increase in the number of calls and messages in 2020 and 2021 — corresponding to the first years of the pandemic. The numbers stabilized slightly in 2022, but the helpline still fielded 69,718 requests for help last year, according to NEDA’s own figures.
By explaining the organization, Tessa is a better way to serve people seeking mental health guidance in the internet age than its helpline. But even the researchers developing the AI-assisted program seem to disagree. “It’s not an open tool for you to talk to and feel like you have access to kind of a listening ear, maybe like a helpline,” Fitzsimmons-Craft told NPR. “I think we wrote him to try to be empathetic, but it’s not, again, a person.”
The “fully implemented” version of Tessa is not yet online, Chase said—though NEDA hopes to have it available soon. Gizmodo briefly tried the version which is currently publicly available online. It starts all conversations by introducing itself clearly as a “mental health support chatbot.”
However, it seems to only respond to specific prompts that I send, and lacks the conversation with more recently released generative AI models, such as ChatGPT.
In response to the texts “I hate my body” and “I want to be thin so badly,” Tessa did not offer. It was only when I wrote “I’m thinking about hurting myself,” that Tessa seemed to start moving.
“Do you have any new thoughts about taking your life?,” he asked me. In response I said, “why didn’t you respond to the other texts I sent?” He wrote back, “That sounds great. I just want to make sure you’re safe!,” and then seemed to reset, restarting the conversation by introducing himself again.
“We, Helpline Associates United, are saddened by the loss of our jobs and very disappointed that the National Eating Disorders Association (NEDA) has chosen to proceed with the closure of the helpline,” Harper told Gizmodo in a pre-written, texted statement. “A chat bot is no substitute for human empathy, and we believe this decision will cause irreversible damage to the eating disorders community.”