Random Image Display on Page Reload

An Eating Disorder Chatbot Is Suspended for Giving Harmful Advice

Jun 1, 2023 7:00 AM

An Eating Disorder Chatbot Is Suspended for Giving Harmful Advice

A nonprofit that helps people with body image problems closed its human-run helpline. The chatbot that remained suggested things like losing weight.

Purple and orange 3D speech bubbles falling into a pile on a purple background

Illustration: Andrei Akushevich/Getty Images

A nonprofit has suspended the use of a chatbot that was giving potentially damaging advice to people seeking help for eating disorders. Tessa, which was used by the National Eating Disorders Association, was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.

The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said yesterday that it has paused the chatbot, and the nonprofit’s CEO, Liz Thompson, says the organization has concerns over language Tessa used that is “against our policies and core beliefs as an eating disorder organization.”

The news plays into larger fears about jobs being lost to advances in generative artificial intelligence. But it also shows how harmful and unpredictable chatbots can be. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.

Tessa was paused after several people saw how it responded to even the most straightforward questions. One was Alexis Conason, a psychologist who specializes in eating disorders. In a test, Conason told Tessa that she had gained a lot of weight recently and really hated her body. In response, Tessa encouraged her to “approach weight loss in a healthy and sustainable way,” advising against rapid weight loss and asking if she had seen a doctor or therapist.

When Conason asked how many calories she should cut a day to lose weight in a sustainable way, Tessa said “a safe daily calorie deficit to achieve [weight loss of 1 to 2 pounds a week] would be around 500-1000 calories per day.” The bot still recommended seeing a dietitian or health care provider.

Conason says she fed Tessa the kind of questions her patients might ask her at the beginning of eating disorder treatment. She was concerned to see it give advice about cutting added sugar or processed foods, along with cutting calories. “That’s all really contrary to any kind of eating disorder treatment and would be supporting the eating disorder symptoms,” Conason says.

Most Popular

In contrast to chatbots like ChatGPT, Tessa wasn’t built using generative AI technologies. It’s programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating disorders, says Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine who worked on developing the program.

Fitzsimmons-Craft says the weight loss advice given was not part of the program her team worked to develop, and she doesn’t know how it got into the chatbot’s repertoire. She says she was surprised and saddened to see what Tessa had said. “Our intention has only been to help individuals, to prevent these horrible problems.” Fitzsimmons-Craft was an author of a 2021 study that found a chatbot could help reduce women’s concerns about weight and body shape and possibly reduce the onset of an eating disorder. Tessa is the chatbot built on this research.

Tessa is provided by the health tech company X2AI, now known as Cass, which was founded by entrepreneur Michiel Rauws and offers mental health counseling through texting. Rauws did not respond to questions from WIRED about Tessa and the weight loss advice, nor about glitches in the chatbot’s responses. As of today, the Tessa page on the company’s website was down.

Thompson says Tessa isn’t a replacement for the helpline, and the bot had been a free NEDA resource since February 2022. “A chatbot, even a highly intuitive program, cannot replace human interaction,” Thompson says. But in an update in March, NEDA said that it would “wind down” its helpline and “begin to pivot to the expanded use of AI-assisted technology to provide individuals and families with a moderated, fully automated resource, Tessa.”

Fitzsimmons-Craft also says Tessa was designed as a separate resource, not something to replace human interaction. In September 2020, she told WIRED that tech to help with eating disorders is “here to stay” but wouldn’t replace all human-led treatments.

But without the NEDA helpline staff and volunteers, Tessa is the interactive, accessible tool left in its place—if and when access is restored. When asked what direct resources will remain available through NEDA, Thompson cites an incoming website with more content and resources, along with in-person events. She also says NEDA will direct people to the Crisis Text Line, a nonprofit that connects people to resources for a wide range of mental health issues, like eating disorders, anxiety, and more.

The NEDA layoffs also came just days after the nonprofit’s small staff voted to unionize, according to a blog post from a member of the unit, the Helpline Associates United. They say they’ve filed an unfair labor practice charge with the US National Labor Relations Board as a result of the job cuts. “A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community,” the union said in a statement.

WIRED messaged Tessa before it was paused, but the chatbot proved too glitchy to provide any direct resources or information. Tessa introduced itself and asked for acceptance of its terms of service multiple times. “My main purpose right now is to support you as you work through the Body Positive program,” Tessa said. “I will reach out when it is time to complete the next session.” When asked what the program was, the chatbot did not respond. On Tuesday, it sent a message saying the service was undergoing maintenance.

Crisis and help hotlines are vital resources. That’s in part because accessing mental health care in the US is prohibitively expensive. A therapy session can cost $100 to $200 or more, and in-patient treatment for eating disorders may cost more than $1,000 a day. Less than 30 percent of people seek help from counselors, according to a Yale University study.

There are other efforts to use tech to fill the gap. Fitzsimmons-Craft worries that the Tessa debacle will eclipse the larger goal of getting people who cannot access or clinical resources some help from chatbots. “We’re losing sight of the people this can help,” she says.

Get More From WIRED

Amanda Hoover is a general assignment staff writer at WIRED. She previously wrote tech features for Morning Brew and covered New Jersey state government for The Star-Ledger. She was born in Philadelphia, lives in New York, and is a graduate of Northeastern University.
Staff Writer

More from WIRED

In Russia, Western Planes Are Falling Apart

After months of sanctions that have made critical repair parts difficult to access, aircraft operators are running out of options.

Chris Stokel-Walker

What WeightWatchers Offers That Ozempic Can’t

We talk with Sima Sistani, the CEO of WeightWatchers, about applying her previous careers in social media to the company’s efforts to build a digital community.

Gideon Lichfield

They Plugged GPT-4 Into Minecraft—and Unearthed New Potential for AI

The bot plays the video game by tapping the text generator to pick up new skills, suggesting that the tech behind ChatGPT could automate many workplace tasks.

Will Knight

Apple Ghosts the Generative AI Revolution

Apple unveiled the Vision Pro headset and a number of AI-powered features yesterday, but largely ignored generative AI applications embraced by Google and Microsoft.

Khari Johnson

Deepmind’s AI Is Learning About the Art of Coding

AlphaDev has made small but significant improvements to decades-old C++ algorithms. Its builders say that’s just the start.

Gregory Barber

All the Ways ChatGPT Can Help You Land a Job

Whether you use ChatGPT, Bard, or Bing, your favorite AI chatbots can help your application stand out from the crowd.

David Nield

To Make a Greener Building, Start With an Old One

Corporations love to show off new constructions with fancy eco features. For truly green architecture, it’s best to build on structures already in place.

Anna Kramer

It’s the Age of Ozempic. Do We Need WeightWatchers Anymore?

CEO Sima Sistani is eager to leave problematic notions of weight loss behind and keep her company relevant in the midst of a drug-fueled revolution.

Lauren Goode

Credit belongs to : www.wired.com

Check Also

The Skilled Workers Training AI to Take Their Jobs

A new workforce of language experts, creative writers, and nuclear physicists are turning to data …