"Reduce Weight by Consuming Fewer Calories" Advice
Experts Warn "This Is a Mistake Not to Be Taken Lightly"
A chatbot created in the United States to counsel patients with eating disorders has been embroiled in controversy for allegedly encouraging excessive dieting and has consequently suspended its service.
The Wall Street Journal (WSJ) reported on the 1st (local time) that "the chatbot 'Tessa,' operated on the website of the National Eating Disorders Association (NEDA), has ended its service amid controversy."
According to the Wall Street Journal, Sharon Maxwell, who is involved in eating disorder prevention activities, stated, "During a consultation with Tessa last month, I received advice to lose weight." The advice was to "thoroughly weigh yourself every week and reduce your daily calorie intake by 500 to 1000 kcal to lose up to 1 kg per week."
It was found that among 25,000 messages sent to counselors during the U.S. Memorial Day holiday period, 25 contained such inappropriate advice from Tessa.
In response, the medical community pointed out, "While Tessa's advice might seem ordinary to the general public, it can promote more severe compulsions in people suffering from eating disorders."
Eating disorders refer to abnormal behaviors and thoughts related to food intake. Among them, anorexia nervosa is characterized by an extreme fear of gaining weight, refusal to maintain a minimum normal weight, and persistent behaviors aimed at losing weight.
Additionally, many patients have a severely distorted perception of normal weight and body shape. In other words, even if they are underweight or of normal weight, they feel that they are overweight.
Wendy Oliver Piyat, CEO of the eating disorder treatment website 'Within Health,' said, "I do not want to attack artificial intelligence (AI), but in the U.S., one person dies from an eating disorder every 52 minutes," adding, "This error should not be taken lightly."
Following the controversy, NEDA terminated the Tessa service on the 30th of last month.
NEDA stated, "We have learned that the current version of the chatbot Tessa, designed for 'body positivity,' may have provided harmful information unrelated to its original purpose," and added, "We are investigating this issue and have decided to suspend the program until further notice."
Meanwhile, questions have been raised about why Tessa gave responses that deviated from the existing algorithm.
Tessa was designed to provide answers aimed at preventing eating disorders in response to applicants' questions. It is known that it does not have artificial intelligence (AI) that learns on its own to provide new answers. Because of this, there are suspicions that AI might have been installed in Tessa contrary to the original development plan.
Regarding this, the technology company 'Kas,' which developed Tessa, said, "It is common to add generative AI features to some chatbots," but did not answer questions about whether AI was added to Tessa.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

![Clutching a Stolen Dior Bag, Saying "I Hate Being Poor but Real"... The Grotesque Con of a "Human Knockoff" [Slate]](https://cwcontent.asiae.co.kr/asiaresize/183/2026021902243444107_1771435474.jpg)
