Eating Disorder Helpline Pulls AI Chatbot After It Gives Users ‘Harmful’ Advice

Eating Disorder Helpline Pulls AI Chatbot After It Gives Users ‘Harmful’ Advice
Chatbots are most often used for low-level customer service and sales task automation, but researchers have been trying to make them perform more sophisticated tasks such as therapy. Tero Vesalainen/Shutterstock
Katabella Roberts
Updated:
0:00

An eating disorder association took down its artificial intelligence (AI) chatbot less than a week before it was set to replace its human-run helpline after discovering that it was giving “harmful” advice to users.

The National Eating Disorder Association (NEDA), a nonprofit that supports individuals and families affected by eating disorders, said in a May 31 Instagram post that it has pulled its chatbot, named Tessa, after discovering that it “may have given information that was harmful and unrelated to the program.”

Katabella Roberts
Katabella Roberts
Author
Katabella Roberts is a news writer for The Epoch Times, focusing primarily on the United States, world, and business news.
Related Topics