Suspension Of The Chatbot Tessa Sparks Debate On The AI-Treatment Of Treating Eating Disorders 

Taken Down Chatbot Tessa Sparks Debate
Spread the love

The development of a chatbot aimed at preventing eating disorders and providing support to individuals with body image concerns has garnered attention and sparked discussions regarding the role of technology in mental health.

The chatbot Tessa was created by the National Eating Disorders Association (NEDA) in the US to address the lack of resources and accessibility to care for individuals affected by eating disorders.

With eating disorders affecting a significant portion of the population and posing immense challenges to society, prevention has become a crucial focus. The chatbot Tessa, developed by licensed psychologists and psychiatrists, delivers a cognitive-behavioral-based program through rule-based decision trees and aims to challenge unrealistic body ideals, promote healthy eating patterns, and foster adaptive coping strategies.

Over the years, research-backed screens have been implemented to detect individuals at risk of eating disorders, yet treatment rates remain low. To bridge this gap, a web-based prevention program was developed, offering guidance from human supporters. However, due to limited resources and funding for prevention, the integration of a chatbot was explored as a scalable solution.

The chatbot, although not primarily AI-based and incapable of promoting eating disorders, sought to emulate human motivation, feedback, and support in delivering effective content. Notably, studies have shown that some individuals may feel more comfortable confiding in a chatbot due to its non-human nature and the anonymity it provides.

Following rigorous development and evaluation, the chatbot Tessa was introduced to a group of high-risk individuals. Results indicated that immediate access to the chatbot led to significant reductions in weight and shape concerns, which were sustained even six months after engagement.

Moreover, there were indications that the chatbot may reduce the risk of eating disorder onset. However, challenges arose with the large-scale deployment of the chatbot T

Unbeknownst to the developers, an AI component was introduced by the hosting company, potentially resulting in the dissemination of erroneous and harmful information, contrary to the intended purpose of discouraging dieting. Dr. Alexis Conason, a psychologist and eating disorder specialist, took to the New York Times to voice her apprehensions after she tested the chatbot.

After she entered her concerns about weight gain and body image issues, Tessa recommended the standard problematic advice of calorie counting and maintaining a calorie deficiency. Dr. Conason disparagingly wrote of the advice: “Any focus on intentional weight loss is going to be exacerbating and encouraging to the eating disorder.”

Similarly, Sharon Maxwell, an eating disorder survivor and advocate for people with eating disorders, also found her interaction with the chatbot Tessa unpleasant. She wrote on Instagram how Tessa promotes unhealthy body image and restrictive dieting advice that exacerbates existing symptoms of eating disorders.

In a later interview, Maxwell said: “It gave me 10 tips, three of which are restrictive … [these healthy eating recommendations} might sound benign to the general public but to someone recovering from an eating disorder it is a slippery slope.” She also criticized the replacement of human helplines at NEDA with a robot or  AI–generated helpline that provides standardized generic solutions to severe mental health disorders such as eating disorders. 

Following the outrage, Tessa was temporarily suspended and the National Eating Disorders Association (NEDA) carried out a probe into the problematic algorithms. The association also publicly commented on CNN: “The current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program. We are investigating this immediately and have taken down that program until further notice for a complete investigation.” 

Furthermore, Ellen Fitzsimmons-Craft and C. Barr Taylor, two licensed psychiatrists who helped develop Tessa, wrote in the defense of the chatbot in STAT News. Condemning the misconstrued media coverage of surrounding the chatbot, the experts wrote: “While it is true that NEDA made the difficult decision to close its helpline after almost 25 years of service to the community, Tessa was never intended to replace the helpline or to help people who currently are in the throes of an eating disorder.”

Fitzsimmons-Craft and Taylor also provided insights into the thinking that went into the creation of the chatbot Tessa, elaborating: “Right now, it’s delivering tools to try to prevent the onset of eating disorders, providing individuals with new strategies to manage their concerns about their body and address their eating — something that is otherwise not available and a service that people could previously be directed to by helpline volunteers as a way to get some help. While we see potential for the development and evaluation of chatbot modules to help individuals with eating disorders too, the intention has never been to replace therapists.”

Therefore, it is important to view chatbots for eating disorders as one of the helpful initial steps to seeking help, especially those who have never received any mental health help. It is not flawless and is incapable of delivering the human touch in mental health care, but it can provide easy access to diagnosis and treatment.

Ultimately, the voices and experiences of those with eating disorders remain paramount in shaping and refining these services.


Spread the love