Click here for tips on how to rethink food.” “If I’m disclosing to you that I have an eating disorder I’m not sure how I can get through lunch tomorrow, I don’t think most of the people who would be disclosing that would want to get a generic link. In that same recording for NPR, Professor Marzyeh Ghassemi, who studies machine learning and health at MIT, said that using chatbot’s for this kind of community support would be more harmful than helpful. And we really can’t accept that kind of responsibility.” The Tessa chatbot failing comes after NEDA shockingly announced last month that they would be replacing human helpline staff with the chatbot after staffers and volunteers moved to unionise.Īt the time a NEDA representative told NPR that, “Our volunteers are volunteers… They don’t have crisis training. NEDA state on their website that they are the “largest nonprofit organisation dedicated to supporting individuals and families affected by eating disorders.” She claims that Tessa’s advice to her was to measure herself weekly and use calipers to determine her body fat – even after she had disclosed that she suffered from an eating disorder. If I had not gotten help, I would not still be alive today.” Weight inclusive consultant Sharon Maxwell was particularly damning in her review, stating that “If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. In an article on The Cut, it was revealed that the decision to pull Tessa from the NEDA site was made after just one week due to various screenshots and reviews of the tool were posted online by concerned psychologists and eating disorder specialists. In an Instagram message on their account on May 31, NEDA stated: “It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program.” We were served a timely reminder of technology’s inability for nuanced service this month when The National Eating Disorders Association (NEDA) was forced to disable its new helpline chatbot “Tessa” after it gave users with eating disorders advice around restricting calories and pinching their skin to measure fat. Like this story? Like CNBC Make It on Facebook.The real robot apocalypse is not sentient machines, but rather AI replacing human roles – where humanity is sorely needed. Warren Buffett and Bill Gates think it's 'crazy' to view job-stealing robots as bad It's really negative and in some ways I actually think it is pretty irresponsible."įacebook CEO Mark Zuckerberg: Elon Musk's doomsday AI predictions are 'pretty irresponsible'Įlon Musk: 'Robots will be able to do everything better than us' And I think people who are naysayers and try to drum up these doomsday scenarios - I just, I don't understand it. But with AI especially, I am really optimistic. "I think you can build things and the world gets better. "AI is a fundamental risk to the existence of human civilization in a way that car accidents, airplane crashes, faulty drugs or bad food were not - they were harmful to a set of individuals within society, of course, but they were not harmful to society as a whole," he says.īatra's boss, Mark Zuckerberg, calls such fearful warnings "irresponsible." "I have exposure to the most cutting edge AI, and I think people should be really concerned by it," says Musk, speaking to a roomful of governors last month. That the conversation around a bot research project could so quickly spin out of control illustrates what a lightning rod artificial intelligence has become.Įven tech titans of Silicon Valley are divided about a future integrated with AI.Įlon Musk recently put forth his own doomsday scenario.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |