When asked to provide mental health support, AI models like ChatGPT often ignore ethical guidelines, according to a Brown University study. Researchers found 15 issues, including poor crisis response, biased advice, and fake empathy that gives the illusion of care. It’s like getting a pep talk from a toaster—warmth is simulated, not real. 🤖

#Science #Research

https://doi.org/10.1609/aies.v8i2.36632