“Amandeep Jutla, an associate research scientist at Columbia University studying the impact of AI chatbots, believes that one of the main reasons for users to spiral is the “anthropomorphic nature of the interface”. He adds that, unlike human conversations, which feature pushback and different perspectives tugging at each other, a user doesn’t receive any pushback during their conversations with chatbots: “The design of the product is pushing you away from reality. It’s pushing you away from other people,” he said. “The friction with other people is what keeps us grounded.”

https://www.theguardian.com/technology/ng-interactive/2026/feb/28/chatgpt-ai-chatbot-mental-health?CMP=fb_us&utm_medium=Social&utm_source=Facebook&fbclid=IwdGRleAQ_QlBleHRuA2FlbQIxMQBzcnRjBmFwcF9pZAo2NjI4NTY4Mzc5AAEednSkslKIZldVDIQ1qwPdxlTYV-G0-jnCFItDqR8S2ydK74vdm9IvBFWX2zM_aem_-8Mb28JinIVUuDu09V_6ig#Echobox=1774364369

Her husband wanted to use ChatGPT to create sustainable housing. Then it took over his life.

Kate Fox says Joe Ceccanti was the ‘most hopeful person’ before he started spending 12 hours a day with a chatbot

The Guardian
@amcvittie I still espouse that our job as UX practitioners is that of the Sweeper in curling. Adding friction is just as important as removing it. The wisdom comes from knowing when to apply which.