“Over four months, LLM users consistently underperformed at neural, linguistic, and behavioral levels.”

https://www.media.mit.edu/publications/your-brain-on-chatgpt/

Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task – MIT Media Lab

This study explores the neural and behavioral consequences of LLM-assisted essay writing. Participants were divided into three groups: LLM, Search Engine, and …

MIT Media Lab

Microsoft researchers say an overdependency on AI tools like Copilot negatively impacts people's critical thinking capabilities.

https://www.windowscentral.com/software-apps/copilot-and-chatgpt-makes-you-dumb-new-microsoft-study

Will an overreliance on Copilot and ChatGPT make you dumb? A new Microsoft study says AI 'atrophies' critical thinking: "I already feel like I have lost some brain cells."

Microsoft researchers say an overdependency on AI tools like Copilot negatively impacts people's critical thinking capabilities.

Windows Central
I’m refusing to on board to Copilot at work. I’m the only person. When asked why I sent them the MIT and Microsoft research papers as openers.
@GossiTheDog AI is OK for me as long it stays in a Browser Window.
@masek @GossiTheDog lemme correct that for you because you misspelled "AI is okay for me because it's actively killing the planet and using up valuable fresh water and I'm fine with the dehumanization of life".

@jadedtwin Tha AI is running on Mac Mini M2 Pro at home. So I can benchmark exactly what it needs. A text query is usually about 300 Ws of electricity, there is no water involved.

About 85% of my current electricity usage is self-produced by my photovoltaics.

It doesn't help you making a good argument by proclaiming the other side an idiot that doesn't understand what he is doing.

@GossiTheDog

@masek @jadedtwin @GossiTheDog As those numbers would be interesting, indeed. Are you sure you mean energy consumption of 300 Ws (Watt seconds = Joule), because 300 J ≈ 0.083 Wh. That would be extremely low. That's only about 3 % of the estimated 2.5 Wh per prompt to ChatGPT. Maybe you miscalculated it?

@camelCaseNick @jadedtwin @GossiTheDog Nope: the wattage increases by 5-10W during the query which takes about 30s. The numbers reported in the press are very likely inaccurate.

I use ollama on a Mac Mini M2 Pro.

@masek i think those media reports probably factor in training data. Also i wouldn't be surprised if chatgpt consumed more power than local models on Apple due to the model size and gpu architecture