"Cognitive surrender" leads AI users to abandon logical thinking, research finds
"Cognitive surrender" leads AI users to abandon logical thinking, research finds
Many (most?) people have always relied on somebody or something else to do the thinking for them.
AI can just be added to the long, long list of sources people so inclined uncritically accept.
Yes, it’s different this time:
In the past, people have often used tools from calculators to GPS systems for a kind of task-specific “cognitive offloading,” strategically delegating some jobs to reliable automated algorithms while using their own internal reasoning to oversee and evaluate the results. But the researchers argue that AI systems have given rise to a categorically different form of “cognitive surrender” in which users provide “minimal internal engagement” and accept an AI’s reasoning wholesale without oversight or verification. This “uncritical abdication of reasoning itself” is particularly common when an LLM’s output is “delivered fluently, confidently, or with minimal friction,” they point out.