Oh god this is so believable. AI bros and Business Leaders have no interest in softer sciences, so they've Dunning-Krugered themselves into believing their googly-eyed autocomplete is a real boy

From https://softwarecrisis.dev/letters/llmentalist/

The LLMentalist Effect: how chat-based Large Language Models rep…

The new era of tech seems to be built on superstitious behaviour

Out of the Software Crisis

@anandamide This definitely explains a lot of what I feel when I see people in the AI field claiming an LLM has any level of consciousness. They've bought into the delusion.

Reminds me of the quote by Upton Sinclair: "It is difficult to get a man to understand something when his salary depends on his not understanding it."

@jamie hah! I thought of the 'you cannot reason someone out of a position they have Dunning-Krugered themselves into' 🙃

@anandamide Also true! And from a certain perspective, both of those are about incentive. Money's an obvious incentive but, confidence being often mistaken for expertise, people who've fallen into the Dunning-Kruger trap become incentivized to pretend they haven't.

Too many people want the benefits of being right without doing the work required to be correct.