The Turing Test poisoned the minds of generations of AI enthusiasts, because its criteria is producing text that persuades observers it was written by a human.

The result? Generative AI text products designed to "appear real" rather than produce accurate or ethical outputs.

It *should* be obvious why it's problematic to create a piece of software that excels at persuasion without concern for accuracy, honesty or ethics. But apparently it's not.

@intelwire To be fair though, many educational systems focus more on getting students to produce convincing arguments than to validate inputs. Debating clubs were the text generators of their day. AI gives us infinitely more excellent, but junk, text.