This essay from @jenniferplusplus is very good, and very important.

It’s good enough and important enough that I’m just going to QFT the heck out of it here on Mastodon until I annoy you into readying the whole thing.

https://jenniferplusplus.com/losing-the-imitation-game/

This essay isn’t the last word on AI in software — but what it says is the ground level for having any sort of coherent discussion about the topic that isn’t all hype and panic.

1/

Losing the imitation game

AI cannot develop software for you, but that's not going to stop people from trying to make it happen anyway. And that is going to turn all of the easy software development problems into hard problems.

Jennifer++

@inthehands @jenniferplusplus I'm gonna pick up on one thing (there's lot to be said about this excellent set of observations, but I'm between lectures so this will have to do):

"But language itself is probably less than you think it is. Language is not comprehension, for example."

No, indeed. Language is the compromise of encoding abstractions between optimal encoding (like Huffmann or Shannon) and complete, yet noisy description (like taking a 720 degree tomography).

@inthehands @jenniferplusplus We can argue about all kinds of subtleties here, what part is evolutionary and what part is culturally informed, but looking at ONLY the two-and-a-half-year-old at home I can assert with 100% certainty that despite the lack of coherence in his speech (although he's about 12 months ahead of his peers in semantics and semiotics), he's intelligent and LLMs are not: He can communicate and pick up on intent non-verbally and use tools he's virtually not seen ever before.
@inthehands @jenniferplusplus The story is not about the child obviously, because that would be both presumptuous and childish (ha!), but rather about the notion of intelligence and the lack of depth on both LLMs and the grifter's perspective thereof.