This essay from @jenniferplusplus is very good, and very important.

It’s good enough and important enough that I’m just going to QFT the heck out of it here on Mastodon until I annoy you into readying the whole thing.

https://jenniferplusplus.com/losing-the-imitation-game/

This essay isn’t the last word on AI in software — but what it says is the ground level for having any sort of coherent discussion about the topic that isn’t all hype and panic.

1/

Losing the imitation game

AI cannot develop software for you, but that's not going to stop people from trying to make it happen anyway. And that is going to turn all of the easy software development problems into hard problems.

Jennifer++

“Artificial Intelligence is an unhelpful term. It serves as a vehicle for people's invalid assumptions. It hand-waves an enormous amount of complexity regarding what ‘intelligence’ even is or means.“

“Our understanding of intelligence is a moving target. We only have one meaningful fixed point to work from. We assert that humans are intelligent. Whether anything else is, is not certain. What intelligence itself is, is not certain.”

2/

@inthehands (I think that whatever intelligence *is*, most of us have met humans that weren't it. So "intelligence is what humans have" seems broken all on its own.)