This powerful piece discusses how the U.S. military ended up killing almost 180 schoolchildren.

We’ve been discussing this as an AI problem. It’s not.

https://www.theguardian.com/news/2026/mar/26/ai-got-the-blame-for-the-iran-school-bombing-the-truth-is-far-more-worrying

“Someone decided that deliberation was latency. Someone decided to build a system that produces 1,000 targeting decisions an hour and call them high-quality. Someone decided to start this war.”

AI got the blame for the Iran school bombing. The truth is far more worrying

LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

The Guardian
@slothrop So called AI can't be blamed for anything. It's a tool. The tool user is the one to blame. Guns don't kill, people do.

@pa27 it’s a little more complicated than that - that’s the entire point of the article.

Tools provide affordances and structures. They make some things more likely to happen, and others less so.