I think that you should read this article:

https://www.972mag.com/lavender-ai-israeli-army-gaza/

Read it carefully.

Note: This article is based substantially on anonymous sources verified by 972mag. Anyone who has done a lot of reading on Israel-Palestine has probably accumulated a list of sources they've decided not to trust; if 972 is on your mistrust list, consider this article by the Guardian, who independently reviewed the article's accounts and effectively cosigns the sources' validity.

https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

+972 Magazine

"'Lavender’: The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties"

A source both 972 and the Guardian cite asserts once this system had selected a target, a human would only spend about 20 seconds reviewing it. (972 states the only criteria were "is the target male" and whether the number of collateral casualties on a strike location would be over the IDF's limit.)

What I find myself asking is how much work the phrase "Artificial Intelligence" (and the widespread cultural belief that "AI" is a thing that exists) does to soften impact of these revelations. Imagine the central allegation of the above articles— that the "AI" program was actually selecting which persons to target for airstrikes— described without reference to "AI". Killing people because their "characteristics" fit a "statistical model" of a militant. Does calling this "AI" inform or obscure?
@mcc I think that I read some foreshadowing of this after the Targeting Directorate was first publicized after its 2019 establishment.
The tech bros (8200) went over the head of the traditional, pixel pinching folk that were in charge of targets.
Because the traditional way is people-heavy and 'inefficient'. It takes time to train someone to be able to look at three pixels and say "yep, that's the guy" with life-or-death levels of human certainty. A machine does not feel guilt.