#Gaza / ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

The #IDF attacked the #Gaza Strip during the war with the assistance of an #AI-based data system called "Lavender", which incriminated 37,000 men as potential #Hamas and Islamic Jihad (#IJ) activists. According to an investigation published by the local news site "Sikha Mekomit" publised also in The Guardian, in addition to using this system, intelligence sources stated that the army had set quotas for killing uninvolved civilians before certain assassinations.

According to two sources, in the first weeks of the war, the army approved strikes on junior operatives that could kill 15-20 civilians in addition to the targets. They said that attacks on such targets were usually carried out with "stupid bombs" that destroyed entire homes and killed their inhabitants.

[...] “This is unparalleled, in my memory,” said one intelligence officer who used #Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

[...] “Because we usually carried out the attacks with dumb bombs, and that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”

McKernan, Bethan, and Harry Davies. “‘The Machine Did It Coldly’: Israel Used AI to Identify 37,000 Hamas Targets.” The Guardian, April 3, 2024, sec. World news. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes.

@israel
@palestine
#WarCrimes

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

Israeli intelligence sources reveal use of ‘Lavender’ system in Gaza war and claim permission given to kill civilians in pursuit of low-ranking militants

The Guardian

@oatmeal @israel @palestine civil defense, police, ministry of security treated as 'Hamas enough' for training data. Journalists and aid workers probably have a similar algorithmic profile when it comes to the low thresholds for "computer says die" we're talking about here.

This is every bit as horrific as Pol Pot's killing fields, the gas ovens, the spread of hiv infected blood to the south, the starving of Bengal.

A clearer metaphor for abdication of humanity is impossible.