B., the senior officer, claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”

According to B., a common error occurred “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender,” B. said.

https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data 🧶

#duty #protocol #conformance #AIRisks #riskApproach #risks #dehumanisation #AIWar #technoCriticism #ethics #efficiency #innovation #Targeting #industrialization #intelligence #AirForce #casualties #DataScience #DataScientist #SIGINT #Unit8200 #patriotism #JewishSupremacy #CrowdSourcing #OSINT #CROSINT #MilInt #military #army #ES2 #IDI #IDF #Unit8200 #Lotem #Habsora #Lavender #dataDon #dataGovernance #Gaza #Hamas #warCrimes #israel

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

+972 Magazine