#Gaza / ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

The #IDF attacked the #Gaza Strip during the war with the assistance of an #AI-based data system called "Lavender", which incriminated 37,000 men as potential #Hamas and Islamic Jihad (#IJ) activists. According to an investigation published by the local news site "Sikha Mekomit" publised also in The Guardian, in addition to using this system, intelligence sources stated that the army had set quotas for killing uninvolved civilians before certain assassinations.

According to two sources, in the first weeks of the war, the army approved strikes on junior operatives that could kill 15-20 civilians in addition to the targets. They said that attacks on such targets were usually carried out with "stupid bombs" that destroyed entire homes and killed their inhabitants.

[...] “This is unparalleled, in my memory,” said one intelligence officer who used #Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

[...] “Because we usually carried out the attacks with dumb bombs, and that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”

McKernan, Bethan, and Harry Davies. “‘The Machine Did It Coldly’: Israel Used AI to Identify 37,000 Hamas Targets.” The Guardian, April 3, 2024, sec. World news. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes.

@israel
@palestine
#WarCrimes

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

Israeli intelligence sources reveal use of ‘Lavender’ system in Gaza war and claim permission given to kill civilians in pursuit of low-ranking militants

The Guardian

#MediaBias / [cont’d]

It was only a couple of months ago that trolls hosted with the poison machine at @babka.social were busy cleaning Mastodon from "antisemites", bullying voices critical of Israel. One of their bearded busy bees went as far as to claim that Local Call [Sikha Mekomit] was a Hamas-operated outlet 🙄 …

However, since then, the exceptional work of the guys at Local Call has been quoted by mainstream media outlets like The Guardian, and now even the New York Times, which with their pathetic record so far (i.e. the Anat Schwartz piece being the most obvious failure) would do well to thoroughly read everything published by Local Call and learn how investigative journalism is done.

So the baseless accusations against Local Call as a “Hamas mouthpiece” have been thoroughly discredited, as their reporting has been widely recognized and relied upon by international media. But this incident highlights the dangers of "poison machines" like @babka.social, @RememberUsAlways
and many others like them (Israel’s official social media setting the tone), who seek to delegitimize and silence critical voices on the Israel-Palestine conflict through smear tactics, and unfounded claims

Wallace-Wells, David. “Opinion | What War by A.I. Actually Looks Like.” The New York Times, April 10, 2024, sec. Opinion. https://www.nytimes.com/2024/04/10/opinion/war-ai-israel-gaza-ukraine.html or https://archive.is/IdmA9.

@israel
@palestine
#israel #WarCrimes #Gaza
#RightSideOfHistory #AnatSchwartz

Opinion | What War by A.I. Actually Looks Like

The Israel Defense Forces’ offensive in Gaza is an ominous hint of the military future

The New York Times