I have a lot more to say, but I'll hold it for now and simply wonder aloud...

Which BigTech clouds are the "Lavender" & "Where's Daddy?" AI systems running on? What APIs are they using? Which libraries are they calling?

What work did my former colleagues, did I, did *you* contribute to that may now be enabling this automated slaughter?

(Also, content warning. This is some of the sickest shit I've ever read.)

https://www.972mag.com/lavender-ai-israeli-army-gaza/

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

+972 Magazine

"Which BigTech clouds are the "Lavender" & "Where's Daddy?" AI systems running on? What APIs are they using? Which libraries are they calling? "

🚨 THIIIIIIIIIIIIIIS! 🚨

@Mer__edith

@Mer__edith
How can you call a system designed to destroy "the Gospel"? Sad and terrible.
@Mer__edith I thought I made it halfway through the article… then realized I stopped at the 10% point. It’s… it’s so horrifying.
@Mer__edith for some reason this George Woodcock quote comes to mind: "It is a frequent circumstance of history that a culture or civilization develops the device that will later be used for its destruction."

@Mer__edith

Personally, I don't believe it. I don't believe the IDF is using an AI to choose targets, since everyone is targeted indiscriminately.

The whole thing is to give themselves plausible deniability if one day any of them eventually get dragged in The Hague.

They will then blame all atrocities on computer glitches.

They *already* do.

@Lily_and_frog @Mer__edith Yep, that's certainly plausible.

@Lily_and_frog @Mer__edith the article said there would be an officer approving hits, but that they would basically just see if the name sounded masculine, to get through approving as many hits as possible.

So, that kind of throws out any plausible deniability right there

@Mer__edith How much would you bet against IBM being involved here? https://www.theguardian.com/world/2002/mar/29/humanities.highereducation
IBM 'dealt directly with Holocaust organisers'

Newly discovered documents from Hitler's Germany prove that IBM directly supplied the Nazis with technology which was used to help transport millions of people to their deaths in the concentration camps.

The Guardian
@Mer__edith How sad this is...😥
@Mer__edith
...governments are really trying to bring fucking Skynet online.

@Mer__edith every time I think Israel has hit the bottom of their depths of depravity, some new horror emerges.

Utterly sickening.

@Mer__edith
Unfucking believable!!!
@Mer__edith My personal suspicion is: there is no AI. There are just the usual human decisions in war covered with a shallow software layer (called AI) to suppress the natural human inhibitions against killing. It creates an abstraction layer to appease the minds of people responsible.
@Mer__edith absolutely dystopian.
@Mer__edith totally normal thing for a totally normal country.

This is legitimately some of the sickest shit I have read
@Mer__edith what if I told you that THIS is yet another reason why noone should use #GSM and espechally.not collect #PhoneNumbers at all?
@Mer__edith truly frightening and heartbreaking...
@Mer__edith Have other journalists verified the facts here yet?

@Mer__edith
IDF Now: It's just a database
IDF Later: OK, it was AI, oopsy. But [recites by rote]Israel reserves the right to defend itself.[end recitation]

"'five to 10 acceptable civilian deaths' for every single Palestinian fighter who was an intended target" is not 'just a database'.

Nvidia's Israel-1 supercomputer starts operations

Two months ahead of schedule

@Mer__edith

We are Google and Amazon workers. We condemn Project Nimbus -
Anonymous Google and Amazon workers
(two years old!)

https://www.theguardian.com/commentisfree/2021/oct/12/google-amazon-workers-condemn-project-nimbus-israeli-military-contract

We are Google and Amazon workers. We condemn Project Nimbus

We cannot support our employer’s decision to supply the Israeli military and government technology that is used to harm Palestinians

The Guardian

@Mer__edith

A few things stand out to me here:

Firstly:

for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damageThis monstrous, there can be no excuse for this. Outright warcrime to intentionally allow for collateral damage like this. AI Compounds the issue as we'll see later but at it's core this is the result of not seeing a population as human, AI targeting or not.

Second:
One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing This right here is my biggest fear with AI making decisions. That AI will be trusted, and humans will rubber-stamp the result in the end. It is an abdication of responsibility, which when it is talking about rejecting a resume it is bad, but here again it is outright evil.

It is a really stark example of the biggest failure mode of AI, of humans just trusting, and punting responsibility whether it is for AI driving cars running into pedestrians, hiring systems rejecting resumes, or somehow, determining that someone has links to a certain organization and sending the bombs of a government that has fallen into fascism and decided that their enemies aren't even human.

@Mer__edith This is supremely dystopian shit. The state apparatus of Israel has become absolutely devoid of humanity:

"During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male."

@Mer__edith Fucking hell...

Project Insight in the MCU was meant to be a warning, not a bloody blueprint