A few toots hit me hard this morning.

First we have this from @zip:

there's something real fucking dark about how AI is too inaccurate for tracking people around a grocery store so they have humans to make sure every last item is paid for, whereas if your AI is generating targets for a genocide then a serious error rate is perfectly within acceptable parameters.

For context:

The news of yesterday that Amazon had 1000 humans reviewing the outcome of their so called AI powered "Just Walk Out" technology, and is now shutting it down.

Amazon is giving up on the cashier-less "Just Walk Out" technology at its Amazon Fresh grocery stores. The Information reports that new stores will be built without computer-vision-powered surveillance technology, and "the majority" of existing stores will have the tech removed. In the early days, Amazon's ambitions included selling Just Walk Out to other brick-and-mortar stores. The problem was that the technology never really worked.

And the news of today that Israel is heavily relying on AI to identify and locate what they consider Hamas terrorists, with an immense amount of collateral damage.

In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.

It made me think of the 20 year old game called September 12th: A Toy World. In short, the game is about killing terrorists with missiles. But the terrorists walk around a crowded place among civilians. And if you happen to kill any civilians while trying to get rid of the terrorists (which is inevitable), surviving civilians are turned terrorists by your actions.

The game was a comment on The War on Terror back then. Israel seems to have mistaken the message of the game and distorted it into something completely different: If they make sure to kill enough civilians as collateral damage, they will all – eventually – turn into terrorists. And thus make them everyone valid targets.

@Mer__edith tooted while sharing the article about the Israeli AI systems:

I have a lot more to say, but I'll hold it for now and simply wonder aloud...

Which BigTech clouds are the "Lavender" & "Where's Daddy?" AI systems running on? What APIs are they using? Which libraries are they calling?

What work did my former colleagues, did I, did you contribute to that may now be enabling this automated slaughter?

Related to that story as well, @cozymel tooted:

I don't think enough people appreciate that what's happening in Palestine is the future of all police states. Israeli fascists are trying out the technology for other fascists to use elsewhere. If we keep letting them get away with it, sooner or later it will be used on you and me.

This is so true. The USA keeps funding Israel and their war machine, because the conflict is a valuable testbed for modern and next generation warfare.