Lavender System – AI Deciding Who To Kill For Israel

Lavender System AI Kills for Israel

The Lavender system determined who to kill, and was obeyed with military discipline.

By Etienne de la Boetie2, April 5, 2024

In 2021, the commander of Israeli intelligence published a book on designing a special machine that would resolve what he described as a human bottleneck for locating and approving targets in war. A recent investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” which does exactly that.

According to six Israeli intelligence officers with first-hand experience, the Lavender A.I. machine determined who to kill, and was obeyed with military discipline.

During the first weeks of the war, the Lavender system designated about 37,000 Palestinians as targets and directed air strikes on their homes. Despite knowing that the system makes errors about ten percent of the time, there was no requirement to check the machine’s data.

Lavender System - Israeli Gaza Genocide

The Israeli army systematically attacked the targeted individuals at night, in their homes, while their whole family was present. An automated system known as “Where’s Daddy?” was used to track the targeted individuals and carry out bombings when they entered their family’s residences. The obvious result was that thousands of women and children were wiped out by Israeli airstrikes. According to these Israeli intelligence officers, the IDF bombed them in homes as a first option. And on several occasions, entire families were murdered when the actual target was not inside. In one instance, four buildings were destroyed along with everyone inside because a single target was in one of them.

When it came to targets marked as low level by the A.I. Lavender system, cheaper bombs were used which destroyed entire buildings killing mostly civilians and entire families. This was done because the IDF did not want to waste expensive bombs on who they deemed as unimportant people.

It was decided that for every low level Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians. And if the target was a senior Hamas official, more than a hundred civilians was acceptable.

Continue reading …

Source: The Reese Report.

Be the first to comment on "Lavender System – AI Deciding Who To Kill For Israel"

Leave a comment

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.