IDF denies it uses AI software to target individuals in Gaza bombing campaigns
The Israel Defense Forces today denied it has been using an artificial intelligence system to create kill lists in its various bombing campaigns in Gaza.
In what is a frankly dystopian-sounding investigative report, Israel’s +972 magazine, which said it had spoken to six Israeli intelligence officers on the condition of anonymity, claimed attacks on targets in Gaza had at times been led by an AI system called “Lavender.”
The officers said that they relied on the data produced by the software to engage in the bombing of Palestinians, with one of the nameless sources saying that they treated the outputs as if “it were a human decision.” This is all the more shocking after the news that the IDF recently killed seven aid workers, adding to the toll of innocents who have been killed in bombing campaigns over Gaza.
“You put hundreds [of targets] into the system and wait to see who you can kill,” said one of the sources to +972. “It’s called broad hunting: you copy-paste from the lists that the target system produces.” The officers were reportedly not asked to “examine the raw intelligence data on which they were based.”
The kill lists presented by the system are reported to have led to around 37,000 people losing their lives in IDF bombing attacks. Another officer said the IDF “almost completely relied” on Lavender despite knowing that the system was not always reliable. He explained that a human would have to “rubber stamp” the result before the intel was acted upon, but he said the decision process usually lasts about “20 seconds.” He added, “I had no additional value as a human. It was quite time-saving.”
The system works by identifying the 2.3 million Palestinians that live in the Gaza Strip. It ranks each person based on the data, and each person in the system is given a score of 1 to 100, relating to the likelihood of them belonging to Hamas. If the person is ranked as a serious threat, a possible Hamas commander, the officer said, there would be authorization to kill as many as 100 civilians in the attack.
“We took out thousands of people,” explained another officer. “We didn’t go through them one by one – we put everything into automated systems, and as soon as one of [ranked individuals] was at home, he immediately became a target. We bombed him and his house.”
The Guardian confirmed the use of the software after also speaking to the sources, with one of them saying he or she was authorized to kill 15 to 20 civilians when Lavender had given them a low rank. “You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country, and there’s a shortage,” he said.
The IDF responded to the reports, issuing a statement to The Guardian explaining that it does indeed use Lavender, but “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations.”
The statement added, “The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist…. In accordance with the rules of international law, the assessment of the proportionality of a strike is conducted by the commanders on the basis of all the information available to them before the strike.”
Photo: Emad El Byed/Unsplash
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU