The Israeli military is using artificial intelligence to determine bombing targets with cursory oversight from humans, according to reports from The Guardian and +972 Magazine last week.

The reports cite anonymous Israeli intelligence officials, who say an AI program called Lavender is trained to identify Hamas and Palestinian Islamic Jihad militants as potential bombing targets. The government has reportedly given Israel Defense Forces officers approval to take out anyone identified as a target by Lavender. The tool has been used to order strikes on “thousands” of Palestinian targets — even though Lavender is known to have a 10% error rate. According to the Guardian, 37,000 potential targets were identified by the program.

The IDF did not deny the existence of Lavender in a statement to CNN but said AI was not being used to identify terrorists.

Militaries around the world are building up their AI capacities, and we’ve already seen the technology play a major role on both sides of the war between Russia and Ukraine. Israel’s bloody campaign in Gaza has led to widespread allegations of indifference toward mass civilian casualties, underlined by the bombing of a World Central Kitchen caravan last week. Artificial intelligence threatens to play a greater role in determining who lives and who dies at war — will it also make conflicts more inhumane?

More For You

Chris, an Army veteran, started his Walmart journey over 25 years ago as an hourly associate. Today, he manages a Distribution Center and serves as a mentor, helping others navigate their own paths to success. At Walmart, associates have the opportunity to take advantage of the pathways, perks, and pay that come with the job — with or without a college degree. In fact, more than 75% of Walmart management started as hourly associates. Learn more about how over 130,000 associates were promoted into roles of greater responsibility and higher pay in FY25.