Israel used secretive AI program called 'Lavender' to identify thousands of bombing targets: report

Israeli military AI program reportedly identifies bombing targets in Gaza with 10% error rate, leading to civilian casualties and White House investigation.

  • Bilawal Riaz
  • 1 min read

An Israeli military artificial intelligence program named “Lavender” was reportedly used to identify bombing targets in Gaza, despite a 10% error rate. The system allegedly flagged 37,000 potential terrorists, including low-level Hamas members not usually targeted. Sources revealed that civilian casualties were reportedly accepted, with up to 20 civilians killed for each junior operative found. The military denied accusations of targeting civilians intentionally, stating “Lavender” is a database for cross-referencing intelligence. The White House is investigating the claims, while Palestinian casualties in the conflict have been high, with many families experiencing multiple losses.

Original link

Comment

Send us your comments

Bilawal Riaz

Posted by : Bilawal Riaz

Dad by day, coder by night 🥷

Recommended for you

'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets

'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets

Israeli military used AI database in Gaza bombing campaign, causing civilian casualties.

Israel's Use of Mass Starvation as a Weapon of War

Israel's Use of Mass Starvation as a Weapon of War

President Joe Biden sets "red line" over Palestinian deaths in Gaza amidst U.S.-backed Israeli offensive.