Israel used secretive AI program called 'Lavender' to identify thousands of bombing targets: report
Israeli military AI program reportedly identifies bombing targets in Gaza with 10% error rate, leading to civilian casualties and White House investigation.
- Bilawal Riaz
- 1 min read

An Israeli military artificial intelligence program named “Lavender” was reportedly used to identify bombing targets in Gaza, despite a 10% error rate. The system allegedly flagged 37,000 potential terrorists, including low-level Hamas members not usually targeted. Sources revealed that civilian casualties were reportedly accepted, with up to 20 civilians killed for each junior operative found. The military denied accusations of targeting civilians intentionally, stating “Lavender” is a database for cross-referencing intelligence. The White House is investigating the claims, while Palestinian casualties in the conflict have been high, with many families experiencing multiple losses.