How Israel Used AI to Form Kill Lists & Bomb Palestinians in Their Homes

Israeli military's AI system "Lavender" was used to create a kill list of Palestinians in Gaza, leading to extensive casualties and international law concerns.

  • Bilawal Riaz
  • 1 min read

Israeli military has been revealed to have used an AI system called “Lavender” to create a “kill list” of Palestinians in Gaza. The AI system identified targets with little human oversight, resulting in the targeting of thousands of individuals. Another AI system called “Where’s Daddy?” tracked Palestinian men on the list, often leading to bombings of their homes. The military’s use of these AI systems resulted in massive casualties, including civilians, and raised concerns over the violations of international law. The Israeli military denied using AI to identify targets, but evidence from sources contradicts this claim. The AI systems interact with various weapons, including drones, in carrying out airstrikes. The investigative report highlights the permissive nature of Israeli military policies and the need for accountability.

Original link

Comment

Send us your comments

Bilawal Riaz

Posted by : Bilawal Riaz

Dad by day, coder by night 🥷

Recommended for you

'AI-assisted genocide': Israel reportedly used database for Gaza kill lists

'AI-assisted genocide': Israel reportedly used database for Gaza kill lists

Israeli military uses AI system to identify bombing targets in Gaza, sparking concerns over potential war crimes.

'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets

'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets

Israeli military used AI database in Gaza bombing campaign, causing civilian casualties.