'AI-assisted genocide': Israel reportedly used database for Gaza kill lists

Israeli military uses AI system to identify bombing targets in Gaza, sparking concerns over potential war crimes.

  • Bilawal Riaz
  • 1 min read

Israeli media outlets have reported on the use of an AI-assisted system called Lavender by the Israeli military to identify targets in Gaza. The system, with an error rate of about 10%, reportedly led to thousands of Palestinians being identified as potential bombing targets. Concerns have been raised by human rights and technology experts, with some suggesting that this AI use could amount to war crimes, considering the high number of civilian casualties in Gaza. Israeli military officials defended the system, stating that analysts verify targets according to international law. Critics argue that the use of AI in warfare raises ethical and legal concerns, with some experts suggesting that it violates international humanitarian law.

Original link

Comment

Send us your comments

Bilawal Riaz

Posted by : Bilawal Riaz

Dad by day, coder by night 🥷

Recommended for you

The Aftermath of a Major Battle in Gaza : State of the World from NPR

The Aftermath of a Major Battle in Gaza : State of the World from NPR

Israeli military besieged Gaza's largest medical complex after a major battle with Hamas fighters.

'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets

'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets

Israeli military used AI database in Gaza bombing campaign, causing civilian casualties.