Skip to content

Center on Capital & Social Equity

Exploring economic inequality – Advocating for the bottom 50%

Menu
  • Home
  • About Us
  • News Blog
  • Legacy Site
  • Our Work
  • Research & Policy
Menu

Amazon Suspends Palestinian Engineer Over Protest Against $1.2B Israel Contract – WebProNews

“Amazon suspended Palestinian engineer Ahmed Shahrour for protesting the $1.2 billion Project Nimbus cloud contract with Israel, citing workplace conduct violations amid his claims of enabling genocide in Gaza. This reignites debates on tech ethics, employee speech, and corporate ties to geopolitics, potentially inspiring more activism.”

So, what might motivate well-paid US tech workers to sacrifice their jobs/careers to protest their employers’ contracts with Israel’s military?  Do they know details we don’t?  The articles below provide windows into how US firms (and taxpayers) provide AI and other tools to help Israel target Palestinians with little or no consideration of collateral damage including killing women and children.

Palantir allegedly enables Israel’s AI targeting in Gaza, raising concerns over war crimes – Business & Human Rights Resource Centre

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza – +972

“The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.”

“The following investigation is organized according to the six chronological stages of the Israeli army’s highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the ‘Where’s Daddy?’ system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how ‘dumb’ bombs were chosen to strike these homes. 

“Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time…”


©2026 Center on Capital & Social Equity | Design: Newspaperly WordPress Theme