Free from fear or favour
No tracking. No cookies

Machine Learning and the Carnage in Gaza

The Israeli army is using an AI-assisted targeting system called Lavender in Gaza. Are we really willing to entrust an algorithm with the lives and deaths of human beings?

Newsletter offer

Subscribe to our newsletter for exclusive editorial emails from the Byline Times Team.

Are we witnessing the first AI-assisted genocide? Reports that Israel is using untested AI systems to exact deadly air strikes with a lenient civilian casualty margin may seem beyond imagination, but make no mistake, Machine Learning is the future of war.

The Israeli-Palestinian publication +972 Magazine and Hebrew-language media outlet Local Call reported recently that the Israeli army was isolating and identifying thousands of Palestinians as potential bombing targets using an AI-assisted targeting system called Lavender.

Reports suggest that the Lavender system created a kill list of Hamas or militia operatives or associates and that these individuals were often targeted in their family home, with sources citing that 15 to 20 civilian casualties were deemed permissible for a junior-level target, and upwards of 100 civilian casualties were permissible for a senior commander.

For lower-level targets, so-called dumb bombs were reportedly used, these unguided munitions typically pose a greater threat to civilians, particularly in regions as densely populated as Gaza. US Intelligence reports from December indicated that 40 to 45 per cent of air-to-ground missiles launched by Israel were dumb bombs.

People inspect the site where World Central Kitchen workers were killed in Deir al-Balah, Gaza Strip, Tuesday, April 2, 2024. Photo: Associated Press / Alamy

The new wave of AI advances aligns with three primary military uses; autonomous weapons systems, cyber warfare, and underpinning military support systems in decision-making.

The danger is, as with many other fields related to AI, there is no established ethical framework or legal structure, militaries are investing heavily in AI technology, but without a human-centred approach with clear regulations and an adherence to international human rights law, the proliferation and autonomy of Machine Learning in war is incredibly dangerous.

Drones have been implemented in combat for years; between 2009 and 2017, the number of American soldiers on the battlefield has decreased by 90% and the number of U.S. drone strikes increased tenfold. Today, U.S., Russian, Israeli, Chinese, Iranian, and Turkish drones are flying attacks in the Middle East, the African continent, Southeast Asia, and Europe.

The advent of Machine Learning and innovation within the field of AI technology has paved the way for fully autonomous weapons, the use of which has already been documented in Ukraine. The Saker Scout, an autonomous drone, can identify 64 different types of Russian ‘military objects’ and launch an attack without human oversight.

Autonomous weapons systems that enable advanced AI systems to execute deadly strikes pose an unprecedented threat and demand critical scrutiny around compliance with international human rights and ethical standards. Are we really willing to entrust an algorithm with the lives and deaths of human beings?

EXCLUSIVE

Brits Want the UK to Ban Arms Sales to Israel But its Political Leaders Aren’t Listening

A new poll commissioned by Byline Times suggests that supporters of all political parties now back an embargo on all arms sales to Israel


The Lavender software system reportedly used in Israel analyses information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks each particular person based on the extent of their perceived involvement in the military wing of Hamas or other terrorist militias. According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant. 

Lavender uses machine learning technology to identify characteristics of known Hamas and militia operatives, whose information was fed to the machine as training data, and then to locate these same characteristics among the general population, the sources explained. An individual found to have several different incriminating features will automatically becomes a potential target for assassination, along with anyone else deemed expendable who happens to be caught within the accepted causality radius. The reports have been widely condemned with UN Secretary-General Antonio Guterres expressing serious concern last week. 

Israel’s war offensive in Gaza has become one of the deadliest conflicts of the 21st century; between 7 October 2023 and 7 April 2024 at least 33,175 Palestinians were killed, and 75,886 Palestinians were injured. According to the Ministry of Health in Gaza, fatalities include over 14,000 children and 9,220 women. In the first 100 days alone Israel dropped eight times the total number of bombs the United States dropped in Iraq over a six-year period. 196 aid workers have been killed, along with 105 journalists. Foreign media continues to be denied access to the besieged enclave. 

Francesca Albanese, the UN special rapporteur on Human Rights in the Palestinian Occupied territories says she believes Israel has committed “acts of genocide” in Gaza. In January the International Court of Justice (ICJ) found it plausible that Israel’s acts could amount to genocide, ordering six interim measures that Israel must take to prevent genocide. Since the ICJ ruling a further 6,918 Palestinians have been killed, and famine has set in with Palestinians now accounting for some 80 per cent of those facing famine worldwide. 

Over 180 Rights Groups and Tech Experts Call for UK and Worldwide Halt to Facial Recognition Surveillance

The EU is implementing a blanket ban on facial recognition surveillance by police – but the UK is ploughing ahead despite privacy fears

Could the use of untested AI technology with minimal human oversight be contributing to the scale of death and destruction in Gaza? And if so, who bears responsibility? As with most bleeding-edge technology, AI was always destined for the battlefield, but according to these reports the cold reasoning of Machine Learning has been quickly promoted to degrees of autonomy historically reserved for only the darkest dystopian horror Science Fiction writers could imagine. 

Rapidly advancing technology weaponised in this way sets a dangerous precedent. It is an established principle within the laws of war that civilians are to be protected. Relying on a computer-based system to identify civilians and combatants without human analysis of the raw data, and with pre-programmed allowances for civilian casualties could constitute a breach of international humanitarian law. 

The US Department of Defence has released a series of policies on military AI, most recently the Data, Analytics, and AI Adoption Strategy released in November of 2023. The report includes a series of user guidelines, including ensuring military AI systems are auditable and that high-consequence applications undergo senior-level review. Whilst this work is essential in starting the necessary conversations around regulation, the guidelines in the report are non-binding and unenforceable. The International community must work at pace to create an enforceable international framework that limits the risks of Machine Learning in warfare.

The Lavender system, on the surface, could be considered a decision support system; one of any number of computerised tools which may use AI-based software to produce analysis to inform military decision-making. These systems collate evidence and combine data sources in order to identify people or objects and to make recommendations for military operations, or predictions about future situations.

Where the use of Lavender has veered into alarming territory is that these recommendations are seemingly being accepted as orders. The damage of empowering machines to make life-and-death decisions may only become apparent in the aftermath of Israel’s assault on Gaza, by which point the threat to humanity may already be too grave to mitigate. 


Written by

This article was filed under
, , , , , ,