Free from fear or favour
No tracking. No cookies

Police Force Faces Backlash Over Facial Recognition Technology Linked to Surveillance in Gaza

Officers have been using technology used by Israel during its operations in Gaza and the West Bank

A police live face recognition vehicle parked in the square outside the Victoria Shopping Centre at the north end of Southend High Street in September 2024. Photo: D C Rayment / Alamy
A police live face recognition vehicle parked in the square outside the Victoria Shopping Centre at the north end of Southend High Street in September 2024. Photo: D C Rayment / Alamy

Byline Times is an independent, reader-funded investigative newspaper, outside of the system of the established press, reporting on ‘what the papers don’t say’ – without fear or favour.

To support its work, subscribe to the monthly Byline Times print edition, packed with exclusive investigations, news, and analysis.

Facial recognition technology used by Essex Police has been developed by Corsight AI, a company whose technology has reportedly been used in controversial Israeli surveillance operations in Gaza and the West Bank, Byline Times has learnt.

Critics argue that this raises serious ethical questions about the adoption of tools with links to alleged human rights violations in conflict zones. Israel stands accused of genocide in its ongoing conflict in Gaza, with over 50,000 civilians reported killed since 7 October 2023 when Hamas launched an attack on Israeli civilians and military.

The police surveillance technology, rolled out in partnership with Digital Barriers, has been used by Essex Police to identify individuals in real-time during public events and in retrospective investigations.

‘Keir Starmer Wants Police to Expand Use of Facial Recognition Technology Across UK – He Should Ban it Altogether’

The PM announced plans for ‘wider deployment’ after the riots – but it is a ‘deeply flawed’ technology ‘technically and legally, and impacts entire communities’

British police officials have praised the system’s success, pointing to several recent arrests, including three individuals apprehended at the Clacton Airshow in August 2024 and two more in Southend shortly after, including for cases of sexual assault and common assault. 

Assistant Chief Constable Andy Pritchard in an Essex Police press release called it a “forward-thinking” approach to policing, claiming it helps identify suspects, enforce court orders, and protect vulnerable individuals.

However, an examination of the technology’s origins — a connection first made in the Israeli media outlet Shomrim — reveals troubling associations. Corsight AI, an Israeli company headquartered in Tel Aviv, has faced criticism for supplying facial recognition tools reportedly used by Israeli military intelligence to conduct mass surveillance of Palestinians in Gaza.

The technology has reportedly been deployed to identify individuals at military checkpoints and in other high-security areas, with Israeli forces allegedly using it to catalogue and track civilians. Reports have also indicated instances where the technology misidentified individuals, leading to wrongful detentions.


A History of Controversy in Gaza

Israeli intelligence officers reportedly used the system in Gaza, alongside tools such as Google Photos, to identify Hamas operatives and locate Israeli hostages. 

Human rights groups have highlighted significant flaws in the system, including false positives and cases where civilians were mistakenly flagged as militants. Amnesty International has previously described the use of such technology in Gaza as contributing to “automated apartheid” and warned that its adoption by law enforcement agencies elsewhere risks normalising oppressive surveillance practices.

In Gaza, facial recognition technology was reportedly used to scan individuals at checkpoints, identify people in drone footage, and match faces to databases compiled without the subjects’ knowledge or consent.

EXCLUSIVE

Revealed: Met Police Scans Almost Quarter of a Million Faces Using Facial Recognition Technology in 2023

This newspaper has found officers in London have been zealous users of the system, which automatically scans the faces of passers-by and matches them against a watchlist

Palestinians travelling along major roads or attempting to flee areas of heavy fighting were often subjected to facial scans. Some were detained without clear justification based on vague intelligence or broad criteria, according to Israeli military officials who spoke anonymously to international media outlets.

One such case involved Palestinian poet Mosab Abu Toha, who was detained after being scanned by a facial recognition camera at a checkpoint. Toha, who has no known links to Hamas, was subjected to lengthy interrogation and released only after an international campaign. 

Speaking about his experience to the New York Times, Toha said, “I did not know Israel was capturing or recording my face.” He continued, saying “they have been watching us for years from the sky with their drones. I feel like I have been watched for so long.”


Concerns Over Ethical Policing

The use of Corsight AI’s facial recognition technology by Essex Police has drawn criticism from human rights campaigners who question the ethics of adopting tools linked to military operations in occupied territories.

Critics argue that the deployment of such systems in the UK could lead to an erosion of public trust in policing and raise concerns about the normalisation of surveillance practices that are widely condemned on the international stage.

Rasha Abdul Rahim, and independent expert on technology, human rights and social justice and the former Director of Amnesty Tech, told Byline Times that “there is a widespread body of research that shows that invasive facial recognition technology amplifies racist and discriminatory law enforcement against racialised communities, including stop-and-search practices which disproportionately affect Black and brown people.” 

If law enforcement agencies are deploying such technologies they need to explain why they’re necessary in the first place, for what legitimate aim and they should demonstrate whether less intrusive technologies could achieve the same aim

Rasha Abdul Rahim, technology expert

Rahim also said: “it’s utterly shameful for the UK to be doing business with an Israeli company whose state is credibly accused of genocide, and is also using the very tools that have been tested and reportedly used on Palestinians to surveil and control them.  What this shows is that societies around the world are not immune from abusive technologies designed to oppress.”


Essex Police Defend Deployment

Essex Police have defended their use of the technology, emphasising its value in keeping communities safe.

“The procurement of Live Facial Recognition technology has been through a rigorous and competitive process which included data protection and security considerations,” the force said.

Tony Porter, Chief Privacy Officer at Corsight AI and a former UK Surveillance Camera Commissioner, has also publicly defended the technology, claiming it adheres to “the highest standards of fair use and transparency”.


The Broader Debate on Facial Recognition

The debate over facial recognition technology is not new, but its application by Essex Police brings fresh urgency to questions about its ethical and legal limits.

In 2022, the UK’s Information Commissioner’s Office raised concerns about the potential for bias and discrimination in facial recognition systems, particularly those developed with limited oversight or transparency.

ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

According to a Big Brother Watch report from December 2024, seven police forces in England and Wales currently use live facial recognition, while others have deployed it for large events or on a trial basis. It was first used in 2015 at the Download Music Festival. It is not known if other UK forces use technology developed by Corsight AI.

Following the UK riots — between 30 July and 7 August 2024 — Prime Minister Keir Starmer announced that he planned to establish “a national capability across police forces to tackle violent disorder,” including a “wider deployment of facial recognition technology”.

With Essex Police at the centre of this latest controversy, the public and policymakers are left grappling with a critical question: how do we balance the potential benefits of cutting-edge technology with the imperative to protect individual rights and uphold ethical standards about the purchase of systems developed in highly controversial conflict zones?


Written by

This article was filed under
, ,