Free from fear or favour
No tracking. No cookies

‘Keir Starmer Wants Police to Expand Use of Facial Recognition Technology Across UK – He Should Ban it Altogether’

The PM announced plans for ‘wider deployment’ after the riots – but it is a ‘deeply flawed’ technology ‘technically and legally, and impacts entire communities’

The Metropolitan Police using live facial recognition technology in Croydon, south London in February 2024. Photo: PA Images / Alamy
The Metropolitan Police using live facial recognition technology in Croydon, south London in February 2024. Photo: PA Images / Alamy

Byline Times is an independent, reader-funded investigative newspaper, outside of the system of the established press, reporting on ‘what the papers don’t say’ – without fear or favour.

To support its work, subscribe to the monthly Byline Times print edition, packed with exclusive investigations, news, and analysis.

Using the United Kingdom’s racist riots as an excuse to expand law enforcement’s use of face recognition technology is a rash mistake that will only exacerbate harm against the immigrant communities already under threat.

The rioting, sparked by online disinformation, led Prime Minister Keir Starmer to announce on 1 August that he will establish “a national capability across police forces to tackle violent disorder,” including a “wider deployment of facial recognition technology”.

Finding perpetrators of violent acts of racism is a compelling cause. But police face recognition is the wrong answer. It is deeply flawed, technically and legally, and impacts entire communities rather than focusing on criminal suspects.

A police car is set on fire as far-right activists hold an ‘Enough is Enough’ protest on 2 August 2024 in Sunderland. Photo: Drik/Getty Images

Both the means of deployment and the technology at its core have a history of misidentifying people of colour, as well as failing to correctly identify trans and non-binary people. Even if overnight the technology somehow had 100% accuracy, it would still be an unacceptable tool of invasive mass surveillance capable of identifying and tracking people on a gigantic scale.

Police forces across the UK already use live face recognition (LFR). Since the first known deployment in 2015, police have used it to attempt to match people’s faces in databases of photographs, including suspect photos, as they walk down the street or attend a football match.

It has an egregious history of being deployed in areas where the population is predominantly composed of people of colour, such as at Notting Hill Carnival, as well as in highly populated areas like Oxford Street, London.

‘I Predict A Riot: The Manufacturing of Islamophobic and Anti-Migrant Hate On Our Streets’

We must be honest about the fact that it is not only fringe rabble-rousers who have engaged in this damaging rhetoric, writes Adeeb Ayton

Facial surveillance technology allows police to track people not only after the fact but also in real time, including at lawful political protests. Its normalisation and widespread use by the Government would fundamentally change the society in which we live. It will, for example, deter people from exercising their rights to free speech, peaceful assembly, and expressive association. 

And this burden historically falls disproportionately on communities of colour, immigrants, religious minorities, and other marginalised groups—all of whom were the targets of the racist riots across the country earlier this month.

Police in the US recently combined this dystopian technology with another to create an even deeper affront to civil liberties.

A police force in California took a DNA sample from a crime scene, ran it through a service that guesses what the perpetrator’s face looked like, and plugged this rendered image into face recognition software to build a suspect list. 

EXCLUSIVE

Exposing the Real UK Race Riot Instigators: The Key Players and Transatlantic Network Around Tommy Robinson

Robinson has been blamed for stoking the unrest, but the tentacles of disinformation are global. Nafeez Ahmed uncovers a worrying nexus of tech platforms and far-right conspiracy – with links to Europe, Russia and the January 6 insurrection

Scientists have affirmed that predicting face shapes—particularly from DNA samples—is not possible. So not only is the artificial face a guess, but face recognition—known to misidentify real people—will create a “most likely match” for that made-up face. It’s a slippery slope toward a police free-for-all.

The expanding roll-out of this dangerous technology has evaded Parliament’s scrutiny. Police forces have unilaterally decided to use LFR, without adopting sufficient safeguards.

In 2022, the UK Government rejected a House of Lords report calling for the introduction of regulations and mandatory training to counter the negative impact that the current deployment of surveillance technologies has on human rights and the rule of law. The evidence that the rules around face recognition need to change are there—many are just unwilling to see or do anything about it.

ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

Doubling down on face recognition technology in a moment of crisis will only exacerbate tensions and abuses further. Government and police use of facial recognition crosses a bright red line, and we should not normalise its use, even during a national tragedy. 

Starmer should not only avoid expanding the use of this technology at an already precarious time, he should ban Government use of face recognition in the UK altogether.


Paige Collings (@CollingsPaige) is Senior Speech and Privacy Activist at the Electronic Frontier Foundation, a nonprofit digital civil liberties organization. Her work focuses on how marginalized communities are stifled by state surveillance and corporate restrictions.


Written by

This article was filed under
,