Free from fear or favour
No tracking. No cookies

The Ugly Face of War: How Technology is Set to Change the Battlefield

Iain Overton explores how facial recognition technology is being applied to military conflict

A military drone. Photo: vadimmmus/Alamy

The Ugly Face of WarHow Technology is Set to Change the Battlefield

Iain Overton explores how facial recognition technology is being applied to military conflict

The horror of a future war is often revealed incrementally.

Last week, it was reported that Clearview AI, an American technology company, had won a contract to provide the US Air Force with glasses equipped with facial recognition systems.

The $50,000 contract promises to help protect airfields by, it might be assumed, allowing guards to get alerts if a ‘hostile face’ appears in their bespectacled field of view.

Clearview, backed by Facebook and Palantir investor Peter Thiel, is on a mission to help America’s military identify potential enemies. No doubt, images of America’s most wanted terrorists – such as Ibrahim Salih Mohammed Al-Yacoub or Mohammed Ali Hamadei – are already uploaded to their facial recognition servers. 

It is a company well-placed to do this, having harvested more than three billion images of people’s faces from social media. It is so good at identifying people that it even has a facial recognition app that lets a person take a picture of someone, upload it and then see all the photos of them currently on the web, with links to where these photos are posted. It seems, horribly, like a stalker’s paradise.

But why should we be concerned with the world’s most powerful military getting into facial recognition defence? After all, police departments in the United States have been using facial recognition tools for almost two decades, cross-checking images such as mug shots with driver licenses?

The reason is because what Clearview does is far superior.

It operates a neural net that converts the image of a face into what is best described as a mathematical formula. Vectors are ascertained from a face’s geometry – how long the nose is, how wide the mouth – put into groups of images with similar geometric shapes. So, when a new photo is uploaded, it can be converted into geometrical code and then quickly compared with the other facial groupings. A person can be identified in an instance. 

It sounds very exciting. But when it is imagined how a military drone might be equipped with such facial recognition systems, something approaching horror emerges. 

Imagine for a moment, swarms of drones – fully-electric, solar-powered autonomous drones – ever present in the skies of a conflict zone. Imagine them being used to search out and assassinate key targets, constantly referencing backed to an enormous online kill list that has been extracted from social media and other websites.

It is a dystopian vision.


Total War

But the technology to realise this is already happening, albeit incrementally.

As Forbes has pointed out, an American patent application was filed by Tel Aviv-based AnyVision back in August 2019. It was to help drones to attain the best angle in order for on-board facial recognition systems to work, and for that drone to reference the shot with a store of faces stored downstream.

In December 2021, AnyVision executives announced that they had partnered in a joint venture called SightX with the Israeli defence company Rafael. Their executives reportedly said that facial recognition features were in development. 

So far, so bad.

But some may say: so what? Surely it is better to have drone systems that are programmed only to blow up in the face of a pre-approved target? Particularly considering that, when explosive weapons are used in populated areas, 90% of those killed or injured are civilians, targeted killings sound like a better form of war.

But the road to hell is paved with good intentions. Ignore, for a moment, what the use of distanced assassination drones means for due judicial process, and all the possibilities of extra-judicial killings. Ignore, too, the fact that national intelligence agencies are well known for providing such poor intel that it is far from certain that the ‘right’ faces will be uploaded to their kill lists.

But do consider this concerning development: that if the US have it, then Russia and China – countries that US intelligence chiefs have identified as their main threat – will also have facial recognition-equipped lethal drones. Or at least they will soon.

The outcome of this could be quite profound, not least for what it does to transparency and accountability.

ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

It does not take long to realise that, if such drones are unleashed in the first hours of war by the enemy, those drones would be programmed to search for the faces of all senior military personnel. Given this threat, a natural and logical response would be wholesale armed forces anonymity.

Facebook posts would be banned. Twitter profiles deleted. Photo ops shunned. The militaries of the developed world would be forced to march quickly into the shadows.

Worryingly, this would lead to a great shutting down of transparency. In a world in which someone’s face can be weaponised, militaries necessarily have to be faceless. It will become harder and harder to hold individuals to account, as they would be protected by a security right to privacy. 

The decline of accountability is the natural offspring of targeted warfare. 

If you know my face, the logic goes, you will hunt me down and kill me – so, I won’t let you know my face. But the entire basis of due legal process in liberal democracies is about identifying the accused. And, if the accused cannot be identified, then prosecutions for matters such as war crimes or murder will be more difficult to levy. Generals will become anonymous. Commanders will wear masks.

This might also create problems of morale. The public praise of the battle hero will no longer be possible. Anonymity does not lend itself towards propaganda-fuelled medal ceremonies. 

This may also lead to ‘softer’ targets being acquired by the enemy. Politicians, healthcare chiefs, trauma medics, fire service personnel, police officers – their faces may be targeted. Total War might be waged because Face-Targeted War cannot be fought against soldiers wearing balaclavas.

Sadly, perhaps this is all simply inevitable. Technology’s blind impetus is already moving to this logical position. But, what humans can make, humans can unmake – so just as the systems are programmed to recognise our faces, perhaps we should quickly recognise that the face of this violent future is the furthest from a defence utopia that some imagine.


Written by

This article was filed under
, , , ,