Free from fear or favour
No tracking. No cookies

The Hypocrisy of the AI Summit: The UK is Doing Nothing About Political Misinformation

The Government rejected a cross party Lords recommendation that political advertising should be regulated according to factual accuracy. The former chair of the Select Committee wonders why

David Puttnam. Photo: Miki Barlok

Newsletter offer

Subscribe to our newsletter for exclusive editorial emails from the Byline Times Team.

Millions – perhaps billions – of words have been written about Al. From those I think it’s possible to distil at least a few truths that should matter to us:

  1. Whether Al will help us do good or bad things depends on whose finger is on the trigger.
  2. If that finger belongs to a politician in campaigning mode, the outcome is unlikely to be good because we will frequently be lied to. Not always, but enough times to make us wary of most statements. Whether those are Al lies or just regular conventional lies, this should really trouble us.
  3. In political advertising at least, there’s a simple solution (driven by Al, as it happens), but because agreement to that solution requires the cooperation of the perpetrators, making that solution happen is far from simple.

Sky News suggests that the Agenda for the Al Safety Summit at Bletchley Park acknowledges “there are wider societal risks when it comes to Al. such as misinformation, bias and discrimination and the potential for automation. But these problems will not be discussed at the summit in depth, because the government feels they are already being addressed both nationally and internationally and doesn’t want “duplicative” efforts.’

‘Despite the Headlines and Handshakes, We Are No More Prepared for the Tidal Wave of AI’

Tackling the issue will require global cooperation and legislative policies – but advancing from non-binding commitments can take decades, writes Emma DeSouza

The notion that misinformation is being dealt with nationally and internationally is simply not tenable, and electoral disinformation in particular should be regarded as the most immediate Al threat. Open Al CEO Sam Altman has urged its regulation, saying “It is one of my areas of greatest concern” and Professor Wooldridge, director at the UK’s Alan Turing Institute, said Al-powered disinformation was his principal concern about the technology: “Right now … it is number one on the list.”

So, we arrive at the nub of this issue: the U.K. government is rightly concerned to address the threats and issues posed by Al. Yet, UK politicians and others have resisted any short- or medium-term plan to address political misinformation. 

The Online Safety Act, for example, does not deal with misinformation, and political advertising is specifically exempt from the DCMS Online Advertising Programme. This position has either been engineered or is simply wonderfully convenient for the omnipresent pedlars of political misinformation. There can be nothing more important in the current Al scenario politically or socially, yet nothing is being done about it.

This government and other political parties that ‘benefit’ from their determination to avoid factual accuracy and a truth-based democracy are looking to the private sector to build in safeguards.

There is a form of hypocrisy in this which is hard to fathom.

Conservatives Accused of ‘Camouflage’ Campaign Tactics After Imitating Green Party Leaflets

A number of Conservative candidates appear to have suddenly dropped the party’s blue branding, reports Max Colbert

So, what should be happening after Bletchley Park? 

In June 2020, a House of Lords cross-party committee recommended that political advertising should be regulated in its factual content. The government rejected the proposal, largely on the grounds that it would have “a chilling effect on free speech”. This despite the proposed regulation being specifically confined to factual accuracy.

Most political advertising comes from relatively few sources and is distributed on relatively few channels: upon arrival at, for example, Facebook, an ad is allocated an ID number which identifies the source, Al or otherwise. More importantly, it allows the tracking of where the ad is placed. so that if its content transgresses the regulatory guidelines, it can be taken down rapidly, or blocked even earlier by (ironically) an Al ‘reading’ of the ad.

Reform Political Advertising has been proposing this idea for the past four years, and while progress has been made, the simple solution of a code requiring factual accuracy has yet to happen. Hopefully, the noise currently surrounding Al will generate some re-consideration or at least a re-discussion of the present untenable position. In the absence of any regulation of factual content, the electorate will, not unreasonably, come to distrust everything.

The UK government’s ‘pro-innovation’ approach would be more convincing were it to recognise that Al’s most immediate threat is to democracy itself.

David Puttnam is a former Labour peer in the House of Lords. He was Chair of the chamber’s Select Committee on Democracy and Digital Technologies, which published its Digital Technology & the Resurrection of Trust report in June 2020

Written by

This article was filed under
, , ,