Free from fear or favour
No tracking. No cookies

Fact Checkers Slam Government Inaction on Political Deepfakes Ahead of General Election, Saying Laws ‘Not Fit for Purpose’

Full Fact has called on the Government to clarify how mis- and disinformation will be challenged in this pivotal election year

Keir Starmer
Labour leader Keir Starmer has been the victim of deepfake audio that implied he had been caught abusing his staff. Photo: PA Images / Alamy

Byline Times is an independent, reader-funded investigative newspaper, outside of the system of the established press, reporting on ‘what the papers don’t say’ – without fear or favour.

To support its work, subscribe to the monthly Byline Times print edition, packed with exclusive investigations, news, and analysis.

A leading fact checking organisation has criticised the Government for failing to take action on political deepfakes.

The Government announced new legislation on April 16 strengthening the law on sexual deepfakes. But ministers have rejected proposals to outlaw political deepfakes – where AI-generated material makes it appear that politicians are saying or doing things they haven’t. This is despite several high-profile cases affecting Labour leader Keir Starmer and London Mayor Sadiq Khan in recent months.

Three amendments to the Data Protection and Digital Information Bill were recently tabled on deepfakes, including one from Labour on the “offence of creating or sharing political deepfakes”. The Government did not back it, meaning it failed to pass. 

In October, an X account posted what appeared to be audio of Starmer abusing party staff. Shortly after, fake audio was released of Khan seemingly disrespecting Remembrance commemorations, in what he said could have caused “serious disorder”. At the time, Khan said laws governing deepfakes were not “fit for purpose.” 

UK’s Next General Election is Wide Open to Foreign Interference, Corruption Watchdog Warns

The Conservatives have made Britain’s elections more exposed to risky foreign donations, according to Spotlight on Corruption

Full Fact CEO Chris Morris told Byline Times: “The Government’s failed promise to tackle information threats has left us facing down transformative digital challenges with an analogue toolkit. An Online Safety Act with just two explicit areas of reference to misinformation cannot be considered fit for purpose in the age of AI.”

The anti-misinformation group told Byline Times that the combination of “significant gaps in the Online Safety Act and the transformative power of widely available generative AI tools” means the next Government will need to fundamentally overhaul the legal framework to “take the fight to bad information,” particularly where it is affecting public health or is generated by artificial intelligence.

Meanwhile, media regulator Ofcom has said its new Advisory Committee on misinformation – meant to be set up following the Online Safety Act‘s passing last year – will now not be ready until the end of 2024. That means it may only be launched after what is likely to be a polarising election, leaving voters less protected from AI deepfakes.

“We know of no reason why it could not be established sooner,” the Full Fact spokesperson told Byline Times.

The Minister for Tech and the Digital Economy (DSIT), Saqib Bhatti, told Parliament in November that the department would work “closely with social media platforms to ensure that the right systems are in place to identify and remove harmful material, including deepfakes, where it breaches platform terms of service”. Full Fact considers the Government’s approach “unnecessary secrecy”.

EXCLUSIVE

Elections Watchdog is ‘Powerless’ to Stop AI-Generated Deepfakes From Sabotaging the General Election

UK authorities have alarmingly few powers to prevent bad actors from interfering with our democracy

The group has called on ministers to clarify explicitly how misinformation and disinformation will be challenged during and around the election.

Full Fact’s comments come months after Byline Times revealed concerns from AI experts that the next General Election – due to take place between December and January – face a major threat of being swayed by deepfakes that the election watchdog has no powers to stop.

“Our recent report argued that, ‘By mid-2024, in time for an autumn election, [Ofcom should] set up the Advisory Committee on Disinformation and Misinformation which draws on expertise across the field in order to effectively monitor and prioritise emerging and existing harms,” the Full Fact spokesperson added.

#VOTEWATCH24 CrowdfundER

Help us investigate disinformation and electoral exclusion as we head towards the 2024 General Election.

Byline Times is running an ambitious monitoring project for this year’s elections – #VoteWatch24. We will be coordinating hundreds of volunteers across the country to show what’s really happening on the ground by sending in news from constituencies across the UK.

Wherever there is voter suppression, misinformation, or dodgy funds, we’ll be here to call it out. Across Britain, months ahead of polling day, the work is about to begin.

But we need your support to make this crucial project a reality. If we don’t make this effort, no one else will. Can you help us cover the staff and infrastructure we need to make it possible?

In our November report, Byline Times explained how the law is unclear about whether deepfake videos and audio of political figures is illegal, but how it could fall under malicious communications measures in the Online Safety Act. However, the Met Police quickly dropped an investigation into the Khan deepfake, saying it did not constitute a criminal offence. 

Experts worry that AI-generated fake videos of politicians could go viral before the next election, with the costs of producing convincing content close to nothing. Many in the field believe hostile state actors will use emerging technology to try and sway voters and sow disruption. 

ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

Since November 2023, campaigners are required to include an “imprint” saying who published certain political campaign materials online. But they are not required to disclose whether content is AI-generated and the Electoral Commission cannot sanction or take down misinformation. Ofcom has more powers, but suffers from the law being unclear on AI-generated political misinformation and deepfakes. 

A spokesperson for the Electoral Commission told Byline Times at the time: “We don’t have a remit on deepfakes or the content of campaign material, as we’re responsible for regulating party and campaigner finance as well as compliance with the digital imprint requirement.” 

The Commission called on the Government to bolster the powers of UK regulators, “so they are equipped to deal with future challenges.” It wants to be able to obtain information from social media and technology companies and online payment providers to identify who is behind content.

Additional reporting, Steve Hopkins

Subscribers Get More from JOSIAH

Josiah Mortimer also writes the On the Ground column, exclusive to the print edition of Byline Times.

So for more from him…


Written by

This article was filed under