The Meta NarrativeHow Zuckerberg Just Won’t Listen
The Facebook boss’ response to damning revelations about the social media platform has been to go full Orwellian, says Kyle Taylor
You would have been excused this week for wondering why, just days after damning new reports were jointly released by more than a dozen news agencies around the world, Facebook’s CEO Mark Zuckerberg gave a big presentation announcing that the social media company had changed its name to Meta without an ounce of irony.
The tone deaf nature of this response was truly spectacular. Meta, even.
Just last week, the world was talking about how this had to be the beginning of the end for Facebook. A new series of stories – upgraded from the ‘Facebook Files’ to the ‘Facebook Papers’ – has been framed as the first in what may very well be months of increasingly shocking revelations based entirely on Facebook’s own internal documents.
Their depth and their specificity tell us that the company’s actions and, more importantly, inactions, have caused more harm in the world than we could have imagined. Perhaps more poignantly, they confirm that the company knew about this and still chose not to act.
This shifts the conversation from one about negligence to intent.
Some of the revelations have included how:
- In its internal algorithmic ranking system for engagement, Facebook gave anger five times as much worth as ‘likes’, prioritising rage over happiness for profit.
- Facebook knew that paid adverts could be exploited by politicians in part because they are exempt from fact-checking. Its response? “Accept the risk”.
- The company conducted its own internal study in which it created a fake account that ‘liked’ Donald Trump and Fox News. Within two days, it was receiving recommendations to join QAnon conspiracy theory groups.
The extent of what has been revealed gets worse:
- We learned that the firm had been aware since at least 2018 that human trafficking was a regular occurrence on Facebook, with the platform being used to enable “all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via real-world networks”. One estimate was that 24 people were trafficked every week on Facebook.
- During the Coronavirus pandemic, Mark Zuckerberg was presented with research that the company could reduce disinformation around the crisis by 38%. He said no to doing this because it would reduce engagement and, by extension, ad revenue for Facebook.
What the documents also show is that Zuckerberg’s public statements and congressional testimonies are in direct conflict with internal documents and findings, suggesting that he may have misled or even lied to governments.
The release of this information was followed up by testimony to the UK Online Safety Bill Pre-Legislative Scrutiny Committee from Frances Haugen, the whistleblower who brought the documents into the public domain.
Speaking to parliamentarians, she confirmed that the company prioritises profit above all else, and that it is effectively a slave to one metric alone: “meaningful social interactions” – tech speak for engagements such as likes, shares and comments. The more engagements, the more time users spend on the platform, the more money Facebook makes in ad revenue.
The problem is that engagement is informationally neutral – the algorithm doesn’t care what is moving through the platform, whether it be hate speech, COVID-19 disinformation, or the trafficking of humans. Its aim is to keep people on Facebook for as long as possible, inevitably favouring the most extreme types of content.
Haugen went on to say that she was concerned that ads are, at present, exempt from the scope of the UK’s Online Safety Bill, effectively meaning that companies can pay to cause harm. It would be a “grave danger to society and democracies around the world to omit societal harm” from the bill, she observed.
It is fortunate that Frances Haugen was able to give evidence to Parliament in person, as the UK has perhaps the most digitally literate elected representatives of any democracy. Well versed in their brief, the MPs and peers on the committee probed the details and were clearly committed to finding genuine solutions.
The committee brought similar depth and rigour to Facebook’s own testimony yesterday. Unfortunately, Antigone Davis, Facebook’s head of safety, did not, admitting that she had not even read the Online Safety Bill despite giving evidence to the committee about it. She said she would “have to get back to them” more than half a dozen times and was unable to explain how reporting works within the company.
Meanwhile, Mark Zuckerberg’s reaction has been to go full Orwellian and pretend as if none of it happened. On Thursday, he released a video announcement that the company would be renamed Meta, declaring that it has a “new North Star” of building the metaverse – a virtual reality world in which Facebook is likely not destabilising governments, facilitating genocides and undermining global health efforts. A virtual reality in which Facebook isn’t terrible. Too late Mark, game over.
Or is it? This speaks volumes about how untouchable Zuckerberg believes his company to be, despite endless scandal and conspiracy.
With the United States in political turmoil despite the election of Joe Biden, a hard Brexit causing chaos in the UK, and global climate emergency action seemingly stalled as the planet burns, can something really be done to rein in Facebook? Or is it simply already too powerful? It seems the world will keep paying the price while we find out.
Kyle Taylor is the project director of the Real Facebook Oversight Board and director of Fair Vote UK, which was central to uncovering the Cambridge Analytica and Vote Leave scandals. His guide, ‘The Little Black Book of Data and Democracy’, is published by Byline Books