A new report by the Centre for Countering Digital Hate has found women are being targeted with misogynistic abuse on Instagram’s direct message function – something Sian Norris knows all too well

The little red alert above the Message icon tells me that someone is trying to contact me on Instagram. Usually when this happens, it’s my friend adding me to her Instagram story.

But this time, when I click through to the inbox, it’s not a message from someone I follow. It’s a Message Request, meaning it has come from a stranger – someone who has found my account and wants to connect. 

There’s that moment of pause. Do I look? What if it’s important? Or something to do with work… that’s the thought that always gets me to click. I click to open. 

It takes a moment to realise what I am looking at. Row after row after row of the vomiting emoji and the middle-finger emoji. I scroll and scroll. Someone has really put the effort in to let me know how much I make them sick. 

In the scheme of things, it’s not that bad. It’s not an obscene image or pornography – something that, thankfully, I’ve never had to endure. When I get to the text part of the message, it’s not a threat or a fantasy of violence.

But this is the problem with online abuse. We are so used to saying: at least it wasn’t x… at least it wasn’t y… always aware that the worst kind of abuse, the x and the y, the terrifying kind that leaves you shaking and sick, is always a possibility. It’s always a click away. 

We are always aware that the worst kind is out there – it’s happening right now to other women, and next time it could be happening to you. 

My Instagram account is private. It’s mostly pictures of books, and silly selfies, and places I’ve visited. But, despite being private, strangers can still reach me via direct messages, by putting in a message request. This means that, while they can’t see my posts, I am always contactable. There’s no real way of putting up a wall against online abuse. 

It’s this aspect of Instagram that is the subject of a new report by the Centre for Countering Digital Hate (CCDH) – an international not-for-profit NGO that seeks to disrupt the architecture of online hate and misinformation.

It found that Instagram fails to act on nine in 10 reports of misogyny in direct messages – and that’s just the abuse that gets reported. This is one of the worst failure rates to deal with abuse, the Centre says, that it has ever encountered.

FEARLESS, INDEPENDENT JOURNALISM
& INCREDIBLE VALUE

Receive the monthly Byline Times newspaper and support quality, investigative reporting.


A Culture of Misogyny

Of the 8,717 direct messages sent via Instagram to five prominent women’s accounts analysed by CCDH, one in 15 broke the social network’s rules on harassment and abuse – and yet the platform only acted on 10% of reports. 

The direct messages sent to actor Amber Heard, TV personality Rachel Riley, activist Jamie Klingler, writer Bryony Gordon, and writer and influencer Sharan Dhaliwal included 125 incidents of image-based sexual abuse. This means unsolicited pornography, men sending images of their genitals, and faked pornography – where the recipient has been photoshopped into a pornographic image. 

Other forms of abuse included telling recipients to kill themselves, rape and death threats, and one-word hatred – for instance, sending one word that could be abusive but could, in other contexts, be benign. The word ‘rape’ can be sent as a one-word threat, but it can’t be banned by a platform where women may well be talking about their own experiences of sexual abuse.

Some of the messages even featured videos of men masturbating over images of the recipient. 

Countdown presenter Rachel Riley, who took part in the research, told CCDH how knowing that Instagram accounts have sent her these images via direct message “turns my stomach”. 

“It really makes me not want to go into my DMs at all because it’s revolting,” Riley said. “It’s astounding to know that strangers are sending porn – it empowers them to know that it’s gone to your inbox.”   

The report also found that cyberflashers – men who send obscene images of themselves – were often repeat offenders, with serial cyberflashers responsible for 31.2% of this form of abuse. 

The Government is set to make cyberflashing a criminal offence in its Online Safety Bill. But the bill has been criticised by women’s rights campaigners who are concerned it is a missed opportunity to take action on violence against women and girls. 

Research by the domestic abuse charity Refuge published in 2021 found that one in three UK women have experienced online abuse or harassment on social media or another online platform, a number which rises to 62% of young women. 

However, Refuge has little confidence the Online Safety Bill will offer the necessary protections for women and girls, not least because it fails to introduce a dedicated violence against women and girls code of practice that could offer more protection to victims of gender-based abuse. 

Refuge also raised how the bill focuses on acts that are already criminal offences, such as ‘revenge porn’, meaning little is offered in the way of new protections. 


A Chilling Effect

One of the concerning aspects of the CCDH research was how women were worried to take part or contribute their experiences in case it led to further abuse or to them being punished by Instagram itself.

“Several women who use Instagram as a significant means to promote their personal brand or conduct commercial work expressed fears that the platform might punish them for criticism by deprioritising their posts,” the report said.

This suggests that online abuse has a chilling effect on women’s speech – both in that it can put women off engaging with social media, but also that women fear repercussions should they speak out about the violence committed against them online.

The highest profile woman interviewed by CCDH was actor Amber Heard, who received a large amount of abuse following her split from the film star Johnny Depp. Anyone who has ever written about the libel case he launched against her will know the depth of hatred voiced online towards Heard and anyone who defends her. 

Heard told CCDH that the failure of Instagram to act on misogynistic abuse could put women off acting in “the interests of their own safety” when it comes to speaking out against male violence.

“If I can’t utilise this tool, if I can’t open Instagram, if I can’t engage at all, then what does it say about a person who doesn’t have the emotional resources that I have, that come with age and experience?” Heard said. 

OUR JOURNALISM RELIES ON YOU

Byline Times is funded by its subscribers. Receive our monthly print edition and help to support fearless, independent journalism.

New to Byline Times? Find out more about us

SUBSCRIBE TO THE PRINT EDITION

A new type of newspaper – independent, fearless, outside the system. Fund a better media.

Don’t miss a story…

Our leading investigations include: empire & the culture warBrexit, crony contractsRussian interferencethe Coronavirus pandemicdemocracy in danger, and the crisis in British journalism. We also introduce new voices of colour in Our Lives Matter.

More stories filed under Reportage

Individuals on UN Sanctions List Allowed to Trade in UK

, 3 October 2022
Diogo Augusto unpicks the UK’s lax surveillance of company directors
Money Laundering Companies House

EXCLUSIVE Russian War Crimes in Ukraine: Torture as a Propaganda Tool

, 3 October 2022
Zarina Zabrisky talks to survivors of the newly liberated Kharkiv region and discovers a terrifying logic of ‘psyops’ in Russian atrocities

EXCLUSIVE The ‘Syndemic’ of Poverty and Mental Health in Young People 

, 3 October 2022
A higher number of poorer children are being referred for mental health support in England, compared to young people who live in the richest areas, new analysis shows

More from the Byline Family