Misogyny in all its manifestations continues to blight our world. Online misogyny frequently exhibits clear similarities with other kinds. It takes many forms: rape and death threats, stalking, continued harassment, unwanted pornographic images, doxing (the sharing of personal information about the victim, and of intimate images, of an individual), revenge porn, making false allegations, hate speech etc. It adds to the anxieties and dangers women and girls face in every environment: in the home, at work, in public places, institutions, and online. Women from all walks of life, even girls still at school, frequently experience online misogyny. Women in the public eye are often targets. Yet the government has still not acted to make misogyny a hate crime nor to address the harms women experience in the online world. The Online Safety Bill currently going through Parliament, as it stands, does not contain any reference specifically to online VAWG (Violence Against Women and Girls) and provides no provision for the protection of victims.
However, a proposed amendment from a cross-party group in the House of Lords would build into law fines for social media companies if they refuse to remove abusive misogynistic content and ban perpetrators, while bosses of these companies could be jailed for persistent breaches.
Writing in The Daily Telegraph, Tory peer Baroness Morgan of Cotes said that social media companies “are failing women and girls”. The amendment she has tabled calls for a VAWG Code of Practice which would allow the media regulator and social media companies to make the internet a safer place. “The reality is online spaces are still a wild west, with illegal activity such as stalking and harassment a daily occurrence for women and girls,” she said.
Refuge, the country’s largest single provider of specialist domestic abuse services published a report in 2021 examining the scale and scope of online abuse directed at women. The survey they conducted found that 1 in 3 UK women (36%) have experienced abuse perpetrated on social media or other online platforms at some point in their lives, and that its prevalence is growing. This accounts for over 11 million women across the UK. Of these women, 1 in 6 (16%) received abuse from a partner or ex-partner – this comes to 2 million women. Refuge notes in its report that the family and friends of online abuse victims are also affected. Often, they are targeted by the perpetrator as part of the abuse. This can include children. Younger women are more likely to have experienced online abuse – a shocking 62% say they have encountered some form of VAWG abuse on social media.
The impact on mental health, physical safety, and online engagement
Any misogynous and abusive content is unacceptable. Refuge also found that on average the duration of tech abuse experienced by women is at least 6 months. 95% reported a severe impact on their mental health and some even felt suicidal. . In the same way that they may feel unsafe in the world outside, many women have said that they also feel physically unsafe because of online abuse, fearing that their abuser may track them down . This also has significant negative consequences for women’s use of the internet and freedom of expression. They feel less confident in engaging in public debate and fear negative reactions when speaking on these platforms. In a sense they are silenced by a tide of misogyny. So, we may see online abuse as yet another form of coercion and control.
Inaction

Currently, social media companies do not sufficiently acknowledge that their inaction exacerbates and facilitates online abuse. They say that they take online hate against women seriously – and they have rules to protect users from abuse. These include suspending, restricting or even shutting down accounts sending hate. However, this doesn’t always happen. Those who report it often wait for months to receive a response. Sometimes they are ignored or very little or nothing is done. Recent research by the Center for Countering Digital Hate (CCDH) has discovered that previously banned Twitter accounts are generating millions of dollars in advertising revenue for the platform since it was taken over by Elon Musk.
One particularly egregious example was that of the notorious Andrew Tate, the influencer and former kickboxer known for posting extreme misogynistic videos. He has said that rape victims “bear some responsibility” for being raped and that if women accuse him of cheating he would threaten them with a machete. His extreme misogynistic comments on social media and his hate-filled videos have been viewed more than 11bn times, prompting fears that the many boys and young men who followed him were being radicalised by Tate. When Labour MP Alex Davies-Jones, spoke in Parliament about Tate’s “toxic” influence on schoolboys and criticised Rishi Sunak for being “too slow to recognise the damage this is causing”, she was “bombarded” with rape and death threats.
Much of this is illegal but the police’s form on responding is mixed. Indeed, a report by His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) in November 2022 highlighted a level of misogyny which cannot be attributed to a few ‘rotten apples’ in the policing barrel. Its investigation into police vetting and counter-corruption arrangements concluded that ‘a culture of misogyny, sexism and predatory behaviour’ is ‘prevalent in many forces’. Some of this is evidenced by messaging and conversations in online forums. Is it any wonder that women have little trust in the police to take decisive action?
What is to be done?
Despite the significant impact of online abuse, and its growing prevalence, social media companies, government and the police are not responding adequately, to protect women online and bring perpetrators to account. Often the advice to those who report online abuse is to remove themselves from online interactions. Silencing women is not the solution. What is needed urgently is a robust system, set up by media companies to tackle online misogyny and hate as a priority. Statutory regulation of online platforms should explicitly reflect the harms and impact of abuse and other online violence. Government needs to prioritise the protection and empowerment of the internet’s single largest victim group: women.