Ofcom consultation on draft guidance: A safer life online for women and girls
Ofcom consultation on draft guidance:
A safer life online for women and girls
- Policy consultation responses
- Violence prevention
Introduction
At The Young Women’s Movement, we strive to support young women and girls across Scotland, ensuring they are connected to the policy decisions that impact their lives. As Scotland’s national organisation for young women and girls’ leadership and rights, The Young Women’s Movement welcomes the opportunity to respond to Ofcom’s consultation on their draft guidance for tech companies on reducing the risk of harm to women and girls using their platforms.
Every single day, young women and girls worldwide are subject to gender-based violence online. The scale of online abuse against women and girls is devastating: globally, 38 per cent of women have had personal experiences of online violence, while 85 per cent of women who spend time online have witnessed digital violence against other women. In Scotland, a significant number of women have experienced online violence, with one in six reporting such experiences. Young women are particularly at risk: this figure increases to 27% among 16–24-year-olds. Statistics and personal testimonies show that violence against young women and girls online is endemic and has serious real-world consequences, including reduced self-esteem, social isolation, and even physical harm.
In our latest Status of Young Women in Scotland report, which engaged with nearly 600 young women and girls from across Scotland, many young women described feeling unsafe online, with some commenting on the rise of bullying and misogynistic harassment and abuse on messaging apps and social media. Through our research and engagement, young women and girls expressed a desire for more proactive efforts from the government to address the growing levels of violence online, particularly the dangers of online misogyny and largely unregulated use of Artificial Intelligence (AI) to abuse young women and girls in the digital sphere.
“I just wish the Government would do more to hold social media companies to account, particularly to stop accounts and channels that are spreading violent misogynistic messaging and ideas against women’s equality.” [anonymous survey respondent]
“I find the rise in AI Technology to perpetrate violence against women online deeply scary. I feel like it’s not taken as seriously as it should be.” [discussion group respondent]
At The Young Women’s Movement, we are currently conducting a programme with 21 young women from across Scotland on artificial intelligence and the importance of online safety for young women and girls. To inform our consultation response, we explored Ofcom’s draft guidance with the group, where they discussed their thoughts around the various proposals. Some young women expressed concerns that the guidance will not lead to any meaningful change, primarily due to a lack of implementation and enforcement of existing laws that seek to tackle violence against women and girls in the real world.
“I have low faith in the law upholding women’s rights and justice because of a lack of implementation and enforcement even for well-established law, e.g. the sexual offences of rape. How do we balance changing law with enforcing current law and regulations?” [participant on our Young Women Lead AI programme]
Despite repeated warnings and pleas for legal and political intervention from women’s organisations, activists and lawyers for many years, the digital sphere has largely gone unregulated. This lack of oversight from platforms and regulators has ultimately facilitated a rise in the weaponization of technology and online platforms to attack, stalk and violate women and girls on an international scale. Now is the time to act and take appropriate action to protect women and girls online. Tech companies are in a unique position of power to shift a system that allows misogyny, sexism, and gender-based violence to go unchecked in online spaces to one where young women and girls can feel safe and thrive. At The Young Women’s Movement, we look forward to working with Ofcom in the development of statutory guidance to ensure young women and girls across Scotland are kept safe, both online and offline.
Answers to Questions
1: Do you have any comments on our proposed approach to ’content and activity’ which ‘disproportionately affects women and girls’?
At The Young Women’s Movement, we welcome Ofcom’s proposed approach to content and activity which disproportionately affects women and girls, focusing on four categories of online gender-based harms: online misogyny; pile-ons and online harassment; online domestic abuse; and image-based sexual abuse, including intimate image abuse and cyberflashing. While we prefer the use of ‘tech-facilitated gender-based violence’ to describe such behaviours, we understand that Ofcom is operating within restricted legal parameters, and that ‘online gender-based harms’ reflects the scope of the guidance.
Some of the content and activity identified by Ofcom is explicitly illegal (e.g., cyberflashing, talking and harassment) while others might be perceived as harmful but not necessarily illegal (e.g., misogynistic comments and language). It is important that service providers are aware of their legal and ethical obligations to remove content that is harmful to women and girls, and we ask Ofcom to consider how they might realistically assist service providers to act against the identified broad range of online gender-based harms.
While we appreciate the guidance intends to regulate online gender-based harms, we believe Ofcom should be more explicit about how the guidance interacts with ongoing efforts to prevent and tackle gender-based violence in schools, workplaces, the justice system, and so on. There is no artificial barrier between the online and offline world when it comes to gender-based violence. The harassment and violence that women and girls experience in the real world is being played out and replicated in real time in the digital world. Young women and girls are being driven out of the classroom by deepfake pornography created for free by their male peers. Victims of domestic violence are being virtually stalked and harassed by their abusers, limiting their opportunities and ability to navigate the real world without fear.
Online gender-based harm does not exist in a vacuum: it is an act of gender-based violence that is deeply rooted in the inequality between men and women that persists worldwide. This must be explicitly acknowledged if we are to successfully tackle the range and extent of online violence perpetrated against women and girls. To make effective and meaningful progress in tackling the weaponization of technology against women and girls, we also need better training for police, better support for schools, better regulation of social media platforms advertising this technology, and better education for young people.
2: Do you have any comments on the nine proposed actions? Please provide evidence to support your answer.
At The Young Women’s Movement, we support, in principle, the nine proposed actions. We are concerned, however, that the guidance will primarily require service providers to continue to remove harmful content, rather than deterring or preventing users from posting it in the first place. While Actions 4-6 in Chapter 4 focus on actionable ways service providers can prevent online gender-based harms before they happen on their services, Ofcom note that these steps require a ‘proactive approach’ to make changes to the functionalities of the service, and we are concerned that this allows most service providers to opt out of implementing preventable features on their platforms. This sends a message to perpetrators of online gender-based violence that there are limited repercussions for posting misogynistic messages or other forms of harmful content.
Action 7 seeks to give users better control online by providing them with greater and more granular control over their own experiences. But current safety features, such as setting safer defaults and blocking/muting other accounts, are clearly not working as intended. Young women and girls are already aware of the dangers they face in the digital sphere, with most already taking some form of precaution to stay safe online. Unfortunately, implementing safety features is not enough to stay safe online from gender-based violence. For instance, in our Status of Young Women in Scotland report, 9 in 10 young women told us they already use passwords to stay safe online, while 81% of young women told us they block or report social media accounts that post harmful content. Most girls under 16 (70%) told us they keep their social media profiles private to stay safe online. Young women and girls who experience abuse online are often told that they should have done more to keep themselves safe and can be blamed for simply existing online, whilst the perpetrators of online gender-based violence often carry on their lives with limited consequences for their actions. More work needs to be done to shift the focus onto regulating the perpetrator of online gender-based harm, rather than the victim.
At The Young Women’s Movement, we agree with Ofcom’s suggestion to encourage service providers to adopt a safety-by-design approach, which would prevent abuse from happening in the first place. We do have concerns about how this might be implemented across existing services that have operated for many years, and ask if it is possible to retrospectively include safety-by-design features on well-established platforms. Coming to a shared understanding of what ‘safety-by- design’ means to service providers covered within the scope of the guidance is also vital to delivering what Ofcom hopes to achieve.
3: Do you have any comments about the effectiveness, applicability or risks of the good practice steps or associated case studies we have highlighted in Chapter 3, 4 and 5? Are there any additional examples of good practices we should consider? Please provide evidence to support your comment.
At The Young Women’s Movement we support, in principle, Ofcom’s good practice steps and associated case studies. Case Study 1 notes that a ‘senior person within a service provider’ could be held accountable to ‘the most senior governance body for ensuring the service considers and addresses online gender-based risks’, and we wonder how Ofcom intends to encourage service providers to accept and delegate this level of responsibility without greater powers of enforcement.
In Chapter 3, Ofcom notes that service providers could implement policies that tackle the promotion of offsite abuse, such as promoting deepfake creation services like nudification apps, or forums sharing explanations of how to monitor your partner. As noted by feminist writer and activist Laura Bates, a simple Google search for deepfake porn videos brings up dozens of websites offering ‘services’ that will create deepfake images for you. A 2023 report by social media analytics firm Graphika argued that deepfake image abuse has shifted ‘from niche pornography discussion forums to a scaled and monetised online business’. Rather than encourage service providers to set policies that simply ‘define and prohibit’ such services and apps, Ofcom should recommend service providers to outright ban the promotion of such apps on their platforms, ensuring they can’t be downloaded at the click of a button. Without enforcement powers, it is likely that many service providers will continue to prioritise profits and views over the safety and protection of women and girls online.
We welcome Case Study 4, which includes examples of how service providers might gender-sensitive risk assessments. As noted by Ofcom, young women aged 18-24 are particularly at risk from online gender-based harms, particularly cyberflashing and intimate image abuse. But due to their age, they fall into a protection gap as many safety features are aimed at girls under 18. Collecting demographic data is one way for service providers to assess risk and improve users’ experiences, and effectively implement safety-by-design features and protections for young women and girls.
4: Do you have any feedback on our approach to encouraging providers to follow this guidance, including our proposal to publishing an assessment of how providers are addressing women and girls’ safety? Do you have any examples or suggestions of other ways we could encourage providers to take up the ‘good practice’ recommendations?
At The Young Women’s Movement, we agree with Ofcom’s proposal to publish an assessment of how providers are addressing women and girls’ safety, and believe this is a positive step towards holding service providers accountable for years of inaction in tackling online gender-based harms.
Overall, we are concerned that Ofcom is relying on optimistic assumptions that service providers will comply satisfactorily with the proposed guidance, without any robust regulatory enforcement. For example, we are alarmed by Ofcom’s admission that they do not expect ‘all content to be taken down or heavily policed’ by services, and that they ‘consider it right for providers to take a holistic view of the experiences of women and girls online when making decisions about their policies, tools, and features they offer users, and how those choices may impact women and girls’ safety’ (p.5). We worry that this choice of language ultimately exonerates service providers from effectively implementing the guidance, including the good practice steps, in any meaningful way.
Regulatory choices are crucial in determining how much impact the guidance will actually have, and how the legislation it covers is implemented in practice. The reality is that it is one thing to say how service providers should operate, and another thing entirely to actually make them do it. Without extensive training, resources and buy in from service providers and platforms, it is difficult to imagine robust action being taken in the vast majority of reported cases of online gender-based harms, let alone systemic change in how service providers operate.
Ofcom and the UK and Scottish governments must work hard to make sure the guidelines are firmly enforced. While the actions proposed in the guidance are commendable and well-intended, they are ultimately voluntary, and there remains no actual requirement on service providers to put in place any of the recommended good practice. Key to enforcement of the guidelines will be the routes through which Ofcom can incentivise compliance, and track take up of the guidance.
Ultimately, the UK government must give Ofcom greater powers of enforcement to secure an internet that is freer and safer for women and girls. This includes introducing these recommendations into a Code of Practice for service providers, which would make the guidance legally binding and place a clear duty on tech companies to comply. Without these powers, it is likely that many service providers will continue to prioritise profits, clicks and views over the safety and protection of women and girls online.
Alongside implementing statutory guidance to enforce regulation of tech companies, more work must be done to increase women’s participation in the tech industry. Globally, women represent only 12 percent of AI researchers, and female-led AI companies get six times less funding than male ones. There is a clear lack of oversight, transparency and regulation of artificial intelligence, which has ultimately led to the creation of tools to perpetrate gender-based violence. The interests of women and girls have been ignored and ultimately omitted from the policies, systems and structures of tech companies. This needs to change if we are to create a more equal, safe online world for women and girls.
Finally, The Young Women’s Movement encourages Ofcom to consider the set of recommendations produced by NSPCC in their recent report on girls’ safety online as part of this portfolio of work. To support services to provide age-appropriate experiences for girls, Ofcom should develop best practice guidance for regulated services, which outlines how safety settings and other protections can be adapted based on children’s age (and gender). Ofcom should then work with service providers, especially those most popular with girls, to implement this guidance. Furthermore, when encouraging service providers to follow this guidance, Ofcom should recommend service providers to conduct their own ‘abusability studies’ to identify risky features and functionalities, as well as testing any new feature before rolling it out. These tests must include a likely risk.