Pasar al contenido principal

Creating a gender-responsive cybersecurity framework means working both in and outside standards-setting bodies to advocate for meaningful mechanisms that mitigate online harms. Standards-setting processes can only go so far, and different communities need to collaborate with others to push for the necessary basket of policies and regulations that respond to the specific and nuanced harms that threaten women and sexually diverse people in the global South. This was one of the key takeaways from a two-hour online roundtable on gender approaches to cybersecurity hosted by the Association for Progressive Communications (APC) at the end of September. 

About 20 experts in gender, feminist tech, cybersecurity policy and governance participated in the two-hour roundtable, titled: “Gender approaches to cybersecurity: Integrating policy, research and technical standards discussions”. A key aim of the meeting, which included presentations and a hands-on session to brainstorm ideas, was to create a space to connect stakeholders working in different fields and communities on the issue of gender and cybersecurity. This was done through sharing different experiences from Latin America, Africa and Asia, exchanging perspectives and insights, and mapping ways of potentially collaborating in the future. 

“Gendered fear”

Presenters shared research and cases that showed the breadth of online harms that need to be considered to properly address cybersecurity threats to women and sexually diverse people. These included technology-facilitated gender-based violence (TFGBV), gendered disinformation attacks, hate speech, smear campaigns, doxxing, loss of personal and sensitive data and privacy rights through “scooping”, surveillance and hacking, and harms from states trying to enforce laws that are too strict. 

An example was given of how disinformation campaigns affect women in politics in Uganda. Women politicians are targeted with “sexualised rumours” and “body shamed”. The intention behind these public attacks is to “weaken their credibility, and over time to erode public trust and delegitimise their leadership.” In one case the attacks were so destructive that the leader of the opposition was forced out of politics: “These attacks really pushed her out of the system such that there was no way for her to run for office again.” Strategies in that country to address this included working with the media and social media platforms. There was also a need for online gender-based violence to be written into the country’s Sexual Offences Bill.

A feminist helpline set up in Brazil – one of a number set up across Latin America – revealed that “the most common cases reported are related to social media.” Similar to Uganda, where women politicians were attacked through body shaming and false rumours about their sex life, cyberattacks on human rights defenders were not just attacks on their political positions as activists, but focused on “identity and intimacy”. 

In Thailand, attacks on women and sexual rights activists include “hateful and abusive speech”, smear campaigns and doxxing – or revealing personal information about someone without their consent. The documented impacts have included severe psychological harm and self-censorship. A “chilling effect” was created where “people withdraw from their activist work, and some people refrain from using social media at all.” 

As the presenter explained: “Activists are now cautious about having their intimate or families’ photos or videos on their phones. [They are] really nervous and panicking about who’s got their data: What did they take? How long will they have it? How long till they use it against them?” 

“Their experience of fear is gendered,” she said.

A need for state responses and regulation 

In Latin America and in Asia, a shared concern is the role of spyware in gender-related harms online, including the use of artificial intelligence (AI) in surveillance. Some organisations make a distinction between spyware and “highly invasive” spyware (such as Pegasus and Predator) that is designed to “scoop up all of the data by default.” Highly invasive spyware can never be human rights-compliant, because the digital tools used cannot be independently audited. There is therefore a need to consider technical amendments to regulations and standards to bring the use of the tools in line with law. With any form of spyware, there is nevertheless a role and duty of the state to examine the different impacts on women and sexually diverse people. Currently this is not happening, one presenter said. 

While in Latin America, as elsewhere, hacking is typically done to commit fraud or financial extortion, the hacking of profiles has become common for “sexual harassment, control and surveillance.” In this context, “stalkerware” was identified as a critical issue that needed more attention. This is often marketed as software that can be used for purposes such as parent-child monitoring and controlling employees, “both of which are problematic in themselves.” However it can be adapted to facilitate online gender-based violence given its surveillance capacities. One presenter suggested that some stalkerware products are even marketed discreetly as being useful for intimate partner surveillance. These products also take advantage of gaps in big business regulation.

The need for a responsive private sector 

As one participant suggested, many in the private sector are aware of and responsive to the need for protections against gender-based violence in digital contexts. A number of businesses have redesigned their technologies, and incorporated mechanisms such as helplines and flag systems. However, more work needs to be done, including exploring the possibility of the private sector funding digital literacy programmes, and forcing some big tech corporations to take their reporting mechanisms more seriously. 

In Latin America, for instance, there is a general sense that when communities use these mechanisms to report abuse they are not effective: “These platforms should provide better reporting mechanisms that actually follow through and give users the control back; they should build this into their design [and be responsive to] what different user needs are,” one presenter said, adding: “You can report as many times as possible and they usually don't care.” 

Working with standards bodies and designing tech 

A key challenge with current cybersecurity discourse and practice is that, firstly, standards are focused too tightly on systems and IT devices, and secondly, cybersecurity is mostly considered a corporate and national security topic. To remedy this, one presenter argued, a human-centric rather than systems approach is necessary. Cybersecurity needs to be considered “as a societal security issue focusing on societal threats.” In this way, “a much broader idea of cybersecurity [is created] where gender issues are really at the heart of it rather than something that is tackled from the outside.”

Standards bodies such as the International Telecommunication Union (ITU) and the International Organization for Standardization (ISO) have taken cognisance of gender in their work, for instance, through the ITU’s Women in Standardization Expert Group and the ISO's Gender Action Plan. However, the extent to which these have meaningfully impacted on the specific discussions related to cybersecurity need to be questioned. 

A key challenge is the participation of marginalised communities in standards-setting discussions and in the design of technologies. As one presenter explained, the kinds of unique threats that different communities in the global South face are often unrecognised – or even unrecorded – and therefore not catered for in these discussions and processes. In effect, they are invisible. However the “threats, models and actors are different in different communities. […] Even the idea of a perpetrator is different.” 

The example was given of Apple releasing its location tracking service, which was thought to be a useful tool if you lost your phone. However, it can also be used for abusive surveillance, or to monitor community activists. “It turns out that none of the people who were sitting around the table when they were designing these location features […] questioned this technology, because they have never faced this kind of abuse, hence the limits of their world were too narrow,” the presenter said. The participatory design of technology is necessary, where “you have different experiences and communities in the room.” Inclusivity is similarly necessary when setting new technology standards. 

Inclusive cybersecurity – working inside and outside 

Encouraging community participation in standards-setting processes will however entail making these processes understandable for non-technical participants. They will have to navigate complex and sometimes alienating institutional dynamics, and grapple with opaque technical jargon. A key challenge is the lack of interpretation and translation in standards bodies, even when it comes to the documents for foundational protocols such as the Internet Protocol. 

“This is a battle,” one participant said. “When we invite [participants from communities to the standards-setting forums] we provide live translation – but this is an individual effort, and there is no support from the standardisation body.” 

Power imbalances are also implicit depending on where the standards-setting discussions take place, since they are often geographically inaccessible and expensive for remote communities to participate in. While token representation needs to be guarded against, having the “right people at the table” is also not enough. “There have to be structures and processes that are affirmative, that make voices heard and ensure that they are taken into consideration. We have to find ways to make sure those voices are not just present, but actively participating.”

These power dynamics were important to address, because although some standards-setting bodies promote a “consensus-based” process, it is clear that “at the end of the day, if big tech company wants to push one way, and a small human rights organisation wants to push another way, we will likely end up going the way big tech wants to go,” commented one participant. What is needed is structures and systems to address these inevitable imbalances in power as they occur.

However, standards-setting bodies are also limited. Standards are “base-level infrastructure”, cannot address all the prevalent harms, and can only respond to an extent to human rights concerns. To complement this, advocacy is needed to simultaneously push for appropriate regulations and other mechanisms such as “intersectional legal frameworks” to comprehensively address gender-based harms online. Evidence also needs to be produced through civil society research, and practical tools developed to support activists and victims of abuse. 

Building long-term collaborative relationships

The need to build collaborative relationships among individuals and organisations working at the intersection of gender and cybersecurity policy and governance, as well as technical standards and technology design, was also emphasised as important. A cross-field approach is critical. This includes engaging organisations that “work on trade and other corporate accountability issues, because most of what we see is motivated by the dominance of the largest tech companies,” and working with technology service providers, who are seen to be best positioned to address gender-based harms. 

Another key takeaway was that the voices of participants from the global South need to be amplified. Practical areas of collaboration in this respect include translation, localisation and outreach research, and even collaborating on administrative needs such as visas to help underrepresented communities attend standards-setting processes. 

Collaboration should be seen as a long-term commitment in terms of engagement and capacity building: “It is crucial to sustain influence across the standards or design life cycle,” one participant said. What was referred to as “post-standardisation corporate advocacy” is also necessary to make sure that companies actually implement the standards that we help to create.

The need for intersectional research 

There is a growing need to raise awareness about the nuanced intersections of gender and cybersecurity, especially in the global South. Research is important because it provides the evidence necessary to influence standards-setting processes, and helps to ensure that the experiences of women online are not overlooked in the design of technology. Evidence is also necessary to build appropriate regulations, policies and laws, and to advocate for better responses from the private sector to online harms. 

It is also important to encourage community-led research, so that it is not only the perspectives and voices of experts that influence tech policy and design. This needs to take an intersectional approach that helps to understand how various identities, whether gender, race, socioeconomic status or others, intersect and impact cybersecurity experiences. As one participant put it, it is important to “frame cybersecurity as a human-centred issue, rather than merely a matter of disputes between companies or countries.” 

Priority topics for research 

The roundtable also identified priority research topics going forward. 

One approach proposed to prioritising research was to firstly focus on the “greatest harm” by identifying what can be considered the greatest and most widespread threat to women and sexually diverse people online at the moment, and secondly, by considering “ease of policy intervention”, or where the most effective change is likely to be possible. Research methods and outputs proposed included evidence-based participatory research, developing how-to guides, and storytelling, which was considered necessary to reach “unlikely” stakeholders, “not just highly visible internet users but people from other organisations, movements, and even those with limited tech use and presence.” 

Proposals for research included the intersection between gender and spyware; guides that demystify the processes at standardisation and other technical bodies and processes; research visioning feminist digital futures for women in politics; and elaborating on the difference between “gendered targeting” and “gendered impacts”. 

The participants also pointed to the need for research that explores the connection between cybersecurity and broader gendered structures that produce discrimination and support patriarchy. As one participant put it, in Thailand, the impacts of online harms “are influenced by existing structural barriers and gender biases that women and queer people in [the country] already experience due to their gender and sexual orientation.” 

Resources

Several resources were shared by the participants, including: 

A framework for developing gender-responsive cybersecurity policy – APC 

A feminist conversation on cybersecurity – GenderIT (APC)

Amplified Abuse: Report on Online Violence Against Women in the 2021 Uganda General Election – Pollicy

Inclusive Cyber Norms Toolkit – Global Partners Digital 

Feminist Helplines Index

Maria d'Ajuda – The first digital security helpline created by feminists from Brazil 

Submission to call for input: The relationship between human rights and technical standard-setting processes for new and emerging digital technologies (2023) – WITNESS 

Request "Off the Record" – Brave Privacy Team 

“Being ourselves is too dangerous”: Digital violence and the silencing of women and LGBTI activists in Thailand – Amnesty International 

Gender Approaches to Cybersecurity – Katharine Millar, James Shires and Tatiana Tropina (UNIDIR) 

Navigating Human Rights in the Digital Environment: The World Telecommunications Standardisation Assembly – Global Partners Digital

A Guide to the Internet Engineering Task Force (IETF) for Public Interest Advocates – Center for Democracy & Technology and ARTICLE 19 

Internet Standards Almanac – ARTICLE 19 

Internet Exchange newsletter – Mallory Knodel 

Internet Draft: Intimate Partner Violence Digital Considerations – IETF 

Digital Security Resource Hub for Civil Society – Amnesty International

The role of the private sector in combatting gendered cyber harms– Chatham House