Skip to main content

Technology-facilitated gender-based violence (TFGBV) remains a harsh reality for many individuals, especially those from marginalised communities. People are targeted in various ways due to their gender identity, sexuality, race, ethnicity, religion, political beliefs, and often the intersection of these identities. Everyday digital tools like Google accounts and Spotify playlists are being misused for stalking, control and harassment – frequently by former partners. This highlights that digital violence doesn’t always require advanced tools; the misuse of common technologies can be just as damaging. In response, affected individuals are developing diverse strategies to protect themselves in digital spaces, striving to stay safe while also maintaining their voices, sustaining activism, and navigating online interactions in an increasingly hostile environment. These strategies not only aim to safeguard but also foster resilience, self-expression and a meaningful presence in these critical spaces.

At the recent Global Gathering event in Estoril, Portugal, hosted by Team CommUNITY, the Feminist Internet Research Network (FIRN) held a circle discussion on the nuanced experiences of TFGBV and hate speech targeting women, LGBTQIA+, feminist and sexual rights activists. The conversation centred on creative strategies for navigating these challenges, drawing on research and lived experiences from Brazil, Ghana, Kenya and India.

Brazil: Navigating hate speech with resilience

From Brazil, the researcher spoke on the experiences of Black Brazilian women and girls who face TFGBV and hate speech, particularly those who have public voices and/or occupy professional categories outside of the stereotypically assigned roles. The hate speech they encounter falls into three main narratives: invaders, where they are accused of not belonging in these public spaces; criminals, questioning their legitimacy and success; and uncultured, dismissing their expertise and knowledge. These narratives reflect a systemic effort to silence and delegitimise Black Brazilian women.

While this type of violence often leads to a silencing effect, where individuals stop posting or abandon political careers due to threats, there is also a strong refusal to be silenced. After incidents of violence, many individuals go through a critical reconsideration of how they want to engage online. This involves openly discussing gender-based and racist violence to challenge its normalisation. Particularly for those who may be isolated – such as an only Black politician in their town – speaking out about their experiences becomes crucial in resisting the harmful feedback that such violence is just a part of political life. By vocalising these challenges, they create awareness and foster collective resistance.

Others are making nuanced and creative decisions about their online behaviour, often by creating multiple social media profiles and carefully determining how to use each one. This practice reflects a broader tradition of Black Brazilian women developing survival strategies in hostile environments. These practices allow them to remain active online and continue participating in political discourse, even in spaces marked by violence. This resilience, through careful adaptation, highlights innovative ways to sustain presence in digital spaces while protecting themselves.

Community, friendship and solidarity among Black Brazilian women has also played a crucial role in coping with this violence. Support from neighbours and friends – whether through simple gestures like sharing resources, from offering cake to providing access to helplines – has been essential in helping them navigate and resist online violence. This collective support strengthens resilience and offers a much needed network of care.

Ghana: Technology opacity, safe visibility, and the intersection of queer identity and TFGBV

In Ghana, the researcher spoke to the “sexuality of technology”, and how queer people navigate violence and survival in online spaces through the concept of "technology opacity". Rather than completely withdrawing from the internet, they adopt strategies to remain visible but only to their community and intended audience, ensuring safety while still maintaining connections. This practice reflects a broader tradition of queer communities using creative approaches, such as storytelling, to share information and resist societal pressures while staying protected.

The research revealed that many participants encountered issues like infiltration and impersonation, but noted that these incidents often improved their digital security awareness. Despite assumptions that certain groups (like priests) might not be tech-savvy, they demonstrated keen understanding of online risks, including spotting fake accounts impersonating queer people. Even when targeted, these individuals adapt by modifying their social media settings, such as refusing to be tagged in photos, to protect their privacy. Despite ongoing challenges, opacity remains central to their survival and engagement in digital spaces.

This concept of "opacity" also applies to queer Muslim individuals on platforms like TikTok, who resist both societal and cultural pressures through storytelling. Even as cultural norms attempt to silence them, these individuals use these platforms to confront threats and share more about their experiences. Opacity becomes a key strategy for protection, where the more information shared, the harder it becomes to target specific individuals. However, this raises questions about whether increased visibility through data actually offers more protection or makes individuals more vulnerable in machine learning contexts.

The discussion also emphasised the need for collective responses to TFGBV. When someone is targeted, follow-up should not rest solely on the victim; collective support is crucial to manage the aftermath, giving victims time to recover. Recovery is not immediate; the effects of violence can linger for years, impacting victims' lives long after the attack. Beyond legal measures, there is a need for holistic approaches to help victims regain control of their online presence. Community and friendship play a vital role in navigating these challenges, as seen in cases where group chats or live streams exposed identities, showing that security and privacy are collective responsibilities, not just individual concerns. In smaller, tight-knit communities, such as queer Catholics using platforms like Roblox, solutions to security challenges are deeply rooted in collective safety. These communities demonstrate resilience, relying on each other for protection and support, illustrating that collective strategies are key to surviving attacks from multiple angles.

Kenya: Visible resistance and collective action

In Kenya, the researcher spoke of how active civil society groups, such as the Coalition of Women Human Rights Defenders, are addressing structural issues surrounding gender-based violence through creative and bold forms of resistance. One example is the Pussy Power campaign, a controversial but powerful name in a cultural context that challenges norms. This campaign advocates for sexual and health rights while resisting the backlash that comes from both the name and its intersectional feminist approach, which includes gender-diverse and non-binary individuals. The name itself is a declaration: "We are here, we will be heard." It symbolises the feminist movement’s refusal to hide or shy away from asserting its power.

While much of this advocacy occurs online through active hashtag activism, it is also deeply rooted in grassroots efforts. However, online activism attracts significant backlash, particularly around the misconception that feminist activists, especially those with diverse identities, are involved in "recruiting" queer communities. There is also a prevalent belief that activists should not be paid for their work, fuelling distrust and hate speech. Despite these challenges, the movement remains resilient, blending both digital and community-based work.

In recent months, Kenya has had protests against the government and increased femicide cases. These protests have been marked by an interesting form of resistance: leaderless organising. For example, during the #EndFemicideKE protests, those calling for action were highly visible and faced personal attacks. However, in anti-government protests such as #RutoMustGo, no specific leaders could be targeted initially, as the movement is collectively owned. This strategy makes hate speech and personalised attacks less effective, allowing the movement to thrive in a more decentralised and community-driven way.

India: Combating image-based abuse

In India, activists are finding innovative ways to combat image-based abuse by leveraging copyright laws – originally not designed for personal protection – to remove harmful content from platforms like Pornhub, Facebook and Instagram. This resourceful approach demonstrates how existing legal tools can address violations, even as challenges persist in fighting hate speech.

Feminist helplines and “survivor tech”: Rethinking documentation and support

The discussion rolled into the complexities surrounding the documentation of gender-based violence and the need for a survivor-centred approach that prioritises consent, emotional safety and community support. Many survivors face immense pressure and shame, often leading them to delete evidence of their experiences rather than document them. In this situation, feminist helplines must navigate the dual challenge of ensuring that documentation serves as a tool for empowerment rather than victimisation.

There was discussion on how effective documentation requires a thoughtful, sensitive approach, emphasising the importance of not re-traumatising survivors while recognising patterns of violence. Establishing secure systems that protect sensitive data is crucial, as is maintaining transparency with survivors about how their information will be used. The need for community support cannot be overstated; having a network of trusted individuals can significantly enhance a survivor’s ability to navigate the legal landscape and gather necessary evidence.

A question was raised on whether there is “survivor tech” used to share information with law enforcement or medical facilities. It was stated that there is no standardised “survivor tech” that universally shares this information, but feminist helplines can implement ticketing systems that facilitate necessary information flow while ensuring survivors retain control over their narratives. Furthermore, the development of survivor-centric policies must be informed by user-based research that accurately reflects the realities of those most affected by TFGBV, rather than relying solely on high-profile cases that may skew the conversation.

Ultimately, the future of effective TFGBV documentation lies in creating systems that are flexible, survivor-led and deeply rooted in the principles of consent and support. By fostering a culture that values community, respect and empowerment, we can better serve those who have experienced violence and help pave the way for meaningful change in how we respond to it.

Conclusion

As we navigate the complexities that technology presents, it’s clear that the fight against TFGBV and hate speech demands a multifaceted approach. By centring the experiences of marginalised communities, fostering collective resilience, and advocating for inclusive policies, we can work towards a future where everyone feels safe and empowered in digital spaces.

The insights shared during this circle discussion highlight not only the challenges but also the incredible creativity and resistance of those combating violence. Moving forward, strengthening community connections and understanding the intricate interplay of gendered identity, sexuality, race, ethnicity, religion, political views and technology will be crucial in tackling the violence that persists in digital spaces.

Diana Bichanga works at APC as the Feminist Internet Research Network (FIRN) project administrator. In this role, she provides support to research partners undertaking data-driven research aiming to change policy and the discourse on internet rights. She is passionate about technology, how it intersects with different societal settings, and how it can be used to propel social justice.