Spotlight Interview with… Gabriela de Oliveira, Glitch

In our second spotlight interview, Ben Whitelaw, editor of Everything in Moderation, spoke with Gabriela de Oliveira, Head of Policy, Research and Campaigns at Glitch, who discussed the importance of creating inclusive online experiences.

Gabriela de Oliveira, Head of Policy, Research and Campaigns, Glitch

Glitch is an award-winning UK charity that aims to tackle online abuse, and champions digital citizenship. They have a particular focus on Black women and marginalised people. Through training, policy, research, workshops, and community building, they are working to build an online world that is safer for all.


What does inclusion look like when applied to online communities? What are the characteristics of such a space?

We don’t like to think of inclusion as an add-on to improve a community space online – though often platforms work like that. We believe it should be built in from the outset. Safety and inclusion by design is absolutely possible and is the best way to create open, creative and safe spaces online. In practice, that means a few things:       

Number one is embedding safety and accessibility by design throughout the business. This means from design and development through to organisational KPIs, like engagement. If safety and accessibility are in conflict with a platform’s core business model, it will be a constant battle to create online communities that are safe and inclusive for all. If these elements are built into the model and design of an online community from the start, you’re a step ahead. It’s an upfront investment of time and resource, so investors have an important role to play in asking the right questions of platforms from the start.

Secondly, we know that groups who are marginalised offline are often experiencing abuse and harm online. So part of the solution is having diverse teams who have a specific mandate to protect groups who are the most likely to experience abuse. For example, Black women have been found to experience 84% more abuse on Twitter than their white counterparts. This means an online community needs to consider how Black women and other marginalised groups might need a different approach to ensure they can participate fully and without being at risk of harm. This means centering Black women as the group most likely to be harmed or excluded from online communities [ensuring these spaces are designed with marginalised communities in mind from the start] – rather than advocating for “inclusion” of this group into spaces that are already “inclusive” for other groups.

Finally, it’s about creating accountability among tech companies, which we’d love to see as a default but often requires legislation and regulation. Due to lack of transparency from many platforms on their polices and the level of harm occurring on platforms, regulation and legislation is a really important way to raise the standard of inclusive practice and hold platforms to account.


Which groups and communities tend to be most targeted online, and can you talk about how it impacts people in the real world?

At Glitch, we focus on the specific impacts of online abuse on Black women, and the many intersecting identities that can be held alongside Blackness and womanhood, with an understanding that Black women face a disproportionate amount of online abuse. We believe that a safer online experience for Black women will mean a safer experience for all.

Online abuse includes a broad range of harm such as sharing of non-consensual images, targeted disinformation campaigns, doxxing, cyberstalking and harassment or hate speech. It can have devastating impacts on individual wellbeing and access to wider society, and on wider structures like functioning democracy, community building and change making.

It can prevent people from marginalised communities from raising our voices online, from engaging with local politics and democratic processes, from becoming activists and change makers. It can reduce our likelihood of promotion or access to job offers, in a time when a visible presence online is often required for career success, particularly for journalists and public figures. It can make us fearful in the offline world, knowing that threats and abuse online can (and do) translate into physical harm. It can cause psychological harm, leaving victims of abuse feeling anxiety, stress, fear, panic attacks and feelings of apprehension when using the internet and social media platforms.

We believe it is fundamental to understand online abuse as a manifestation of existing forms of discrimination and violence that women and other marginalised people face offline. It is part of a continuum of structural inequality, translating patriarchal and white supremacist society into the online space – because our online and offline experiences all interlink, and all affect us in real ways.


What techniques have you developed to keep yourself safe online? Is there any you’ve found to be particularly effective?

Online abuse is a consistently shifting and changing issue, so it’s important to keep on top of ways to stay safer online while working to create a web that is safer and more inclusive. We have a reporting resource on our website that supports users to record all online abuse and its impact, which can be particularly useful for abuse faced at work, or if you wish to report it to the police.

Glitch workshops and free resources include a range of tips and advice for individuals and groups on countering online abuse. Through our workshops we share the most recent and accurate advice – we would recommend attending! And in the meantime, we believe strongly in being the change you wish to see – being an active online bystander for those facing abuse helps to create safety and security within your online communities. Our A Little Means A Lot campaign, provides some concrete advice on how to be there for your community, recognising that online abuse can feel isolating, but community kindness and support can break through the noise and ground you.


Glitch has focused on addressing these problems through educational programmes about digital citizenship, self-care, and safety. Who do you work with, and what impact have you seen that work have?

Yes, we work a lot with organisations and individuals who want to learn more about how to stay safe online and support others. We believe this is all part of Digital Citizenship, which is about supporting our communities to engage with digital spaces in safe, responsible, and respectful ways. Effective digital citizenship depends on the norms, behaviours and practices upheld by three key groups: communities, government bodies and technology companies. So, we work with all three!

Our #DrawtheLine campaign in partnership with BT Sport is a great example of this. The campaign’s “Spot. Support. Report” tagline helped promote the kinds of behaviours that can make online communities safer and more inclusive. The campaign took a firm stance on online abuse, effectively drawing the line between what is and is not acceptable online behaviour at a time where England’s men’s football players were being targeted with abuse during the Euro’s championship in 2021. This included our Founder and CEO Seyi Akiwowo speaking publicly with BT as part of the campaign, as well as supporting the creation of a Top Tech Tips awareness-raising video.


Which services and platforms are you aware of that take a radical approach to ending online abuse?

There are a number of organisations doing radical work to make our online communities safer – I’m sure many we aren’t aware of too! Some that come to mind are the Awhi app, Supernova, Chayn, All Tech is Human and Block Party.


For a company at the start of its journey to make its product or service more inclusive, where do you recommend starting?

We’re big believers in companies adopting a value-driven and transparent approach to community building. This includes clear codes of conduct and moderation policies which are publicly available, kept up to date and implemented. It’s important to remember companies can create any community they want – if they want to allow, or stop, certain behaviours or content then they can do it. Transparency of policies and values are fundamental, so users can choose where and how they engage. 

Importantly, these policies need to be implemented across teams within a platform; values and policies should be embedded through-out, not left to moderation teams. For Glitch, a values-driven approach would mean explicitly naming abuse as unacceptable, and demonstrating a clear knowledge of what misogyny, racism and other forms of bigotry can look like and their impact on your community.

This is harder than it sounds, but well-resourced and supported moderation teams are absolutely key to implementing content policies and ensuring your community members are getting the feedback and action they need when they’ve reported something and aren’t feeling safe.

Also, there’s a lot of benefit in giving users more control over what they see and how they can engage. Advanced filtering options, transparency around algorithms such as “why you’re seeing this” features and downvoting content are some examples of features that give users more control and provide a company with more feedback on how users want to engage in a space. For these features to work however, users need to be aware of what tools exist and how they can be used.


Finally, apart from Glitch, what other organisations are doing important work in this space that you’d like to call out?

There are so many amazing organisations working in this space and finding really creative ways to collaborate with each other. Some organisations that believe in supporting and working with Black women and other groups most likely to experience online abuse are: Alaàṣẹ lab, Black Girls Code, Global Tech Advocates Black Women In Tech, Milk and Honeybees, European Digital Rights initiative, End Violence Against Women coalition and European Network Against Racism.

Articles you might like