From Bystander to Ally: Six steps for safer online spaces

7 min read

CIR

CIR 's photo

Share Article

On this International Women’s Day, it feels especially difficult to be optimistic. The dust has yet to settle after the Grok non-consensual intimate image abuse scandal, and the pursuit of justice in the wake of the Epstein files release feels slow, uncertain, and compromised. 

Every day, our social media feeds fill with more than only updates and entertainment. We see various forms of technology-facilitated gender-based violence, including harassment, image abuse, deep-fake pornography, and gendered threats. Online harm moves fast. It spreads wide. It lingers. Anonymity can embolden perpetrators, while survivors are often left feeling isolated in very public spaces. Many people scroll past it, unsure what to do next. 

But scrolling past also leaves harm unchallenged. When we intervene thoughtfully, we can interrupt harm. When we show solidarity, we can reduce isolation. When we challenge abuse, we can shift norms. Ending digital abuse isn’t just about stopping perpetrators; it’s also about empowering allies.

That’s why, this International Women’s Day, CIR is launching The Six R’s of Digital Bystander Intervention – a practical, accessible guide designed to turn digital bystanders into digital allies. Part of our forthcoming Cyber Allies Toolkit, in collaboration with Media Smart Youth Ethiopia, this how-to guide offers clear, proportionate, and safety-centred actions anyone can take when witnessing online abuse – whether it targets women and girls or people of any gender. 

The Six R’s adapt established bystander intervention models (for example, see: Right to be) for the realities of digital life – where risks are different, visibility is amplified, and intervention must be both strategic and self-protective. The guide offers simple ways to respond when we witness online harm – without escalating risk to ourselves or others.

The guide recognises an essential truth: there is no obligation to intervene.  Allyship should be intentional, informed, and optional. You also don’t have to do everything. You choose what feels safe.

Before intervening, it is important to recognise potential risks:

  • Could you become a target of harassment yourself?
  • Could the abuse escalate or intensify?
  • Will you be exposed to harmful content that could cause trauma?

This International Women’s Day, we’re inviting you not just to witness – but to become an ally; to act, support, and help build online communities rooted in dignity, accountability, and solidarity.

 


The six R’s of digital bystander intervention

1. REDIRECT

Shift the focus without engaging the harasser

Redirecting subtly disrupts harassment by changing the tone or focus of a conversation – without engaging with the abusive person. How to do this online:

  • Do not respond directly to the harasser.
  • Avoid engaging with their language or arguments.
  • Post positive, affirming comments under the original content.
  • Shift attention to neutral or positive elements (e.g. the setting, the message, the achievement).
  • Like and amplify supportive or respectful comments.
  • Create pathways to alternative, constructive content.

 

Tips:

  • Keep language calm and non-provocative
  •  Avoid sarcasm or wording that could inflame the situation.
  • Remember: abusers often feed off attention and their intention is usually to trigger responses.

 


2. RECORD

Document harm responsibly and ethically

Documentation can support accountability, research, or survivor-led reporting – when done carefully. Unfortunately, methods that work today may not work tomorrow, and platforms may restrict access, downloads, or archiving. With this in mind, effective documentation of TFGBV requires flexibility and caution. How to do this online:

  • Archive content URLs (e.g. social media posts) using trusted tools (e.g. the Internet Archive’s Wayback Machine).
  • When capturing screenshots, try to: (1) take multiple screenshots and (2) capture the full screen, including the URL bars, timestamps, and any account information or usernames (if on social media).
  • Good documentation requires more than a screenshot or archive link. – Careful record-keeping provides a more robust and credible documentation process. You should build a database containing the following information about each archived link, screenshot, or download:
    • Where to find the abusive content itself (e.g. an archive link, a folder containing screenshots, a folder containing direct downloads, a table containing textual data)
    •  Date and time captured
    • Platform
    • URL
    • Timestamps 
    • Information about the account/username or source (where safe and appropriate)
    • Notes on context, such as related posts or patterns of behaviour
  • Store files in secure, access-controlled databases. Be mindful of who has access. If data is shared within a team or with partners, ensure clear protocols are in place for handling, storage, and deletion.
  • Avoid unnecessary duplication, the sharing of raw abusive content, or the amplification of harmful content.
  • Do not publish sensitive materials without consent.

 

Critical principles:

  • If safe and appropriate, check whether the survivor wants documentation to exist.
  • Never post documentation publicly without explicit permission.
  • Use trauma-informed and consent-based approaches.
  • Follow quantitative and ethical data-collection standards.

 


 

3. REPORT

Involve platforms and support systems

Reporting delegates responsibility to those with the authority to act. How to do this online:

  • Report abusive or harmful content directly to the platform.
  • Be as specific as possible when choosing a reason for your report as this helps moderators understand the issue (the forthcoming toolkit contains a guide you can use) Where possible, report anonymously for your own safety.
  • Before you report the content, record the information (save, archive, screenshot, and build a corresponding dataset) in case you need them later. URLs and screenshots will also improve the power of your reporting.
  • Use platform tools to flag non-consensual sexual content, harassment, or threats.
  • Use specialist tools like the StopNCII.org tool. This creates a ‘hash’ for the intimate image or video, which is then shared with companies, including major social media platforms. The platforms can use the hash to identify the images online and delete them.
  • In serious cases, reporting may include law enforcement or specialist organisations.

 

Tips:

  • Familiarise yourself with reporting tools and policies in advance.
  • Reporting is a valid form of intervention – even if it feels invisible.
  • You do not need to inform the perpetrator.

 


4. REACH OUT

Offer support after the incident

Even when you cannot intervene publicly, private support can significantly reduce harm. How to do this online:

  • Send a private message or supportive comment (if safe).
  • Acknowledge what happened and affirm that it was not okay.
  • Ask what support – if any – would be helpful.
  • Share relevant resources or support service.
  • Invite them into supportive communities or networks.

 

Do not:

  • Contact the perpetrator.
  • Create additional content about the survivor’s experience.
  • Encourage public disclosure.
  • “Pile on” (a rapid, collective, and often intense, attack or wave of criticism directed at an individual, brand, or group by a large number of users.)


5. REACT

Name the harm briefly and calmly

A short, direct response that sets a boundary – used only when it is safe to do so. Before reacting, ask yourself:

  • Am I digitally safe?
  • Could I take measures to improve my own digital security prior to engaging?
  • Is escalation likely?
  • Does it seem the person being targeted would welcome support?

 

How to intervene directly online:

  • Keep it short and factual.
  • Name the behaviour, not the person.
  • Avoid debate, arguments, or prolonged engagement.
  • Examples:
    • “That’s harassment.”
    • “That’s not okay.”
    • “That comment is sexist.”
    • “This violates platform rules.”
  • Then disengage. Do not enter a comment war. Do not amplify the abuser.

 

Additional protective steps:

  • Report abusive accounts.
  • Block abusive accounts.
  • Mute or filter harmful words using features on social media:
    • TikTok: Creator Care Mode, keyword filter
    • YouTube: Hold for Review, Blocked Words, Hide User
    • Instagram/Facebook: Hidden Words, Profanity Filter, Moderation Assist
    • X: Hide replies, mute words, Quality Filter
  •   Adjust your privacy settings.

 

6. RESET

Aftercare for digital bystanders

Intervening – at any level – can be emotionally and digitally taxing. Looking after yourself is part of responsible bystander action. After intervening:

  •         Stay alert for suspicious messages or links.
  •         Review your privacy and security settings.
  •         Take breaks from harmful content.
  •         Seek peer or professional support if needed.

 


More on our Cyber Allies Toolkit

This guide was developed as part of the forthcoming Cyber Allies Toolkit that focuses on tackling technology-facilitated gender-based violence (TFGBV) by equipping young people with practical knowledge, critical thinking skills and real-world strategies. Its mission is simple but urgent: to empower young people to create safer digital spaces and become cyber allies to their peers.

By localising expertise within young people themselves, the toolkit supports the development of responsible technology users, emerging TFGBV researchers and everyday digital defenders who understand how to identify harm, respond safely and challenge injustice online. To maximise accessibility and reach, the toolkit will be available in six languages: English, Amharic, Afaan Oromo, Tigrigna, Somali and Afar.

While the core toolkit is a comprehensive self-guided resource – complete with activities and reflection exercises – it will be released alongside a dedicated facilitation guide. This companion resource transforms the toolkit into a structured, interactive training programme, offering additional exercises, discussion prompts and facilitation strategies tailored for workshops, classrooms and group learning environments.

The toolkit has been co-authored by Media Smart Youth Ethiopia and the Centre for Information Resilience. It represents a collaborative effort shaped by educators, researchers, youth advocates and – most importantly – young people themselves. Illustration adaptations by Temar Ermias (original resources from Freepik).

Get involved

  • Would you like to learn more about the Cyber Allies Toolkit?
  • Interested in organising a training session?
  • Looking to partner to adapt the toolkit to your country, culture or language?

We welcome collaboration. Get in touch to explore how we can work together to build safer digital communities – locally and globally: [email protected]

 

Latest reports, direct to your inbox

Be the first to know when we release new reports - subscribe below for instant notifications.

Share Article