Use of coded language by far-right extremists online: Emojis, numbers, and symbols reinforce in-group identity

Spotlight on Extremism

5 min read

CIR

CIR 's photo

Image credit: VOXPol

Share Article

By Francesca Gentile and Isabella Gomez O’Keefe

This piece was originally published on VOX-Pol as part of their series featuring contributions from presenters at the VOX-Pol Next Generation Network Conference 2025, held at Charles University in Prague, Czech Republic.

Our recent research, presented at the VOX-Pol Next Gen Conference in Prague, explored how far-right extremist accounts on TikTok use coded language to evade content moderation systems in order to establish and reinforce in-group and out-group identities; recruit, radicalise, and mobilise individuals; and spread disinformation, conspiracy theories, and hate and violent speech.

While the coded use of language, and dog-whistles specifically, are not new, many scholars argue that they are rapidly evolving in online spaces. Further, they are increasingly being deployed in association with un-democratic and extremist ideologies, allowing these perspectives to become increasingly prevalent in mainstream spaces, in turn demonstrating the need for up-to-date studies which understand their usage on social media.

Emojis, symbols, numbers, and context-specific terms can all be used to reference neo-Nazi and/or white suprematist tropes – creating a language that is understood by those who “belong to the in-group”. This language is also often used to portray negative, or even antagonistic, depictions of a perceived out-group. For example, a number of dog-whistles used racialised emojis such as a ‘taco’ to represent hispanics and a ‘guerilla’ and ‘watermelon’ to represent Black communities. These further out-group stereotypes, and even result in dehumanisation. Ultimately, this type of language can reinforce collective identity, making individuals feel as if they are the only ones “in the know”.

The use of in/out-group portrayals online has been commonly shown to be linked to collective action and mobilisation. More specifically, many of the framing techniques outlined by Snow & Benford’s (2000) model of collective action framing (CAF) were observed in the comments, and were even used to glorify or even call for collective violent action. In particular, historical references and the glorification of the in-group were used to demonstrate capacity for action, while out-group depictions often engaged with antagonist, diagnostic, and prognostic framing methods laid out by the CAF model. 

Dog-whistles were also used as a means of welcoming others to the in-group, with some users in the comment section even explaining to others how to use them correctly. This demonstrates how dog-whistles, and the process of social grouping used within them, may at times result in radicalisation or extremism.

Coded language and dog-whistles are also used to spread extremist ideologies and ideas without content moderation, making it challenging for social media companies and policy/law makers to regulate their usage.

 

Case study: coded language on TikTok

In January, the Daily Mail posted a TikTok video on Elon Musk’s speech at President Trump’s inauguration, which included his controversial gesture that sparked widespread online debate. Musk’s actions dominated online discourse for several months, with people on either side of the political spectrum debating on his intentions and motivations. However, the focus around Musk distracted from a key factor that needed to be analysed: how extremists exploited the public debate to share far-right and radical ideologies in online spaces.

The video posted by the Daily Mail was selected due to its high engagement rates, which had 70M views, 3.2M likes, 131K comments, and 800K shares as of the 26th of January. By limiting the research to one video, we were able to carry out a deep dive analysis of the comment section. We manually collected, analysed, and divided comments into 4 separate categories:

  1. General praise and support for Elon Musk’s actions – accounts online praised his actions, using praises like the ‘GOAT’ (greatest of all time), ‘the best,’ ‘sigma’ (meaning successful and popular), ‘based’ (slang for someone who speaks the truth, especially when the topic is controversial), and ‘legend.’
  2. International responses – users commenting in various languages including Spanish, French, Russian, and German. The German language in particular was used for direct references to Nazi Germany, Adolf Hitler, and the Alternative for Germany (AfD) party.
  3. Comments reinforcing in-group/out-group identities – classic “us vs. them” narratives that delineate insiders from outsiders.
  4. Far-right extremist coded language – using emojis, symbols, numbers, and historical references to promote hate, extremist ideologies, and call for violence.

We identified several coded terms being used to spread hate and avoid content takedowns, these included:

  • Numbers: several numbers were seen in the comment section, including: 88, 6, 23, and 271. The number 88 is used by neo-Nazis and far-right extremists to refer to the 8th letter of the alphabet, the letter H. Therefore, 88 stands for ‘HH’ which in this context refers to ‘Heil Hitler.’ One account in particular mentioned that he “watched the video 88 times, because 6 weren’t enough” referring to the six million Jews that died during the Holocaust and alluding to the idea of carrying out more violence towards the Jewish population. The number 23 was also mentioned in one of the comments, with the number representing the 23rd letter of the alphabet, the letter ‘W.’ One of the comments had the number 23 alongside a DNA emoji, this could possibly be referring to the superiority of the white race. Another comment alluded to the Neo-Nazi conspiracy theory that only 271 thousand Jews died during the Second World War.
  • Context-specific terminology: with references to Nazi Germany or the Roman Empire. These included terms such as “Heil Hitler”, “Ave Cesare”, and “Sieg Heil”. To avoid content takedowns, accounts also spelled it ‘hail’ or “zigal alitla” (or ‘Sieg Hitler’). The German song “Erika,” a popular marching song during Nazi Germany, was also referenced throughout the comment section. Comments that mentioned this song would write “Auf der Heide.”
  • Repurposed emojis: emojis used to spread hate. This included: the salute emoji and raised-hand emoji were used to show support for Musk’s gesture; the black, white, and red square emojis as references to flags belonging to Nazi Germany; and the ninja emoji, used to spread hate towards the black community.
  • Neo-Nazi related imagery: some accounts had images of Adolf Hitler as their profile pictures; others had swastikas or pictures of concentration camps. Usernames also included swastikas and lightning bolts, referring to the S.S.
  • Calls to action: some of the comments alluded to direct calls to action, with accounts mentioning that they are “ready to rise.” At times this phrase was accompanied by a rising sun emoji, alluding to another common sentence found in the comment section, “the sun is rising” – referring to the idea that a new dawn is coming.

 

Conclusion

Our research has shown that far-right extremist communication is evolving.  Coded language isn’t just a method of communication, it’s a tool for recruitment and radicalisation. It enables far-right extremist communities to reinforce their identity, normalise hate and violence, and operate beneath the radar of platform moderation systems. If platforms are not able to remove this coded language, hate, disinformation, and conspiracies will continue to spread.

To better detect and disrupt extremist content online, this coded language needs to be understood and analysed. Future research in this area could look at both qualitative and ethnographic research, as well as the creation of a database that examines this in more depth.


 

Francesca Gentile is an Open-Source researcher at the Centre for Information Resilience, specialising in disinformation narratives, social media monitoring, information operations and far-right extremism.

Isabella Gomez O’Keefe is a PhD student and Researcher at the University of Cambridge & Queens’ College, where she specialises in socio-cognitive framing of social identity, ‘us & them,’ and calls for violent action on mainstream social media.

Share Article