In this blog, one of our experienced Tradecraft Advisors explores how the evolution and widespread migration of language and meme culture – within both niche communities and mainstream online audiences – makes identifying risks from violent extremism challenging and how OSINT can help.
Since its inception, social media has facilitated instantaneous dialogue across the globe, bringing unprecedented levels of connectivity and communication to the digital age. While these platforms have been integral in disseminating valuable information – and directly shaping global movements and narratives – social media platforms also become breeding grounds for extremist views, propagating hate speech and extremist ideologies across class, creed, and countries.
Similar to how cultural and regional differences change how people speak and interact with each other, online spaces have their own slang, jokes and intra-societal norms that dictate the way individuals interact. Language and slang allow “cultural groups to create their own kind of territory through language.” Below are two examples of the normalizations of terminology and trends from extremist communities to the mainstream.
Incels (Involuntary celibates) are an extremist group comprised of heterosexual men who define themselves by their inability to have sex or relationships with women. Incels believe that women, genetics, and societal pressures predetermine men’s ability to form heterosexual relationships. To them, a large portion of men struggle to form relationships, but women largely have the freedom to form relationships with the most desirable of men. Incels feel entitled to sex and relationships, feelings that lead to an outpouring of aggression and hate toward women.
Incels, like other insular groups, have developed their own terms and meme culture. Some examples include terms and shorthand such as “MGTOWs” (Men Going Their Own Way), “Chads and Gigachads” “Staceys,” “Redpill,” and “based.” The term “based” is a classic exemplar of the dynamic nature of terms spreading throughout the internet. “Based” originally was a term that referred to someone addicted to smoking crack cocaine. Still, the term expanded its applicability and is now used by Incels, racists, and violent extremists as a form of admiration for an individual who does not care “about others’ opinions or being politically correct,” posting and behaving in accordance with their extremist beliefs.
Initially linked with the Incel community and other extremist ideologies, “based” has now found widespread use among younger generations, often devoid of its initial connotations. An article from Fox News highlights that the term “based” was even included in an FBI watch list due to its association with extremism.
However, “based” is now commonly used to signify agreement or appreciation. This migration and evolution of meaning underscore the complexities of online linguistics and the need to remain current on morphing applications of the term or terms used. A fundamental unawareness of such shifts could lead to making an incorrect, outdated, and potentially dangerous assumption about the usage of a term.
As slang and memes are co-opted and modified by extremist groups, the risks of false positives also rise. Labeling an innocent user as extremist because they used a term like “based” without understanding its changing connotations can be extremely detrimental to an individual or an agency.
A recent article by the Global Network on Extremism and Technology (GNET) identified the recent TikTok trend, which involved “gnome hunting” – an antisemitic trope used to refer to Jewish people as “gnomes.” On the surface, the trend seemed innocent and resulted in several users engaging in the trend who did not realize the original meaning of gnome hunting. As such, those unfamiliar with the background of the trend might inadvertently perpetuate harmful stereotypes or expose themselves to extremist ideologies. As the meaning becomes clear, online users are utilizing tropes and memes to conceal the concept of hunting for Jewish people. The normalization of trends and language in the mainstream needs to be considered. What was a unique phrase in a niche corner of the internet may take on a life of its own and monitoring that phrase may become too noisy.
While language inevitably evolves over time passively, extremist communities also actively modify terms and memes in order to frustrate detection and investigations, continuously devising methods to bypass detection, including the development of new codes, symbols, and narratives, or alphanumeric substitution, i.e., $ for S or 5 for the Arabic letter ح.
The Power of OSINT
National security and law enforcement analysts cannot afford to miss crucial insights or mistake innocent terminology as indicators of criminal intent. Through the use of powerful OSINT platforms, like Fivecast ONYX, analysts can gain insights into emerging extremist trends and narratives.
AI-enabled risk analytics are built into Fivecast ONYX to assist in identifying and monitoring developing narratives, sentiment, and emotion over time. Customizable risk detectors enable analysts to define keywords and phrases based on memes or online culture, as well as detect text in images through Optical Character Recognition (OCR).
To combat the spread and adoption of extremist ideologies online, law enforcement agencies, governmental bodies, and cybersecurity experts must provide analysts with the tools they require to identify and address potential threats as they migrate across platforms and before they escalate to violent action.