Home Male Psychology Violent Extremist Language in Incel Communities Has Steadily Increased Over the Past 6 Years

Violent Extremist Language in Incel Communities Has Steadily Increased Over the Past 6 Years

Reading Time: 3 minutes

A comprehensive study examining violent extremist language in online incel communities has revealed a worrying trend of increasing extremism over time. The researchers analysed posts from various platforms frequented by incel communities to track changes in language and the prevalence of violent extremist terms.

The study aimed to provide a more nuanced understanding of the incel subculture by examining the evolution of its language across multiple platforms over several years. The researchers used a custom-built dictionary of incel violent extremist language (IVED) to measure the salience of violent extremism in these communities’ discussions.

The incel (involuntary celibate) subculture has gained notoriety for its misogynistic and violent rhetoric, often blaming women and society for the perceived injustices faced by its members. This study sought to determine whether this rhetoric has become more extreme over time and how it varies across different online spaces.

The researchers compiled a vast corpus of incel online content, spanning from 2014–2022, covering major incel forums, subreddits, and other online spaces. They used computational methods to analyse this content, focusing on the frequency and context of violent extremist language. The IVED included terms explicitly associated with violence, dehumanising labels, and derogatory language aimed at outgroups, particularly women.

The findings were published in the journal Terrorism and Political Violence.

The study found that the use of violent extremist language in incel communities has steadily increased over the past six years. This trend was particularly pronounced on major platforms like Incels.is, which emerged as a primary hub for incel discussions following the shutdown of earlier subreddits like /r/Incels and /r/Braincels.

One of the critical insights from the study was the responsiveness of incel communities to real-world events. For example, the 2018 Toronto van attack by Alek Minassian, an incident widely celebrated within incel circles, led to a noticeable spike in violent extremist language. Similarly, the COVID-19 lockdowns and subsequent offline events were correlated with increased extremism in incel discussions, though this was not uniformly observed across all platforms.

The research highlighted significant differences in the level of violent extremist language across various incel platforms. Forums like Incels.is exhibit higher levels of such language compared to subreddits and other online spaces. This finding underscores the heterogeneity within the incelosphere, suggesting that some platforms are more conducive to extremist rhetoric than others.

For instance, newer subreddits created after the shutdown of /r/Braincels showed significantly lower levels of violent extremist language compared to the forums. This suggests that while the incelosphere as a whole has become more extreme, this trend is not uniformly distributed across all platforms.

The findings of this study have important implications for law enforcement and policymakers. Online rhetoric-inspired offline violence is becoming a growing threat due to the rise of extremism in online communities. The study underscores the need for a nuanced approach to monitoring and addressing the activities of these communities, considering the variations in extremism across different platforms.

The researchers also call for further investigation into the dynamics of incel communities, including cross-platform migrations and the potential emergence of new hubs for incel discussions. Understanding these dynamics is crucial for developing effective strategies to counter the spread of violent extremist ideologies.

The study suggests several avenues for future research. One area of interest is the exploration of visual tropes used in incel communities, such as the use of images and avatars that glorify violence and extremist figures. Additionally, the researchers highlight the need to examine the role of mental health issues within the incel subculture, particularly the prevalence of discussions around suicide and self-harm.

Another important direction for future research is to analyse the impact of specific events and interventions on the levels of violent extremist language. This could help identify effective strategies for mitigating the influence of extremist rhetoric in online communities.

© Copyright 2014–2034 Psychreg Ltd