Dr. Elisa Orofino, the Academic Lead for Extremism and Counter-Terrorism at The Policing Institute for the Eastern Region (PIER) at Anglia Ruskin University (ARU). She can be reached at: [email protected]
In a post-pandemic society, online radicalization stands as a real and pressing threat.
The use of the internet by extremist groups to disseminate their ideologies has been amplified by the COVID-19 pandemic, which kept people confined to their homes where most of their interactions happened in front of a screen. Recent studies have highlighted the role of specific online ecosystems as fertile ground for radicalization. More precisely, social media like Facebook, Twitter and YouTube have worked (and continue to work) as efficient tools for extremist and terrorist groups to recruit and train new people while disseminating specific information
Gaming Environment
Relevant research also suggests that increasingly, various extremist, hate and disinformation actors and communities are transitioning from social media towards a more diverse range of online spaces that offer even less moderation but greater privacy, security or anonymity. Some of these platforms are connected to the gaming environment and they include Reddit, 4chan (8Chan), Discord, and Twitch. Mostly characterized by far-right/white supremacist forms of extremism which are often intertwined with typical concepts of the manosphere (e.g. misogynist attitudes), as well as the glorification of Islamist terror practices (e.g. beheadings), these platforms have already worked as virtual means to broadcast horrors around the world. For example, Twitch has been used to broadcast the Christchurch (New Zealand) attack in 2019, and later in 2022, to broadcast the Buffalo attack in Texas. Both terror acts had a far-right ideological reference. Once these videos are uploaded, they are very hard to be removed. They were watched by thousands of individuals around the world and continue to inspire acts of terrorism to date.
The online gaming ecosystem also appears to be quite prolific for radicalization and extremism. Referrals to the Prevent Scheme — one of the earliest de-radicalization support programs in the UK — are characterized by a young population (under 18) and this seems to be a predominant characteristic all over the world. Extremist groups are using video games to appeal to young people in a language and form that is very accessible to them. Popular games, such as Call of Duty and Medal of Honor foster an “us vs them” narrative where players collaborate to fight the common enemy, often producing a sense of shared thrill for the mission. War-play has worked very well for armed groups like Hezbollah that have created their own games to recruit members and share their worldview. More precisely, in 2007, Hezbollah released Special Force 2: Tale of the Truthful Pledge as a follow up game to Special Force (released by Hezbollah in 2003). The game setting is the 2006 Lebanon War between Hezbollah and Israel and players are first-person shooters and Hezbollah fighters.
While theoretically, games can support violent radicalization processes through normalizing violence by fostering a sense of self-efficacy of the individual and his/her identification with violent avatars, there is no empirical evidence that violent games foster violent behavior in their users. Besides war-play, other gaming styles have also been used by extremists. An example is a Roblox driving game which invites players to “Become a Racist” (name of the game) and simulate the murder of people belonging to ethnic minorities by running them over in a car.
While much extremist content, like the examples highlighted above, is out in the open, some extremist material can also be well hidden in the online ecosystem. Fake news, personal blogs, conspiracy theories, emerging YouTube channels, social media profiles and group pages can all feature some messages and conversations about topics that can be deemed as “extreme”.
Defining Extremism
But what does “extremism” mean? Although this is a much-contested definition, I would describe it as a “vocal or active opposition to fundamental values such as democracy, the rule of law, individual liberty and the mutual respect and tolerance of different faiths and beliefs”. Although this government definition is intentionally quite broad, some useful elements can be extracted to help identify extreme content online.
Regardless of specific ideology and narratives opposing the “fundamental values” mentioned above, extremists usually have three elements in common: an enemy to fight, a group of victims to protect, and a desired change. All extreme ideologies (and groups associated with them) would have a clear enemy in mind that needs to be annihilated. Whether it is a specific group in society (e.g. ethnic minorities, women, LGBTQ+) or political actors (e.g. states, governmental organizations), private companies (e.g. cosmetics, fashion brands), the enemy needs to be actively opposed. The methods to fight against the enemy vary from group to group also within the same ideological trend and this is how we differentiate between violent and vocal extremism. To give an example, according to Islamists, the enemy to fight is the West, which stands as a corrupt political, social, economic and cultural system.[1] Groups like ISIS or Boko Haram would encourage their members and followers to physically fight against the enemy, using violence and supporting the killing of innocents to disseminate terror. Conversely, groups like Hizb ut-Tahrir and Tablighi Jama’at encourage their members to intellectually fight against the enemy, refusing violence as a viable means to achieve desired change.
Besides an identifiable enemy to blame, extremist groups would also clearly address a group of victims they allegedly advocate for. Victims are presented as seriously in danger and the extremist groups are portrayed as the only actor able to offer them knowledge and protection.[2] Extremist groups exploit specific grievances in society to incite people. Those who feel disenfranchised, lonely and like they do not fit into the society where they live or with (multiple) mental health conditions, specifically constitute a vulnerable group that could be radicalized easier than others. These people are often desperate for a change, both in their personal life and in the broader context surrounding them. Therefore, the choice to join a certain extremist group or to espouse a specific set of extremist ideas provides them with a purpose in life and a sense of excitement for the desired change.[3]
Preventing Terror
A recent study from Moonshot has shown how peer support could play a pivotal role in disengaging people from extremist views and preventing acts of terrorism. Moonshot highlighted how several mass shooters, such as the 19-year-old Parkland, Florida shooter in 2018 and the Red Lake, Minnesota attacker in 2005, had posted cries for help online before committing the act. The study also highlights the need for more specific bystander interventions to help prevent acts of terror by intervening early.
In line with Moonshot’s recent findings, previous research on the linguistic features of terrorist manifestos has shown a leakage warning behavior which can be defined as “the intentional or unintentional signalling of planned violence in public or non-public communications”.[4] The purpose of this leakage can be linked to the individual’s play for attention or help to cope with anxiety related to the act he/she wants to commit. In the UK, a number of resources exist to help educators and parents spot the signs of radicalization in young people. Some of them consist of toolkits which offer guidance on the signs of radicalization, drivers of extreme ideologies and classroom resources to discuss uncomfortable topics at school. However, there is a lack of bystander interventions and resources, especially for young people who wish to help their peers in disengaging from extreme ideas.
Conclusion
In the UK, if a young person is concerned about one of their friends, they can have a chat with the safeguarding team at school or speak to the family. The young person “at risk” can be referred to the PREVENT scheme which was conceived to assess the risk of individuals as a terror threat. Channel, the main programme within PREVENT, is a voluntary support program for individuals who are at risk of becoming terrorists. Support programs are tailored to the specific case and they are regularly reviewed to check how the person is progressing. Although the PREVENT scheme has been at the center of a harsh debate over the years for allegedly stigmatizing the Muslim community as the “dangerous other”, individuals engaged in channel support programs, according to my observations and interviews with practitioners, rarely refuse the support offered and normally feel well supported throughout the program.
Last but not least, it is important to mention that all support tools and programs work best if they are used alongside an integrated approach which involves the people that the individual cares about (is bonded with) and if they address the grievances that led him/her to engage with specific groups and ideologies both online and offline.
European Eye on Radicalization aims to publish a diversity of perspectives and as such does not endorse the opinions expressed by contributors. The views expressed in this article represent the author alone.
____________________________
References
[1] Orofino, E. 2020a, Citizenship in the Minds of Political Islamists. Australian Journal of Islamic Studies, 5:2, 29-50.
[2] McCauley, C. and Moskalenko S. 2008. Mechanisms of Political Radicalization: Pathways Toward Terrorism. Terrorism and Political Violence 20 (3):415-433. doi: 10.1080/09546550802073367.
[3] Al-Attar, Z. 2020. Autism spectrum disorders and terrorism: how different features of autism can contextualise vulnerability and resilience. The Journal of Forensic Psychiatry & Psychology 31 (6):926-949. doi: 10.1080/14789949.2020.1812695.
[4] Ebner, J., Kavanagh, C., & Whitehouse, H. (2022). Is There a Language of Terrorists? A Comparative Manifesto Analysis. Studies in Conflict & Terrorism, 1-27. doi:10.1080/1057610X.2022.2109244
© 2018 EER – Copyright © European Eye on Radicalization.
© 2018 EER – Copyright © European Eye on Radicalization.