Technology and social media provide extremist groups with a unique global platform to spread their ideology and indoctrinate young recruits. The Islamic State exemplifies the use of technology to disseminate high quality propaganda that emulates Hollywood movies and popular video games. Using a variety of social media platforms, the Islamic State has been able to reach millions of people and encourage tens-of-thousands to leave their homes and join the Caliphate.
As a result, companies like Facebook and Google unexpectedly found themselves at the forefront of combating the global war on terror. Following billions of dollars of investment in human capital and technology, Silicon Valley is finally succeeding at flagging and removing terrorist content. Despite these successes, evidence suggests that online gaming is becoming a new modus operandi for extremist groups to communicate with and attract young recruits.
Online gaming is one of the fastest growing industries. In 2018, more than 200 million people viewed the eSports League of Legends final championship, more than twice the number of people who watched the NFL Super Bowl. According to Pew Research polls, 97 percent of teenage males and 83 percent of teenage females play video games.
Even as video games are becoming ubiquitous in modern society, the controls around content have become less clear. Video games provide an unmonitored environment where extremists, from the Islamic State to neo-Nazis, can contact and groom potential recruits from around the world.
Governments and intelligence services have expressed concern in recent years that video games, particularly in-game chat and messaging services, are tools that extremist groups use to communicate propaganda and groom young recruits.
In 2013, documents leaked by National Security Agency (NSA) contractor Edward Snowden revealed that both the NSA and the Britain’s General Communications Headquarters (GCHQ), were tracking Islamic extremists using video games such as World of Warcraft and Second Life. In 2016, officials in the United Arab Emirates claimed that young Emiratis were being recruited by ISIS using online games.
There is also proof of this trend on the Deep and Dark Web. As recently as January 2019, Islamic State outlets used Telegram groups to provide supporters with specific instructions for how to use gaming platforms to recruit new members. The use of video games is a sufficiently prominent recruitment strategy that a member of the Islamic State’s deep web forum recently requested that the Flames of War 2 propaganda video be formatted to run on PlayStation Portable devices.
The content of modern video games also generates concerns. One game in particular, Counter-Strike, allows players to simulate terrorists trying to perpetrate a terrorist attack. Dubbed the “Gaming Jihad,” terrorist organizations have exploited violent multi-player first-person shooter games and violent imagery to attract young recruits. In 2014, the Islamic State even developed a propaganda film designed to look like the popular video games Call of Duty and Grand Theft Auto to appeal to young gamers by glorifying and fantasizing video game violence.
The Lebanese group Hezbollah developed its own video games where players can do battle against Israeli Defense Forces or kill ISIS fighters in Syria.
While Islamist groups have been at the forefront of exploiting electronic media, growing evidence indicates that online gaming platforms are becoming a haven for white-nationalist indoctrination efforts across Europe and the United States.
Starting in 2017, the Valve Corporation and its streaming community Steam have been in the spotlight for the ubiquitous presence of neo-Nazi groups operating on the platform. Steam offers community-based online gaming services, where gamers can find other gamers and connect with them using profiles, messaging, and voice-to-voice video conferencing. Steam currently has over 125 million active users; however, the community is almost entirely unregulated.
Recent media coverage of Steam has revealed thousands of users and community pages that support neo-Nazi white supremacist groups. There are similarly thousands of pages dedicated to glorifying school shootings.
Given the volume of this toxic content it is concerning that video games and online gaming communities have largely flown under the radar compared to other social media messaging and communication platforms. As a result, gaming is providing extremists with an appealing alternative platform to communicate, disseminate propaganda, and make targeted recruitments.
EXTREMIST RECRUITING IN VIDEO GAMES
Online gaming is a perfect extremist incubator, allowing recruiters to identify potential recruits, make contact, develop rapport, and then indoctrinate them. There are a number of factors that make online gaming a unique alternative space for recruitment.
Online gaming communities represent a pool of potential recruits from which extremists can identify and target individuals that are more likely to be sympathetic to their cause. In online communities like Steam, extremists can easily search among users and groups, or even advertise their ideologies on community pages and forums. Video games can also be a self-selecting environment. For example, gamers drawn towards violent games could be more susceptible to discuss or commit real acts violence.
One of the critical steps in radicalization is finding recruits that perceive injustice, feel powerless, or have a crisis of identity. Hundreds of groups on Steam idolize mass school shooters because their users are often victims of bullying. Extremist groups can manipulate these feelings to misplace blame or aggression. For example, a Neo-Nazi recruiter might try to convince a Steam community of bullied teenagers that ethnic and religious minorities are to blame for how they feel. The Islamic State has used this same tactic to target immigrant youth in Europe and translate their feelings of dislocation into a hatred of the West.
The online gaming space provides a unique environment where people are in contact remotely and anonymously. Online gaming is designed to foster closeness with friends and teammates whose real identities often remain anonymous. This allows extremists to build rapport with potential recruits. Video game platforms are also innately less suspicious than other forms of communication, and propaganda can be slowly socialized without users becoming alarmed.
Gamers often substitute socialization on the platform for engagement in their surrounding communities. As extremists develop relationships with online recruits, they can start to introduce increasingly radical ideas to test and vet their candidates. Without outside intervention, people that are isolated and vulnerable to radicalization tend to validate the extremist messaging they receive. They also have fewer opportunities to challenge the rhetoric they are hearing. Over time, their isolation leads to normalization of extremist views and hate speech. One clear concern on a platform like Steam is that if thousands of pages of “toxic” white supremist content are allowed to remain active, this starts to normalize neo-Nazi behavior in online gaming platforms. Given the volume of teenagers playing video games in the United States, this could have an enormous effect on shaping dialogue around white nationalism.
A WAY FORWARD FOR THE INDUSTRY
In the future, video game platforms will remain an appealing tool for extremist groups to communicate their message and recruit young gamers. There is already growing evidence of a nexus between video games, online isolation in gaming communities that espouse radical and violent views, and a network of white nationalists that are engaging vulnerable youth to groom and indoctrinate a future generation of white nationalists.
Currently the gaming industry is trying to self-police this problem. The media company Riot Games relies on volunteers to moderate game related chats, while online forums such as Discord actively ban far-right extremist communities.
However, banning all types of “toxic behavior” is a daunting technical challenge. Microsoft, PlayStation, and Steam host hundreds of millions of active monthly users and house tens-of-thousands of users and groups engaged in extremist content.
Effective solutions will require a holistic approach that blends innovative analytic tools with clear policy guidelines. The online gaming industry needs analysts and AI tools that can find and flag extremist content. These tools need to be robust and capable of monitoring enormous volumes of in-game text chat and voice chat that is often ambiguous, coded, and occurring in hundreds of different languages.
Effective strategies also need to go beyond simply policing the platforms. De-platforming toxic actors is important, but a whack-a-mole approach to counter-extremism is reactive. Having clearer polices and enforcement mechanisms to prosecute content violators, or even coordinate with law enforcement, is equally important for deterring extremist activities over the long-term.