At the beginning of December, Bavarian Regulatory Authority for New Media (Bayerische Landeszentrale für neue Medien, BLM) and the Bavarian Ministry for Family had presented a new study commissioned by the two state entities centered around extremism, especially right-wing extremism, in and around online gaming. The study called „(Rechts)Extremismus und Online-Games“ has been executed by Modus – Zentrum für angewandte Deradikalisierungsforschung (modus zad) and the Peace Research Institute Frankfurt of Leibniz-Institut für Friedens- und Konfliktforschung (PRIF). While the study's results are published in German, Dr. Linda Schlegel of modus zad has provided an English executive summary of the results as well as distilled ten recommendations for the games industry from the report for GamesMarket.
Research Report on Extremist Influence in Digital Gaming Spaces: Key Findings and Recommendations for the Gaming Sector
Although extremists have been developing bespoke propaganda games since the 1980s, the potential nexus between gaming and extremism has only gained heightened attention since the livestreamed and gamified right-wing extremist terror attack on two mosques in Christchurch (New Zealand) in 2019. Because the perpetrator made several references to games and gaming culture, researchers began exploring extremist influences in digital gaming spaces, uncovering that the gaming sphere is exploited by extremist actors in various ways.
In addition to the continuous production of bespoke propaganda games, this includes the instrumentalization of commercial games - for example by creating modifications with extremist content and utilizing in-game communication features - the dissemination of propaganda material on gaming platforms, the appropriation of gaming aesthetics in propaganda videos and memes, as well as the use of gamification mechanics in extremist forums and servers.
These are not isolated incidents. Surveys with gamers across the world show that many have been exposed to hate, toxicity, and extremist material in online gaming spaces, particularly right-wing extremism and identity-based hate such as racism and antisemitism. Among those affected by such material are children and teenagers, including in Germany. Therefore, extremist influence in digital gaming spaces is an increasingly relevant issue for German policymakers, regulatory authorities, youth protection experts, and actors seeking to prevent and counter (violent) extremism (P/CVE).
To explore the issue in depth, the Bavarian Regulatory Authority for New Media (Bayerische Landeszentrale für Neue Medien; BLM) in cooperation with the Bavarian Ministry for Family, Work, and Social Affairs (Bayerisches Staatsministerium für Familie, Arbeit und Soziales; StMAS) commissioned a study on extremist activities in digital gaming spaces, implications for youth protection, and avenues to use gaming for political education and P/CVE. The research report “Wenn Hass mitspielt: Forschungsgutachten (Rechts-)Extremismus im Gaming-Bereich” was carried out by modus - Centre for Applied Research on Deradicalisation, supported by researchers from the Peace Research Institute Frankfurt.
Key Findings of the Report
For the study, a literature review, interviews with both gamers and experts, participant observation in eight online games, and data collection on gaming- and gaming-adjacent platforms was carried out. Key findings include:
- All digital gaming spaces are used to disseminate toxicity, hate, and extremist content. This includes games, chats, user-generated content, livestreams, forums, mods, and user profiles. Due to the prevalence of such content, users may stumble upon such material without searching for it.
- Real-life events, including elections, attacks, protests, and international conflicts but also seemingly unpolitical events such as the release of a new game, can lead to an uptick in propaganda material in gaming spaces. For instance, after a mass violence attack such as a terror attack or a school shooting, hundreds of profiles glorifying the attacker(s) are created within a short timespan.
- Experts assess that hateful and extremist material in the gaming sphere is most likely created and disseminated both strategically and organically. That means that it is likely that extremist groups are trying to exploit gaming for strategic reasons, while, simultaneously, users without connections to such groups also create and disseminate extremist content.
- Hateful and extremist material can and does influence opinion-building, particularly in young users who have not yet developed their own, secure worldview. However, there is no indication that gaming ‘causes’ radicalization in any shape or form or that the gaming sphere is generally more susceptible to extremist influence than other digital spaces. A revival of the moral panic surrounding video games in the 1990s is neither useful nor does it correspond to the evidence on extremist influence in gaming spaces so far. Nevertheless, the normalization of toxicity and identity-based hate in some gaming communities can provide a fertile ground for extremist influence and may discourage users from speaking up against such content.
- Gaming itself and connecting with others in gaming communities often has far-reaching positive effects on users. Concerns surrounding extremist influence should not overshadow these effects. Rather, the positive power of gaming should be used to support efforts in preventing and countering extremism as well as in political education. Gaming content and spaces provide such efforts with crucial opportunities to increase the appeal of P/CVE approaches and utilize gaming as a new avenue to support democratic, inclusive discourses online.
Ten Recommendations for the Gaming Sector
The expert interviewees commended the gaming sector as a whole for its recent efforts to improve measures against extremist influence: “It's actually quite important to emphasize and celebrate that the [gaming] industry is actually doing quite a lot here. The industry, I think, is already taking quite a lot of positive and preventative steps.” However, they also identified crucial gaps and areas in need of improvement to effectively curb extremists’ exploitation of digital gaming spaces. Key recommendations include:
- Learning from social media companies: For the past two decades, social media companies have faced many of the same challenges and observed similar extremist activities than those observed in digital gaming spaces today. The gaming sector does not need to reinvent the wheel when it comes to P/CVE but can benefit tremendously by drawing on this knowledge. “There is a wealth of policy, tooling, and safety-by-design knowledge developed by traditional social media platforms that can be reappropriated and adapted to gaming surfaces” write Erin Saltman and Nagham El Karhili from the Global Internet Forum to Counter Terrorism, an organization that brings together tech companies from all sectors to share knowledge and support counter-extremism efforts.
- Improved community guidelines: While many companies provide community guidelines that explicitly prohibit the dissemination of extremist material, such guidelines are often perceived as lengthy, difficult to access, and challenging to read, particularly by young users. Community guidelines should be made more accessible, easy to understand for young audiences, should specify the types of extremist materials and activities prohibited by the platform or game company, and users should be reminded of these guidelines repeatedly rather than merely once at sign-up.
- Reporting: Many users complain that reporting mechanisms are difficult to find, challenging to handle, lengthy, incomprehensible, sometimes do not allow the reporting of all types of content (such as the inability to report profile pictures), and that they do not receive any information about the outcomes of their reports. Improved reporting systems should be easy to navigate even for minors, permit the flagging of all types of content, and provide a follow-up to the users afterwards. In addition, the gaming sector should encourage its users to report extremist content more often as surveys show that willingness to report is currently low.
- Moderation: Community guidelines should be enforced through swift and effective moderation efforts that ideally include both reactive and proactive measures. Experts lamented the current lack of proactive moderation efforts, which would be necessary to effectively curb extremist influence in gaming spaces. However, they also warned against an over-reliance on automated tools and artificial intelligence often used in proactive moderation efforts, because this could lead to over-blocking of content as contextualizations and the necessary degree of nuance can only be provided by human moderators. In addition, experts criticized that many companies rely on self-made lists of keywords to identify and sanction extremist material and users. They recommended continuous collaborations with a group of experts on different types of extremism and related phenomena, who can provide insight into current trends and activities to improve such keyword lists.
- Community management: Experts warned that the current community management infrastructure is often insufficient. Community management teams, so the criticism, are often understaffed, over-burdened with other tasks, and not properly trained in identifying and handling hate and extremism - a perception backed-up by international research. However, community management is a key tool in mitigating extremists’ influence in digital gaming spaces. It should therefore be treated as a priority and should, so the experts, include both sanctions for users disrespecting community rules as well as rewards for prosocial behavior to support the development of a positive user community.
- Safety by design: It is by now common knowledge that there is no such thing as neutral design; every design choice incentivizes or disincentivizes certain user actions. It is therefore crucial that safety by design is incorporated from the very start of a new gaming-related product or service and that this process specifically includes discussions on safeguarding against extremist exploitation of the product or service. This may include both the inclusion of certain features (such as the opportunity to reward other players for positive interactions) as well as not incorporating certain features (such as a chat that cannot be adequately moderated). This must be implemented in a nuanced way. For instance, experts state that user-generated content (UGC) is both one of the best features of games and, simultaneously, allows for the creation of propaganda material: “This is what's really tricky about games. UGC is one of the best parts of games, you can create anything you want. It's pretty amazing. But also, you can create terrible experiences. We’ve seen concentration camp style maps being made. (...) We've seen school shooting simulators being made” but at the same time “ the flip side of the coin is also true, right? (...) [Look at] Luke Bernard who created the Holocaust museum in Fortnite.” Effective counter-extremism efforts do not simply disable features such as UGC, but recognize the necessity to implement measures to prevent exploitation by extremist actors.
- Cross-company cooperation: Extremist material and users do not stay on one platform, but migrate across the digital ecosystem. For instance, research shows that users in extremist Roblox groups are sometimes redirected to a Discord server and a recent Arte documentary demonstrates how in-game footage recreating terror attacks is disseminated on TikTok. While removing extremist content from one digital gaming space is the first step, experts call for cross-company cooperation to disrupt the spread of extremist content and networks across the gaming ecosystem and beyond. The Global Internet Forum to Counter Terrorism has piloted such efforts with a hash-sharing database, but further collaboration is necessary.
- Code of ethics: A shared, cross-company code of ethics encompassing P/CVE efforts but also issues such as child protection, predatory practices, data protection, ecological footprints in game production, and worker protection could support the creation of less toxic and more inclusive gaming communities in the long run, game psychologist Celia Hodent explained in her keynote at the 2025 Game Developer Conference. While first steps to the creation of such a code have been taken, the gaming sector currently lacks shared standards regarding trust and safety, particularly when it comes to extremist influence.
- Cooperation with P/CVE actors: Although the last years have seen an increase, the use of gaming-related content and spaces in projects aimed at preventing and/or countering extremism or in political education efforts is still exceedingly rare. This is, in part, due to a lack of knowledge, access, and technical expertise among P/CVE practitioners. The gaming sector could make a positive impact in international efforts against extremism by supporting P/CVE practitioners and their projects. This could take various forms, including aiding the visibility of P/CVE content in digital gaming spaces to the sharing of game design expertise to support the creation of high quality games against hate and extremism.
- Societal responsibility: Gaming is the biggest entertainment industry across the globe and has considerable societal influence. Millions of users congregate in digital gaming spaces not only to play, but to talk about social and political issues. The gaming sphere has become a crucial communication space. Such tremendous popularity and reach, so the experts, must be accompanied with an awareness of the social responsibility that the gaming sector as a whole has. This responsibility should not only include measures against extremism, but a responsibility to utilize the positive power of gaming for the greater social good and the protection of democratic discourses.