The upcoming roll-out of global age verification at Discord has many game studios worried. The tool is one of the main ways for many studios to build up communities: With their own Discord servers, to share screenshots and videos with the community, find playtesters and much more. While people that do not age verify through Face ID will still be able to access Discord in a PG13 version, for many game studios, that means they cannot be sure if their game visuals fall under that. What are the guidelines for cartoon violence in screenshots, for example? Will users not be able to enter a whole server anymore if the game it promotes features references to alcohol and drinking?

We asked an official representative of Discord these questions. "We do not automatically age-gate servers or content related to a specific game based on its rating alone", they stated. However, "our Violence and Graphic Content Policy does not allow the uploading or sharing of any content depicting real violence, gore, or animal cruelty. Discord’s content safety filters are part of our broader Teen Safety Assist and safety-by-default approach. They help reduce exposure to certain categories of potentially sensitive image-based media, especially for teens."

These filters will be applied "with a combination of automated detection with AI validation and human review to proactively identify and age-gate servers". That means game servers could indeed be gated to parts of their community, whether intentionally or through a possible false flag in the automated systems.