Better Game Moderation.. Finally

 

The Challenges

Toxic behavior in games can be a serious problem that can ruin the experience for many players. What’s more, it’s often bad for business — a new player who experiences toxicity within their first few sessions often never comes back. And victims of toxicity have shown to have decreased 90-day retention rates as high as 25%.

The traditional approach to moderation relies mostly on enabling users to report bad actors via a player reporting system. While this is a great idea in theory in the sense that it empowers players to help police the community, the reality is often quite different: a combination of a preponderance of false reports, e.g. a toxic player who’s been told they’re being reported will report their victims as well and the sheer number of reports that get filed quickly overwhelms even the most committed moderation teams. As a result, games often have the capacity to respond to less than 0.1% of the reports that are filed, leading to many players feeling the reports are essentially useless.

The Basics

When most teams first think about moderation, it typically revolves around chat. This makes sense because chat is the most accessible way for players to vent their frustrations at one another and thus is responsible for a fair bit of the toxicity players experience. However, just like in real life — where actions speak louder than words — some of the most disruptive types of behaviors are actions players do to one another in game. For example, having a teammate intentionally leave a match early or go AFK can be incredibly frustrating in a competitive game. And having one cheater in a match can ruin the experience for everyone else in the game.

Trust and Safety teams recognize that effective moderation needs to cover all of these areas: chat, in-game behavior, and player reports. It should also be holistic, balanced and fair, taking into account both positive and negative actions, and giving players feedback to incentivize pro-social behavior. It can be a daunting task, so let’s take a look at some key strategies that can help.

 

Key Strategies to Build a Holistic Video Game Moderation System

 

Set and communicate Community Guidelines

Acceptable behavior vary depending on the game type and should be communicated clearly – not in fine print, but in easy to read, concise and ideally engaging text. For example, the Fortnite code of conduct has just four simple rules while League of Legends has a comprehensive explanation of the principles behind the code of conduct with a list of specific unacceptable behaviors and potential consequences.

Start with Content Moderation

This includes both text and voice content in the game. Focusing solely on this aspect, as many games initially do, can leave you with an incomplete moderation process and picture of a user’s overall conduct in your ecosystem. However, it is an essential foundation of moderation. Stopping toxicity while it’s happening as much as possible by filtering unacceptable words, phrases, or players when they are repeatedly saying toxic things via text and voice is a visible way to both communicate seriousness about the standards and prevent bystanders from exposure to toxic messages.

Get the most out of community policing and player reports

The concept of ‘neighborhood watch’ can be a potent tool in maintaining a positive gaming environment. However, it comes with the major challenge that there is a high number of false or illegitimate reports, sometimes due to bad faith and sometimes due to the fact that it’s hard to report some offenses accurately (e.g. can you always tell if a perfect headshot was cheating or a really skilled player?)

The fact that reports are notoriously unreliable (we’ve heard games cite false positive %s from 10% to 80%, depending on genre!) means that games ignore the vast majority of them. This is a miss; while the problem is beyond human scale, AI-based triage can ensure that all reports get analyzed and processed. GGWP’s approach aims to finally make reports useful with the following approaches:

  • Report credibility: we assess the credibility of each report using a variety of factors, including corroborating reports with AI-based detections, player reputations, auto-detections (e.g. toxic chat in the case of verbal abuse reports), and in-game events.
  • Player credibility: over time, we learn which players are the most reliable at filing accurate reports and weight their voices more heavily, helping you cut through the noise.
  • AI triage: Our systems auto-validate incidents and triage reports based on severity, urgency, or impact for either automated action or human review. Our triage ensures your moderation team can effectively allocate scarce manual resources and respond more quickly to serious, legally risky events like child grooming.

Auto-Detect Unacceptable In-Game Behavior

Monitoring and moderating player actions during the game is an important and under-used component of moderation; for example, in competitive games, players can be automatically penalized for AFKs instead of waiting for player reports to pile in. Larger games may want to invest in custom models for their gameplay, like friendly fire detection, in order to rapidly punish and ultimately prevent griefing.

When looking at in-game actions, it’s critical to take into account additional factors like a player’s skill level and whether the game is a ranked match so any sanctions can be appropriately tailored to their context.

Don’t forget positive behavior

A complete system will celebrate positive behavior and allow toxic players to have a “path to redemption” if they change their ways. Games have started to understand the value of a commendation or endorsement system, but it may seem like a daunting task requiring extensive resources, available only to large games like Overwatch. In our view, it is worth implementing in multiplayer games from the start, especially if you have access to a system like GGWP which automatically processes and incorporates positive incidents into player reputations out-of-the-box.

Use Nudges, Sanctions and Rewards

Creating a healthy community is about using each incident as a teachable moment, sanctioning bad behavior, and celebrating those who lead by example. Sanctions, ranging from temporary mutes and suspensions to bans, are necessary and have long been the foundation of moderation.

However, a great program will start with warnings before sanctions and “nudge” messages to remind players to cool down and make sure they are being good teammates to prevent rather than just punish bad behavior.

In addition, recognizing and rewarding positive, respectful gameplay can be as crucial as penalizing negative actions. An effective system can encourage cooperative behavior with a mix of internal and external rewards, including positive messages and nudges (the opposite of the “warnings” above) as well as more public rewards like Mentor badges.

Give players transparency

While it can feel scary to introduce, some games have been successful with a system that lets their users know that they have this holistic moderation system in place, sometimes going as far as revealing player scores (e.g. Overwatch endorsement levels). This transparency builds trust and fosters a sense of accountability, adding the type of social observation that happens in real life, and going beyond simple carrots and sticks.

Learn more about our approach

GGWP was founded with the vision of making a truly comprehensive game moderation system available to games of all sizes. Our system was designed to make it very easy to get started with just a few data elements and minimal configuration and powerful enough to scale to the largest games.

Our chat subscription makes both positive and negative context-aware detections available out-of-the-box and our player reports assessments automatically assess both report and reporter credibility to create a comprehensive view of a player’s contribution to the community. Combined with username detections and a view of your community’s Discord channel, our products give you a truly complete view of the health of your community and help you take action on each player with confidence.

To learn more about how GGWP’s AI-based tools can help you moderate your game, reach out to us at, contact us.