Protecting Moderators from the Effects of Distressing Content

Written by Kimberly Voll and George Ng

Introduction

In the dynamic world of online gaming, player support teams are at the forefront of protecting players from harmful and illegal content. In order to keep their communities safe, teams are often exposed to the most extreme types of distressing content, including graphic violence, child sexual abuse materials (CSAM), and severe harassment. Understanding and addressing the psychological impact of such exposure is crucial for fostering a supportive work environment and ensuring that the people who prevent others from being harmed by this content aren’t harmed themselves in the process.

 

Background and Impact

Continuous exposure to distressing content can significantly affect mental health, leading to trauma, insecurity, and long-term health issues. The rising documentation of these risks underscores the importance of comprehensive support and protective measures for moderators.

Understanding and addressing these impacts is not just a moral endeavor; it also has significant implications for organizations. Notably, the Scola vs. Facebook case, which was settled for $85 million in 2020, shed light on the profound mental health challenges faced by moderators due to their exposure to distressing content. This landmark settlement underscored the financial and reputational risks associated with inadequate support mechanisms. More recently, a 2024 ruling by Spanish courts further emphasized the corporate responsibility to compensate and care for moderators, reinforcing the importance of thoughtful consideration and action in these matters.

These cases serve as a compelling reminder to protect the well-being of those who protect our online realms and to thoughtfully evaluate the potential risks and remediation strategies. While it’s understood that not every organization may fully implement these changes, the evolving landscape invites a careful consideration of the ways we can collectively foster a supportive environment for our moderators.

 

Practical Advice with Examples

1. Prepare teams with specialized training:

Implement¹ a comprehensive training program on digital safety and psychological resilience, including simulations of moderation scenarios specific to gaming content. Regular updates and refreshers can help address new types of emerging online behavior.

¹ The FPA is a good resource that can work with companies to create tailor made solutions – www.fairplayalliance.org

2. Reduce exposure by using smart content filters:

Integrate AI-driven software that pre-screens game chats and user uploads for CSAM and violence, using image recognition to blur out potentially harmful content automatically. This initial filter significantly reduces direct exposure for human moderators and grants them a buffer to prepare for what they might see.

3. Enforce breaks and rotation:

Schedule mandatory rotations where moderators shift between content moderation and other tasks, such as community engagement or game testing, to reduce burnout. Implement an automated alert system to ensure breaks are taken regularly. It’s important to recognize that, although moderators might occasionally perceive these rotations as unnecessary, the gradual nature of desensitization can obscure the true impact of distressing content on their well-being. 

4. Offer trauma-informed care and psychological support:

Establish an in-house wellness program with access to on-demand counseling sessions. Or if infeasible, look for ways to cover external counseling or support through benefits. For instance, after encountering a distressing case, a team member could immediately connect with a psychologist specializing in occupational stress via a dedicated app. 

5. Establish peer support systems:

Create a peer support network where employees can share experiences and coping strategies in a safe, confidential environment. This could be facilitated through regular meetups or an online forum moderated by trained staff.

6. Provide clear resources and escalation pathways:

Develop a clear, accessible protocol for handling severe cases, displayed on moderators’ dashboards. This could include a direct hotline to legal teams for immediate CSAM reporting and a step-by-step guide for escalating threats of violence. Ensure regular training and review of these runbooks.

7. Collaborate with law enforcement when appropriate

Form partnerships with local cybercrime units to offer specialized training sessions on handling online crimes. Create a resource hub with contacts for regional law enforcement bodies, ensuring quick and accurate reporting of illegal activities. It is important to know the appropriate terminology when working with law enforcement to ensure issues are escalated appropriately.

8. Provide access to self-care resources:

Offer subscriptions to meditation and wellness apps, encouraging moderators to engage in self-care activities. Organize monthly wellness workshops focusing on stress management, mindfulness, and physical health.

9. Listen to your team and respond with customized support: 

Understanding that each team member may react differently to exposure to distressing content underscores the need for a nuanced approach to support. Recognizing individual needs and fostering open communication can lead to more effective and personalized strategies to mitigate the psychological impact. This should include

  • Personalized Check-ins: Regular one-on-one meetings to discuss challenges and feedback.
  • Adaptive Workload Management: Adjusting workloads based on individual sensitivities and stress levels.
  • Training for Emotional Intelligence: Equipping leaders with the skills to support their teams effectively.
  • Customizable Support Options: Providing a variety of support resources to meet diverse needs.
  • Feedback-Driven Improvements: Incorporating team feedback into policy and program enhancements.
  • Encouraging Peer Support: Facilitating informal support networks among team members.
  • Building Rapport: Invest in building trust and getting to know your team to help make difficult topics easier to address.

 

Conclusion

The well-being of player support teams is essential for the vibrant health of online gaming communities. By prioritizing protective measures such as advanced content filtering, comprehensive support, specialized training, and wellness initiatives, organizations can cultivate a resilient and supportive environment for their frontline staff.

The views expressed in this blog post are those of the author and do not necessarily reflect the official policy or position of GGWP. The content provided is for informational purposes only.


About Kimberly Voll

Kim Voll
Kimberly Voll – LinkedIn

Kim Voll is a veteran game designer and developer with a passion for creating inclusive online gaming experiences. Her background in cognitive science and computer science (AI) informs her work, blending technical expertise with a player-centric design approach. Formerly a principal designer at Riot Games, Kimberly is now CEO of Brace Yourself Games, and is the Co-Founder of the Fair Play Alliance, a global coalition dedicated to healthier game communities.

About George Ng

George Ng
George Ng – LinkedIn

George is Co-Founder and Chief Technology Officer of GGWP.