Alexandria, VA (OCTOBER 28, 2024) — The International Centre for Missing & Exploited Children (ICMEC), a global leader in child protection, has released a Model Framework that provides a comprehensive set of employer best practices to strengthen resilience and support the mental health of human social media content moderators who are exposed to child sexual abuse material (CSAM).
ICMEC’s A Model Framework for Employers of Content Moderators: The First Line of Defense in Online Child Protection is designed to minimize the risks of secondary trauma from repeated exposure to distressing content. The Model Framework provides a clear, practical approach for employers to implement trauma-informed policies. It both protects employees and improves the overall effectiveness of content moderation.
Globally, approximately 100,000 content moderators work for social media giants like Meta, TikTok, and YouTube, often through third-party contracting firms. Studies show that up to 50% of content moderators experience mental health issues due to frequent exposure to harmful content and that up to 80% of moderators leave the job within the first two years. ICMEC’s Model Framework, which was developed in collaboration with market-leading digital service providers and trade associations, represents a significant step forward in improving conditions for this vital yet often overlooked group of workers in the digital age.
“ICMEC recognizes the immense mental health challenges content moderators face, especially when working with highly sensitive material,” said Bindu Sharma, ICMEC Vice President Global Policy & Industry Alliances. “This Model Framework establishes a new benchmark that guides best practices and fosters consistency and clarity.”
The Model Framework, which spans the moderator’s entire employment cycle from hiring and retention through to post-employment, emphasizes the importance of comprehensive training, a safe and supportive work environment, and a strong commitment to prioritizing mental health. Effective content moderation is an increasingly critical success factor in helping online platforms protect their users from potential abuse. Organizations that adopt ICMEC’s best practices can reduce burnout, build resilience, and help ensure the success of their content moderation efforts while minimizing legal risks.
For more information about ICMEC reports and initiatives, please visit www.icmec.org or contact ICMEC at information@icmec.org or +1-703-837-6313. To discover more about ICMEC’s impactful approach and the child protection innovations that ICMEC is driving, visit ICMEC’s CSAM Model Legislation page.