m0b422d8

Content Moderation For a Social Media Platform

PDF
m0b422d8

Content Moderation For a Social Media Platform

PDF

Prioritizing Employee Wellness and Resiliency in Content Moderation Practices

The need for rigorous content moderation practices on social media has never been more vital than it is today. Social media content has become even more diverse due to the rising global number of users. Bad actors are increasingly becoming more calculated and consistent in their attempts at publishing harmful content that violates or circumvents these platforms’ community guidelines. This only highlights that creating policies and using automation are simply not enough—effective content moderation still requires a human moderator’s critical thinking skills.

As platforms continue to develop their content moderation programs, human moderators become more exposed to harmful types of content that often depict extreme hate speech, violence, and sexual exploitation, among others. This prolonged exposure to harmful content makes content moderators more prone to psychological health and safety concerns1.

READ ALSO: Hate Speech EU Regulations

A comprehensive health and resiliency program is, without a doubt, crucial to any business looking to improve their content moderation practices. By putting human moderators’ overall well-being at the forefront, we enable them to make sound decisions and deliver excellent results. This is what we were able to accomplish for a leading social media platform with over 200 million daily active users globally, after their company partnered with Us to improve their moderation policies.

The Challenge

Because of the massive volume of user-generated content that can easily go viral together with the platform’s live streaming capabilities, it made it more crucial for the client to implement a strengthened response to content violations.

The Solution

We provide content moderation of both live and recorded content around the clock, looking out for possible content violations including, but not limited to offensive terms or gestures, targeted abuse, hate speech, and violence. Our human moderators make use of the platform’s playbook and media triage tool, flagging and banning users for content violations that fall under several categories: Sexual Content and Nudity, Suicide and Self-Harm, Hate Speech, Violence and Abuse, and Child Sexual Exploitation.

READ ALSO: Online Self-Harm in the Pandemic

Our content moderation Teammates review user reports determining whether a particular user’s post or message has violated the platform’s guidelines. Once a violation is found and flagged, it is then subjected to further review to determine whether it is actionable. Violation cases are prioritized based on their potential for harm or a user’s violation history. Our Teammates act as Digital First Responders, triaging violations and providing timely support and intervention for the most urgent ones—such as for victims of abuse or potentially life-threatening situations—by forwarding them to the client and reporting them to the necessary authorities.

Behind these solutions is our end-to-end Wellness & Resiliency program for content moderators.

Each TaskUs content moderation Teammate gets a total of 90 minutes of microbreaks per week, regular rotating schedules, and overall wellness monitoring. We have Mood Meter tools that alert our dedicated psychologist and on-call psychiatrist should an urgent matter arise. We also conduct regular group wellness sessions, pre- and post-production debriefing sessions, provide necessary training courses for Teammates and Team Leaders, and get feedback via quarterly assessments. This is to ensure adequate support is given as we continuously improve our wellness programs. To cap it off, we provide year-end wellness packages for all Teammates, to show our gratitude to our content moderators.

Our content moderation practices and efforts, in tandem with our wellness initiatives, have resulted in our client achieving the following:

  • 99.8% Moderation Accuracy Score
  • 95% Email Accuracy Score
  • 6.3 Hours of Email Response Time

Employee-first Content Moderation Practices

Find out more how TaskUs puts employee wellness at the forefront of our Trust and Safety solutions by downloading the full case study, Content Moderation and Wellness for a Leading Social Media Platform.

Read the full case study

    Download Case Study

    Content Moderation For a Social Media Platform

    _

    I understand that my information will be used in accordance with applicable data privacy law and TaskUs' Data Privacy Policy. Please review our Privacy Policy for additional information.

    References