Content moderation—the process of reviewing content created and shared by users to see if it complies with a digital platform’s policies—is a challenging and complex undertaking, especially at scale. But before we dive deeper into content moderation, let’s take a moment to review some of the key takeaways covered in early T&S Curriculum chapters:
- Trust & Safety (T&S) is a term to describe the teams at internet companies and service providers that work to ensure users are protected from harmful and unwanted experiences
- T&S has evolved from removing bad content to having a much larger role in creating the content and product policies governing a company’s products and services, as well as developing the tools, systems, and techniques for enforcing those policies
- The advent of social media has increased the need for user-generated content (UGC) to be moderated and this is a huge focus of T&S work.
- Policy is the set of rules and guidelines that a platform uses to govern the conduct of its users.
- Policy often exists in two forms: an external document providing an overview of the company’s expectations of user behavior, and an internal document detailing exactly how and when to apply the policies in making specific decisions.
- Policies change in scope over time due to a variety of factors, including but not limited to: a change in societal or user behavior, the regulatory landscape, to improve detection capabilities, etc.
- Policy models vary in their scope and different types are used by different websites/platforms. Similarly, the abuse that policy intends to mitigate varies across platforms, but some examples include Violent and Criminal Behavior, Regulated Goods and Services, Offensive & Objectionable Content, User Safety, Scaled Abuse, Deceptive & Fraudulent Behavior.
Content Moderation and Operations
- Content moderation is the process of reviewing online user-generated content (UGC) for compliance against a digital platform’s policies regarding what is and is not allowed to be shared on their platform.
- Moderation is done through human review or automated tooling, or a combination of both.
- Content moderation results in a few different potential enforcement actions, such as: content deletion; banning; temporary suspension; feature blocking; reducing visibility; labeling; demonetization; withholding payments; referral to law enforcement
- Content moderation is important for many reasons, including ensuring safety and privacy, supporting free expression, and generating trust, revenue, and growth.
Policy Tiers, Complexity, and Enforcement Actions
Not all policies are created or enforced equally. Most platforms will have certain violations that rank higher than others due to a variety of factors. These policies will be enforced as P0 or and receive the highest priority in review and may include a quicker turnaround time, more rigid SLAs, and a more specialized moderation skillset. Certain policies may be more complex than others. Some require a more binary or straightforward decision, and others require several steps to come to a determination. Others still may require a more contextual review, may cross-reference several policies, or require additional operational nuance.
Most platforms also have different types of enforcement actions that correlate to the severity and priority of the violation. If content violates more than one policy, a moderator may need to choose one from several enforcement options. Enforcement actions include, but are not limited to: temporary suspensions, permanent bans, temporary removal of access to parts or features of a product, rate limiting use, and shadow banning account actions.
By the end of this module, you will:
- Apply the concepts previously learned in the T&S Curriculum to understand what content moderation looks like in context;
- Understand the different enforcement actions taking in accordance with moderation scenarios;
- Engage in moderation of fictitious scenarios to make enforcement decisions.
Acknowledgements
Authors│James Gresham, Abigail Schmidt, Noor Haneyah, Allison Weilandics
Contributors│Harsha Bhatlapenumarthy, Dona Bellow, Sarah Godlewski
Special Thanks│Automattic Team