Although not all trust and safety teams are exactly the same, there are key functions and roles that are common across all trust and safety teams. On this page, we profile common functions within trust and safety, with a high-level description of what those teams do and examples of roles within those teams. Smaller companies may have teams that perform multiple functions or team members that play multiple roles. This list is not exhaustive.
Content Policy is responsible for developing content policies, outlining what type of content is or is not allowed on a platform. These policies usually reflect the company’s values and users’ sensibilities. They also attempt to comply with legal and regulatory requirements, while ensuring that the community’s voice is protected. Where relevant, they may also produce and present internal communications for senior leadership to inform them of ongoing policy situations and provide recommendations and support for decisions.
Content Policy Manager, Policy Analyst, Knowledge Management, Public Policy Manager
Content Design and Strategy
Content Design and Strategy identifies the optimal strategy for user-facing content and develops effective language to communicate with users. User-facing content includes user-education material, help center articles, product interventions, etc.
Content Strategist, Content Designer
Data Science and Analytics
Data Science and Analytics teams build measurement methods to understand the extent of policy violations present on a platform. They analyze the impact of content moderation, proactive abuse detection, and various other efforts to curb violating content and behavior. They also predict policy violation trends through data analysis, and develop creative ways to address adversarial behavior.
Data Scientist, Data Engineer, Data Analyst
Engineering is responsible for developing machine learning models to scale and/or automate enforcement against policy-violating behavior. They build and maintain the systems and tools that support user-facing reporting and enforcement options (e.g., DMCA notice-and-takedown), internal review tools, and other technical aspects of the content moderation and policy enforcement processes.
Software Engineer, Security Engineer
Legal teams are responsible for managing, creating, and articulating responses to requests from law enforcement, regulatory bodies, and government agencies. They proactively identify and mitigate potential issues as well as advise on legal risks.
They also counsel product teams and broad cross-functional teams on a wide range of legal issues regarding existing and planned products.
General Counsel, Cybersecurity Law and Investigations, Regional Experts, or Subject Matter Experts (copyright, privacy, etc.)
Law Enforcement Response and Compliance
Compliance and Law Enforcement Response teams are responsible for reviewing and accurately assessing legal requests from law enforcement officials, ensuring compliance with applicable law and platform’s terms of service. They respond to sensitive or urgent escalations, process and policy inquiries from law enforcement, government agencies and relevant internal parties. They may also coordinate with internal partners such as Legal and Content Policy and external law enforcement agencies to provide real world assistance to people in crisis.
Law Enforcement Response Analyst, Investigations Analyst, Incident Response Analyst
Operations teams are involved with all aspects of day-to-day moderation activities, and are responsible for developing scalable and efficient processes to support content moderation and policy enforcement. Content moderation professionals are typically housed within operations teams; if a platform chooses to outsource moderation to an external vendor, the platform’s operations team often manages the relationship with that vendor. Other tasks that fall under this group include quality assurance, training, capacity and workflow management, change delivery, developing review protocol, and crisis or incident response and management. Their proximity to content moderation means that operations teams play a key role in generating insights, predicting problem areas, and influencing the future direction of trust and safety work.
Project Manager, Program Manager, Vendor Manager, Analyst, Investigator, Specialist
Product Policy teams are responsible for developing and refining principles and policies specific to a particular product (e.g., Ads). Usually a sub-team within Content Policy (which outlines what is or is not allowed on a given platform at a high level), Product Policy teams may introduce additional policy nuances to cater to individual products (e.g., additional restrictions to nudity policies in the context of Ads or Sponsored Content). They also counsel internal product teams and often provide practical product strategies across multiple jurisdictions based on policy considerations.
Product Policy Manager, Product Policy Associate
Product Management partners with engineering, investigators, analysts, operations, policy, and cross-functional leadership to drive the strategy, vision, and execution for preventing and reducing policy violating content and behavior on the platform. “Product,” in this case, may include a policy area such as hate speech or cross-policy focus areas, such as quality assurance or transparency reporting. A product manager for hate speech is responsible for developing the strategy for addressing hate speech on the platform and is accountable for executing the strategy by collaborating with cross-functional teams.
Public Policy and Communications
Public Policy and Communications is responsible for building and maintaining partnerships with key external stakeholders such as NGOs, governments, and regulatory bodies. They advise internal teams on regional public policy matters to guide the broader development of products, services, and policies. They manage strategic outreach, designing and leading strategies and campaigns to shape public and political opinion about the platform.
Public Policy Manager
Sales and Advertiser Support
Although Sales and Advertiser Support is not usually considered a Trust and Safety function, these teams play a key role in addressing advertisers’ concerns about abusive or policy violating content and behavior (e.g., a brand appearing alongside objectionable or extremist content).
Client Partners, Industry Managers, Vertical Leads
Threat Discovery and Research
Threat Discovery and Research teams are responsible for investigating and analyzing networks of abuse, researching bad actor behavior, and streamlining insights to collaborate with both internal teams and relevant external parties, such as law enforcement, to address criminal and/or large scale abuse.
Abuse Investigators, Threat Intelligence Investigators
Author │Harsha Bhatlapenumarthy
Contributors │Eric Davis, Daphne Keller
Special Thanks │Kaofeng Lee, Charlotte Willner