This content was reproduced from the employer’s website on August 17, 2022. Please visit their website below for the most up-to-date information about this position.
What’s so interesting about this role?
Grindr is looking for a Content Moderation Operations Manager to drive the vision and execution for content moderation operations at scale, overseeing the team dedicated to investigating and resolving issues related to policy enforcement, fraud, safety, and other sensitive and complex scenarios.
Reporting directly to the VP of Customer Experience, you’ll manage the day-to-day operations of content moderation, building scalable policies and processes while promoting an inclusive and collaborative culture. You’ll also work cross-functionally across the organization, helping to contribute to a culture of safety by design.
This is a unique opportunity to work at a company with a focus on the global LGBTQ+ community, where Trust and Safety is seen as a strategic asset rather than a cost center. In this role you will have plenty of opportunities to significantly contribute to Grindr’s mission of being a safe and welcoming space for millions of LGBTQ+ users around the world to find love and connection.
What’s the job?
- Responsible for oversight of vendor operations of a global team of content moderators. Forecast volume and headcount needs for content moderation, ensuring a balance between high-quality work, adherence with production/metrics expectations, and moderator health and wellbeing.
- Supervise moderator training and quality assurance, ensuring that policies and processes are being applied effectively, fairly, consistently, and are supported by strong reporting and analytics.
- Develop and operationalize principles and policies for Community Guidelines enforcement, and enhance workflows via process improvements and tooling recommendations.
- Deliver reports and recommendations on content moderation and enforcement strategy, including at the executive level.
- Collaborate and advise on rollout of new features that require content moderation, helping to contribute to a culture of safety by design.
- Mitigate high crisis incidents that pose a risk to the company brand or to our users, and be able to weigh risks and balances between safety and privacy. Make final decisions and act as escalation point where needed, including time-sensitive escalations from operations teams, internal stakeholders, and outside groups (Law Enforcement, Government, NGO).