Trust and Safety Content Policy Manager

  • People Management
  • TSPA Members
  • New York, NY, USA, San Francisco, CA, USA, Seattle, WA, USA, or Los Angeles, CA, USA

This content was reproduced from the employer’s website on December 9, 2021. Please visit their website below for the most up-to-date information about this position.

The Trust & Safety team at Niantic and the broader Operations department serves a critically important role in defining, implementing, and scaling outstanding policies and processes in service of our player, advertiser, and emerging development needs.

As a Trust & Safety Manager, you will be responsible for leading our content moderation processes and workflows. You will partner closely with peers across product, engineering, legal and communications to ensure Niantic remains trusted, safe and fair. As part of the team, you will define our approach to content moderation across products and games, ensuring alignment with Niantic’s strategic direction and goals, and develop and enforce clear and consistent policies and processes for moderating various forms of User Generated Content, in accordance with Niantic’s values and mission.

Responsibilities

  • Work closely with key partners including Legal, Comms, Eng and Product to implement new processes, and contribute to ongoing process improvements as well as offer inputs on risk assessments for new projects and features.
  • Provide efficient and effective issue management in line with the established policies to manage time sensitive concerns. Collaborate with Legal to develop strategies for handling requests from law enforcement and security officials, government authorities, and other industry coalitions.
  • Serve as subject matter authority on content policy issues cross functionally.
  • Act as the voice of our customer. Advocate for our most significant player challenges and drive prioritization with our product and engineering teams.
  • Evaluate, select and develop outsourced vendor partners. Partner with vendors to deliver upon our moderation goals and meet our performance standards.
  • Evolve reporting to assess impact of customer experience metrics on Niantic’s business and brand.
  • Provide day-to-day oversight and leadership to ensure objectives and key results are met across quality, productivity, and other success measures.
  • Lead design-centric conversations and adversarial planning sessions on content moderation issues to inform product strategy.

Qualifications

  • Master’s Degree from an accredited institution or higher.
  • Experience in content moderation operations, both in crafting and carrying out policies.
  • Ideally, expertise in one sub-areas of content policy, such as hate speech, CVE, child safety, etc.
  • Motivated by Niantic’s goal of using technology to get people outside into the physical world to experience real world social interaction.
  • Strong critical thinking and problem solving ability to effectively analyze data, regulatory requirements, and product limitations.
  • Previous experience building and scaling an integrity, user-facing support or services function within a tech company.
  • Previous experience leading a globally-distributed, hard-working team in a fast-paced, high growth environment.
  • Excellent collaborator with ability to maintain strong internal and external relationships.
  • Familiarity with the product development cycle, and with adversarial planning processes.
  • Demonstrated passion for gaming and/or digital safety.
  • Interest in and familiarity with mapping, augmented reality, and gaming.
  • Excellent written and verbal communication skills.

To apply for this job please visit careers.nianticlabs.com.