This content was reproduced from the employer’s website on April 23, 2023. Please visit their website below for the most up-to-date information about this position.
Earlier this year, we began building a healthy and sustainable federated social space that thrives on its own terms, independent of profit- and control-motivated tech firms. Using the Mozilla Manifesto as guiding principles, our Trust and Safety team is hard at work developing policies, processes, and human-centered solutions for user and community safety, product experience, and sustainability.
The Trust and Safety Operations Lead will be responsible for designing, implementing, and overseeing a scalable and effective content moderation program for Mozilla.social, Pocket, and future social-sharing products and features. You will work as part of a cross-functional team ensuring that our efforts hosting user- generated content are aligned with our Pledge for a Healthy Internet
Does the opportunity to design, implement, and lead a brand new content moderation program excited you? Are you dedicated to building a safer, healthier, and more inclusive online space for marginalized communities? If so, we invite you to apply!
Trust & Safety is a core component of the product experience and content moderation is one of the most important aspects of this work. This role is focused on launching and maintaining the operational, experiential, and trustworthiness challenges inherent in social products.
As Trust and Safety Operations Lead at Mozilla you will:
- work cross-functionally with product, engineering, legal, and privacy to cultivate our practice of content moderation related to a range of topics, including harassment, abuse, hate speech, dis/misinformation, intellectual property infringement, illegal content, and other issues
- manage operational relationships with tooling and content moderation vendors
- create and document processes to manage and scale our operations
- create training materials and lead training sessions for new content moderators
- act as a mentor and adviser for new moderators, and a point of escalation for complex situations
- moderate content and ensure consistency across products and decisions
- coordinate and scale on call team schedule, spanning time zones and regions
- work with data and product to drive efforts around data labeling and annotation including creating training materials, crafting rater tasks, and maintaining quality
Please note that this position involves exposure to sensitive and graphic content, including but not limited to harmful or offensive language, violent threats, pornography, and other graphic images. Mozilla is committed to providing specialized wellness support for teams working in content moderation.
Your Professional Profile
- You have 6+ years of professional experience, with at least 3 years experience implementing and enforcing scalable operations for content moderation on a public-facing social product or platform.
- You lead with empathy and are invested in knowledge sharing and learning from others.
- You work positively to cultivate an equitable team culture.
- You are dedicated to building a safer, healthier, and more inclusive online space for marginalized communities.
- You have hands-on time in an operations-heavy role including data labeling and annotation.
- You have owned complex projects from inception to completion in a hands-on role.
- You’ve implemented shared roadmaps in collaboration with cross-functional teammates.
- You take cross-functional needs into account and are a highly-skilled communicator.
Bonus points for:
- You have people management experience. While this is an individual contributor role, there may be future opportunities to grow a small team.
- You have experience building out a new operational support function for products or services from the ground up.