This content was reproduced from the employer’s website on October 10, 2022. Please visit their website below for the most up-to-date information about this position.
Figma is growing our team of passionate people on a mission to make design accessible to all. Born on the Web, Figma helps entire product teams brainstorm, create, test, and ship better designs, together. From great products to long-lasting companies, we believe that nothing great is made alone—come make with us!
As one of the first members of our Community Content Moderation team, you’ll have the exciting opportunity to help build processes and shape the team culture. You’ll be the voice of Figma as you help moderate files, plugins, and widgets, creating a safe and engaging platform for our community of creators. Interacting with our customers requires critical thinking, an investigative approach and outstanding communication skills. We’re looking for a great teammate who is technically inclined, has a passion for quality, creating great customer experiences, and is comfortable collaborating in a fast-paced startup environment.
This is a full time role that can be held from one of our US hubs or remotely in the United States.
What you’ll do at Figma:
- Support our growing community with files, plugins, and widgets, driving content moderation excellence, and promoting continuous improvement
- Create an excellent community support experience, maintaining healthy communication and timely engagement with our customers
- Analyze and carry out existing policies and community guidelines to ensure consistency with Figma’s values and business strategy
- Work closely with cross-functional partners, such as product, legal, and marketing, on complex issues to ensure we are providing swift and quality solutions
- Act as a point person for all things community content related, such as publishing flows, review guidelines, and standards
- Create and maintain our internal knowledge base, ensuring workflows, macros, and saved replies are made readily available
- Review Community content that has been triggered by internal systems or flagged by users to ensure the content adheres to established community guidelines and policies
- Serve as subject matter authority and escalation point for investigations, disputes, and communications for policy issues
- Keep informed of evolving trends in platform abuse, abuse tactics, regulatory requirements, and industry standards
- Collect, analyze and share community-related insights with relevant internal teams and collaborators
We’d love to hear from you if you have:
- 2+ years’ experience working in risk, fraud, trust and safety, or a content moderation environment preferably for a technical SaaS product
- Experience reviewing and handling sensitive content
- Strong interpersonal, verbal, and written communication skills
- Excellent judgment in carrying out policy decisions
- Outstanding organizational skills and strong attention to detail
- Handled projects that include cross-functional partners