Scaled Intel Collections Misinformation Analyst, Trust and Safety

  • Individual Contributor
  • TSPA Members
  • CA, GA, TX, NY, Washington D.C., US
  • This position has been filled
  • Experience level: 4+ years

Website google Google

This content was reproduced from the employer’s website on October 10, 2022. Please visit their website below for the most up-to-date information about this position.

Minimum qualifications:

  • 4 years of experience working in Trust and Safety, policy, government, or the technology sector.
  • Experience with the Intelligence Cycle or using Intelligence products to drive scaled abuse-fighting programs.

Preferred qualifications:

  • Experience with understanding and synthesizing user needs, writing product requirements, and prioritizing features.
  • Experience working with third-party providers, vendor, or contractor teams.
  • Experience working with Trust and Safety enforcement or policy teams.
  • Experience assessing, analyzing, and resolving complicated issues, and distilling that complexity into concise, actionable concepts.
  • Expertise in dis/misinformation abuse trends.
  • Familiarity with Applications Script, SQL, Python, or another scripting language, as well as familiarity with Google Data Studio.

About the job

Trust & Safety team members are tasked with identifying and taking on the biggest problems that challenge the safety and integrity of our products. They use technical know-how, excellent problem-solving skills, user insights, and proactive communication to protect users and our partners from abuse across Google products like Search, Maps, Gmail, and Google Ads. On this team, you’re a big-picture thinker and strategic team-player with a passion for doing what’s right. You work globally and cross-functionally with Google engineers and product managers to identify and fight abuse and fraud cases at Google speed – with urgency. And you take pride in knowing that every day you are working hard to promote trust in Google and ensuring the highest levels of user safety.

As an analyst within Trust and Safety’s Intelligence Collection team, you’ll be the glue that helps keep Intel Collections connected to Google’s most pressing dis/misinformation priorities. You’ll help identify new intelligence sources to provide information advantage to a broad set of partners and ensure existing sources reach maximum impact. You’ll use the intelligence cycle to understand partner needs and collaborate with other intelligence functions and teams to keep Google aware and ahead of the latest trends and developments affecting our users. You’ll use your experience building scalable programs and data and scripting facility to build scalable solutions supporting multiple teams and stakeholders.

At Google we work hard to earn our users’ trust every day. Trust & Safety is Google’s team of abuse fighting and user trust experts working daily to make the internet a safer place. We partner with teams across Google to deliver bold solutions in abuse areas such as malware, spam and account hijacking. A diverse team of Analysts, Policy Specialists, Engineers, and Program Managers, we work to reduce risk and fight abuse across all of Google’s products, protecting our users, advertisers, and publishers across the globe in over 40 languages.


  • Build and manage scaled intelligence collection initiatives on dis- and misinformation, leveraging support from a 24/7, global team and external Intelligence providers.
  • Maintain expertise in new dis/misinformation trends, global hotspots, and disinfo anti-abuse efforts across industry and research organizations. Leverage this expertise to build new scaled intelligence collection workflows that keep Google’s Intelligence Collections offerings on the cutting edge.
  • Understand Google Trust and Safety dis/misinformation landscape, including cross-functional initiatives, policies, counter-abuse workflows, and current gaps. Leverage knowledge of partner intelligence needs to ensure Intelligence Collection programs address Google’s most pressing priorities.
  • Provide custom intelligence reports/briefs for peers and executive stakeholders.
  • Review or be exposed to sensitive or violative content as part of our core role.