Trust & Safety Coffee Hours are one-on-one conversations designed to provide mentorship to current and aspiring trust and safety professionals. Hosts are volunteering their time and expertise to offer advice to job seekers and T&S professionals. Hosts are not conducting formal interviews, offering referrals, providing product feedback, participating in interviews and/or user research, and/or acting on behalf of their employers or TSPA. If you schedule time and you’re not a current or aspiring T&S professional, a host reserves the right to cancel the appointment without notice.
Here’s how it works:
- Search for a host based on what you want to talk about. Sign-up using the host’s “Schedule a chat” link.
- Be respectful of our hosts and their time. Remember the TSPA Code of Conduct applies to all resources, spaces, and programming–even if you’re not a TSPA member.
- Because these one-on-one conversations are brief, come prepared with questions and have a specific goal in mind. If you’re just getting started in T&S and want to learn more, we recommend reviewing Careers in T&S and the T&S Curriculum before scheduling.
If you’re a TSPA member looking for a way to help other folks on their journey in T&S, sign up to be a T&S Coffee Hours host!
Aaron Berman is a technology policy and national security leader focused on making online spaces safe and secure. He has held a series of senior roles first at the Central Intelligence Agency and now Meta, including more than 10 years leading large teams. His substantive expertise is in mis- and disinformation, cyber threats, counter extremism, and the Middle East and South Asia. He has a Masters in Public Policy from Georgetown and BA from Williams College in computer science and music. He’s based in Seattle.
As Pinterest’s former Head of Policy, Adelin led the team that developed the company’s principles and core values around content moderation, covering a range of issues from hateful speech to medical (mis)information to dank memes. Prior to Pinterest, she ran Twitter’s Legal Ads Policy team, guiding policy and operations for Twitter’s self-serve and international advertising products.
As the Lead Software Engineer on the Growth and Trust & Safety team at Yik Yak, Adrian was able to help scale Yik Yak to 1 million DAU’s, build internal trust & safety infrastructure, and close a seed round of $6 million dollars. With his background in engineering, he has had the pleasure of working alongside the brightest minds in the industry throughout his career and has found his passion for building a better internet from his time at Yik Yak to his current role at Red Hat.
With a PhD in Communication, and as a trust & safety professional with over a decade of experience in industry and academic research, Alex has led and conducted product, policy, and strategic research on key societal harm topics like misinformation, polarization, content quality, and crisis response. Alex is an expert on social and informational needs in online communication platforms and emergent technologies, and their research spans both qualitative (interview & ethnography) and quantitative (surveys, experiments, & computational social science) methodologies.
Alicia recently joined Twitch, a livestreaming platform owned by Amazon, as the Senior Director for Content Policy and Policy Outreach. Prior to coming to Twitch she worked at YouTube and Google in the Product/Compliance and Public Policy teams for four years. In her time at YouTube she helped create resources to support families like the YouTube parental supervised experience and Best Practices for family creators as well as launching compliance programs for new laws like the Age-Appropriate Design Code. While she has worked in industry T&S for a little over four years her career has spanned almost 20 years of work to support youth and caregivers on and offline. Before working in the tech industry she was an academic and media literacy educator, and co-authored the book Parenting for a Digital Future: How Hopes and Fears About Technology Shape Children’s Lives. She has appeared in press including NPR, the BBC, the Wall Street Journal, Fast Company, and more.
Bethany started her tech policy career as a Fellow on the Twitter Public Policy team. While there, she discovered her love for Trust and Safety and was hired as a Site Policy Specialist. After almost three years, she joined Pinterest’s Content Policy team as a Content Policy Manager and then Senior Manager. She’s currently a Content Policy Manager at Niantic, the company that created PokémonGo. She is located in New York City EST.
TSPA Members Only
Brian’s mission is to help educate others in how to gain trust and better understand your customers. He’s a Trust & Safety leader specializing in building scalable teams and programs from scratch. Brian’s career has been built by simplifying complex ideas and incessantly asking, “Why?”. Currently, he’s the Head of Trust & Safety at Dodgeball where he is an internal advocate and educator for the Trust & Safety community.
Cathryn Weems is the T&S Coffee Hours program mentor. She’s been working in Trust & Safety for over 20 years – before it was even called Trust & Safety! Originally from the UK, she moved to the San Francisco Bay Area over 20 years ago when working at Yahoo! but has worked at a variety of other tech companies (Flickr, YouTube/Google, Dropbox, and Twitter) throughout her career. She has experience in child safety, people management, operations, and policy development, but most recently has specialized in leadership, global content regulation, and legal requests from law enforcement/government or rights holders. Her most recent role is Senior Director, Legal Policy at Twitter where she leads a global team of more than 125 people, along with two vendors, handling a variety of legal requests and regulatory compliance matters sent to the company. She is passionate about promoting TSPA (& T&S as an industry in general), as well as connecting people across the industry and is looking forward to chats!
TSPA Members Only
Christopher manages the Content Moderation Ops team for Mercari, a peer to peer e-commerce marketplace. In his time in the T&S space, he would consider his major areas of expertise to be vendor management/alignment, people management, and working to organize knowledge. Before his current role, Christoper worked in enhanced due diligence for a bank, then was working in the BSA/AML space for Mercari for about a year, so he’s am happy to discuss the differences and relationships between BSA compliance work and Trust and Safety/Content Mod. Christopher has also performed enhanced due diligence work on high risk customers, and worked on numerous money laundering and consumer protection related engagements. Christopher is currently located in Albuquerque, New Mexico (Mountain Time, USA) and working remotely so he’s happy to commiserate about working way outside a tech hub, but is afraid he won’t be able to offer too many solutions. He has two school-age kids and has been in his current role for about four years. He looks forward to chatting with folks!
Based in Dublin, Daniel is a Project Manager focussing on harmful content at Meta for more than four years. Working with a variety of cross-functional teams, Daniel’s main areas of expertise include improving media detection and machine learning processes for many T&S problems. Influencing technical teams and product goals and supporting new hires in T&S are some of the key skills Daniel has. Daniel holds a Bachelor’s Degree in Applied Psychology from University College Cork (UCC). All from the Technological University Dublin (TUD) he also hold: a Masters Degree in Public Relations, a PostGrad Certificate in Data Science and a Masters of Science in Computer Science. Before Meta, Daniel worked in the youth-focussed NGO sector in campaigns and comms for four years.
TSPA Members Only
Dave Willner started his career in at Facebook helping users reset their passwords in 2008. He went on to join the company’s original team of moderators, write Facebook’s first systematic content policies, and build the team that maintains those rules to this day. After leaving Facebook in 2013, he consulted for several start ups before joining Airbnb in 2015 to build the Community Policy team. While there he also took on responsibility for the Quality and Training for the Trust team. After leaving Airbnb in 2021, he began working with OpenAI, first as a consultant and then as the company’s first Head of Trust and Safety.