Trust & Safety Coffee Hours are one-on-one conversations designed to provide mentorship to current and aspiring trust and safety professionals. Hosts are volunteering their time and expertise to offer advice to job seekers and T&S professionals. Hosts are not conducting formal interviews, offering referrals, providing product feedback, participating in interviews and/or user research, and/or acting on behalf of their employers or TSPA. If you schedule time and you’re not a current or aspiring T&S professional, a host reserves the right to cancel the appointment without notice.
Here’s how it works:
- Search for a host based on what you want to talk about. Sign-up using the host’s “Schedule a chat” link.
- Be respectful of our hosts and their time. Remember the TSPA Code of Conduct applies to all resources, spaces, and programming–even if you’re not a TSPA member.
- Because these one-on-one conversations are brief, come prepared with questions and have a specific goal in mind. If you’re just getting started in T&S and want to learn more, we recommend reviewing Careers in T&S and the T&S Curriculum before scheduling.
If you’re a TSPA member looking for a way to help other folks on their journey in T&S, sign up to be a T&S Coffee Hours host!

Laura Markell
Laura has 14 years of experience in trust & safety, online advertising, and content policy, as well as four years of experience in people management. Since July 2022, she’s been a Senior Program Manager in Google’s Trust & Safety team, spearheading strategic org-wide policy programs within Google’s highest priority trust risk areas (Kids & Family, Civics, Health, Information Quality).
TSPA Members Only

Maggie Kerr
Maggie has been in Trust & Safety for over 12 years between Facebook (Meta), DoorDash, and TSPA. Within that time she’s worked in Operations, Content Moderation, Investigations, Programs, Product Strategy, Analysis, and Product Policy. Her expertise includes people managing, interviewing prep, program management, strategic partnerships, resume review, and values-based job hunting.

Mark Reitblatt
Mark is a software engineering manager with seven years of professional experience (five as an Individual Contributor), joining Facebook (now Meta) directly after completing a PhD in Computer Science at Cornell. Before his PhD, he worked brief stints at a number of companies, including doing pre-silicon verification at Intel. Mark has been working in the Integrity org at Meta since 2018. In that time, he’s been primarily working on Crisis Response process, programs, and tooling. In the course of responding to crises, he’s had the opportunity to touch almost every aspect of Trust and Safety, ranging from policy, operations, product management, scaled review, measurement, ML and automation, to internal tooling and infrastructure. As a general rule, if it can break and cause an issue in a crisis, he’s probably had to deep dive and fire fight on it. Mark is mainly looking to host other engineers or engineering managers interested in or working in the T&S space, but happy to talk to anyone. He’s currently working remotely out of Seattle, but was previously in the SF Bay Area for several years.

Matt Ramirez
Hi friends! Matt has spent much of his career working in electoral politics, capping his public service experience off as an advisor to Speaker Pelosi. He’s worn a number of hats under policy, communications, strategy, and coalition management. While on Capitol Hill, Matt served as Vice President to the LGBTQ Staff Association, where he worked to hire over 100 staffers to various offices via interview prep, resume review, and other professional development resources. Now, he works with Meta’s content policy organization on transparency and policy development. He lives in San Francisco and has two formal years of trust and safety experience, and nine years of formal political experience.

Matthew Soeth
Matt serves as Head of Trust & Safety at Spectrum Labs AI supporting the #TSCollective community as well as many of our partner companies across gaming, dating, and social apps. He also serves as an advisor for All Tech is Human focusing on responsible innovation in technology. As a former member of the Global Trust & Safety Team At TikTok, Matt worked on cross functional teams to develop safety resources on that platform around bullying & harassment, suicide and self harm, hate speech, election integrity, and media literacy education tools. Prior to entering tech, Matt spent 15 years in education working with diverse high school populations as a teacher and administrator. He is a co-founder of the nonprofit #ICANHELP, and co-creator of #Digital4Good. In 2015, Matt helped start the Social Media Helpline, the first social media helpline for schools in the United States.

Michael Swenson
Michael Swenson most recently led the Policy Programs team at Discord, where they oversee a variety of programs that facilitate direct connections between Discord’s many safety and policy-focused teams with users, moderators, academics, and industry practitioners. Their team engaged in dialogue, research, and education surrounding topics of platform governance, policy acumen and praxis, as well as education and co-creative approaches to facing the platform’s toughest safety challenges. In their prior work, Michael worked in the digital marketing space and in academia, and also has spent many years in online community spaces, helping lead dozens of communities most recently on the Reddit platform. They studied religion and ethics in modern society at Duke University, where they researched religious and political extremism through the lens of post-colonial era critical social theory and anthropological approaches to communities of practice. They also spent several years doing Computer Science work at the Georgia Institute of Technology, where they focused on human-centered design, ethics in AI and ML, and educational technology at scale.

Nitesh Shivapooja
Nitesh has been in the Trust & Safety space for over 10 years, all at Google. He has had the opportunity to work as an IC and manager, and has built teams from the ground up. Nitesh has also developed and executed anti-abuse strategies for various categories of products including Search, G+, Google Drive, Calendar, and others. His team also works on identifying various abuses ranging from Chia mining on Cloud, scaled abuse like spammy notifications/emails and sensitive and egregious content like terrorist content, sexually explicit content, and hate speech.

Nivedita Mishra
Nivedita has expertise in: Internet Trust and Safety, Content Moderation, Operations, Mentoring, Team Management, Vendor Management, Project Management. She has more than 15 years experience in the tech industry with 12+ years at Meta across different T&S Operations teams, almost two years at Deloitte on business analysis and content management, and almost one year and two months at Magna Infotech on content management. Nivedita is located in Hyderabad, India (India Standard Time).

Paola Maggiorotto
Paola Maggiorotto currently works at Teleperformance as T&S Process Director, and is based out of Medellin, Colombia. Originally from Italy, she spent 12 years in Dublin, Ireland. Her last job there was at Meta as T&S Global Process Manager (specialized in violent, graphic and very egregious content), leading a team of Project and Program Managers, scaling solutions helping prevent online and real-world harm by building fast, scalable support systems and optimizing processes, to address safety-related incidents on Facebook and Instagram products, reducing risk and by influencing others to collaborate on cross-functional initiatives. Prior to that, Paola worked at Microsoft as EMEA Technical Support Manager and Search Editorial Specialist ensuring a high-quality and safe user experience on Microsoft Advertising and partner sites. She has 10 years of experience in T&S, through people, project and program management, stakeholder engagement, operations and communication. Paola is also super passionate about DEI and human rights and have been involved in leading multiple DEI and ERG global initiatives, focusing on guaranteeing a gender and culturally balanced recruiting pipeline/workforce and advocating for more inclusion of women and underrepresented categories in technology and the corporate world.

Paul Janzer
In 2005, Paul was hired as Facebook’s first customer support rep. Beyond providing support to a rapidly growing user base, he was also asked to help build out the team (hiring, management, etc.) and establish operations strategy (workflows, policies, tooling, goals, etc.). By 2007, Paul was responsible for the entire Bay Area User Operations team (~80 people), and, shortly thereafter, supported an expansion to three other offices as part of the global leadership team. When he left UO in 2011, it had grown to over 250 employees. Paul spent much of his time in Facebook UO focused on Trust & Safety, leading teams that were responsible for issues such as spam, account integrity, authenticity, pornography, hate speech, and harassment. He worked with these groups to establish content policies, continuously improve workflows/tooling, and evolving models to be more proactive. After leaving UO, Paul transitioned to Product Management. He worked for four years as a PM on Facebook Growth and four years as a Product Lead at Airbnb. In these roles, he interfaced regularly with T&S teams to ensure products prioritized user safety and wellbeing. Since mid 2020, Paul has been taking time off to be with his family and work on some of his own projects. He’d love to be of help to people who are still on the front lines, tackling difficult T&S issues across the industry. Paul is located in the San Francisco Bay Area.

Rolando Vega
Rolando currently leads the Legal Response Team at Discord. He has eight plus years of experience working in Trust & Safety and his career has focused on legal compliance in relation to law enforcement, child safety, intellectual property, and privacy operations. He has worked to develop processes and teams at companies like Pinterest, TikTok, Snap, and Twitter. His previous background in legal support has aided him in developing a career in Trust & Safety legal operations, and his passion for preventing real-world harm has allowed him to continue to grow as a leader in this space.

Rosanna Rafel-Rix
Rosanna has been at YouTube for over two years, and is a Program Manager for YouTube Trust and Safety, having formerly been Policy Enforcement Manager tackling Hate Speech for YouTube. She held a series of roles in the third sector prior to joining YouTube, working on antisemitism and racism, particularly online for almost a decade. She currently works cross-policy with external organizations, managing the Trusted Flagger program for YouTube. She has a Masters in Criminology, focused on Hate Speech, from Birkbeck College, University of London and a Masters in Near and Middle Eastern Studies from the School of Oriental and African Studies. She is based in London.
TSPA Members Only