Why External Engagement Is Necessary

Why do platforms need to do external engagement?

Trust & Safety teams often deal with extremely nuanced, culturally-specific, and sensitive work related to content, behavior, and actors on platforms. Decisions that these teams make— ranging from the policies they create to the way those policies get enforced—can have a major impact on people’s lives both on and offline. Understanding a range of user behaviors from global actors who are engaging with a product, service, or platform is critical to designing effectively for them. This work cannot be done solely through quantitative analysis of internal data or reviewing outputs from user surveys. One additional way Trust & Safety teams can incorporate different perspectives into their work is to engage with a wide variety of external stakeholders.

Engaging with external stakeholders can also build up internal and external legitimacy for Trust & Safety teams’ efforts. As these issues become more scrutinized externally, internal teams increasingly want to know that the work they are planning for and about to commit resources to is validated by external experts on the topic. These teams can range from individual teams or product managers, all the way up to executives at an organization. Demonstrating engagement  with external stakeholders and that recommendations are backed by their insights allows Trusty & Safety teams to make a stronger case for a particular approach within their organization. Externally, regulators are also beginning to demand more transparency into companies’ decision-making processes and asking for evidence of stakeholder engagement. Demonstrating that engagement will be an important legitimacy win for Trust & Safety teams.

With which external groups do they engage?

By collecting and integrating external input from outside parties, Trust & Safety professionals can gain insight and knowledge from a few different types of stakeholders.

Civil Society

Many platforms operate globally while their employees are based primarily in a small number of locations. Given these limitations, it is critical to understand how a team’s policies may work (or not work) in a given cultural context. Trust & Safety teams can partner with local organizations across the world to get a better sense of how safety issues can manifest differently across regions. Working with local partners on these issues can also help teams understand how to operationalize policies to ensure that the spirit of the policy is being effectively enforced despite regional variation in speech and other norms.

Civility society organizations (CSOs)—particularly NGOs—represent the interests of multiple communities, especially for marginalized populations around the world, and they support activism around particular topics of interest. CSOs develop deep expertise in the needs and opportunities of diverse communities. There are a growing number of CSOs dedicated to specific areas of harm, ranging from child sexual exploitation to eating disorders. Some examples (though not exhaustive) include Thorn, Papyrus, and the Centre for Social Research. These organizations help provide voice to those who have been directly impacted by these harms, while also organizing for change on specific issues relevant to Trust & Safety teams. Engagement with these organizations includes soliciting feedback to refine policies, receiving resources to help teams better understand issues, or even partnering directly to create new safety products for a platform.

For example, Meta regularly engages with CSOs to understand the impact of their products and policies on local communities. The company’s Stakeholder Engagement team is tasked with presenting these proposals to a wide multidisciplinary range of civil society members ranging from NGOs to activists to grassroots organizations. These engagements ensure that the company does not encounter any blind spots after launching, which might inadvertently harm users in the process.

Academic Researchers & Experts

Most problems faced by platforms have been studied in varying capacities by academic institutions and independent researchers. In fact, some aspects of communication technologies and their impact on society have been studied for decades in certain academic disciplines. Scholars can often provide extremely helpful insights into problems faced by Trust & Safety teams, particularly by conducting novel research—that could inform a platform’s strategy or investigations—or by synthesizing a variety of other research into a review to inform knowledge in a given topic area. While most researchers operate from inside traditional academic institutions, there are also many researchers working within NGOs, in civil society groups, as independent researchers, and collectively as community researchers.

For example, the Twitter Trust & Safety Council historically would seek feedback from academic experts and researchers in the space of bullying and harassment as they sought to curb these abuses on the platform. The perspective of this stakeholder group helped inform new policies, gut-check new product developments, and ensure that the final outputs were grounded in an expert-backed approach to the issue. When Twitter became X, the Council was disbanded and future policies and products could no longer benefit from the expertise of its former council members.

Policymakers

As new technology emerges, regulation often follows. Regulatory bodies around the globe are working to manage online harms through new regulation. Allowing policy makers the opportunity to better understand an organization and its approach to Trust & Safety will help ensure there is more informed and effective regulation.

Regulators are interested in what tangible work companies and their broader industries have done to engage issues relevant to the regulator, what is being done to further improve, and if these point to a need for government intervention through new policies and laws or whether industry is adequately addressing issues without the need for new policies or laws. Public Policy teams may also ask Trust & Safety stakeholders to directly engage with regulators in closed door or open meetings with regulators to communicate Trust & Safety related work, progress, and vision of the platform. 

Trust & Safety engagement with regulators will typically occur through Public Policy or Government Affairs teams who are responsible for maintaining and developing relationships with government institutions and elected officials. Proactively establishing relationships, building rapport, and educating regulators by working with Public Policy helps ensure regulatory stakeholders have the information they need to develop balanced and effective policies and legislation. Trust & Safety related reports—such as data transparency reports—and press releases are also often leveraged to communicate to government stakeholders.