Despite the benefits of transparency, operationalizing scaled reporting comes with significant challenges. To start, defining meaningful metrics can be a challenge in practice. Reporting also comes at a significant operational cost, particularly as the reports become more extensive and detailed. The design and implementation of a transparency report requires an ongoing investment of human and technical resources across functions, which can include operations analysts, data analysts, engineers, technical program managers, lawyers, policy and communications managers, and more. In addition, this level of investment may not consistently result in desired outcomes if there isn’t an equal investment in contextualizing the information to support public understanding. For trust and safety professionals, the process of building a report is therefore a balancing act between reporting metrics at scale and investing in the narrative around those metrics to build trust and credibility with the public. This section is a detailed overview of these key challenges.
Because of the lack of standardization across industry, trust and safety professionals developing transparency reports must make important decisions on defining relevant metrics for their platform. For example, in reporting on legal content takedown requests, a company may count the number of items of content they were asked to remove or might instead count the number of requests they received to remove content. This decision will lead to very different metrics because a single request may specify multiple items of content. In many cases internet companies have legitimate reasons to track and define different metrics; for example, because of unique product features or company policies.
However, these differences can reduce the practical value of transparency reporting for the public, by making “apples to apples” comparisons between companies impossible. In the comparatively new and evolving category of reporting on policy enforcement, these challenges may compound further with rising demands for companies to include metrics that go beyond content removal to cover other moderation actions such as algorithmic ranking, promotion, or demotion, if such play a role in their model. Metrics can create important accountability mechanisms, so it’s critical that companies set the right incentives by which to measure progress and improvement. To learn more about challenges in accounting granularity, a list of open questions on content transparency reporting can be found here, assembled by TSPA advisor and Director of Platform Regulation at Stanford University’s Cyber Policy Center, Daphne Keller explores these challenges and assembled a list of open questions on content transparency reporting.
Any company that seeks to publish a transparency report must also recognize the costs associated with measuring and reporting, which not only include designing and launching a report, but also maintaining and expanding it. When a team decides upon the relevant transparency metrics discussed above, they must also consider what systems, tooling, and workflows will be required to power the transparency report they envision. The design of any given reporting system may require that a company first invests in solving broader operational and infrastructure issues, such as redesigning tools to record the right data points, auditing manual workflows to reduce the opportunity for human error, and building dashboards and websites to display the data.
The less obvious cost of reporting comes from the continuous obligation to publish recurring updates, which may require two or more releases per year. Each launch cycle can take multiple months of combined effort, requiring an investment of human resources across teams and departments. For trust and safety teams that already operate with constrained budgets, these investments in transparency reporting directly compete with the core work of the team, creating a difficult balancing act. All of these costs can also be compounded unexpectedly as global legal requirements evolve; countries may introduce new reporting requirements that require a company’s data collection and reporting tools to be adapted, sometimes drastically, in a timespan that typical engineering roadmaps are not designed to accommodate. As more governments introduce their own mandatory transparency reporting laws, companies with limited resources and time may have to reduce their investment in other areas, such as improving the precision of their enforcement systems, in order to comply with those demands.
Transparency should therefore not be considered a “one and done” cost, but rather a critical and ongoing investment, requiring dedicated resources to both maintain the existing standard of reporting and innovate upon it.
Transparency reports are, by their nature, designed to help the public understand how companies operate, but that doesn’t mean transparency reports are always easy for the public to understand. Good reporting requires a conscious effort to ensure that disclosed metrics are not only precise and meaningful, but also easy for the general public to interpret. Numbers presented without the appropriate context and background information can result in conflicting or entirely inaccurate interpretations. For example, Facebook removed 31.8 million pieces of content for violating their adult nudity and sexual activity policies from January to March 2021, representing a 13% increase from the previous three months. With a number that large, the public might reasonably wonder whether Facebook has a significant and growing pornography problem. That interpretation falters when you read the contextualizing information Facebook presents alongside that number: “In Q1, we adjusted our media-matching technology and were able to take action on old, violating content.” Furthermore, Facebook’s accompanying prevalence measurement explains that this content has a 0.03-0.04% prevalence rate—that is, of every 10,000 views of content on Facebook, only 3-4 of them would have been of content containing adult nudity or sexual activity. Contextual metrics and information give the public a better picture of the situation. Therefore, a cross-functional, proactive effort between a company’s communications team and trust and safety team is critical to ensure that this context is understood and communicated alongside the report.
The challenges noted above should not dissuade companies from pursuing transparency reporting; rather, they are necessary realities companies must consider when designing and publishing these critical reports. As many governments around the world look to mandate different forms of reporting, it is more important than ever to invest in building a sustainable reporting infrastructure. Whether voluntary or mandated, transparency reporting will continue to be a valuable opportunity for trust and safety professionals to pull back the curtain on these complex operations—and create accountability and legitimacy with the public on the decisions and systems that govern their digital lives.
- Transparency Center, Electronic Frontier Foundation
- Transparency Reporting Index, Access Now
- Case Study #3: Transparency Reporting, New America
Authors│Jan Eissfeldt, Kate Jung
Contributors│Daphne Keller, Charlotte Willner, Dave Willner
Special Thanks│Harsha Bhatlapenumarthy, Kaofeng Lee, Megan McClellan, Nancy Stone