Industry Response & Transparency Efforts

Transparency has become a major request from lawmakers and other stakeholders. Due to its importance to understanding how decisions are being made by online platforms, different requirements have been imposed by recent laws and recommended by other collaborative initiatives to involve stakeholders in helping companies to ensure users’ safety on platforms. This section provides an overview of the most relevant regulatory frameworks, principles, and collaborative initiatives on transparency that T&S teams should consider.

Platform Transparency Frameworks

The Manila Principles

In response to trends with more stringent, broadly scoped, and complex platform liability regimes that began taking shape in the 2010s, industry organizations—led by the Electronic Frontier Foundation—collaborated to produce the Manila Principles on Intermediary Liability in 2015. This framework includes the following recommendations for regulators when developing platform intermediary laws, and for governments and platforms when operating within intermediary liability regimes:

  • Intermediaries should be shielded from liability for third-party content. This includes both the scenario of hosting content, or restricting or moderating content (proposing a liability regime similar to the U.S.’s Section 230 of the CDA); and specified, where platforms have not “modified” the content in some way (meaning transforming or editing the original content in some way, rather than removing or restricting it). The authors of the Manila Principles also specified within this principle that platforms should never be required to adhere to proactive monitoring requirements;
  • Content must not be required to be restricted without an order by a judicial authority—in addition to the near-blanket liability regime proposed in the first principle, advocating for a court-adjudicated system atop it, wherein governments could still compel illegal content to be removed through a notice-and-takedown system;
  • Requests for restrictions of content must be clear, be unambiguous, and follow due process—introducing a variety of requirements for judicial authorities submitting requirements, to ensure that the request is lawful, and sufficiently detailed enough for the platform to act upon;
  • Laws and content restriction orders and practices must comply with the tests of necessity and proportionality—establishing that any requests made by government authorities are limited to the scope of the specific request made, such as the specific piece of content concerned, and the jurisdiction where the law applies;
  • Laws and content restriction policies and practices must respect due process—recommending that platforms have the ability to appeal requests through an appropriate legal process, and users have the ability to appeal the consequences of a platform’s action through a corresponding redressal mechanism or legal process;
  • Transparency and accountability must be built into laws and content restriction practices—recommending various practices for both governments and platforms (including that both publish clear and timely terms and policies, transparency reports, and maintain objective oversight mechanisms); that platforms publish notices of removed content; and that systematic reviews of intermediary liability-related rules and guidelines take place.

The Santa Clara Principles

In response to regulatory, industry, and public concerns over platforms’ inconsistent moderation practices, NGOs and academics within the T&S ecosystem and in collaboration with a group of large social platforms, developed the Santa Clara Principles in 2018 to recommend best practices for platforms to maintain transparency, accountability, and fairness in their policies and operations. The Santa Clara principles are divided into “Foundational” principles— essentially, guiding values— as well as “operational” principles related to establishing mechanisms for transparency. 

The Foundational Santa Clara Principles are:

  • Human rights and due process—recommended practices to ensure platforms can carry out unbiased and accurate review and action on content, including publishing information to users about how moderation decisions are made;
  • Understandable rules and policies—“clear and precise” rules and policies for users should be published on the platform in an “easily accessible” place;
  • Cultural competence—recommending that language knowledge, cultural context, and diversity among a platform’s moderator teams is sufficient to allow moderators to make accurate and unbiased decisions on content, and that users are able to access information related to content moderation in a language they understand;
  • State involvement in content moderation—where possible, platforms should notify users when their data has been requested by a government entity;
  • Integrity and explainability—platforms should ensure the accuracy and reliability of their content moderation systems, operational processes, and policies, such as through third-party auditing.

The Operational Santa Clara Principles are:

  • Numbers—introducing a set of recommended metrics related to actions on content, automated actions taken, and government requests received for platforms to include in their transparency reports;
  • Notice—recommending that users are consistently notified of actions taken on their content or account, and specific details that should be included;
  • Appeal—recommending that platforms consistently make an appeal mechanism available to a user whose content has been actioned, and best practices for how to ensure a fair appeal system.

Recent Developments & Regulatory Frameworks

Platforms have been improving their transparency practices over the past few years in different ways. In some cases, platforms have started publishing blog posts explaining in plain language any changes in their community policies or on any specific important event they consider their users should know about. Most of the platforms have also started publishing transparency reports to let their users know how they are moderating certain types of content and conduct. In addition, these transparency practices have also been included as obligations for platforms in recent laws. For example, article 13 of the EU Digital Services Act on transparency obligations for providers of intermediary services establishes that “Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period.” This article also provides an explanation of the specific information these reports must include.

When processing requests from governments, such as user disclosure requests, platforms also aim to follow any applicable legal provisions that aim to help them maintain transparency and fairness, and uphold users’ privacy rights and human rights through the processing of legal requests. For example, platforms may generally disclose to users when government officials have requested their data for an investigation where jurisdictional laws allow, and where there has been no legal basis for non-disclosure to the user established for the associated criminal investigation. 

Many user disclosure-related laws, including the U.S.’s Stored Communications Act, include requirements related to transparency, detailing if and when platforms should notify a user that their data has been requested by a government entity or law enforcement. Laws may also give users a mechanism to challenge the order for the disclosure of their data in court, once they are notified of the request made for the disclosure of their data. Within the SCA, for example, the platform may be required to provide notice to the user depending on the type of court order that has been obtained. The SCA also makes provisions for a government entity to obtain an additional non-disclosure order to compel the platform not to notify the user in certain scenarios where it could jeopardize an investigation. 

Platforms also use established industry transparency practices, including transparency reporting, to provide data and insights related to government requests–for example, highlighting where a region has requested a disproportionately large number of requests for user data relative to its population size, or a disproportionate number of requests related to a specific category of content. Transparency reporting requirements for platforms related to legal requests have begun to be incorporated into regulations (such as the EU Digital Services Act) but, historically, it has generally not been legally mandated for platforms to publicly disclose data about the legal requests they receive. Larger social media platforms, led by early pioneer Google in the early 2010s, established legal requests reporting as an industry-wide practice, in part to shed light on jurisdictional trends and challenges with compliance related to the disclosure of user information. 

Other industry-wide initiatives have launched to promote transparency around legal requests from government entities, such as the Lumen Database, which aggregates legal request data and details such as compliance outcomes from participating tech platforms.

Conclusion

Legal and regulatory considerations are increasingly significant for both established and emerging platforms and services. The general overview above intends to help young professionals interested in the field, the interested public, and founders confronting such issues for the first time to navigate some foundational aspects of this complex, and fast evolving, area of the industry. This is especially important as sovereigns pursuing a wide range of public policy objectives introduce regulations going beyond the traditional core issues of intermediary liability, privacy and data protection, copyright and intellectual property, and content moderation. As the density of relevant material increases, the field is inevitably becoming somewhat less accessible and the authors hope the considerations chapter helps interested readers pursue themes they become interested in more deeply.

Resources


Acknowledgements

AuthorsMaia Levy Daniel, Rhett King, Jan Eissfeldt
ContributorsJess Miers, Wesley Rich
Special Thanks│Brian Fishman, Catherine Silva, Karen Maxim, Amanda Menking, Harsha Bhatlapenumarthy