It is vitally important to have an understanding of the body of legislation that governs relevant trust and safety policy creation. This section is focused on a U.S. regulatory perspective, as many global online service providers are based in the United States. Many countries around the world have similar legislation that is covered in this section, but it is always advisable to research a country’s specific laws and requirements that may apply and discuss implications with corporate legal teams.
Communications Decency Act, Section 230
Section 230 of the Communications Decency Act (CDA) of 1996 (CDA 230) essentially provides immunity from liability for both providers and users of an “interactive computer service” that publishes information provided by third-parties. This is particularly important for content moderation as it allows for services to moderate and remove content without assuming liability for content that was not removed.
The legislation (a part of the CDA) specifically states:
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
In other words, any online service that publishes third-party content is protected against a range of laws that might otherwise be used to hold them legally responsible for content that their users post. There are important exceptions however for certain criminal and intellectual property claims. Essentially, CDA 230 creates strong, broad protection for U.S. service providers, which allows platforms to provide the public with the means to broadly share information and access communication services.
The Stop Enabling Sex Traffickers Act (SESTA) is a significant amendment to CDA 230 that requires the removal of content that violates federal and state sex trafficking laws. It was packaged with the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) in 2018. This amendment removed liability protections otherwise protected by CDA 230 for knowingly assisting, facilitating, or supporting sex trafficking. For platforms, this legislation creates a strong disincentive to host content that promotes or advertises most kinds of sex work or other sex-related services; this is because platforms usually cannot reliably determine whether a sex worker is a trafficking victim, so must choose either to run the risk of violating FOSTA-SESTA or to prohibit any content that might violate FOSTA-SESTA.
The Children’s Online Privacy Protection Act (COPPA) is legislation aimed at protecting the privacy of children. It requires verifiable parental consent for the collection or use of any personal information collected from users of an online service who are under the age of 13. COPPA was passed in 1998 to address the rapid increase of online marketing specifically targeting children.
Digital Millennium Copyright Act (DMCA)
The Digital Millennium Copyright Act (DMCA) is a U.S. copyright law that addresses a number of issues specific to intellectual property rights and protections online. Subsection 512(c) of the DMCA protects online service providers from liability for copyrighted material posted or transmitted by users, if certain requirements are met.
The online service provider cannot have actual knowledge of a specific case of copyright infringement, or enough knowledge that infringement in that case is apparent. Upon being made aware of an alleged case of infringement, the provider must act expeditiously to remove or prevent access to the infringing material.
An online service provider must also have a designated agent responsible for receiving notifications of alleged infringement, with contact details registered with the Copyright office and publicly posted. This is required to ensure there is always a clear way for copyright holders to inform the provider of alleged infringement.
Under U.S. trademark law, it is unlawful to use a trademark in a manner that confuses consumers about the source or sponsorship of goods or services. In other words, it is prohibited to use a mark that suggests an association between your product and another business where none exists. Online service providers are expected to take action on reports of trademark infringement.
As we discussed in the Industry Overview chapter, many functions of trust and safety, including policy development, continue to evolve as the field matures. There are a number of approaches to creating and enforcing policies based on a wide-range of factors, from how the product or service is used to its scale, maturity, and core values. Core values and principles especially play a critical role in defining the policy’s direction as it continues to mature considering many factors including emerging content trends, legal and regulatory requirements, user behavior, and enforcement issues. Ensuring there are clear frameworks and scalable processes for developing, revising, and auditing policies and enforcement processes serves as an essential touchstone when working in this challenging area.
Authors│v1. James Gresham, Jan Eissfeldt, Dave Willner, Shannon Berri; v2. Jeff Lazarus, Harsha Bhatlapenumarthy
Contributors│Charlotte Willner, Kaitlin Sullivan
Special Thanks│Adelin Cai, Amanda Menking, Kaofeng Lee, Cathryn Weems, Pia Shah