The rights that creators or owners have over their creative works are defined through the concept of intellectual property (IP). Intellectual property includes:
- Copyrights, which define artists’ exclusive protections and rights (such as the rights to reproduce, distribute, publicly perform, or edit their works), and applies to creations including visual artwork, literature, and musical works;
- Trademarks, which are typically associated with registered businesses, and may include word marks such as brand names or slogans, or design marks such as logos;
- Patents, which define protections related to inventions.
Infringements and Platform Responsibilities
IP issues arise frequently on online platforms hosting user-generated content. Online content may originate from unknown or anonymous creators, and may be widely re-shared across platforms in its entirety, or else modified–clipped, edited, commentated upon, or otherwise transformed in some way, including through multi-layered content formats such as memes, sound bites, and video clips spliced together into new narratives.
IP owners may provide notice to an intermediary platform and request that the platform halt these unauthorized uses of content to which they hold some degree of rights or ownership, and expect the platform to take actions to remove the content or otherwise resolve the dispute. This precedent for platforms to act as intermediaries and take actions to resolve disputes has been encoded into many countries’ IP laws. A key example is the United States’ copyright law amendment, the Digital Millennium Copyright Act (“DMCA,” 1998, 17 U.S. Code § 512). The DMCA establishes a “safe harbor” protection from secondary liability for infringement for platforms if they meet certain requirements, including:
- Accommodating “standard technical measures” (referring to rights holders’ abilities to monitor the platform and notify the platform of infringement);
- Managing a structured “notice-and-takedown” system, including receiving and taking action (“expeditiously” removing infringing content) on valid notices of infringement from rights holders on an ongoing basis, as well as actioning any counter-notices submitted to challenge previous takedowns (e.g. restoring content where infringement is challenged);
- Having a designated DMCA agent to manage notices; and,
- Adopting and implementing a repeat infringer policy.
While U.S. trademark law does not establish any defined system similar to copyright with specific requirements for notice submission and processing, platforms in general have developed similar intake processes, notice-and-takedown systems, repeat infringement policies, and other parallel components of their copyright systems, in order to standardize IP policies, streamline their operations, and manage complaint volume.
Countries have largely adopted similar IP law frameworks; however, each country has defined its own distinct set of laws. When assessing notices and counter-notices, platform teams should consider the jurisdictions of both the IP owner and infringing users and applicable IP laws as various nuances will apply, potentially including any international treaties determining which nations have “opted in” to certain common IP frameworks, and certain cases where rights may or may not extend across borders.
Assessment of whether or not content is infringing is highly nuanced, and copyright and trademark infringement cases have subjective and unpredictable outcomes. Two particular concepts within copyright law which further make infringement assessments particularly complex are the concepts of fair use (applicable to U.S. law) and fair dealing (incorporated into various countries’ copyright laws)—covering a wide array of unlicensed partial or full usages of a work (such as use for research, or the replication or transformation of certain works) which are judged to be “fair” and not to be infringing when challenged in a court of law.
Within U.S. copyright law, the rightsholder, rather than the platform, is responsible for ensuring they have met all applicable requirements when submitting an infringement notice, and must provide a sworn statement and signature to affirm that they are the rights holder and that their complaint is valid. More critically, however, U.S. rightsholders hold the overall responsibility of monitoring for content that infringes upon their copyright and providing notice to the platform, without the platform having any specific monitoring requirements encoded into law.
Platforms may aim to incorporate some methods of limited infringement assessment or due diligence into their operational policies when assessing the validity of a notice— such as requesting any available evidence or documentation related to a user’s copyright (e.g., a link to where a digital work is hosted). However, they generally do not thoroughly assess infringement to the degree that counsel or a judge would. Platforms aim to act merely as intermediaries, and generally move forward with taking action when a notice or counter-notice has been received and confirmed to be valid.
Platforms which receive a larger volume of IP complaints (particularly those which host video or audio content) may nonetheless take on some proactive monitoring and removal burdens through their operational processes, identifying and actioning identified infringing content even before it is reported to them. Frequently this is achieved through the implementation of software tools to automatically detect and remove content, including image hash matching tools to detect copyrighted or trademarked images, and audio detection software programs such as Audible Magic that can identify copyrighted music. Platforms may also implement user or partner-facing tools allowing rights owners to monitor for “matches” to their IP on the platform, and file complaints to them more quickly and directly, with YouTube’s copyright management tool suite as one example.
Global Frameworks
While many global IP frameworks historically followed the DMCA in providing platforms with more significant protections from liability, more recently amended copyright laws are beginning to introduce more liability risks to platforms, as well as to introduce additional questions around which party should carry the monitoring burden with IP infringement.
A key example is the EU’s updated Copyright Directive, which was adopted by EU member states in 2019, and began to be implemented into member states’ legal codes starting in 2021. The EU framework establishes limitation of liability where platforms have been judged to have made “best efforts” to prevent or halt infringement, with “best efforts” being judged in terms of “proportionality” relative to the platform’s size, resources, and other factors. The key requirements to earn the liability shield are: platforms are required to “make best efforts” to:
- “Obtain authorization” from rights holders–broadly, to ensure content on their platforms is not infringing;
- “Ensure the unavailability of works”–which could be interpreted as requiring platforms to have some degree of mandatory content filtering sufficient to identify infringing content; and,
- Expeditiously remove content in response to infringement notices, as well as to prevent that content from being re-uploaded.
These new requirements introduced by the EU Copyright Directive–particularly the subjectivity of judging whether or not a platform made “best efforts” to take certain actions, and the feasibility and risks both of mandating the detection of infringing works via content filtering, and the prevention of re-uploads–were met with concern from a number of industry stakeholder groups.
Namely, the potential automated content filtering and removal requirements were criticized as likely to have negative impacts on online free expression–for example, due to non-infringing content being accidentally removed more frequently. The re-upload prevention requirement was also called into question by legal experts as potentially conflicting with EU users’ privacy rights under the GDPR, as assumedly the metadata required to identify content that had been previously uploaded would contain user data. Even if it would be possible to anonymize the metadata sufficient to meet the GDPR’s requirements, the proposed requirement was also examined under the lens that it may not meet the GDPR’s requirement to have a legally necessary basis to process data.
Regardless, challenges to the EU Copyright Directive have so far failed, and with its implementation into most member states’ legal codes starting in 2021, a new precedent has been set with intermediary liability within the realm of intellectual property law. The EU Copyright Directive also continues a more recent regulatory trend of additional monitoring and technical implementation requirements being imposed upon platforms to earn protection from intermediary liability.