The U.S. Supreme Court is scheduled to hear two cases right after Presidents’ Day that could significantly impact how online services providers must handle third-party content. At issue in the cases is Section 230 of the Communications Decency Act (“CDA 230”), which some commentators credit as one of the most important laws in the development of the Internet and has been the subject of significant policy attention over the past few years.

U.S. courts have traditionally interpreted CDA 230 in a manner that grants robust immunity to online services providers against various legal claims relating to third-party content. This immunity is subject to statutory exceptions for IP infringements, criminal violations and some other types of claims, but courts have otherwise routinely used CDA 230 to reject a wide range of claims seeking to hold providers liable for hosting, displaying, removing or blocking third-party content, including under contract, defamation, tort and civil rights laws.

Some advocates of the traditional interpretation of CDA 230 note that it would be unduly burdensome for the law to expect online services providers to review, assess and moderate all of the content that users independently develop and upload. At the same time, numerous public figures have called for changes to CDA 230 (although not always for the same reasons), including President Trump, President Biden, and the attorneys general of more than half of the states in the USA.

The upcoming cases focus on the interplay between CDA 230 and online platforms that allegedly used algorithms to amplify terrorist content. If the Supreme Court finds that CDA 230 does not protect online services providers from liability associated with third-party content when they amplify the content, online platforms should consider whether and how to change their content promotion and moderation policies and protocols. Regardless of the outcome of the case, companies should consider reviewing their policies, protocols and algorithms to ensure they do not recommend content that promotes terrorism, violence or other harmful content.

What is CDA 230?

CDA 230 has two main provisions: one that shields online platforms from liability for content exclusively provided by third parties, and one that shields them from liability when they block or remove content in good faith.

First, under 47 USC § 230(c)(1), a provider or user of an interactive computer service will not be treated as the publisher or speaker of any information provided by another information content provider. Historically, a “speaker” or “publisher” of content could be liable for content in much the same way as the original author of the content, unlike a mere “distributor” of the content. Courts have therefore interpreted this provision as protecting online platforms from liability for content exclusively generated by third parties. For example, if a user of a social media platform posts a defamatory statement on the platform, the platform hosting such content would generally not be liable for the user’s defamation.

Second, under 47 USC § 230(c)(2), a provider or user of an interactive computer service will not be liable for voluntarily restricting access to or removing objectionable content if the action is taken in good faith. This provision generally immunizes platforms against liability for policing their platforms and engaging in content moderation, such as when a streaming platform chooses to remove violent or obscene videos from its platform to enforce its community standards.

CDA 230 defines “interactive computer service” broadly as any “information service, system, or access software provider that provides or enables computer access by multiple users to a computer server” and includes many varieties of online platforms, such as social media platforms, streaming platforms, and search engines.

How has CDA 230 been historically interpreted?

Courts have construed CDA 230 broadly in favor of online services providers over the years. In one of the earliest and most significant court cases to interpret the statute, Zeran v. America Online, Inc. (1997), the court held that Congress intended CDA 230 immunity to apply broadly to online service providers because, without such broad immunity, providers would severely restrict user content to avoid potential liability, which would impede free speech interests. This set a precedent for future cases and, in the decades since, many courts have cited the law to shield online services providers from a wide range of civil claims related to third-party content.

Although courts have generally interpreted CDA 230 broadly, it does not provide absolute immunity. The statute lists certain types of matters that it does not apply to, such as violations of federal criminal laws and intellectual property infringements. Courts have also established limits to CDA 230 protection in certain instances. For example, if the defendant induced or helped to develop the unlawful content, CDA 230 immunity does not apply (see, e.g., Fair Housing Council of San Fernando Valley v. Roommates.com (2008) and FTC v. Accusearch (2009)).

What are the upcoming Supreme Court cases about?

In one of the cases, the plaintiffs claim that the defendant’s content streaming platform aided and abetted a terrorist organization in violation of the Anti-Terrorism Act by hosting recruitment videos and algorithmically recommending those videos to users based on their viewing history, which allegedly radicalized users susceptible to terroristic messaging into carrying out coordinated attacks. One of the key questions in this case is whether an online platform transforms into a content developer and therefore loses CDA 230 immunity when it makes algorithmic recommendations of harmful third-party content.

In the other case, the plaintiffs filed a lawsuit against three social medial companies alleging that they aided and abetted an act of terrorism in violation of the Anti-Terrorism Act by failing to take sufficient measures to prevent a terrorist organization from using their platforms for recruitment and messaging. One of the key questions in this case is what measures online platforms must implement with respect to third parties and their content to avoid being considered to have knowingly aided and abetted terrorism. This will be the first time that the Supreme Court interprets the scope of CDA 230. Depending on the outcome of these cases, platform providers may have significantly more responsibilities with respect to moderating content related to terrorism and other subject-matter on their platforms. In addition to court cases, numerous lawmakers have proposed bills or initiatives to amend or abrogate CDA 230, although none have currently gained sufficient traction to become law. We recommend monitoring developments in this space as the stakes are high for many online services providers.