States and Congress have been enacting or debating different approaches to online “content moderation” by social media and other internet platforms. California’s “Content Moderation Requirements for Internet Terms of Service” bill (“AB 587”) goes into effect on Jan 1, 2024. In short, AB 587 requires social media companies to disclose their processes to take down or manage content and users on their platforms. AB 587 takes a somewhat different approach to social media content regulation than previously enacted laws in Texas and Florida. The Texas and Florida laws also address the content management practices of social media companies, but go beyond requiring disclosures and also prohibit specific conduct in order to restrict putative viewpoint discrimination. The Eleventh Circuit partially repudiated the Florida law because the associated content moderation requirements violate social media companies’ First Amendment rights to exercise editorial judgment on their platforms. The Fifth Circuit, on the other hand, upheld a similar Texas law because the court believed that content moderation based on viewpoint would constitute censorship and that a platform’s content moderation activity is not speech protected by the First Amendment.
Section 230 of the Communications Decency Act has been read to provide broad immunity to social media companies to manage third-party posts or content on their platforms. Importantly, Section 230 protected social media companies from liability for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”
Florida and Texas sought to regulate the management of third-party posts or content by social media companies by passing “content moderation” or “transparency” laws. Both laws focus on 1) content moderation; 2) disclosure requirements on internal guidelines as to how content moderation and censorship decisions are made; and 3) the blocking, banning, or deplatforming of users and their content from the platform.
As noted above, both laws were challenged on constitutional grounds. In May 2022, the Eleventh Circuit upheld a district court’s entry of an injunction barring enforcement of certain provisions of the Florida law relating to content moderation. The Eleventh Circuit’s core holding was that the platforms’ content moderation was protected speech, similar in kind to the editorial powers exercised by traditional media companies. In contrast, in September 2022, the Fifth Circuit held that a district court had erred in enjoining the Texas law, reasoning that the Texas law does not violate the First Amendment rights of tech platforms. The Fifth Circuit’s lengthy opinion is not easily summarized, but at its core the opinion “reject[s] the idea that corporations have a freewheeling First Amendment right to censor what people say.” The court likewise stated that the Texas law “does not chill speech; if anything, it chills censorship,” and “does not regulate the Platforms’ speech at all; it protects other people’s speech and regulates the Platforms’ conduct.” The Fifth Circuit also likened social media companies to “common carriers,” drawing on this analogy to hold that the government may broadly limit their content moderation activities. The Fifth Circuit therefore rejected the challengers’ contention that the Texas law conflicts with the First Amendment. Notwithstanding this ruling, the Texas law has not yet taken effect. The Fifth Circuit granted a stay of its ruling in mid-October, agreeing to suspend enforcement of the law pending potential further review by the Supreme Court.
While California’s new law is crafted to require transparency regarding content moderation practices, rather than restrict editorial decisions, the California law may also face litigation challenges under the First Amendment and perhaps Section 230 as well.
AB 587 is intended to cover Social Media Companies, a term broadly defined to mean “a person or entity that owns or operates one or more social media platforms.” A Social Media Platform is defined as “a public or semipublic internet-based service or application that has users in California” and both aims to connect users within the service or application and allows users to construct a profile to interact with others and consume user-generated content.
This definition would appear to capture a large swath of companies, including message board companies, platforms like Instagram, Tik Tok and Meta. However, the legislative history suggests that AB 587 should be construed more narrowly. Specifically, the legislative history indicates that the legislature intended to apply to an internet-based service or application for which interactions between users are limited to direct messages, commercial transactions, consumer reviews of products, sellers, services, events, or places, or any combination thereof. This could potentially mean that even prominent companies such as Amazon or Yelp would not be considered Social Media Companies that own Social Media Platforms.
Requirements of AB 587
According to AB 587, a Social Media Company must post its Terms of Service and include (1) contact information for users to ask question about the Terms of Service; (2) a description of the content moderation policies; and (3) a list of potential actions that a Social Media Company may take against prohibited content. Users of the platform have broad rights under AB 587, including the right to flag content they believe is violating the Terms of Service and to obtain an explanation on a decision from the Social Media Company. Notably, there is no right to private action under AB 587; provisions that would have created a private right of action were stripped out during the legislative process. There do not appear to be any regulations provided or any guidance that has been issued regarding the implementation of the bill at this time; however, there is no indication that such guidance or regulations would be precluded or would not be provided at a future date.
Under AB 587, a Social Media Company must provide a report regarding its Terms of Service to the California Attorney General every six months. The report must include the Terms of Service, any changes made to the Terms of Service since the last report, and a statement of whether the Terms of Service define certain categories of speech. In addition, AB 587 requires disclosure of the content moderation practices the Social Media Company uses in its day-to-day operations.
There is a $15,000 fine per violation of the requirements of AB 587. A Social Media Company shall be considered in violation of the Bill for each day the social media company does any of the following:
(A) Fails to post Terms of Service.
(B) Fails to timely submit to the Attorney General a required report concerning its Terms of Service.
(C) Materially omits or misrepresents required information in a report concerning its Terms of Service.
State legislatures and federal courts are adopting different approaches to regulating social media. As a result of the litigation in the Fifth and Eleventh Circuits, there exists a split on the issue of “content moderation” laws that could lead the Supreme Court to take up these issues soon. In the meantime, other courts and state legislatures, and possibly the U.S. Congress, may weigh in with other laws and legal rulings. Each of these developments will shape the legal landscape and inform how laws governing social media content moderation will be applied and analyzed under the First Amendment and Section 230 of the Communications Decency Act. Social media companies should continue to monitor developments in this area and prepare to respond to a variety of content moderation laws, taking into account in their planning the possibility that these laws may impose different legal obligations and perhaps even obligations that conflict squarely with one another.