Since 1996, internet and social media companies have relied on Section 230 of the Communications Decency Act, 47 U.S.C. § 230, to obtain dismissal of claims based on user-generated content posted on their sites. Section 230 provides broad immunity for interactive computer services when a plaintiff’s claims treat that service as “the publisher or speaker” of third-party content. But, as two recent cases demonstrate, the line can be fuzzy between a claim that targets a social media platform as a “publisher or speaker” and one that targets the platform in some other role. For this reason, a defendant should also rely on traditional principles of tort and commercial law to seek dismissal.

Cohen v. Facebook, Inc.

In a recent decision in the Eastern District of New York, the Court held that Section 230 shielded Facebook.com from liability for violation of federal anti-terrorism statutes and tort claims under New York and Israeli law. The court found that the plaintiff’s claims implicated Facebook’s role as a publisher or speaker of content created by third-party users and thus fell within Section 230’s broad immunity. In 2016, two groups of plaintiffs sued Facebook, alleging that Facebook’s social media platform allowed terrorists to incite violence and organize attacks and that Facebook had failed to take adequate steps to address this problem. See Cohen v. Facebook, Inc., Nos. 16-CV-4453 (NGG) (LB), 16-CV-5158 (NGG) (LB), 2017 WL 2192621 (E.D.N.Y. May 18, 2017). Facebook moved to dismiss the claims because, among other reasons, the plaintiffs’ claims were barred by Section 230.

The court granted Facebook’s motion. After dismissing one group of plaintiffs for lack of standing, the court considered the application of Section 230 to the remaining plaintiff’s claims. The court found that the key question was whether plaintiff’s claims arose from Facebook’s role as a “publisher or speaker” of third-party content. Id. at *11. The plaintiffs tried to argue that their claims fell outside of Section 230 because their claims related to Facebook’s “provision of services” to suspected terrorists while “refus[ing] to use its available resources” to remove those accounts. In other words, plaintiffs tried to construe their claims as related to Facebook’s policing of accounts and not Facebook’s policing of content, which would fall within Section 230. Id. at *12.

The court rejected this distinction, finding that “Facebook’s choices as to who may use its platform are inherently bound up in its decisions as to what may be said on its platform, and so liability imposed based on its failure to remove users would equally “derive[ ] from [Facebook’s] status or conduct as a ‘publisher or speaker.’” Id. (quoting FTC v. LeadClick Media, LLC, 838 F.3d 158, 175 (2d Cir. 2016)). Moreover, plaintiff’s injury derived from the nature of the content — posts inciting or organizing terrorist attack s— not purely from the ability of such individuals to obtain Facebook accounts. Id. at *13. Thus, the court concluded that Section 230 applied and dismissed the remaining plaintiffs’ claims. The plaintiffs have not yet noticed an appeal.

Beckman v. Match.com, LLC

In contrast, Match.com had to turn to traditional tort principles to obtain dismissal in a failure to warn case after the Ninth Circuit held that the plaintiff’s claims did not seek to impose liability against Match.com for its role as a publisher or speaker of third-party content. In Beckman v. Match.com, LLC, No. 2:13-cv-97-JCM-NJK (D. Nev. filed Jan. 18, 2013), Mary Kay Beckman sued Match.com for, among other things, failure to warn after she was the victim of a brutal attack by a man she met through the site, Wade Ridley. Beckman argued that Match.com failed to protect her from dangerous individuals like Ridley, who used the website to facilitate criminal behavior.

Match.com moved to dismiss on the basis that Beckman’s claims were barred by Section 230, and the district court granted the motion. On appeal, the Ninth Circuit disagreed with the district court that Beckman’s failure-to-warn claim was barred by Section 230. Beckman v. Match.com, LLC, No. 13-16324, 2016 WL 4572383 (9th Cir. Sep. 1, 2016). Citing its recent holding in Doe No. 14. v. Internet Brands, Inc., the Ninth Circuit stated that Section 230 “did not preclude a plaintiff from alleging a state law failure to warn claim against a website owner who had obtained information ‘from an outside source about how third parties targeted and lured victims.’” Id. at *1 (quoting Doe No. 14. v. Internet Brands, Inc., 824 F.3d 846, 851 (9th Cir. 2016)). Holding a website owner liable for a failure to warn did not implicate its role as a “publisher or speaker” of third-party content. Id.

On remand, Match.com was nonetheless able to secure dismissal of Beckman’s claims. Without relying on Section 230, the district court held that Beckman failed to state a claim for failure to warn. See Beckman v. Match.com, LLC, No. 2:13-cv-97-JCM-NJK, 2017 WL 1304288 (D. Nev. Mar. 10, 2017). Under Nevada law, no duty to warn exists in the absence of a special relationship between the parties, such as an innkeeper-guest, teacher-student, or employer-employee. Such a relationship may also exist where “the ability of one of the parties to provide for his own protection has been limited in some way by his submission to the control of another.” Id. at *3 (quoting Sparks v. Alpha Tau Omega Fraternity, Inc., 255 P.3d 238, 245 (Nev. 2011)). The court found Beckman’s allegations insufficient to satisfy this element. Id. at 4-5. Beckman has appealed the ruling.

* * *

Social media and similar platforms have complicated the law related to Section 230 in ways likely unanticipated in 1996. As a result, it can be difficult for providers to anticipate when a claim treats them as a “publisher or speaker” of third-party content and when a claim treats them in some other capacity. But, as Beckman shows, failure to secure immunity under Section 230 is not the end of the road. Even where Section 230 does not require dismissal of a claim, the absence of a tort duty still might. These traditional tort principles should be part of a defense strategy in cases involving user-generated content, and a company would be wise to consider them in drafting terms and conditions describing its relationship status with users. For example, disclaiming the formation or existence of a “special relationship” with users could make explicit what the court in Beckman had to infer.