In a case that has gone through some interesting procedural twists, the Second Circuit has confirmed (for a second time in this case) that social media platforms can terminate accounts, at least when they do so in good faith.

James Domen is a self-identified "former homosexual" and founder of Church United ("CU"), a California non-profit that seeks to “equip pastors to positively impact the political and moral culture in their communities”--among other things, by sharing stories like Domen's. In service of this mission, Domen posted 89 videos to CU's Vimeo account, certain of which CU described as "addressing sexual orientation as it related to religion." In late 2018, following the issuance of a warning and CU's failure to remove the videos, Vimeo deleted CU's account for violation of its Terms of Service and Guidelines, which specifically prohibit videos that promote sexual orientation change efforts.

Domen brought suit in the Southern District of New York, alleging that Vimeo's actions constituted censorship and barred him from speaking about his preferred sexual orientation and religious beliefs. He brought claims under California's Unruh Act, New York’s Sexual Orientation Non-Discrimination Act, and Article 1, and Section 2 of the California Constitution. The District Court granted Vimeo's motion to dismiss, finding that Vimeo's actions were protected by Section 230(c)(2) of the Communications Decency Act, which provides certain protections for interactive computer service providers.

Domen appealed, and, in March, the Second Circuit affirmed the dismissal. The Court found that Section 230 provides “platforms like Vimeo with the discretion to identify and remove what they consider objectionable content from their platforms without incurring liability for each decision," at least where that decision is made in good faith. The decision was novel. Although courts in California had addressed the issue, the Second Circuit had not yet ruled on a platform's ability to remove content under Section 230, but only found that it was shielded from liability for allowing third-party content to remain on its site.

The Court based its decision on Section 230(c)(2)(A)- which provides that: "No provider or user of an interactive computer service shall be held liable on account of...any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." In focusing on this section, instead of on Section 230(c)(1), which provides broad immunity for posts made by third parties, the Court confirmed that identifying something as "objectionable" depends on the platform's subjective intent, and here was governed by Vimeo's policies. It also found unavailing evidence that other, similar videos remained on the platform despite arguably violating Vimeo's policies, confirming that a platform's efforts at enforcement need not be perfect to take advance of the safe harbor. As the Court noted, "the purpose of Section 230 was to insulate interactive computer services from liability for removing some content, but not other content…There simply are no substantive allegations to support the notion that Vimeo somehow was targeting Domen because he is a ‘former homosexual,’ as Plaintiffs posit." Moreover, the Court provided some guidance on the "good faith" requirement, finding that element satisfied because Vimeo's removal was "not anti-competitive conduct or self-serving behavior," but a straightforward application of its stated policies.

Undeterred by the decision, Domen sought rehearing. He argued--as many have lately--that Section 230 effectively shields platforms from lawsuits, and that the Court had failed to do any analysis to confirm that Vimeo had acted in good faith, as is required by Section 230(c)(2)(A). In a relatively uncommon move, the Second Circuit granted the petition and vacated its initial opinion. However, just yesterday--and a mere week after granting rehearing--the Second Circuit delivered an opinion that...(surprise!) largely adopted its prior opinion. Aside from stylistic changes, the Court changed references from "immunize" to "protect"--despite the fact that the case granted dismissal on a motion to dismiss, suggesting that Section 230 is more immunity than affirmative defense. The Court also deleted references to its opinion being "broad", indicating that its decision would not apply to "providers acting in circumstances far afield from the facts of this case." It also added a section dismissing the claim on the merits. The overall result, however, remained the same.

Aside from the interesting procedure, this case may prove important as the first out of the Second Circuit to uphold a platform's ability to terminate user accounts based on their use (or possibly abuse) of the platform. It also signals that, even if Section 230 (c)(2)(A)--and not the more forgiving Section 230(c)(1)--applies to account terminations, to avoid dismissal, a plaintiff needs to do more than make conclusory allegations of bad faith. However, the opinion does provide some guidance on what might count as bad faith, including anti-competitive or self-serving behavior.