On February 27, 2019, the Federal Trade Commission (“FTC”) announced that it settled with the operators of a video social networking app for a record civil penalty of $5.7 million under the Children’s Online Privacy Protection Act (“COPPA”). This FTC COPPA action was notable not just for the size of the penalty, but also because of the joint statement by the two Democratic Commissioners, Rebecca Slaughter and Rohit Chopra, that future FTC enforcement should seek to hold corporate officers and directors accountable for violations of consumer protection law.
COPPA applies to operators of websites and online services directed to children under 13 that collect personal information from children, as well as to operators that have actual knowledge that they are collecting personal information from children under 13. It requires, among other things, notices to parents or guardians and verifiable parental consent for the collection of personal information for children under 13. The app at issue, Musical.ly, also known as TikTok, allows users to create lip-syncing videos, which they can share publicly. Users can interact with each other by “following” accounts, commenting on videos, and sending direct messages.
According to the FTC, Musical.ly collected personal information from users including first and last name, online contact information, a short bio, the content of direct messages between users, and photos and videos containing users’ images and voices. User profiles are set to public by default. Even if a user changed the default setting from public to private, the user’s profile picture and bio remained visible to other users. The user could also still receive direct messages from any other user. In addition, the FTC claims that, for a time, the app collected device location information to enable users to see a list of other users within a 50-mile radius.
The FTC alleged that the app was governed by COPPA because:
- Musical.ly directly targeted children under 13 as an audience. Because the nature of the app involves a child-oriented activity (creating short lip-synching videos), the music options on the app has categories that appeal to children such as “Disney” and “school,” the app allows users to send colorful emojis to each other, and a large percentage of users have been found to be under the age of 13, the FTC alleged the app is an “online service directed to children under 13.”
- Musical.ly had actual knowledge that the app collected information from children under 13. The app provided guidance to parents on its website about monitoring their children’s use of the app. Musical.ly allegedly received thousands of complaints from parents that their children under the age of 13 had created accounts without their knowledge. Though the app began age screening new users in 2017, it did not do so for existing users. Moreover, the FTC claimed that the age of the user base is readily apparent from profile pictures and bios that are visible to all users even if the default setting is changed from public to private. The FTC’s complaint referenced “many public reports of adults trying to contact kids” via the app.
The FTC alleged that Musical.ly violated COPPA by:
- collecting information from children under the age of 13 without providing prominent notice on the service or direct notice to parents of the app’s practices regarding children’s personal information;
- not obtaining verifiable parental consent before collecting or using the personal information of children under 13;
- failing to delete personal information of such children when directly requested by parents; and
- retaining personal information of children for longer than reasonably necessary to fulfil the purpose for which it was collected. In response to parental complaints regarding the creation of accounts by children under 13, Musical.ly allegedly disabled the accounts, but did not delete the content associated with the accounts.
In addition to the $5.7 million monetary penalty, Musical.ly has agreed to either destroy all personal information in their possession, custody, or control that is associated with the user accounts of children under the age of 13, or obtain verifiable parental consent for those users.
In a joint statement, Commissioners Rebecca Slaughter and Rohit Chopra stated that “individuals at large companies have often avoided scrutiny” during FTC investigations. Slaughter and Chopra called for change, urging the Commission to “prioritize uncovering the role of corporate officers and directors” and holding responsible individuals accountable. None of the three Republican Commissioners joined in the statement. Notably, the joint statement coincides with an uptick in at least some consumer protection authorities pursuing theories of individual, and not just organizational, liability for violations of privacy and other consumer protection laws.