Children are actively present online at an ever-younger age and the risks associated with the processing of their personal data, by the likes of TikTok, have been the subject of increasing regulatory scrutiny in Europe recently. Following the death of a 10 year-old girl in Italy, TikTok was banned from processing data of users whose age could not be definitively established.
Under the GDPR, children have the same rights as adults over their personal data and a lawful basis is required in order to process a child’s personal data. In September 2020, the UK’s Information Commissioner (the “ICO”) launched its “Age appropriate design: a code of practice for online services" and was the first European data protection authority (“DPA”) to identify the protection of children's data as "a regulatory priority".
Even with the GDPR, and growing media scrutiny surrounding the use of children's personal data, it appears that, in some cases, websites and apps popular with children are overlooking this requirement. There is increasing legal and regulatory risk surrounding those who operate these sites. More recently, we have seen some EU DPAs enforcing bans or launching investigations on those social media apps that overlook such requirements.
TikTok finds itself at the forefront of regulatory and legal challenges
As highlighted above, earlier this year, the Italian DPA, the Garante, imposed an immediate limitation on data processing performed by TikTok with regard to the data of users whose age could not be established with certainty. The DPA took urgent regulatory action (under Articles 58 and 66 GDPR) following the death of a 10 year-old-girl from Palermo. The ban will apply provisionally until 15 February as the Garante plans to conclude its further assessment by that date.
The Garante had separately opened an investigation into TikTok for its handling and protection of children's data. The Garante alleged TikTok's practices are "non-compliant with the new regulatory framework for the protection of personal data." In particular claiming that TikTok's safeguards don't address situations involving minors while issues with data retention and consent may exist.
In 2019, the ICO confirmed an investigation into TikTok for its practices in handling children’s data. The ICO launched its investigation following the U.S. Federal Trade Commission’s decision to fine TikTok $5.7 million for “infractions related to the collection of personal information of users 13 and under”. The ICO’s investigation is still ongoing, with the intention to publish its findings later this year.
TikTok : the new representative action
TikTok Inc. and its parent company ByteDance Ltd is facing a legal challenge in the UK. The lead claimant is an anonymous 12-year-old girl who uses TikTok and Anne Longfield OBE is acting as her litigation friend. The claim alleges that TikTok illegally collects children's private information in the UK and European Economic Area (“EEA”).
The press release statement outlines the personal information allegedly collected by TikTok and ByteDance, which includes children's telephone numbers, videos, pictures, and their exact location, along with biometric or facial recognition data. More specifically, the claim alleges that TikTok and ByteDance have violated the UK Data Protection Act 2018 and Articles 5, 12, 14, 16, 25, 35, and 44 of the GDPR.
The legal claim is being pursued as a ‘representative action’, and the website (TikTok Data Claim UK) defines the ‘class’ as “all children in the UK and the EEA that used the TikTok app since 25 May 2018, regardless of whether or not they had an account, may have had their private information illegally collected and transferred to unknown third parties for profit.” To date, the research concluded by the claimants suggest there are over 3.5 million affected children in the UK alone.
The claim was filed in the High Court of England and Wales in December 2020. The claim is currently ‘stayed’ pending the outcome of the Lloyd v Google case, which is being heard by the Supreme Court at the end of April 2021. The outcome of Lloyd v Google will no doubt have an impact on how the case is pursued.
Instagram for Under-13s?
Despite the current legal and regulatory action facing TikTok, Mark Zuckerberg, the chief executive of Facebook, announced in April 2021 that he is developing an Instagram service for under – 13s. Instagram is currently unavailable for under-13s because they have special legal protections, meaning that it’s harder for businesses to collect their data and to target them with ads.
Facebook has announced that their proposed work-around would be that the app is “managed by parents”, meaning that it could require parental consent at sign-up. A Facebook spokeswoman said that the app is in its “early stages” and Facebook wants to “help children connect in a way that is safe and age-appropriate”.
To date, an international coalition of 35 children’s and consumer groups have called on Instagram to scrap its plans. With many privacy advocates voicing their concerns for children’s privacy, and child safety surrounding self-image and mental health.
With the growing scrutiny facing social media giants over their use of children’s data, it is positive to see the importance that the EU DPA’s are placing on the protection of children’s privacy rights. This is an area to watch over the coming months, especially for those companies who promote or sell their services on social media.