Online services have until 31 May to respond to 16 draft standards of age-appropriate design.
The ICO is required by s123 of the Data Protection Act 2018 to prepare a code of practice which contains guidance on standards of age-appropriate design of relevant information society services likely to be accessed by children. On 15 April, the ICO published a draft code of practice on age-appropriate design for online services (the Code). A copy of the Code can be found here.
Who does the Code apply to?
The Code is aimed at Information Society Services (ISS), which is defined as “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”. In practice, this definition extends to almost all online services including apps, websites, social media platforms, online messaging services, online marketplaces, content streaming services, and even news and educational websites.
The reference to “remuneration” is often seen as confusing. However, the ICO clarified that remuneration covers services funded by advertising, but also those provided to end users free of charge.
ISS should also note that the Code applies if children (i.e. a person under 18) are likely to use the service. This definition includes services that are designed specifically for children, as well as those that may appeal to children or those that were designed for adults but have, in fact, attracted children.
What is the legal status of the Code?
Any failure to act in accordance with the Code will inevitably render demonstrating compliance with the GDPR and the Privacy and Electronic Communications Regulations (PECR) difficult. The ICO must take the Code into account when considering whether an online service has complied with its data protection obligations under the GDPR and the PECR, and any failure to act in accordance with the Code will inevitably make it difficult to demonstrate compliance with the GDPR and the PECR. The ICO will also take the Code into account when considering questions of fairness, lawfulness, transparency, and accountability under the GDPR, as well as in the use of its enforcement powers. The Code can also be used in evidence in court proceedings.
When will a final draft of the Code be published?
The ICO intends that the Code will be finalised by the end of the year. However, the Code must be approved by Parliament before the final version is published.
Requirements of the Code
The Code sets out practical measures that ISS should comply with in order to ensure processing of children’s personal data is deemed “fair” under the GDPR. The Code centres around 16 standards of age-appropriate design for ISS:
- A child’s best interests should be a primary consideration for ISS in developing online services. The predominant standard of age-appropriate design is that the best interests of children should be a primary consideration when designing and developing online services that children are likely to use. The Code focuses on the needs of children, including (amongst others) ensuring that children are safe from exploitation and that ISS protects their development. If the best interest of children conflict with ISS’ commercial interests, then children should be the primary consideration.
- ISS should consider an audience’s age range and the needs of children of different ages. If ISS cannot determine which users are children, the same safeguards must be applied to all users. In practice, an ISS that cannot distinguish between adult and child users should either (i) consider putting robust age-verification mechanisms in place; or (ii) apply the same safeguards to all users by default. In the case of the latter, the ISS has the option to offer robust age-verification mechanisms to allow adults who can prove their age to opt out of some or all of the relevant safeguards.
- ISS should provide concise, prominent, and clear privacy information that is suited to the age of a child. The Code recommends providing “bite-sized” explanations about the use of children’s personal data at the point they enter their personal data. Helpfully, the Code also includes suggestions on how to meet the transparency obligations for various age ranges; for example, the Code suggests that if a child aged between six and nine attempts to change a default high privacy setting, the ISS should tell the child to either leave the setting as it is or seek help from a parent before changing the setting.
- ISS must not use children’s personal data in ways that have been shown to be detrimental to their wellbeing. The use of children’s personal data should not be detrimental to their wellbeing or run counter to government advice, regulatory provisions, or industry codes of practice (such as the CAP guidance on online behavioural advertising). Where there is currently no formal government position on a particular matter, ISS should take a precautionary approach; for example, the Code specifically notes that strategies used to extend user engagement should not be used with children until a formal position is adopted.
- ISS must ensure that settings are high privacy by default (unless there is a compelling reason to the contrary, taking account of a child’s best interests). The Code provides numerous practical examples on how organisations can comply with this requirement. For example, the Code states that children’s personal data should only be visible to other users to the extent that a child allows this visibility and, unless the settings are changed, ISS should only use children’s personal data if the use is essential for providing the service. In addition, if a child attempts to change a privacy setting, ISS should provide age-appropriate explanations and prompts.
- ISS should collect and retain only the minimum amount of personal data needed to provide the service. The Code notes that ISS should avoid collecting real world identifiers if possible, using options such as usernames and avatars instead. The collection of children’s personal data should not be “bundled” in order to provide an enhanced service alongside the core service.
- ISS should not disclose children’s personal data unless they can demonstrate a compelling reason to do so, taking account of the best interests of a child. ISS should not share personal data if they can reasonably foresee that third parties will use this data in ways that are detrimental to the wellbeing of a child. This measure is likely to result in ISS requiring assurances from third parties as to their use of children’s personal data and undertaking extensive due diligence checks as to the adequacy of third party data protection practices.
- ISS should ensure that geolocation is switched off by default. ISS must comply with this standard, unless there is a compelling reason to the contrary, taking account of the best interests of a child. In addition, an obvious sign should appear on a user’s screen when location tracking is active. Options that make a child’s location visible to others must default to off at the end of each session.
- ISS must give children age-appropriate information about parental controls. ISS must make it clear to a child whether parental controls are in place and whether an ISS is tracking or monitoring their online activity. The Code provides guidance for various age ranges on how an ISS can demonstrate that they have fulfilled this requirement.
- Any options that involve profiling children should be switched off by default (unless you can demonstrate a compelling reason for profiling, taking into account the best interests of the child). Profiling should be switched off by default unless there is a compelling reason to the contrary, taking into account a child’s best interests. This means that the legal basis will, in the majority of circumstances, be consent rather than legitimate interests with a right to object. That said, as profiling can serve a wide range of purposes, there may be circumstances in which ISS can demonstrate a compelling reason for profiling; examples included by the ICO include child protection or safeguarding. In these circumstances, profiling can be switched on by default and, the ICO states, there does not need to be an option to switch it off. Assuming an ISS does not have a compelling reason for profiling, the Code requires an ISS to be clear about the purposes for which the service uses the personal data at the point at which a child can turn profiling on as well as prompts to seek assistance from an adult. If an ISS uses profiling, the ISS must put in place appropriate measures to safeguards the child and ensure they are not “fed” or presented with content which is detrimental to their health. These measures could include contextual tagging, robust reporting procedures and elements of human moderation. The Code includes several examples of content that should not be presented to children as it may be detrimental to their health, including violent content, music that is labelled as explicit and strategies used to extend user engagement.
- ISS should not use nudge techniques to lead or encourage children to provide unnecessary personal data, turn off privacy protection, or extend use. ISS should not use nudge techniques such as reward loops or positive reinforcement techniques (such as likes and streaks) that lead or encourage children to provide unnecessary personal data, extend their use or weaken or turn off their privacy protections. That said, the Code is in favour of using “pro-privacy” nudges which, for example, encourage children to maintain a high level of privacy.
- Connected toys and devices must include effective tools to enable compliance with the Code. Connected toys and devices should clarify which organisation is processing personal data. If third parties are involved in providing the product, an ISS should carry out significant diligence to mitigate security risks. Of significance to ISS providing connected toys and devices is that clear information about the product’s use of personal data should be provided at the point of purchase before the user has purchased or set up the device.
- ISS must provide prominent and accessible tools to help children exercise their data protection rights and report concerns. The Code focuses on ISS giving prominence to reporting tools during the registration or set-up process. Tools enabling data subject rights should be tailored to the age of a child, should be specific to the rights they support, and should include mechanisms for tracking progress and facilitating communication between the ISS users so that, for example, a user can indicate if there is an urgent or safeguarding issue.
- Data protection impact assessments should be undertaken to assess and mitigate risks to children. The GDPR requires a DPIA to be carried out where a controller begins any type of processing that is likely to result in a high risk to the rights and freedoms of individuals. The ICO clarifies in the Code that the nature and context of online services within the scope of the Code inevitable involve a type of processing likely to result in a high risk to the rights and freedoms of children. This aligns with the ICO’s list of circumstances in which a DPIA is required which was published pursuant to Article 35(4) of the GDPR (see here for more information). This list supplements GDPR criteria and European guidelines and includes a requirement to undertake a DPIA for “the use of personal data of children or other vulnerable individuals for marketing purposes, profiling or other automated decision making, or if you intend to offer online services directly to children”. On this basis, ISS should carry out a Data Protection Impact Assessment should be undertaken in respect of new functionalities, specifically to assess and mitigate risks to children who are likely to access the service, taking into account differing ages, capacities and development needs.
- ISS should ensure policies and procedures are in place to demonstrate compliance with data protection obligations, including data protection training for all staff involved in the design and development of online services likely to be accessed by children. The programme should involve reporting against the standards in the Code, developing policies that demonstrate compliance with data protection, and adequately training staff.
The consultation remains open until 31 May 2019. For more information, see here.