Regulators in Europe and beyond have been ramping up their efforts to protect children online, through new legislation, guidance, and by promoting self-regulatory tools. We discuss below recent developments in the EU and UK on age verification online.
Currently, the options available to companies to verify users’ age online age are limited. Some available methods used to verify users’ age online include self-declaration, parental consent, use of biometrics (including facial recognition technologies to ascertain physical features of the user’s face, or to check their correspondence with an ID picture), analysis of online usage patterns, and reliance on digital ID verification systems offered by some governments.
At the EU level, there is not yet a common stance on the best way to verify users’ age online, nor is there likely to be any time soon, as the methods selected should be calibrated to risk. There is, however, an increasing body of EU law requiring organizations to adopt some appropriate measures to verify age. For instance, the Audiovisual Media Services Directive requires the adoption of appropriate measures to protect children from harmful content, including through age verification. The GDPR contains rules on obtaining parental consent where consent is relied on as the legal basis for processing children’s personal data in the context of providing online services. The recently adopted Digital Services Act (“DSA”) also contains rules on protecting children online – including by not serving them targeted advertising based on profiling. (For more information on the DSA, see our previous blogpost here).
In May 2022, the European Commission announced its Strategy for a better internet for kids (BIK+), listing as a priority the strengthening of effective age verification. To this end, the Commission proposes to:
- develop a EU code for age-appropriate design by 2024, building on the framework of the DSA; and
- establish a European standard on online age verification in the context of the European Digital Identity (“eID”) proposal. The eID proposal would also enable minors to use their digital identity wallet to prove their age without disclosing other personal data.
In addition, the European Data Protection Board (“EDPB”) is expected to issue guidelines on children’s data and on the use of technologies for detecting and reporting online child sexual abuse, as announced in its 2023/2024 Work Programme (see our previous blogpost here).
National-level developments on age verification
National-level guidance on protecting children’s privacy – such as those issued by the French and UK data protection authorities – have stressed the importance of age verification as an important safeguard. Below, we set out some recent developments relating to age verification at a national level:
The French Parliament is currently examining a legislative proposal to establish an age of “digital consent”. The current text of the proposal would require social network providers to implement certified technical solutions to verify users’ age and parental consent. The certifying authority would be the newly created ARCOM (Autorité de régulation de la communication audiovisuelle et numérique), which has competence over the audiovisual and digital communications sectors. ARCOM is expected to create a repository of tools, in consultation with the CNIL. The proposal has been adopted by the French National Assembly, and is currently being discussed within the Senate.
To date, the Italian Garante has not issued specific guidance on age verification, but is increasingly active in campaigns on children’s privacy more broadly. In a statement, the Garante emphasized that age verification systems are “indispensable” to protect children, particularly when accessing social networks.
In a recent investigation in 2023, the Garante ordered the temporary limitation of processing by a company operating an AI-powered chatbot. The Garante found that the chatbot’s inappropriate content posed concrete risks for minors and vulnerable subjects, and that the company had not established any age verification procedures, nor any mechanisms to ban or block access, even after the user explicitly declares that they are underage. The Garante did not, however, recommend a specific, effective method of age verification in its decision.
Although not particularly recent, in October 2021, the UK Information Commissioner’s Office (“ICO”) issued its opinion on Age Assurance for the Children’s Code, setting out guidance on the effectiveness and privacy concerns associated with various age assurance methods. The ICO calls for organizations to take a “risk-based” approach to age assurance, and stresses that any processing of personal data in connection with age assurance technologies should comply with applicable data protection law. The ICO recommends the use of appropriately certified suppliers – for example, providers approved by the Age Check Certification Scheme (“ACCS”), which checks that providers meet the current industry standard. (See our previous blogpost here). The ICO is likely to further clarify its views on the appropriate application of the Code in the months ahead.
Given the increasing focus of regulators across Europe on protecting children online, we expect further guidance and concrete schemes being set up to help organizations verify the age of the users of their online services.