Data protection is just one area of regulation which helps to protect children in the online environment. It's no good getting data protection right while ignoring rules on marketing and advertising to children, or failing to protect them from other online harms, for example, online gambling, or access to inappropriate content.
There are a number of areas where clear rules do exist and these must be considered in addition to special rules on the protection of children's data.
Governments around the world are still struggling to deal with how to protect individuals, in particular children, online without impinging on freedom of speech and without making platforms entirely responsible for third party content. There is a lot of activity in this space which is likely to lead to further guidance and regulation. We look at some key areas, both current and emerging.
The Audiovisual Media Services Directive 2018/1808 (AVMS) establishes the regulatory framework for broadcasters and providers of audiovisual content in the EU, and extends the 2010/13/EU AVMS Directive to include video-sharing platforms. The Directive takes into account that the protection of children must always be balanced with other important values such as freedom of expression and recognises that it cannot work without parental responsibility. Member States have until September 2020 to implement the updated provisions to safeguard children from harmful content.
- Member States shall ensure that media service providers provide sufficient information to viewers about content which may impair the physical, mental or moral development of minors. This may include a system of content descriptors, an acoustic warning, a visual symbol or any other means, describing the nature of the content.
- Member States shall take appropriate measures to ensure that audiovisual media services provided by media service providers under their jurisdiction which may impair the physical, mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see them. Such measures may include selecting the time of the broadcast, age verification tools or other technical measures. They shall be proportionate to the potential harm of the programme.
- The most harmful content, such as gratuitous violence and pornography, shall be subject to the strictest measures. These include encryption and effective parental controls.
- Personal data of minors collected or otherwise generated by media service providers shall not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising.
- Audiovisual commercial communications for alcoholic beverages should not be aimed specifically at children and Member States must encourage codes of conduct to reduce the exposure of children to these communications.
- Audiovisual commercial communications shall not directly exhort children to buy or hire a product by exploiting their inexperience or credulity, directly encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust children place in parents, teachers or other persons, or unreasonably show children in dangerous situations.
- Children must also be protected from harmful content and hate speech on video-sharing platform services.
The UK government is currently consulting on how to implement the Directive.
The Advertising Standards Authority (ASA) is the UK's independent advertising regulator. The Committee of Advertising Practice (CAP) is the ASA's sister organisation responsible for writing the advertising codes. The ASA makes sure that advertisements across the UK media adhere to the advertising rules and the Advertising Codes.
The UK CAP Code sets out rules that apply to adverts and other marketing communications. The UK Code of Broadcast Advertising (BCAP) Code applies to all advertisements on radio and television services licensed by Ofcom, which includes teleshopping, content on self-promotional television channels, television text and interactive TV ads.
- In general, marketers much take into account children's reduced ability to recognise and critically assess the purposes behind the processing and the potential consequences of providing their personal data. In particular, marketers should not exploit any lack of understanding or vulnerability on the child's part.
- A Data Protection Impact Assessment (DPIA) must be completed before sending direct marketing messages to individual children or using personal data to display targeted adverts in an online context.
- Following guidance issued in January 2018, advertisements in the UK should not portray or represent anyone who is or seems to be under 18 years old in a sexual way.
- Following guidance issued in June 2019, advertisements can still be targeted at and feature a specific gender but they should take care not to explicitly convey that a particular children's product, pursuit, activity including choice of play or career is inappropriate for any gender.
- Both the BCAP Code and the CAP Code state that marketing communications or advertisements must not exploit susceptibilities, aspirations, credulity, inexperience or lack of knowledge on the part of children or young persons. Both Codes state that marketing communications or advertisements must not be likely to be of particular appeal to children or young persons by reflecting or being associated with youth culture.
CAP has also released new standards which aim to further protect children and young people from irresponsible gambling advertisements. For the purpose of these rules a child is an individual under the age of 16 and a young person is an individual that is 16 or 17. This new guidance came into effect on 1 April 2019 and applies to marketing communications in all media including online channels such as social media. Guidance is based on previous ASA rulings however these are not new rules and they do not bind the ASA counsel when they are considering complaints. They do, however, set out the relevant CAP Code and BCAP codes rules in greater detail.
- The Codes advise marketers to adopt a sense of caution in preparing communications to ensure they abide by the relevant set standards. These rules apply in spirit as well as in letter, whether or not the gambling product is shown or referred to and they apply to 'free play' as well as 'pay to play' games.
- CAP Code rules state that marketing communications must be socially responsible and must ensure that children and young persons are protected from being harmed or exploited by advertisements that feature or promote gambling.
- The BCAP Code states that advertisements must not portray, condone or encourage gambling behaviour that is socially irresponsible or could lead to financial or emotional harm.
The guidance goes into further detail and provides tangible examples of what will be considered inappropriate at the time of marketing gambling products to children.
- Content will be irresponsible if it features under 18s playing a significant role or if directed at under 18s by being placed in media for that group or sub-age category.
- Content must not appear in media for children or young persons (for which they make up more than 25% of the audience).
- Similarly, content must not otherwise encourage under 18s to engage directly in potentially harmful behaviour. This includes avoiding colourful, exaggerated, animated characters like children's cartoons, animals, pirates or fairy tale characters, avoiding childlike imagery or narratives such as nursery rhymes or children's stories, avoiding tropes or specific characters familiar to children.
- The guidance also recommends considering humour carefully in content whether it resonates with under 18s, specifically referring to slapstick and juvenile humour.
- Marketers and advertisers should be cautious about using sports people or celebrities in marketing communications, adverts or through endorsement agreements. The guidance recommends considering whether they have a significant profile among under 18s, particularly where they are sports or reality TV stars.
- Additionally, the guidance states that nobody who is (or seems to be) under 25 should appear in any gambling content that is targeted at children.
Age verification rules for adult content
The UK intends to bring in age verification for online adult content. Commercial providers of online adult content and pornography where more than a third of the content is pornographic will be required by law to carry out age verification checks on users to ensure that they are 18 years or older. Websites that fail to implement age verification technology face sanctions including having payment services withdrawn or blocked.
This new approach is the first of its kind in the world, leading the way in internet safety for children and aiming to implement the same protections that exist offline. However, implementation has been delayed due to the government failing to tell EU regulators about the plan. Notification takes at least six months so it remains to be seen when the rules will come in.
The rules have been widely criticised as unworkable and easy to circumvent and may yet change again as technology develops.
It's worth remembering that terms and conditions and policies which apply to children need to be written in a way they can understand. This can obviously be a challenge. It is hard enough to produce clear terms and conditions which users will actually read online for adults, let alone for children and the added complication is that understanding can vary dramatically depending on which age group is targeted. In England and Wales, children have capacity to enter into a contract from the age of seven although they can also withdraw from it at any point. This does mean that particular care has to be taken where it is likely that children may engage with terms and conditions.
The UK Science and Technology Committee has concluded that social media companies must be subject to a legal duty of care to help young people's health and wellbeing when accessing their sites, further to report findings from 31 January 2019.
The Committee reports that there is a "standards lottery" among social media companies whose practices differ widely across the industry, and the Committee has advised establishing a regulator to address this. It suggests the regulator would be responsible for:
- Providing guidance on how to spot and minimise the harms social media presents.
- Taking enforcement action when warranted.
- Establishing a strong sanctions regime in order to be effective.
UK government White Paper on online harms
The UK government published a White Paper on online harms in April 2019. It sets out the UK government's plans for a world-leading package of online safety measures to support the thriving digital economy and technological innovation. This White Paper suggests legislative and non legislative measures to make businesses more responsible for users' safety online, with a particular focus on promoting and protecting the safety of children (see here for more).
Recommendations made in the White Paper include appointing an independent regulatory body to implement, oversee and enforce a new regulatory framework, establishing appropriate enforcement powers, implementing potential redress mechanisms for online users, and measures to ensure that regulation is targeted and proportionate for the industry.
The proposed regulatory framework is intended to apply to companies which allow users to share or discover user-generated content or interact with each other online. This includes social media companies, public discussion forums, retail sites which allow online product reviews, file sharing sites, instant messaging services, search engines and cloud hosting providers.
Just reading the introduction to the White Paper makes you aware of how complex this space is. The concept of online harms ranges from currently illegal activities to unacceptable content, covering online radicalisation, dissemination of online child sexual abuse, online disinformation, use of social media to promote gang violence and incite violence, and harmful online behaviours including bullying or promotion of content relating to self-harm or suicide.
How you decide what is unacceptable without infringing free speech is one of the most difficult challenges to overcome but whatever the approach, it is certain that more care will need to be taken in all these areas where children are concerned. We expect further developments in this area, both in terms of legislation and guidance.
Looking out for children
As with the GDPR, other rules around protecting children online are often enhanced versions of the rules which apply more generally. It is not always clear where the boundaries lie, especially when it comes to user generated content and private communications. Businesses need to follow a patchwork of applicable legislation, rules and guidance, but they always need to consider that the interests of the child are paramount.