IP & IT analysis: How far will the adoption of the EC’s Opinion (02/2013) on apps held on smart devices, which cites lack of transparency and free and informed consent, go in alleviating users’ and regulators’ concerns? Philip James, Partner in Pitmans SK Sport and Entertainment Group, considers the future for app developers and site operators.

Original news

Article 29 Working Party Opinion 02/2013 on apps held on smart devices

This opinion clarifies the legal framework surrounding the processing of personal data in the development, distribution and usage of apps on smart devices. Particular focus is placed on the requirements surrounding consent, the principles of purpose limitation and data minimisation, the adequacy of security measures and the fair processing of data collected from and about children.

What are the obligations of app developers and platforms (such as Google and Fa-cebook) in relation to the personal data of users?

Apps have an enormous ability to collect sensitive data, including personal data, stored on the user’s device as well as store and access metadata on that same device. When it comes to accessing, storing and passing on this data, there are a number of obligations app developers are bound by.

Consent

Their primary obligation is to obtain valid consent under the Data Protection Directive 95/46/EC (in relation to personal data) and, more recently, under the ePrivacy Directive 2002/58/EC in relation to data which may potentially infringe an individual’s privacy and private sphere. The opinion advises  this is more than merely clicking an ‘I accept’ button–users must be provided with more granular information explaining:

  • what data will be collected
  • who will access it
  • what it will be used for

Layered consent mechanisms and transparency are recommended and ‘just-in time’ consents are advised, although used on their own will not necessarily comply with legal requirements. It is important to bear in mind, however, that consent is not always required (for instance, personal data may be processed where it is necessary for legitimate commercial interests or where, say, for technical purposes or to satisfy a contract at the request of user).

Targeted collection   

Developers must make sure data collected is minimised, its processing should be limited to the purpose for which app requires it, and there are adequate systems in place to ensure the protection of the personal data they possess. In this regard, organisations should seek guidance from the ICO’s Anonymisation Code. Re-ducing the quantity of data reduces cost (eg  storage, distribution, analysis, mining), improves efficiency, protects privacy, increases accuracy whilst reducing potential liability and risk.

Developers and platform operators alike should instill:

  • privacy by default in technical mechanisms
  • privacy by design in research and product development

This makes for good business practice and can significantly reduce cost and enhance improvements and build trust and, in turn, protect reputation.

What are the implications for other parties involved in the development, distribution and processing of mobile apps in the EU?

Other parties such as app stores, operating systems and device manufacturers have an obligation to regu-late developers, as well as put in place and improve standards for application programme interfaces (APIs) to control access to sensitive information. These regulations can be enforced through these parties’ admis-sion policy.

Advertisers and analytics providers can fall into one of two categories: either they process data on behalf of the app owner or process data for their own purposes. If the latter, such as apps involved in behavioural targeting, they will share the obligations to obtain appropriate consent and have appropriate security measures in place with the app developers. Site owners and platform operators will need to consider carefully how to contract with third party advertisers and analytics providers to manage risk and liability and clearly delineate responsibility for achieving and reporting for compliance purposes.

Why has the Article 29 Working Party considered it necessary to address apps specifically in their recent opinion on apps held on smart devices? 

Apps are considered to have a particularly close relationship with the operating system of a device compared to the traditional web browser. This enables apps to collect large quantities of data from the device, much of which is personal.

The risk from a data protection perspective arises due to the fragmentation of the app provider landscape: device manufacturers; operating systems; platform developers; app developers; third parties.

This fragmentation can result in:

  • poor security measures
  • a lack of transparency as to responsibility
  • a lack of appropriate consent
  • indiscriminate data-gathering

Given the proliferation of apps and success of many developers, there is a challenge to police privacy whilst not restricting or impeding innovation. This is a constant juggling act, but the scales are currently listed too much in favour of developers. It is argued platform owners and network providers should act as the gatekeepers to help manage data privacy and users’ trust in apps.

What steps should app developers and others be taking to ensure they comply with the law? 

Appropriate consent forms the backbone of the opinion.

Freely given consent means that, if an app needs to process personal data, a user should not be confronted with one button, ‘I accept’, but a choice to ‘Accept’ or to (eg) ‘Cancel’. Informed consent means the data subject must have the necessary information to make an accurate decision before any data is processed–the consent must also be specific, meaning the label of the button or link to be clicked must be specific to the category of data to be processed.

How is the issue of informed user consent likely to be dealt with in practice, particularly in relation to children?

When it comes to children, developers should be aware of their potentially limited understanding of data protection issues. Because of their vulnerability, the principles of minimisation and limiting the data collected apply even more strictly for children. For example, developers and third parties should not process children’s personal data for behavioural targeting as this is likely to be outside of the scope of a child’s understanding. Specific prohibitions also include not collecting any data relating to the parents or family members of the young user.

Security is also a fundamental consideration and those commissioning, operating or licensing apps should conduct due diligence as part of the procurement process to manage security and provide for management audit and reporting requirements.

Interviewed by Nicola Laver. The views expressed by our Legal Analysis interviewees are not necessarily those of the proprietor.

This article was originally published on the Current Awareness service on LexisLibrary on 9th April 2013.