Pete (Sales): “Hi. I want to email our Loyalty Programme members so they know about this AMAZING new deal.”
In-house: “Do we have their consent?”
Pete (Sales): “Their what?”
In-house: “Their consent.”
Pete (Sales): “Dunno. But the deal is fantastic. They’ll definitely want to hear about it. And we have such rich data these days we can really target this one. It’s a WIN WIN!”
Sound familiar? The explosion of marketing initiatives which use highly targeted information is now a daily challenge for many in-house counsel.
This update examines two key questions which are central to this challenge, in light of evolving privacy and consumer protection legislation:
Can customer data be used for a new purpose that was not contemplated (or disclosed) at the time that the data was collected?
How can privacy policies and ‘consent clauses’ be ‘future proofed’?
Marketing initiatives that present challenges for privacy often stem from the continual emergence of new technology to monetise customer databases and loyalty programmes, such as:
Mobile apps - the ubiquitous use of mobile apps has changed the way that customer information is collected, disclosed and used. For example, geotracking data and user analytics may be passively provided by users and this information may be shared with other traders, as well as cloud and other service providers. Customer information may also be used to personalise a user's experience and for real-time location based marketing. Last week the Privacy Commissioner released new ‘Need to know or nice to have’ guidelines which are aimed at helping app developers understand the application of the Privacy Act.
‘Big Data’ and data mining - exponential growth in the volume and detail of information captured by businesses has been fuelled by the rise of social media, mobile apps and the ‘Internet of Things’. The sharing of data between businesses, to gain a broader, more complete perspective on consumer behaviour, is also increasing. For example, there are a number of shared loyalty scheme cards which provide aggregated consumer data to non-competing businesses.
The cloud - cloud-based storage of information and services present novel privacy issues, particularly where the location of data servers necessitates the transfer of personal information outside New Zealand and where cloud-based service providers have the ability to use personal information for their own purposes (such as, the improvement of their services and big data analytics).
1. Can customer data be used for a new purpose that was not contemplated (or disclosed) at the time that the data was collected?
The general obligation to obtain individuals’ prior consent to use of their personal information is well known. It stems from the:
Privacy Act 1993 (Privacy Act);
Unsolicited Electronic Messages Act 2007 (UEMA);
Fair Trading Act 1986 (FTA) (as a result of new ‘uninvited direct sales’ provisions); and
obligations of confidentiality (contractual and under common law).
It is best practice to ensure that individuals’ prior informed express consent is obtained before their personal information is collected, disclosed or used.
However, obtaining customers’ consent, in a way that is sufficiently broad and forward-thinking to adapt to evolving marketing and technology, is a constant challenge. One key step in the process is to understand and utilise the flexibility that is offered through the following exemptions to the general consent obligation.
Is there any risk of harm to individuals?
A legally enforceable breach of the Privacy Act only crystallises when there has been an “interference with privacy”.
This requires that, in the opinion of the Privacy Commissioner or the Human Rights Review Tribunal (as the case may be) the breach has or may cause actual harm to the individual concerned, such as detriment, loss, significant humiliation, or any adverse affect on their rights or interests.
The requirement for actual harm is also emphasised in specific principles. For example, there is an exemption to the general principle that information must be collected directly from the individual concerned if “non-compliance would not prejudice the interests of the individual concerned”.
The nature and sensitivity of the personal information that is being used will be critical to assessing whether there is a risk of harm for the purpose of the Privacy Act. It is also essential to consider whether:
irrespective of an actionable breach of privacy, the project may nonetheless lead to a perceived non-compliance with the Privacy Act and undermine brand values and trust; and
the intended use involves electronic messages, in which case the express or inferred consent of users is required under the UEMA (irrespective of whether there is a risk of harm).
Can the information be anonymised?
The use of data in a truly anonymised form is not “personal information” (as defined under the Privacy Act) and therefore is not subject to the requirements of the Privacy Act. Therefore, there is much greater flexibility to use data (such as ‘Big Data’) if it can be limited to anonymised data which identifies trends and patterns, without the potential to identify any particular individual.
If information is not “personal information” for the purposes of the Privacy Act, it will also not be subject to the UEMA and, generally, it will not be “confidential information” (subject to specific contractual confidentiality terms).
Is the information limited to corporate information?
If information can be limited to corporate information, and not information about individuals, it will not be “personal information” (as defined under the Privacy Act) and will therefore fall outside the requirements of the Privacy Act. However, if the message is an electronic message, the UEMA may still apply.
Can consent be inferred? Or is the purpose directly related to the purpose for which the information was collected?
The Privacy Act and UEMA permit consent to be inferred in some circumstances. For example, the UEMA includes an acknowledgement that consent can reasonably be inferred from the conduct, business and other relationships of the persons concerned. Similarly, the Privacy Act includes a number of exemptions for the use of personal information for a “directly related” purpose.
If the information is disclosed to a third party, are they acting solely as an agent?
The Privacy Act ‘looks through’ agents and deems them to be the principal for the purposes of the Privacy Act, provided that any use by the agent is limited to the safe custody or processing of the information and the agent does not use or disclose the information for its own purposes. Therefore, provided a third party service provider is solely acting as an agent, disclosure to the service provider is permitted under the Privacy Act, even if express consent to such disclosure has not been obtained.
Applying these exemptions to new initiatives for loyalty programmes and customer databases may provide some flexibility to the scope of a seemingly narrow and outdated privacy ‘consent clause’. However, the application of these exemptions will not always be straight forward. They should be considered alongside the Privacy Commissioner’s obligation under the Privacy Act to have due regard to the right of businesses to achieve their objectives in an efficient way, as well as the evolving philosophical shift of privacy law from humans rights to consumer protection legislation.
2. How should privacy policies and ‘consent clauses’ be updated?
Despite the potentially flexible parameters of ‘consent’ discussed above, it is still ‘best practice’ to ensure that individuals’ prior informed express consent is obtained before their information is used.
Accordingly, it is important to ensure that ‘consent clauses’ keep abreast of:
new and anticipated means of collecting, using, disclosing and sharing information about customers (including those highlighted above); and
recent and proposed amendments to the applicable legal framework, including the upcoming reform of the Privacy Act (discussed in our previous update here), new guidance from the Privacy Commissioner (including the new app development guidelines released last week) and recent changes to the FTA (discussed in our previous update here).
Three key questions which should be raised when drafting or updating privacy policies and ‘consent clauses’ (including clauses that cover the Privacy Act, UEMA and confidentiality) are:
WHAT information is being collected? For example, are customers passively providing information such as geo-location and analytics information through the use of an app?
HOW will the information be used? For example, if it will be used for direct marketing (including cross-selling) by phone, consider whether consent can be obtained to limit the requirements of the new uninvited direct sales provisions of the FTA. Further, will data be aggregated or used in an anonymised form, such as ‘Big Data’?
WHO will it be disclosed to? For example, will any cloud services be used? Could the information be transferred overseas? Will it be shared and combined with any third party data?
Obtaining customer consent to the use of their information does not necessarily require this level of detail under the current legislation.