It seems like privacy is being talked about everywhere these days – in the media, in new legislation and in the boardroom. And as the data that is collected becomes more extensive, and the use of that data becomes more widespread, the conversation about privacy needs to mature. Organisations should move beyond focussing merely on consent and de-identification, but begin conversations on governance, on ethics and what “should” they use data for, rather than what “can” they use data for.

So how do organisations move beyond the paradigm of consent and start asking about the data use and governance models that we should start thinking about in the future? KWM recently hosted two events on the Future of Privacy (in Sydney on 8 August 2018, and in Melbourne on 15 August 2018) which asked these questions on what should – and shouldn’t – matter when it comes to privacy. Panellists included Tom Daemen (Head of Corporate, External and Legal Affairs, Microsoft Australia), Rajiv Cabraal (Legal, Governance and IP Director, Data61), Sarah Turner (General Counsel and Company Secretary, REA Group) and Nigel Phair (Director of UNSW Canberra Cyber), along with KWM partners Patrick Gunning and Cheng Lim.

In polls carried out in the rooms (anonymously of course!), attendees were most keen to hear about whether people actually cared about privacy at all, and the trade-offs that individuals make between privacy and convenience.

Some of the key takeaways from our panellists include the following:

Embed privacy by design

  • The nature of personal information is changing with new technology – what was once not personal information (e.g. a photograph) might be now be personal information (e.g. because of increasing capabilities of facial recognition software). At the same time, increased tracking and analytics capabilities mean things like MAC addresses and IP addresses are increasingly used to profile individuals, but may not be covered by existing definitions of personal information.
  • So this means that we may need to rethink how we conceptualise privacy, particularly if we no longer have privacy or anonymity! This is where the concept of privacy by design is an important idea.
  • Privacy by design is a central concept in the EU General Data Protection Regulation, which is beginning to gain traction here in Australia. One key tenet of privacy by design is data minimisation – only collecting and only keeping what you need, rather than scooping up everything you can get and finding a use case for it later. For some, that means balancing the expectation of consumers for a ‘curated experience’ with consumer views of what is acceptable and what might be ‘creepy’.
  • For others, ‘research approval by design’ and ‘privacy by design’ are equally important. A root cause of regulation failing to achieve desired outcomes is that it assumes people can control their own privacy through the consent process. But once consent has been given to the data collector, then the data collector can do anything it wants within the bounds of that consent. Sites can be designed to take advantage of predictable human behaviours and biases. Well-designed technology cues people to instinctively give consent without really considering or understanding it – for example, people are inclined to tick checkboxes because they know subconsciously that’s what checkboxes are for, something should be in there. Data collectors will make sure everything is done to direct people to click “I consent”. Often, risks are not clear to the user and they don’t necessarily understand how much and what information is being collected about them.

So, maybe the conversation should move from consent to use, expectations and ethics or values around the use of information.

Free is not free. If you aren't paying for a product, then you are the product.

  • When a product or service that is offered is “free” and is extremely convenient or useful, that is often extremely compelling for consumers. It is difficult to convince people to pay for a similar service which may have more privacy protections and which doesn’t seek to monetise their behaviour. And while some may intellectually understand that the provider has to earn revenue in order to provide the service, the fact that the service is “free” and “useful” seems to overwhelm any concerns that users may have about the privacy of what is done with their information. Added to this is the fact that consent to the privacy terms is often a condition of receiving the service and there is no ability to negotiate the terms of that consent.
  • Having said that, many of the global platforms do understand that trade-off and do attempt to put into place privacy protections to ensure that users do not become so concerned with the privacy of their information and how it is used, that they will change platforms. In this way, by giving users more granular control over the use and sharing of their data, these platforms are starting to move the conversation to managing data use and the expectations of their users in relation to that data use.

Pay attention to data governance

  • In a world where we need to think about privacy by design, and to think about consents differently, in a world where we should talk about “should” rather than can, we need to have a way to have conversations at the right level within our organisations about the “should”. In a world where technology has reduced the chances of being able to conduct our affairs in anonymity, and has real world consequences for us – we need a new way to think about privacy, and the ethics of how we should deal with information.
  • We are starting to see organisations create data governance structures to begin having these conversations. For example, one global software firm has an ethics committee formed by lawyers, engineers, technologists and others, to make determinations as to whether the organisation should assist customers to undertake particular kinds of projects using their technologies. While we hope that organisations will start to put these bodies into place in the near future, even without them, organisations need to have conversations at the highest levels as to the norms that they will apply to the use of data that they hold. This is critical to building trust with consumers, customers and regulators.
  • Privacy is not just about compliance, but is about trust. Trust and compliance are different things. Strategic organisations drive privacy not as a compliance issue but as a strategic trust issue. At a fundamental level, the more trusted you are with customers the better you will be; how many customers will stay with you if they don’t trust you? Your competitors are literally only a couple of clicks away. And if they trust you, they will be more likely to stay with you even if there is a privacy breach as they will trust you to do the “right thing” and to do your best to ensure that there are no repeated breaches.

GDPR as a global standard

  • GDPR is currently the “gold standard” in terms of privacy regulation. We are starting to see Australian companies, particularly those with global operations, look at implementing consistent privacy practices which are consistent with GDPR. But it is really hard to do, and requires not just dedication of resources and effort – it requires organisations to make strategic design decisions – for example, should they implement the right to be forgotten even if that is not required by local laws? And the more that organisations implement GDPR practices, the more that GDPR resembles a de facto global privacy regulatory regime.
  • It is important to remember that GDPR is not the only major standard in privacy law. China’s cybersecurity laws are another precedent which is gaining traction in Asia, having been followed in Vietnam and other countries.

What can we do

We can start having conversations in our organisations about these matters. As has been said in the context of the Banking Royal Commission, we should be talking about “what we should do”, rather than “what we can do”. We can start building our customers’ and clients’ trust in us and how we treat their data.