The promise of big data carries with it levels of personalisation and problem-solving that were not envisioned by the general populace even a few short decades ago. Indeed, personal data is used by retailers to meaningfully enhance the customer experience by tracking past purchases, drastically simplifying purchases and returns, and building a profile that lets retailers react to what consumers want – and even to anticipate it. It’s a far cry from walking into a store as a ‘new’ shopper each time.
But with new technology comes new trade-offs, and it was ever thus. Complying with legislation such as the Data Protection Act is just one aspect of this; customers are increasingly aware of the use of their personal data, and bad practice can be damaging to customer relationships, even if it is legal. Reconciling data security and data-driven innovations represents a delicate balance.
Tech giants such as Google, Amazon and Facebook thrive upon what the data tells them, but virtually all businesses now have access to sensitive customer data. Sharing insight on sales, location-based and psychographic data between departments underpins business growth quite directly. The myriad of modern tools analyse everything from who your best customers are to the demographic appeal of particular products. We live in a world driven by marketing campaigns that depend on sharing personal data.
Data security concerns
But after high-profile mishaps involving organisations as diverse as the NHS, HMRC and T-Mobile, who saw 15 million of its customers hacked last year, some consumers are jittery. In a study by Digital Catapult, 60 per cent of consumers admitted their discomfort with the personal data they provided being shared with other businesses.
Consumers are often not even aware of what exactly their data is being used for, even if they know it is being used. Certain approaches to data usage may be contravening codified human rights. Proposed changes to EU data protection regulation will allows companies to store personal health and genetic data without consent or knowledge. Although such proposals attempt to balance the risk by ‘pseudo-anonymising’ the data, which is to say omitting names and other identifying features, the potential for exploitation remains.
Whatever our position on the issue, we are already constantly sharing information about ourselves, and that information is being exploited – sometimes for our benefit, sometimes not so much. What is vital is that data collection and usage is informed by a healthy institutional respect. If a consumer even suspects their data is being exploited or sold to third parties they will lose faith, which is not what marketers are after. Openness may reap its own rewards.