This update illustrates the treatment of Artificial Intelligence (AI) by a regulator and will be of interest to any company that makes or uses automated data systems.

On 5 June 2018, the Personal Data Protection Commission (PDPC) in Singapore released a discussion paper on AI and Personal Data. The PDPC is Singapore's data protection regulator that covers how organizations collect, use and disclose personal data. The discussion paper takes a technology neutral and sector agnostic approach and is intended to be applied to a wide cross-section of organizations and industry bodies.

The framework divides the AI ecosystem into three components - 1. those who make AI (AI Developers), 2. those who use AI processes or sell AI enabled devices (User Companies) and 3. consumers. The obligations under the framework primarily rest on AI Developers and User Companies.

The obligations set out by the framework broadly fall into the following categories:

  • The ability to explain how your AI enabled product works (Explainability) or, where that is not possible, to supervise the AI system to ensure that the results are accurate (Verifiability).  
  • Good data practices for organizations. This includes knowing the provenance of data and its movement (data lineage), keeping good records throughout the AI value chain, as well as minimizing the risk of inherent or latent biases in the dataset.  
  • Open and transparent communication, both between AI Developers and User Companies as well as with consumers, with a view towards building trust in the AI ecosystem.

The discussion paper also outlines governance measures for organizations to consider that will allow it to be accountable to a regulator for its AI decision making processes. It also suggests measures for building trust and managing relationships with consumers who interact with AI decision making.

While no binding requirements have currently been imposed, the discussion paper provides an insight into potential regulatory touchpoints and considerations. It may also serve as a basis for legal counsel and compliance officers to obtain resources to improve data practices in organizations that place significant investments or reliance on AI.

By issuing a discussion paper, it is clear that the regulator has its eye on the increasing use of AI across various sectors. The regulator's current opinion is that "governance frameworks around AI should be 'light-touch'". The industry's response will determine how this view develops in future.

You will find a copy of the PDPC's discussion paper on "Artificial Intelligence (AI) and Personal Data - Fostering responsible development and adoption of AI" here or on the PDPC's website.