The White Paper on Artificial Intelligence (the "AI White Paper"), recently released by the European Commission, provides the clearest indication yet that the EU is seriously considering regulating the development and deployment of artificial intelligence ("AI").
If adopted, the Commission's proposals would likely increase the already significant compliance burden imposed on technology-focused and technology-dependent businesses operating in the EU, and may lead to significantly divergent practices between the EU and the rest of the world. It is evident from the text of the AI White Paper that the Commission considers that its proposals would help to establish the EU at the centre of global AI technology development and deployment, and to encourage investment in the EU in this space, although it is not entirely clear that a lack of additional regulation is currently holding the EU back.
The AI White Paper charts a rough course for the potential future regulation of AI technologies in the EU. It will require much greater detail and refinement before it can realistically progress to full-blown legislation. However, at this early stage, there are several issues for businesses involved in the development or use of AI technologies to consider.
One of the fundamental issues for any future AI regulation, acknowledged by the Commission in the AI White Paper, is the need to provide a clear and precise definition of AI. This is essential so that businesses are able to understand and identify which AI technologies fall within the regulatory framework.
The Commission recognises that a precise definition must be established. However, it has also stressed the need for the definition to be "sufficiently flexible to accommodate technical progress". The definition used in the White Paper is not particularly precise, and if this definition were to be adopted, and form the basis of a future regulation, it could result in a much wider application than intended.1
The Commission has proposed the adoption of a risk-based framework for any future regulation of AI. At its core, the Commission's proposal is that AI technologies deployed in high-risk sectors (e.g., energy, transport, healthcare) for high-risk uses (e.g., uses that may cause significant material or immaterial damage) should be regulated. In certain exceptional instances, the Commission proposes that the deployment of AI outside of high-risk sectors may be subject to regulation due to the risks associated with its use. Low-risk AI uses would not be subject to mandatory compliance requirements.
Prior to the adoption of any AI regulation, greater clarity and certainty will be required with respect to: (1) the identified "high-risk sectors"; (2) what constitutes a "high-risk use"; and (3) what "exceptional instances" would be considered high-risk.
Material issues for businesses to consider
The proposals in the AI White Paper give rise to a number of potential issues which businesses with an interest in AI should consider. In particular:
- Application to personal and non-personal data: the proposals suggest that the use of non-personal data may be regulated, and the use of personal data could be subject to new and/or additional requirements in the context of AI technologies (i.e., in addition to the GDPR and other existing data protection laws).
- Biometric data use is always high-risk: the proposals indicate that using biometric information (e.g., physical attributes, psychological attributes, behavioural characteristics) in connection with AI technologies will always be considered high-risk and therefore subject to regulation.
- Additional data-related obligations: the proposals could result in further compliance burdens regarding data quality and traceability (i.e., ensuring the compliant collection of data) in addition to those obligations which businesses are already subject to when using data (e.g., data protection, privacy, confidentiality requirements, etc.).
- Record keeping obligations: the proposals suggest that businesses should be required to maintain accurate records of the data sets used to test and train AI systems and, in some situations, to maintain the entire data set and disclose that data set to regulators. The AI White Paper does not explain how such large datasets should be stored, nor does it address the corresponding increased privacy risks that would result from storing such additional data.
- Impact on existing AI technologies: if the proposals are adopted, many current uses of AI technologies are likely to become non-compliant unless additional steps are taken by affected businesses – in some cases, this will require wholesale changes to the way in which AI products are developed and implemented. It is also possible that some current AI technologies will simply not be capable of being compliant, and their use will have to be discontinued.
- Energy requirements: the proposals suggest that a future regulation could require AI technologies to meet certain energy consumption and resource usage standards.
There is little doubt that, if the proposals in the White Paper are adopted in the form of laws that seek to regulate AI, affected businesses will need to allocate substantial additional resources to meeting the increased compliance burdens. It will likely be necessary to adopt and maintain new policies, procedures and expertise in this area.
Business strategy and response
Businesses with an interest in AI technologies should review the AI White Paper and assess the potential impact of the proposals on current and future operations.
The Commission has also issued a call for comments to the AI White Paper, to be received by 19 May 2020, so there is an immediate opportunity for interested businesses to voice any concerns they may have at this stage.
Beyond the deadline for comments to be received, the Commission has given no indication of a timetable for progressing with its proposals. As such, and in light of the potentially significant impact that the adoption of the proposals in the AI White Paper could have, businesses are advised maintain a watching brief for further developments.