AI is a hot topic in the financial sector, with legislatures, regulators and financial institutions themselves, especially due to the popularity of solutions like ChatGPT. In this series of Q&AIs, we discuss what financial institutions need to consider when using AI in the performance of their regulated activities and developing AI solutions for that purpose.

Although the EU AI Act is still under construction, EU and Dutch regulators point out that existing regulatory frameworks already impose standards on financial institutions when they use and/or develop AI systems. They have also published reports, principles and other guidance on the use and development of AI systems. In this series of Q&AIs, we will focus on the standards arising from the existing regulatory framework. In this respect, we discuss the use of AI in the context of third-party service providers and management of ICT risks. Furthermore, we set out how the use of AI affects the product distribution chain and client relationship, including client due diligence and transaction monitoring. Finally, we discuss what the AI Act means for financial institutions.

Question 1. May financial institutions use AI when performing regulated activities? Financial institutions may use AI in relation to regulated activities. While AI may enable them to enhance their business processes, it also has the potential to cause incidents that could harm a financial institution and/or its clients. Therefore, the use of AI should be in compliance with existing regulatory requirements. Both EU regulators and Dutch regulators – the Dutch Central Bank (DNB) and the Netherlands Authority for the Financial Markets (AFM) – have published reports, principles and other guidance on the use of AI by financial institutions.

DNB, for instance, issued General Principles For The Use Of Artificial Intelligence In The Financial Sector in July 2019. Soundness, accountability, fairness, ethics, skills and transparency (or ‘SAFEST’) form a framework within which financial institutions can responsibly shape the deployment of AI.The AFM acknowledges the use of AI in, amongst others, its supervision forecast report ‘Trendzicht 2024’ of November 2023. The AFM stipulates that the use of use of AI can contribute to efficiency in the financial sector. However, besides the positive effects of increased supply and diversity of providers, the digitalisation of financial markets leads to new risks, for example in case of the uncontrolled use of AI in advising on and distributing financial products.

Specifically for (re)insurers, EIOPA has also published Artificial Intelligence Governance Principles in June 2021 that EIOPA plans to update. EIOPA has also announced in its working programme that it will develop a sound regime for the use of AI by the insurance sector, complementary to the AI Act. In these reports and principles, the regulators stress that the existing regulatory framework also applies to the use of AI by financial institutions. Examples of relevant regulatory requirements in the use of AI include: requirements relating to ethical business operations, sound and controlled business operations, outsourcing, ICT risk management, product approval and review processes and customer due diligence and transaction monitoring. In this Q&AI series, we elaborate on this in more detail.

Question 2. Third-party service providers: What do financial institutions (already) need to consider when they use third-party AI-solutions? The use of third-party AI-solutions is likely to be captured by the outsourcing rules as well as the Digital Operations Resilience Act (DORA). The main characteristics of an outsourcing arrangement are (i) the engagement of a service provider, (ii) such service provider performing services that are part of the financial institution’s business operations/its regulated business or supporting its essential business processes, and (iii) the activities being performed by the service provider otherwise being performed by the financial institution itself. Additionally, third-party AI-solutions are captured by DORA. DORA applies as of 17 January 2025 and covers all (existing and new) ICT contracts, whether such contracts constitute outsourcing or not.

Key requirements in relation to arrangements with third-party service providers following from the outsourcing rules and DORA are (i) management of risks related to arising from the arrangement with the service provider, (ii) monitoring the arrangement and (iii) inclusion of specific provisions in the contract with the service provider. If the arrangement is deemed to be critical/important, there is an obligation to notify the regulator and additional requirements apply with the aim of ensuring business continuity of the financial entity and to stay in control.

Finally, and irrespective whether an arrangement is captured by the outsourcing rules or DORA, financial institutions need to ensure sound and controlled business operations. Whilst this is a very broad requirement, regulators tend to use this requirement as a legal basis for guidance on topics for which there is no specific legislation (yet) for financial institutions. Existing guidelines also already include specific requirements in respect of managing third-party risks which could be extended to covering risk in respect of the use of AI when outsourcing. Relevant examples are the EIOPA Guidelines on outsourcing to cloud service providers for (re)insurers and the ESMA Guidelines on outsourcing to cloud service providers for entities that fall within the scope of MiFID, the AIFMD, UCITS, etc. For credit institutions and payment service providers, the EBA has published Guidelines on outsourcing arrangements. The existing guidelines on outsourcing are expected to continue to co-exist with DORA (as they have a different scope in part), but they will need to be adapted to align well with DORA.

This page is part of an ongoing series of articles that will be regularly updated throughout the week.