Significant risk management obligations are on the horizon under the proposed AI Act. While we are a couple of years away from its implementation, businesses must consider these incoming obligations today. The AI Act is clear that many elements of the legislation, particularly concerning risk, will need to be addressed in the design and development stages of specific AI systems. AI being developed today needs to be future proofed in line with the requirements of the proposed AI Act.

Article 9 of the proposed AI Act (Article 9) deals with risk management. The obligations created by Article 9 should give businesses pause for thought. Any organisations using or producing systems, which could be classified as high-risk AI (HRAI) under the proposed legislation, will have significant risk management obligations. The legislation may carve open airtight liabilities and warranties in existing contracts.

Mandatory Risk Management Systems

A risk management system is required to be put in place for HRAI under Article 9. It will be a continuous and iterative process run throughout the entire lifecycle of the HRAI system, including mandatory systematic, regular updating. The risk management system will comprise of:

  1. Identifying and analysing the known and foreseeable risks of each HRAI system.
  2. Estimating and evaluating the risks that may emerge when the HRAI system is used for its intended purpose, but notably, also under conditions of reasonably foreseeable misuse.
  3. Evaluating new risks based on post-market monitoring.
  4. Adoption of risk management measures.

According to Article 9(4) of the AI Act, any residual risks associated with the intended use or reasonably foreseeable misuse of an HRAI system will have to be communicated to the user of the HRAI.

Impact on Businesses

In identifying the most appropriate risk management measures, organisations must eliminate or reduce risks as much as possible through good design and development of HRAI systems. Where organisations cannot eliminate risk, they must implement adequate mitigation and control measures.

Importantly, producers of HRAI systems will be required to provide adequate transparency information to users under Article 13 of the AI Act and, "where appropriate", producers must also provide training on the AI risks to users of the HRAI systems.

To eliminate or reduce the risks of HRAI systems, Article 9(4) of the AI Act states that "due consideration shall be given to the technical knowledge, experience, education, training to be expected by the user".

What Does This Mean?

Let's break down what this means. Not only will organisations need to figure out the foreseeable risks of each HRAI system, but they will also need to figure out the ways the organisation could misuse the system and what associated risks would be with each misuse. They will also need to consider potential users' knowledge, experience and training. Further, AI producers may also need to provide training to users about risk.

Critically, businesses will also need to consider the users of HRAI systems. What if a company has its employees using HRAI systems but still needs to ensure that they are using an HRAI with an adequate risk management system? What if their staff should have, but didn't, receive training on the HRAI? How are they meant to assess these risks?


Businesses must now ask themselves the following questions:

  • Are your policies and procedures for risk assessment and risk management sufficient to deal with the obligations under the AI Act?
  • Are you equipped to provide the information and training required to comply with your obligations?
  • Have the risks outlined in Article 9 been eliminated as far as possible in the design and development of the HRAI systems?

According to Article 9(4)(a) of the AI Act, risk must be eliminated "as far as possible through adequate design and development". So, the AI systems you are designing and implementing today may be subject to the AI Act in the coming years and should be at the forefront of regulatory and compliance planning today.