This article forms part of a report on the EU AI Act by Paul Foley Law.
Read the main article HERE >
A point that may not be obvious from the extracts below on conformity assessment is that for stand-alone high-risk AI systems that are referred to in Annex III, a new compliance and enforcement system will be established. This follows the model of the New Legislative Framework legislation implemented through internal control checks by the providers with the exception of remote biometric identification systems that would be subject to third party conformity assessment.
Just to recall: conformity assessment means the process of demonstrating whether the requirements set out in Title III, Chapter 2 of the EU AI Act relating to an AI system have been fulfilled (Art 3(1)(20)).
High-risk AI systems and foundation models which are in conformity with harmonised standards or parts thereof, the references of which have been published in the Official Journal in accordance with Regulation (EU) 1025/2012 (on European standardisation) must be presumed to be in conformity with Chapter 2 (Title III: requirements for high risk AI systems) or Article 28b (foundation models) requirements, to the extent those standards cover those requirements (Art 40 truncated).
The Commission must issue standardisation requests covering all requirements of the EU AI Act in accordance with Article 10 of Regulation EU (No)1025/2012 (on European standardisation (Article 10 Standardisation requests to European standardisation organisations)) by two months after the date of entry into force of the EU AI Act. When preparing standardisation request, the Commission must consult the AI Office and the Advisory Forum (Art 40(1a)).
When issuing a standardisation request to European standardisation organisations, the Commission is required to specify that standards have to be consistent, including with the sectorial law listed in Annex II, and aimed at ensuring that AI systems or foundation models placed on the market or put into service in the Union meet the relevant requirements laid down in the EU AI Act (Art 40(1b)).
The actors involved in the standardisation process, must amongst other things, take into account the general principles for trustworthy AI set out in Article 4(a), and ensure a balanced representation of interests and effective participation of all relevant stakeholders in accordance with Articles 5, 6, and 7 of Regulation (EU) No 1025/2012 (Art 40(1c) extract only).
The Commission may, by means of an implementing act adopted in accordance with the Article 74(2) examination procedure and after consulting the AI Office and the AI Advisory Forum, adopt common specifications in respect of the requirements set out in Chapter 2 of this Title III or Article 28b (foundation models) wherein all of the following conditions are fulfilled:
Where the Commission considers there to be a need to address specific fundamental rights concerns, common specifications adopted by the Commission in accordance with paragraph 1a (Art 41(1a)) must also address those specific fundamental rights concerns (Art 41(1b)).
The Commission must develop common specifications for the methodology to fulfil the reporting and documentation requirement on the consumption of energy and resources during development, training and deployment of the high risk AI system (Art 41(1c)).
The Commission must, throughout the whole process of drafting the common specifications referred to in paragraphs 1a (Art 41(1a) and 1b (Art 41(1b), regularly consult the AI Office and the Advisory Forum, the European standardisation organisations and bodies or expert groups established under relevant sectorial Union law as well as other relevant stakeholders (Art 41(2)extract only).
High-risk AI systems which are in conformity with the common specifications referred to in paragraph 1a (Art 41(1a)) and 1b (Art 41(1b)) must be presumed to be in conformity with the requirements set out in Chapter 2 of this Title III, to the extent those common specifications cover those requirements (Art 41(3)).
Where a harmonised standard is adopted by a European standardisation organisation and proposed to the Commission for the publication of its reference in the Official Journal of the European Union, the Commission must assess the harmonised standard in accordance with Regulation (EU) No 1025/2012. When reference of a harmonised standard is published in the Official Journal, the Commission must repeal acts referred to in paragraph 1 (Art 41(1) and 1b (Art 41(1b)), or parts thereof which cover the same requirements set out in Chapter 2 of this Title III (Art 41(3a)).
Where providers of high-risk AI systems do not comply with the common specifications referred to in Article 41(1), they must duly justify that they have adopted technical solutions that meet the requirements referred to in Chapter II to a level at least equivalent thereto (Art 41(4)).
Taking into account their intended purpose, high-risk AI systems that have been trained and tested on data concerning the specific geographical, behavioural contextual and functional setting within which they are intended to be used shall be presumed to be in compliance with the respective requirements set out in Article 10(4) (Art 42(1)).
For high-risk AI systems listed in point 1 of Annex III, where, in demonstrating the compliance of a high risk AI system with the requirements set out in Chapter 2 of this Title III, the provider has applied harmonised standards referred to in Article 40, or, where applicable, common specifications referred to in Article 41, the provider must opt for one of the following procedures:
In demonstrating the compliance of a high risk AI system with the requirements set out in Chapter 2 of this Title III, the provider must follow the conformity assessment procedure set out in Annex VII in the following cases:
For high-risk AI systems referred to in points 2 to 8 of Annex III, providers must follow the conformity assessment procedure based on internal control as referred to in Annex VI, which does not provide for the involvement of a notified body. For high-risk AI systems referred to in point 5(b) of Annex III, placed on the market or put into service by credit institutions regulated by Directive 2013/36/EU (CRD IV), the conformity assessment shall be carried out as part of the procedure referred to in Articles 97 to101 of that Directive (Art 43(2)).
For high-risk AI systems, to which legal acts listed in Annex II, section A, apply, the provider must follow the relevant conformity assessment as required under those legal acts. The requirements set out in Chapter 2 of this Title shall apply to those high-risk AI systems and shall be part of that assessment. Points 4.3., 4.4., 4.5. and the fifth paragraph of point 4.6 of Annex VII shall also apply. For the purpose of that assessment, notified bodies which have been notified under those legal acts shall be entitled to control the conformity of the high-risk AI systems with the requirements set out in Chapter 2 of this Title, provided that the compliance of those notified bodies with requirements laid down in Article 33(4), (9) and (10) has been assessed in the context of the notification procedure under those legal acts. Where the legal acts listed in Annex II, section A, enable the manufacturer of the product to opt out from a third-party conformity assessment, provided that that manufacturer has applied all harmonised standards covering all the relevant requirements, that manufacturer may make use of that option only if he has also applied harmonised standards or, where applicable, common specifications referred to in Article 41, covering the requirements set out in Chapter 2 of this Title (Art 43(3)).
High-risk AI systems must undergo a new conformity assessment procedure whenever they are substantially modified, regardless of whether the modified system is intended to be further distributed or continues to be used by the current user. For high-risk AI systems that continue to learn after being placed on the market or put into service, changes to the high-risk AI system and its performance that have been pre-determined by the provider at the moment of the initial conformity assessment and are part of the information contained in the technical documentation referred to in point 2(f) of Annex IV, shall not constitute a substantial modification (Art 43(4)).
The specific interests and needs of SMEs must be taken into account when setting the fees for third-party conformity assessment under this Article, reducing those fees proportionately to their size and market share (Art 43(4a)).
The Commission is empowered to adopt delegated acts in accordance with Article 73 for the purpose of updating Annexes VI and Annex VII in order to introduce elements of the conformity assessment procedures that become necessary in light of technical progress. When preparing such delegated acts, the Commission shall consult the AI Office and the stakeholders affected (Art 43(5)).
The Commission is empowered to adopt delegated acts to amend (Art 43(1)) and Art 43 (2) in order to subject high-risk AI systems referred to in points 2 to 8 of Annex III to the conformity assessment procedure referred to in Annex VII or parts thereof (Art 43(6) extract only).
Certificates issued by notified bodies in accordance with Annex VII must be drawn-up in one or several official Union languages determined by the Member State in which the notified body is established or in one or several official Union languages otherwise acceptable to the notified body (Art 44(1)).
Certificates will be valid for the period they indicate, which shall not exceed four years. On application by the provider, the validity of a certificate may be extended for further periods, each not exceeding four years, based on a reassessment in accordance with the applicable conformity assessment procedures (Art 44(2)).
Where a notified body finds that an AI system no longer meets the requirements set out in Chapter 2 of this Title, it must suspend or withdraw the certificate issued or impose any restrictions on it, unless compliance with those requirements is ensured by appropriate corrective action taken by the provider of the system within an appropriate deadline set by the notified body. The notified body shall give reasons for its decision (Art 44(3)).
The provider must draw up a written machine readable, physical or electronic EU declaration of conformity for each high-risk AI system and keep it at the disposal of the national supervisory authority and the national competent authorities for 10 years after the AI high risk system has been placed on the market or put into service. A copy of the EU declaration of conformity shall be submitted to the national supervisory authority and the relevant national competent authorities upon request (Art 48(1)).
The EU declaration of conformity must state that the high-risk AI system in question meets the requirements set out in Chapter 2 of this Title. The EU declaration of conformity must contain the information set out in Annex V and must be translated into an official Union language or languages required by the Member State(s) in which the high-risk AI system is placed on the market or made available (Art 48(2)).
Where high-risk AI systems are subject to other Union harmonisation legislation which also requires an EU declaration of conformity, a single EU declaration of conformity may be drawn up in respect of all Union legislations applicable to the high-risk AI system. The declaration shall contain all the information required for identification of the Union harmonisation legislation to which the declaration relates (Art 48(3)).
By drawing up the EU declaration of conformity, the provider must assume responsibility for compliance with the requirements set out in Chapter 2 of this Title. The provider shall keep the EU declaration of conformity up-to-date as appropriate (Art 48(4)).
After consulting the AI Office, the Commission will be empowered to adopt delegated acts in accordance with Article 73 for the purpose of updating the content of the EU declaration of conformity set out in Annex V in order to introduce elements that become necessary in light of technical progress (Art 48(5)).
The physical CE marking must be affixed visibly, legibly and indelibly for high-risk AI systems before the high-risk AI system is placed on the market. Where that is not possible or not warranted on account of the nature of the high-risk AI system, it must be affixed to the packaging or to the accompanying documentation, as appropriate. It may be followed by a pictogram or any other marking indicating a special risk of use (Art 49(1)).
For digital-only high-risk AI systems, a digital CE marking shall be used, only if it can be easily accessed via the interface from which the AI system is accessed or via an easily accessible machine-readable code or other electronic means (Art 49(1a)).
The CE marking referred to in paragraph 1 (Art 49(1)) of this Article will be subject to the general principles set out in Article 30 (General Principles of the CE Marking) of Regulation (EC) No 765/2008 (Art 49(2)).
Where applicable, the CE marking must be followed by the identification number of the notified body responsible for the conformity assessment procedures set out in Article 43. The identification number of the notified body must be affixed by the body itself or, under its instructions, by the provider’s authorised representative. The identification number must also be indicated in any promotional material which mentions that the high-risk AI system fulfils the requirements for CE marking (Art 49(3)).
Where high-risk AI systems are subject to other Union law which also provides for the affixing of the CE marking, the CE marking shall indicate that the high-risk AI system also fulfils the requirements of that other law (Art 49(3a)).
The provider must, for a period ending 10 years after the AI system has been placed on the market or put into service, keep at the disposal of the national competent authorities: (a) the technical documentation referred to in Article 11; (b) the documentation concerning the quality management system referred to Article 17; (c) the documentation concerning the changes approved by notified bodies where applicable; (d) the decisions and other documents issued by the notified bodies where applicable; (e) the EU declaration of conformity referred to in Article 48.
This article forms part of a report on the EU AI Act by Paul Foley Law.
Read the main article HERE >