paul@paulfoleylaw.ie
22 Northumberland Road, Dublin D04 ED73, Ireland, EU
INTRO
INSIGHTS

Foundation Models, Generative AI and the EU AI Act

By
Paul Foley
Foundation models including generative AI: the proposed EU legal requirements

This article forms part of a report on the EU AI Act by Paul Foley Law.
Read the main article HERE >


A foundation model is an AI system model that is trained on broad data at scale, is designed for generality of output, and can be adapted to a wide range of distinctive tasks (Art 3(1)(1c) EU AI Act).

Generative AI is a foundation model used in an AI system which is specifically intended to generate with varying levels of autonomy, content such as complex text, images, audio or video (Art 28b(4) paraphrase of EU AI Act).

Generative AI (FTC definition) is a category of AI that empowers machines to generate new content rather than simply analyze or manipulate existing data. By using models trained on vast amounts of data, generative AI can generate content—such as text, photos, audio, or video—that is sometimes indistinguishable from content crafted directly by humans. Large language models (LLMs), which power chatbots and other text-based AI tools, represent one common type of generative AI (source Federal Trade Commission https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2023/06/generative-ai-raises-competition-concerns)

Foundation models are often trained on a broad range of data sources and large amounts of data to accomplish a wide range of tasks, including some for which they were not specifically developed and trained (Recital 60e extract and paraphrase).

AI systems with specific intended purpose or general purpose AI systems can be an implementation of a foundation model, which means that each foundation model can be reused in countless downstream AI or general purpose AI systems (Recital 60e extract and paraphrase).

In the case of foundation models provided as a service such as through API access, the cooperation with downstream providers should extend throughout the time during which that service is provided and supported (Recital (60f) extract and paraphrase ).

There is significant uncertainty as to the way foundation models will evolve, both in terms of typology of models and in terms of self-governance. Therefore, it is essential to clarify the legal situation of providers of foundation models (Recital (60g) extract and paraphrase).

Foundation models should have information obligations and prepare all necessary technical documentation for potential downstream providers to be able to comply with their obligations under the EU AI Act (Recital (60g) extract and paraphrase).

Generative foundation models must ensure transparency about the fact the content is generated by an AI system, not by humans (Recital (60g) extract).

As foundation models are a new and fast-evolving development in the field of artificial intelligence, it is appropriate for the Commission and the AI Office to monitor and periodically asses the legislative and governance framework of such models and in particular of generative AI systems based on such models, which raise significant questions related to the generation of content in breach of Union law, copyright rules, and potential misuse (Recital 60h extract).

Deployers who are public authorities or Union institutions, bodies, offices and agencies or deployers acting on their behalf and deployers who are undertakings designated as a gatekeeper under Regulation (EU)2022/1925 (DMA) must also register in the EU database before putting into service or using a high-risk AI system for the first time and following each substantial modification. Other deployers should be entitled to do so voluntarily. Any substantial modification of high-risk AI systems must also be registered in the EU database (Recital 69 extract).

Obligations of the provider of a foundation model (Art 28b)

A provider of a foundation model must, prior to making it available on the market or putting it into service, ensure that it is compliant with the requirements set out in this Art 28b, regardless of whether it is provided as a standalone model or embedded in an AI system or a product, or provided under free and open source licences, as a service, as well as other distribution channels (Art 28b1).

For the purpose of Art 28b1, the provider of a foundation model must (extracts and paraphrase):

  1. demonstrate through appropriate design, testing and analysis the identification, the reduction and mitigation of reasonably foreseeable risks to health, safety, fundamental rights, the environment and democracy and the rule of law prior and throughout development with appropriate methods such as with the involvement of independent experts, as well as the documentation of remaining non-mitigable risks after development;

  2. process and incorporate only datasets that are subject to appropriate data governance measures for foundation models, in particular measures to examine the suitability of the data sources and possible biases and appropriate mitigation;

  3. design and develop the foundation model in order to achieve throughout its lifecycle appropriate levels of performance, predictability, interpretability, corrigibility, safety and cybersecurity;

  4. design and develop the foundation model, making use of applicable standards to reduce energy use, resource use and waste, as well as to increase energy efficiency. Foundation models must be designed with capabilities enabling the measurement and logging of the consumption of energy and resources;

  5. draw up extensive technical documentation and intelligible instructions for use, in order to enable the downstream providers to comply with their obligations pursuant to Arts 16 and 28(1);

  6. establish a quality management system to ensure and document compliance with this Art, with the possibility to experiment in fulfilling this requirement,

  7. register that foundation model in the EU database referred to in Article 60, in accordance with the instructions outlined in Annex VIII point C (Art 28b2).

Providers of foundation models must, for a period ending 10 years after their foundation models have been placed on the market or put into service, keep the technical documentation referred to in Art 28b2(e) at the disposal of the national competent authorities (Art 28b3).

Providers of foundation models used in generative AI and providers who specialise a foundation model into a generative AI system, must in addition:

  1. comply with the transparency obligations outlined in Article 52(1),

  2. train, and where applicable, design and develop the foundation model in such a way as to ensure adequate safeguards against the generation of content in breach of Union law in line with the generally-acknowledged state of the art, and without prejudice to fundamental rights, including the freedom of expression,

  3. without prejudice to Union or national or Union legislation on copyright, document and make publicly available a sufficiently detailed summary of the use of training data protected under copyright law (Art 28b4).

Get in Touch

This article forms part of a report on the EU AI Act by Paul Foley Law.
Read the main article HERE >

For legal and regulatory advice, drafting and know-how related to the development, deployment and otherwise making available of foundation models and related documentation, contact paul@paufoleylaw.ie

© Paul Foley, Paul Foley Law 2023 - All Rights Reserved.
17.7.2023

Full copyright policy HERE >
map-markerenvelopetagarrow-left linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram