Provider Obligations under Art. 16–25 EU AI Act

What legal and organisational obligations apply to providers of AI systems under the EU AI Act? Overview of quality management, technical documentation, logging, transparency, CE marking and post-market monitoring.

11 February 20265 min read
EU AI ActProviderComplianceCE MarkingTechnical Documentation

Overview

Anyone who develops an AI system and places it on the market under their own name is classified as a provider under the EU AI Act. For high-risk systems, providers face comprehensive obligations covering organisational, technical and documentation requirements.

Many companies underestimate that they legally qualify as providers -- even when they rely on existing models or APIs.

This article explains:

  • Who legally qualifies as a provider
  • What core obligations Art. 16--25 impose
  • What documentation requirements exist
  • How CE marking works
  • How provider obligations differ from deployer obligations

Who Is a "Provider" under the EU AI Act?

A provider is any natural or legal person who:

  • Develops an AI system or
  • Has an AI system developed
  • Places it on the market under their own name or brand
  • Or puts it into service for the first time in the EU

Also relevant:

  • Re-branding of a third-party AI system
  • Substantial modification of functionality
  • Own integration with significant system modification

Role Shift to Watch

A company can shift from deployer to provider if it substantially modifies an AI system's functionality or distributes it under its own brand.

Overview of Provider Obligations (High-Risk Systems)

ObligationLegal BasisCore Content
Risk managementArt. 9Continuous risk identification & mitigation
Data governanceArt. 10Quality & representativeness of training data
Technical documentationArt. 11Complete system description
LoggingArt. 12Automated logging
TransparencyArt. 13User information
Human oversightArt. 14Intervention capabilities
Accuracy & robustnessArt. 15Technical performance requirements
Quality management systemArt. 17Formal governance system
Conformity assessmentArt. 43CE marking
Post-market monitoringArt. 61Ongoing surveillance

Risk Management System (Art. 9)

Providers must establish systematic risk management.

This includes:

  1. Identification of potential risks
  2. Assessment of likelihood of occurrence
  3. Assessment of severity of potential harm
  4. Implementation of countermeasures
  5. Continuous updating

Risks may relate to:

  • Fundamental rights
  • Discrimination
  • Erroneous decisions
  • Manipulation
  • Security vulnerabilities

Data Governance (Art. 10)

Training, validation and testing data must be:

  • Relevant
  • Representative
  • Low in errors
  • Free from systematic biases (as far as possible)

The following must be documented:

  • Data sources
  • Selection criteria
  • Cleaning steps
  • Bias tests

Practical Relevance

Data governance is one of the most common compliance weak points in high-risk AI.

Technical Documentation (Art. 11)

The technical documentation must enable authorities to assess conformity.

It must include, among other things:

  • System architecture
  • Intended purpose
  • Model description
  • Training data description
  • Performance metrics
  • Testing methods
  • Human oversight concept
  • Safety measures

This documentation must be created before market placement.

Logging (Art. 12)

High-risk AI systems must automatically:

  • Log relevant system events
  • Make decision processes traceable
  • Document interventions

Purpose: Traceability and subsequent review.

Transparency Obligations (Art. 13)

Providers must ensure that deployers:

  • Receive clear instructions
  • Understand the purpose and limitations of the system
  • Are aware of known risks
  • Can implement necessary monitoring measures

Human Oversight (Art. 14)

The system must be designed so that:

  • Human intervention is possible
  • Shutdown is possible
  • Malfunctions can be detected
  • Automation does not proceed unchecked

Accuracy, Robustness & Cybersecurity (Art. 15)

Requirements:

  • Minimisation of systematic errors
  • Protection against manipulation
  • Resilience against attacks
  • Monitoring of performance deviations

Quality Management System (Art. 17)

Providers must introduce a formal management system covering:

  • Compliance processes
  • Documentation standards
  • Responsibilities
  • Change management
  • Supplier oversight

This is similar to existing standards such as ISO/IEC 42001.

Conformity Assessment & CE Marking

A conformity assessment must be conducted before placing on the market:

Two Routes:

  1. Internal self-assessment (under certain conditions)
  2. Assessment by a notified body

After successful assessment:

  • EU Declaration of Conformity
  • CE marking
  • Registration in EU database

Market Ban Possible

Without CE marking, a high-risk AI system may not be made available on the EU market.

Post-Market Monitoring (Art. 61)

Providers must:

  • Monitor system performance
  • Report incidents
  • Document updates
  • Implement corrective measures

Compliance does not end with market launch.

Provider vs. Deployer Distinction

RoleResponsibility
ProviderDevelopment & conformity
DeployerDeployment & monitoring

A company can hold both roles.

Connection to GPAI

Providers of General Purpose AI are subject to additional obligations (Art. 51 ff.), particularly in the case of systemic risk.

Practical Implementation

Step 1 -- Role Clarification

  • Are we a provider?
  • Or a deployer?
  • Or both?

Step 2 -- Identify Documentation Gaps

  • Is technical documentation available?
  • Is risk management established?
  • Is data governance documented?

Step 3 -- Implement QMS

  • Appoint compliance officers
  • Define processes
  • Establish change management

Step 4 -- Plan Conformity Strategy

  • Is self-assessment possible?
  • Is a notified body required?

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Next Steps

  1. Clarify your role under the EU AI Act.
  2. Check whether your system is high-risk.
  3. Begin with technical documentation.
  4. Establish a formal quality management system.
  5. Plan the conformity assessment early.

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Related Articles