Conformity Assessment under the EU AI Act

How does the conformity assessment for high-risk AI systems work under the EU AI Act? Overview of self-assessment, third-party procedures, CE marking and practical implementation steps.

11 February 20265 min read
EU AI ActConformity AssessmentCE MarkingHigh-Risk AIComplianceNotified Body

Overview

The conformity assessment is the central market access instrument of the EU AI Act for high-risk AI systems. Without a successful assessment, such a system may not be placed on the market or put into service in the European Union.

It is similar to well-known procedures from product safety law (e.g. CE marking for medical devices), but is specifically tailored to AI systems.

This article explains:

  • When a conformity assessment is required
  • What procedures are available
  • When a notified body must be involved
  • How CE marking works
  • What typical mistakes should be avoided

When Is a Conformity Assessment Required?

A conformity assessment is required for:

  • All high-risk AI systems under Annex III
  • AI systems that are safety components of a regulated product

It is not required for:

  • Low-risk AI
  • Systems subject to transparency obligations (e.g. chatbots without high-risk relevance)
  • GPAI models as such (unless they become part of a high-risk system)

Core Rule

Only high-risk AI systems are subject to the mandatory CE conformity procedure.

The Two Routes of Conformity Assessment

The EU AI Act provides for two possible procedures:

ProcedureWhen ApplicableThird-Party Involvement
Internal self-assessmentWhen harmonised standards are fully appliedNo
Third-party assessmentWhen standards are not fully applied or special risks existYes (Notified Body)

Internal Self-Assessment

This procedure is possible when:

  • The company fully complies with harmonised standards
  • No significant deviations exist
  • The system does not qualify as a safety component of a regulated product

Prerequisites:

  • Complete technical documentation
  • Risk management system
  • Quality management system
  • Internal audit records

The company declares conformity under its own responsibility.

High Responsibility

Internal self-assessment does not mean less liability. Market surveillance authorities can review the documentation at any time.

Conformity Assessment by Notified Body

A notified body is required when:

  • Harmonised standards are not fully applied
  • Safety components are involved
  • Special technical risks exist

The notified body reviews:

  • Documentation
  • Risk management
  • Tests
  • Governance structure
  • Quality assurance

CE marking may only proceed after a positive assessment.

Conformity Assessment Process

Step 1 -- Preparation

  • Complete risk classification
  • Clearly define intended purpose
  • Structure documentation

Step 2 -- Technical Documentation

The documentation must include, among other things:

  1. System description
  2. Architecture & model description
  3. Training & test data analysis
  4. Risk management documentation
  5. Human oversight concept
  6. Logging mechanisms
  7. Accuracy metrics

Step 3 -- Internal Review or Submission to Notified Body

  • Internal audit or
  • Review by notified body

Step 4 -- EU Declaration of Conformity

After successful assessment, the provider creates:

  • An EU Declaration of Conformity
  • A formal CE marking

Step 5 -- Registration

High-risk AI systems must:

  • Be registered in an EU-wide database

CE Marking

CE marking signifies:

  • Compliance with all relevant AI Act requirements
  • Market authorisation within the EU

It is not a quality award but a regulatory confirmation.

Unauthorised CE Marking

Unauthorised CE marking can result in substantial fines.

Harmonised Standards

The application of harmonised standards (e.g. ISO/IEC 42001) can:

  • Facilitate conformity evidence
  • Support internal assessment
  • Increase regulatory legal certainty

Standards are not a substitute for legal obligations but serve as a structuring instrument.

Common Mistakes in Practice

MistakeRisk
Unclear intended purposeIncorrect risk classification
Incomplete training data documentationRejection by notified body
Missing human oversight processesMarket ban
No post-market strategyReporting obligation violation
Confusion of provider and deployer rolesLiability risks

Post-Market Monitoring

The obligation does not end after market launch.

Providers must:

  • Establish performance monitoring
  • Document incidents
  • Implement corrective measures
  • Inform authorities where applicable

The conformity assessment is therefore not a one-off act but part of a continuous compliance process.

Connection to the GDPR

Where personal data is involved, the following must additionally be considered:

  • Data Protection Impact Assessment
  • Transparency obligations
  • Data subject rights
  • Third-country transfers

Isolated AI Act conformity is not sufficient.

Strategic Preparation

Start Early

Conformity processes frequently require:

  • 6--18 months lead time
  • Interdisciplinary teams (legal, technical, compliance)
  • Budget planning

Build Governance Structure

  • Appoint AI compliance officers
  • Establish documentation standards
  • Implement change management

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Next Steps

  1. Classify your AI system by risk level.
  2. Check whether a notified body is required.
  3. Structure your technical documentation.
  4. Implement a formal quality management system.
  5. Plan CE marking and registration early.

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Related Articles