AI Compliance in Healthcare

What requirements apply to AI in healthcare? Art. 9 GDPR (health data), DPIA obligations, high-risk classification under the EU AI Act and overlaps with MDR/IVDR – explained practically.

11 February 20265 min read
HealthcareAIGDPRArt. 9EU AI ActMDRCompliance

Overview

AI systems in healthcare promise better diagnoses, more efficient workflows and new medical insights. At the same time, healthcare is among the most heavily regulated fields: health data is specially protected under Art. 9 GDPR, and many medical AI applications qualify as high-risk systems under the EU AI Act.

An additional regulatory framework applies: depending on the use case, an AI application may also qualify as a medical device and therefore fall under the Medical Device Regulation (MDR) or the In-vitro Diagnostic Regulation (IVDR).

This article explains:

  • Data protection requirements for health data
  • Typical DPIA triggers for medical AI
  • AI Act classification and obligations
  • Interfaces with MDR/IVDR
  • Practical action steps for projects

1. Data Protection: Health Data as a Special Category (Art. 9 GDPR)

What Counts as Health Data?

Health data includes, among others:

  • Diagnoses, findings, laboratory values
  • Medication plans
  • Treatment histories
  • Data from wearables (e.g. heart rate), where a health context exists
  • Patient identifications in a clinical context

As a special category, a general processing prohibition applies -- unless an exception under Art. 9(2) GDPR is applicable.

High Protection Standard

Health data is particularly sensitive. Even minor governance errors can trigger significant risks.

Common Exceptions in the Healthcare Context

  • Art. 9(2)(h): Healthcare / treatment
  • Art. 9(2)(i): Public interest in the area of public health
  • Art. 9(2)(j): Scientific research (subject to additional conditions)
  • Explicit consent (a) -- in practice often difficult, as withdrawal and purpose limitation are complex

2. Typical AI Use Cases in Healthcare

Clinical Decision Support Systems (CDSS)

  • Suggestions for diagnoses or therapies
  • Risk: Over-reliance on AI, bias, lack of traceability

AI-Assisted Imaging

  • Radiology, pathology, dermatology
  • Risk: Misclassifications, data drift, bias from training data

Operational AI (Hospital Processes)

  • Capacity planning, patient flow, resource optimisation
  • Risk: Indirect effects on treatment, prioritisation

Drug Discovery and Research

  • Pattern recognition in data, drug screening
  • Risk: Change of purpose, data interfaces, anonymisation questions

3. DPIA: Why It Is Almost Always Relevant in Healthcare (Art. 35 GDPR)

A DPIA is frequently required because:

  • Special categories are processed (Art. 9)
  • Large-scale processing of medical data is possible
  • Profiling and scoring can take place in a clinical context
  • Individuals can be significantly affected

Typical DPIA risks for medical AI:

  • Bias / systematic disadvantage of certain groups
  • Lack of transparency (black-box models)
  • Incorrect decisions with health consequences
  • Loss of control in automated workflows
  • Third-country transfers with cloud services

Practical Rule of Thumb

If AI output influences medical decisions or health data is processed at scale, a DPIA is very likely required.

4. EU AI Act: High-Risk Classification in Healthcare

Many medical AI systems qualify as high-risk, particularly when they:

  • Support or influence diagnoses
  • Perform treatment prioritisation
  • Assess patient risks
  • Intervene in clinical decisions

Depending on the system, classification can occur via:

  • Annex III (e.g. systems with significant fundamental rights relevance)
  • Or as a safety component of a product

Typical High-Risk Obligations (Simplified)

  1. Risk management system
  2. Data governance
  3. Technical documentation
  4. Logging
  5. Transparency
  6. Human oversight
  7. Accuracy, robustness, cybersecurity
  8. Conformity assessment and CE marking where applicable

5. MDR/IVDR: When Does AI Become a Medical Device?

Whether an AI qualifies as a medical device depends not on the technology but on:

  • Intended purpose
  • Medical function
  • Use in a clinical context

Examples that frequently have medical device character:

  • Diagnostic support
  • Therapy recommendations
  • Clinical decision assistance

Examples that are not necessarily medical devices:

  • Purely administrative process optimisation
  • Generic text assistance without medical purpose

Interface Reality

Many healthcare AI systems lie in an overlap zone: GDPR + EU AI Act + MDR/IVDR. An integrated compliance approach is more effective than individual assessments.

6. Practical Implementation: Checklist for AI Projects in Healthcare

A) Clarify Scope and Intended Purpose

  1. What is the medical purpose?
  2. Does AI influence clinical decisions directly or indirectly?
  3. Is it a medical device (MDR/IVDR relevance)?

B) Lay the Data Protection Foundation

  1. Classify data categories (Art. 9?)
  2. Establish legal basis + Art. 9 exception
  3. Define transparency obligations (Art. 13/14)
  4. Establish retention and deletion concept

C) Integrate Risk and Quality Processes

  1. Carry out DPIA screening then create DPIA if required
  2. Plan bias and robustness tests
  3. Define human oversight (who may override?)
  4. Implement logging and monitoring

D) Technical and Organisational Measures (TOMs)

  1. Access controls (role-based)
  2. Encryption (at rest / in transit)
  3. Pseudonymisation / data minimisation
  4. Define incident response process

7. Common Sources of Error

ErrorWhy Critical
Unclear intended purposeMDR/IVDR and AI Act risk
Training data without bias analysisDiscrimination and liability risk
Cloud/LLM without transfer assessmentThird-country transfer violation
No DPIAHigh GDPR risk indicator
Missing monitoring after go-liveSecurity and quality risk

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Something novel or high-risk?

Collaborate with Creativate AI Studio and a network of industry experts and deep-tech researchers to explore, prototype and validate advanced AI systems.

Next Steps

  1. Clarify intended purpose and product category (including MDR/IVDR review).
  2. Define legal basis + Art. 9 exception for health data.
  3. Carry out a DPIA screening and create a DPIA if required.
  4. Review AI Act high-risk classification and start documentation and risk management.
  5. Validate your approach with qualified experts.

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Related Articles