Art. 13/14 GDPR – Information Obligations for AI Systems

What information obligations apply under Art. 13 and 14 GDPR when deploying AI systems? Transparency requirements, automated decisions, logic explanation and sample wording for AI privacy notices.

11 February 20265 min read
GDPRArt. 13Art. 14TransparencyAutomated DecisionsAI Compliance

Overview

Transparency is one of the central principles of the GDPR. Art. 13 and 14 oblige controllers to comprehensively inform data subjects about the processing of their personal data.

When deploying AI systems, this obligation becomes particularly demanding. Data subjects must not only be informed about data collection, but -- in the case of automated decisions -- also about the "logic involved" and possible effects.

This article explains:

  • The difference between Art. 13 and Art. 14
  • Which mandatory information is required
  • Special requirements for AI and automated decisions
  • How transparency can be practically implemented with complex models
  • Example wording templates

Art. 13 vs. Art. 14 GDPR

ProvisionWhen ApplicableExample
Art. 13Data collected directly from the personOnline form
Art. 14Data obtained from a third-party sourceData purchase, scraping, external databases

For AI systems, Art. 14 is frequently relevant, for example when:

  • Enriching customer data
  • Using external datasets
  • Training data from third-party sources

Timely Information

Information must in principle be provided at the time of data collection -- or at the latest within one month for third-party sources.

Mandatory Information under Art. 13/14 GDPR

The privacy notice must contain, among other things:

  1. Name and contact details of the controller
  2. Contact details of the Data Protection Officer
  3. Processing purposes
  4. Legal basis
  5. Recipients or categories of recipients
  6. Third-country transfers
  7. Storage period
  8. Data subject rights
  9. Right to lodge a complaint with a supervisory authority

Additionally for automated decisions:

  • Meaningful information about the logic involved
  • Significance and envisaged consequences

Automated Decisions & Profiling (Art. 22 GDPR)

When an AI system:

  • makes a decision
  • produces legal effects
  • or significantly affects the individual

additional transparency requirements must be met.

Examples:

  • Credit rejection
  • Job application rejection
  • Insurance premium calculation

Right to Human Intervention

Data subjects have, under certain conditions, the right to request human review.

What Does "Meaningful Information about the Logic Involved" Mean?

The GDPR does not require disclosure of:

  • Source code
  • Mathematical models
  • Complete training data

However, what is required:

  • Description of the functionality at an understandable level
  • Explanation of the main factors
  • Presentation of the possible effects

Example: AI-Assisted Credit Scoring

A transparent description could include:

  • Which data categories are included (e.g. income, creditworthiness, payment behaviour)
  • That an algorithmic scoring model is used
  • That certain thresholds lead to the decision
  • What consequences a rejection has

Transparency for Generative AI

Particular challenges:

  • Training data not fully known
  • Dynamic model adaptations
  • Complex black-box structures

Possible solution:

  • Description of the general model function
  • Note on potential error-proneness
  • Explanation of intervention and correction possibilities

Sample Wording Template (Example)

Use of AI Systems

"We use an algorithmic assessment system to analyse your information. This processes in particular the following data categories: [...]. The analysis serves the purpose of [...]. The decision is made on the basis of predefined assessment logic. You have the right to request a review by a human."

This text is only an example and must be adapted to the specific use case.

Case-by-Case Assessment Required

The specific wording depends on the respective AI system and its function.

Art. 14 for Training Data

When personal data:

  • originate from third-party sources
  • have been collected from public sources
  • have been aggregated automatically

it must be assessed:

  • Is individual information possible?
  • Does an exception apply (e.g. disproportionate effort)?

Disproportionate Effort (Art. 14(5))

Information may be omitted if:

  • it is impossible
  • or would require disproportionate effort

However, this is to be interpreted narrowly and requires documented justification.

Connection to the EU AI Act

The AI Act additionally requires:

  • Transparency obligations for high-risk systems
  • Labelling obligations for AI interactions
  • Information for deepfakes

Both regulatory frameworks complement each other.

Practical Implementation

Step 1 -- Transparency Check

  • Are all mandatory disclosures included?
  • Are AI-specific notices integrated?

Step 2 -- Automation Review

  • Is there an automated individual decision?
  • Are individuals significantly affected?

Step 3 -- Comprehensibility Review

  • Is the description understandable for non-experts?
  • Are technical terms explained?

Step 4 -- Documentation

  • Versioning of privacy notices
  • Proof of publication
  • Archiving of previous versions

Common Errors

ErrorRisk
Generic "AI is used" wordingInsufficient transparency
No logic explanation for scoringArt. 22 violation
Missing information for third-party sourcesArt. 14 violation
Overly complex technical descriptionLack of comprehensibility

Governance Recommendation

AI transparency should:

  • be integrated early in product development
  • be coordinated with data protection and compliance teams
  • be regularly reviewed and updated

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Next Steps

  1. Review your privacy notice for AI references.
  2. Assess the Art. 22 relevance of your systems.
  3. Add clear logic explanations.
  4. Document exceptions under Art. 14(5).
  5. Conduct a DPIA for complex systems.

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Related Articles