Overview
Transparency is one of the central principles of the GDPR. Art. 13 and 14 oblige controllers to comprehensively inform data subjects about the processing of their personal data.
When deploying AI systems, this obligation becomes particularly demanding. Data subjects must not only be informed about data collection, but -- in the case of automated decisions -- also about the "logic involved" and possible effects.
This article explains:
- The difference between Art. 13 and Art. 14
- Which mandatory information is required
- Special requirements for AI and automated decisions
- How transparency can be practically implemented with complex models
- Example wording templates
Art. 13 vs. Art. 14 GDPR
| Provision | When Applicable | Example |
|---|---|---|
| Art. 13 | Data collected directly from the person | Online form |
| Art. 14 | Data obtained from a third-party source | Data purchase, scraping, external databases |
For AI systems, Art. 14 is frequently relevant, for example when:
- Enriching customer data
- Using external datasets
- Training data from third-party sources
Timely Information
Information must in principle be provided at the time of data collection -- or at the latest within one month for third-party sources.
Mandatory Information under Art. 13/14 GDPR
The privacy notice must contain, among other things:
- Name and contact details of the controller
- Contact details of the Data Protection Officer
- Processing purposes
- Legal basis
- Recipients or categories of recipients
- Third-country transfers
- Storage period
- Data subject rights
- Right to lodge a complaint with a supervisory authority
Additionally for automated decisions:
- Meaningful information about the logic involved
- Significance and envisaged consequences
Automated Decisions & Profiling (Art. 22 GDPR)
When an AI system:
- makes a decision
- produces legal effects
- or significantly affects the individual
additional transparency requirements must be met.
Examples:
- Credit rejection
- Job application rejection
- Insurance premium calculation
Right to Human Intervention
Data subjects have, under certain conditions, the right to request human review.
What Does "Meaningful Information about the Logic Involved" Mean?
The GDPR does not require disclosure of:
- Source code
- Mathematical models
- Complete training data
However, what is required:
- Description of the functionality at an understandable level
- Explanation of the main factors
- Presentation of the possible effects
Example: AI-Assisted Credit Scoring
A transparent description could include:
- Which data categories are included (e.g. income, creditworthiness, payment behaviour)
- That an algorithmic scoring model is used
- That certain thresholds lead to the decision
- What consequences a rejection has
Transparency for Generative AI
Particular challenges:
- Training data not fully known
- Dynamic model adaptations
- Complex black-box structures
Possible solution:
- Description of the general model function
- Note on potential error-proneness
- Explanation of intervention and correction possibilities
Sample Wording Template (Example)
Use of AI Systems
"We use an algorithmic assessment system to analyse your information. This processes in particular the following data categories: [...]. The analysis serves the purpose of [...]. The decision is made on the basis of predefined assessment logic. You have the right to request a review by a human."
This text is only an example and must be adapted to the specific use case.
Case-by-Case Assessment Required
The specific wording depends on the respective AI system and its function.
Art. 14 for Training Data
When personal data:
- originate from third-party sources
- have been collected from public sources
- have been aggregated automatically
it must be assessed:
- Is individual information possible?
- Does an exception apply (e.g. disproportionate effort)?
Disproportionate Effort (Art. 14(5))
Information may be omitted if:
- it is impossible
- or would require disproportionate effort
However, this is to be interpreted narrowly and requires documented justification.
Connection to the EU AI Act
The AI Act additionally requires:
- Transparency obligations for high-risk systems
- Labelling obligations for AI interactions
- Information for deepfakes
Both regulatory frameworks complement each other.
Practical Implementation
Step 1 -- Transparency Check
- Are all mandatory disclosures included?
- Are AI-specific notices integrated?
Step 2 -- Automation Review
- Is there an automated individual decision?
- Are individuals significantly affected?
Step 3 -- Comprehensibility Review
- Is the description understandable for non-experts?
- Are technical terms explained?
Step 4 -- Documentation
- Versioning of privacy notices
- Proof of publication
- Archiving of previous versions
Common Errors
| Error | Risk |
|---|---|
| Generic "AI is used" wording | Insufficient transparency |
| No logic explanation for scoring | Art. 22 violation |
| Missing information for third-party sources | Art. 14 violation |
| Overly complex technical description | Lack of comprehensibility |
Governance Recommendation
AI transparency should:
- be integrated early in product development
- be coordinated with data protection and compliance teams
- be regularly reviewed and updated
Need help implementing?
Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.
Need legal clarity?
For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.
Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.
Next Steps
- Review your privacy notice for AI references.
- Assess the Art. 22 relevance of your systems.
- Add clear logic explanations.
- Document exceptions under Art. 14(5).
- Conduct a DPIA for complex systems.
Need help implementing?
Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.
Need legal clarity?
For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.
Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.