Art. 25 GDPR – Privacy by Design and Privacy by Default

What do Privacy by Design and Privacy by Default mean under Art. 25 GDPR? Technical and organisational requirements for AI systems, architecture principles and a practical developer checklist.

11 February 20264 min read
GDPRArt. 25Privacy by DesignPrivacy by DefaultAI ArchitectureData Protection

Overview

Art. 25 GDPR obliges controllers to integrate data protection not retroactively, but from the outset into systems and processes. This principle is known as Privacy by Design. Additionally, Privacy by Default requires that privacy-friendly settings are activated by default.

For AI systems, this article is particularly relevant, as training data, model architecture and decision logic are already determined during the development phase.

This article explains:

  • The requirements of Art. 25 GDPR
  • The seven foundational principles of Privacy by Design
  • Technical implementation approaches for AI systems
  • The difference between Design and Default
  • A practical checklist for AI developers

Controllers must:

  • take appropriate technical and organisational measures
  • take into account the state of the art
  • consider implementation costs and risks
  • effectively implement data protection principles

Early Integration

Privacy by Design is not merely a documentation obligation -- it is an architectural decision.

The Seven Principles according to Cavoukian

  1. Proactive not reactive
  2. Privacy as the default setting
  3. Privacy embedded into design
  4. Full functionality (no "either-or")
  5. End-to-end security
  6. Transparency
  7. User-centricity

These principles are not enshrined word-for-word in the law, but serve as a recognised reference framework.

Privacy by Design in the AI Context

Data Collection

  • Minimisation of collected data
  • Clear purpose definition
  • Avoidance of unnecessary sensitivity

Training Data

  • Representativeness assessment
  • Bias analysis
  • Documented selection criteria

Model Architecture

Technical measures may include:

  • Federated Learning
  • Differential Privacy
  • On-device processing
  • Encryption during training and inference

Access Controls

  • Role-based access systems
  • Separation of training and production data
  • Access logging

Transparency Mechanisms

  • Model cards
  • Data provenance documentation
  • Versioning

Privacy by Default

By default, only data that is:

  • necessary for the respective purpose

may be processed.

Examples:

  • Tracking features deactivated by default
  • Profiling only after activation
  • Minimal data storage as the default setting

Avoid Misconfiguration

Data protection must not depend on an active user decision to be established.

Technical Measures for AI Systems

MeasureObjective
PseudonymisationReduction of identifiability
EncryptionProtection against unauthorised access
Differential PrivacyProtection of individual data traces
Federated LearningData remain local
Access ControlsLimitation of internal use

Connection to the EU AI Act

Privacy by Design complements:

  • Risk management obligations
  • Data governance requirements
  • Transparency obligations

Both legal frameworks require a structured development process.

Practical Checklist for AI Developers

Conception Phase

  1. Document purpose definition
  2. Define data categories
  3. Identify sensitive data

Development Phase

  1. Implement minimisation strategy
  2. Conduct bias tests
  3. Integrate security measures

Implementation Phase

  1. Review default settings
  2. Create transparency information
  3. Set up access controls

Operational Phase

  1. Regular review
  2. Establish incident management
  3. Document versioning

Common Errors

ErrorRisk
Data protection considered only after developmentSystem modifications necessary
Unclear purpose definitionPurpose limitation violation
Missing documentationViolation of accountability
Comprehensive data collection by defaultPrivacy by Default violation

Governance Recommendation

Privacy by Design should:

  • be part of the product development process
  • be integrated into architectural decisions
  • be audited regularly

Interdisciplinary collaboration (legal, technical, compliance) is essential.

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Something novel or high-risk?

Collaborate with Creativate AI Studio and a network of industry experts and deep-tech researchers to explore, prototype and validate advanced AI systems.

Next Steps

  1. Integrate data protection requirements into your development guidelines.
  2. Implement privacy-friendly default settings.
  3. Document technical protective measures in a comprehensible manner.
  4. Conduct regular data protection reviews.
  5. Assess high-risk or DPIA relevance.

Need help implementing?

Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.

Need legal clarity?

For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.

Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.

Related Articles