Overview
The EU AI Act is the world's first comprehensive regulation for Artificial Intelligence. Unlike the GDPR, however, it does not enter full application all at once but follows a staggered timeline with clearly defined transition periods.
For companies, this means: The regulation is already taking effect -- even though individual obligations will not become fully applicable until 2025, 2026 or 2027. Anyone who develops, integrates or deploys AI systems should know the timeline precisely to minimise risks and build compliance in a structured manner.
This article provides a complete overview of:
- Entry into force
- Application dates for individual chapters
- Deadlines for GPAI models
- Deadlines for high-risk systems
- Action steps for companies
Entry into Force vs. Applicability -- The Difference
First, it is important to clearly distinguish two terms:
| Term | Meaning |
|---|---|
| Entry into force | The legal act is formally valid |
| Applicability | The specific obligations must be fulfilled |
The EU AI Act entered into force in August 2024. Most substantive obligations, however, apply only progressively.
Important to Understand
The EU AI Act already applies as binding EU law. Companies cannot claim they "don't need to do anything yet" simply because individual obligations become applicable later.
Complete Timeline of the EU AI Act
Overview of Key Milestones
| Date | Milestone | Content |
|---|---|---|
| August 2024 | Entry into force | Publication in the Official Journal + 20 days |
| February 2025 | Prohibited practices applicable | Art. 5 AI Act |
| August 2025 | GPAI obligations + governance structures | Art. 51--56 |
| August 2026 | High-risk systems (Annex III) | Core obligations for providers & deployers |
| August 2027 | Full application | Incl. high-risk as safety component |
Phase 1 -- February 2025: Prohibited AI Practices
From February 2025, the prohibitions under Art. 5 EU AI Act are binding.
These include, among others:
- Social scoring by public authorities
- Manipulative AI systems
- Exploitation of vulnerable persons
- Real-time remote biometric identification in public spaces (with narrow exceptions)
- Emotion recognition in the workplace or educational institutions
Fine Risk
Violations of Art. 5 can be sanctioned with up to EUR 35 million or 7% of global annual turnover.
What Companies Should Do by Then:
- Take stock of all deployed AI systems
- Review for possible prohibited practices
- Document the intended purpose
- Adapt or shut down problematic functions
Phase 2 -- August 2025: GPAI Models & Governance
From August 2025, special rules apply for:
- Providers of General Purpose AI (GPAI)
- Operators of systemic GPAI models
Two Tiers for GPAI:
| Category | Criteria | Obligations |
|---|---|---|
| Standard GPAI | General AI models | Technical documentation, training data summary |
| GPAI with systemic risk | >10^25 FLOPs | Adversarial testing, incident reporting, cybersecurity measures |
Important: Companies using an API-based LLM (e.g. via cloud providers) are generally not providers of a GPAI model -- but may become deployers of a high-risk system.
Avoid the Misconception
API usage does not automatically make you a provider -- but functional changes or re-branding can shift this role.
Phase 3 -- August 2026: High-Risk AI Systems (Annex III)
From August 2026, the comprehensive compliance requirements for high-risk systems under Annex III apply.
Affected areas include:
- Biometric identification
- Critical infrastructure
- Education
- Employment
- Credit scoring
- Law enforcement
- Migration
- Justice
Core Provider Obligations
- Risk management system
- Data governance
- Technical documentation
- Logging
- Transparency obligations
- Human oversight
- Accuracy & robustness
- Declaration of Conformity + CE marking
Use the Preparation Time
High-risk compliance often requires 12--18 months of lead time. Companies should start gap analyses at the latest now.
Phase 4 -- August 2027: Full Application
From August 2027, the AI Act applies in full, including:
- High-risk AI as a safety component of products
- Full market surveillance
- EU-wide database obligations
- Post-market monitoring
Practical Implementation -- What Companies Should Do NOW
Step 1 -- AI Inventory
- What systems do we use?
- In-house development or third-party provider?
- In which business processes?
Step 2 -- Initial Risk Classification
- Prohibited practices?
- High-risk under Annex III?
- GPAI relevance?
Step 3 -- Gap Analysis
- Is technical documentation available?
- Is logging implemented?
- Is human oversight defined?
- Is training data documented?
Step 4 -- Governance Structure
- Has a responsible person been appointed?
- Is an AI compliance process defined?
- Is the data protection / AI governance interface clarified?
Timeline as a Decision Basis
Many companies underestimate the organisational effort:
| Measure | Average Implementation Time |
|---|---|
| AI inventory | 1--3 months |
| Risk classification | 1--2 months |
| Technical documentation | 3--6 months |
| Quality management system | 6--12 months |
| Conformity assessment | 3--6 months |
The regulatory clock is already ticking.
Connection to the GDPR
The AI Act does not replace the GDPR.
Both apply in parallel.
Typical interfaces:
- Data Protection Impact Assessment (DPIA)
- Transparency obligations
- Purpose limitation
- Data minimisation
- Data subject rights
Companies therefore need integrated governance structures.
Need help implementing?
Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.
Something novel or high-risk?
Collaborate with Creativate AI Studio and a network of industry experts and deep-tech researchers to explore, prototype and validate advanced AI systems.
Next Steps
- Conduct a structured AI inventory.
- Classify your systems according to AI Act risk classes.
- Identify regulatory gaps.
- Develop an internal AI governance roadmap.
- Plan budget and resources for 2025--2027.
Need help implementing?
Work with Creativate AI Studio to design, validate and implement AI systems — technically sound, compliant and production-ready.
Need legal clarity?
For specific legal questions on the AI Act and GDPR, specialized legal advice focusing on AI regulation, data protection and compliance structures is available.
Independent legal advice. No automated legal information. The platform ai-playbook.eu does not provide legal advice.