Apr 8, 2022
Governance you can run: turning AI Act duties into everyday operations
The EU AI Act is built on a simple idea: governance is not a document — it’s a daily practice. But most organizations still treat compliance as a project to finish, not a process to run. If your AI governance doesn’t fit into your operational rhythm — product releases, data updates, incident handling — it won’t hold up to review. The goal is not more paperwork, but evidence that someone owns every duty, every day.

Stella Manga Chesnay
Lawyer | GDPR & AI Act
1. Roles and RACI: make governance visible
The AI Act makes accountability explicit. Articles 9–15 define risk management, data governance, human oversight, and transparency duties — all of which require someone to be responsible.
Start with a RACI grid that maps every compliance duty to your operating model:
Product: ensures intended use and performance are documented (Article 10).
Data: maintains lineage and data-quality controls.
Compliance: owns the conformity assessment and evidence pack.
Security: manages incident response and technical safeguards.
Your reviewer’s first question will be: Who does what?
If your RACI can’t answer that immediately, your governance isn’t operational yet.
2. Policy coverage targets: measure what’s governed
Policies are only effective if they actually cover the decisions they’re meant to.
A practical benchmark: aim for ≥90% policy coverage for all critical decisions that rely on AI.
Critical means decisions that:
Affect individual rights or access (credit, hiring, healthcare).
Are classified as high-risk under Annex III of the AI Act.
Trigger overrides or escalation paths.
Each policy should be versioned, linked to a model or process, and logged as part of your compliance evidence pack.
In practice, “governance coverage” becomes a KPI you can track quarter to quarter.
3. Incident and change management: control the moving parts
Every AI system changes — new data, retraining, model updates, third-party APIs.
Under Article 61 (Post-market monitoring), you’re required to track and report significant changes that affect compliance.
Operationally, that means:
Maintaining model version logs and rollback points.
Capturing incident reports that include cause, impact, and corrective action.
Reviewing retraining events to ensure they don’t shift performance beyond acceptable risk.
Think of this as DevOps for compliance: your change log is your audit trail.
4. Vendor governance and evidence exchange
AI supply chains are complex — and your responsibility doesn’t end with your own models.
Articles 24–28 of the AI Act make providers and deployers jointly accountable for compliance.
You need structured vendor governance that includes:
Evidence exchange protocols (e.g., model cards, bias testing summaries).
Contract clauses defining incident notification timelines and data responsibilities.
Shared risk registers that both parties update.
Auditors increasingly ask not only “Where’s your model card?” but also “Where’s your vendor’s?”
If you can’t access your supplier’s compliance evidence, you can’t prove your own.
5. Quarterly review cadence and metrics
Governance must run on a clock.
A quarterly cadence keeps your evidence pack current and ensures findings turn into improvements.
Track operational governance metrics such as:
Policy coverage (%)
Incident closure rate
Override frequency
Average latency from change to policy update
These aren’t theoretical — they’re your live compliance health dashboard.
From compliance to culture
AI governance that lives in binders fails under pressure.
AI governance that lives in operations builds trust.
When every role, policy, and incident is logged and reviewed, governance becomes something your organization runs, not something it documents.
That’s the mindset shift the AI Act is pushing the industry toward — and the one that turns compliance into a competitive advantage.
👉 Download the Policy & RACI Templates (PDF)
Turn the AI Act’s duties into a playbook your teams can actually run.
