WWitness
Erste SchritteAkademieExperten-ChatPreiseBlog
Anmelden
Zurück zum Blog
3. April 2026

EU-KI-Verordnung vs. DSGVO: Unterschiede und Überschneidungen

Witness Team·7 Min. Lesezeit

Two Regulations, Different Targets

The GDPR (General Data Protection Regulation) and the EU AI Act are both EU regulations with extraterritorial reach, significant penalties, and compliance obligations that require documented evidence. But they regulate different things.

GDPR regulates the processing of personal data. It applies whenever an organization collects, stores, uses, or shares information that identifies or can identify a natural person.

EU AI Act regulates AI systems. It applies whenever an organization develops or deploys a machine-based system that operates with autonomy and generates outputs through inference — regardless of whether personal data is involved.

An AI system that processes personal data falls under both regulations simultaneously. An AI system that analyzes satellite imagery with no personal data falls under only the AI Act. A database of customer names with no AI processing falls under only the GDPR.

Key Differences

Aspect GDPR EU AI Act
Subject Personal data processing AI systems
Core approach Lawfulness of data processing Risk classification of AI systems
Risk assessment Data Protection Impact Assessment (DPIA) Risk classification (4 tiers) + FRIA
Roles Controller / Processor Provider / Deployer / Importer / Distributor
Maximum penalty €20M or 4% of global annual turnover €35M or 7% of global annual turnover
Documentation Records of Processing Activities (RoPA) Technical Documentation (Annex IV)
Supervisory authority National Data Protection Authorities National AI Supervisory Authorities (often different bodies)
In force since May 2018 Phased: Feb 2025 — Aug 2027
Personal data required? Yes (that's the point) No (applies to all AI systems)
Consent model Legal basis required for processing No consent model — obligations are structural

The Penalty Gap

The AI Act's penalties are significantly steeper than the GDPR's. Prohibited AI practices can draw fines of up to €35 million or 7% of global annual turnover — nearly double the GDPR maximum. Even standard high-risk obligation violations carry penalties of up to €15 million or 3% of turnover, comparable to the GDPR's upper tier.

Different Roles, Different Obligations

GDPR assigns obligations based on who controls data processing decisions (controllers) and who processes data on their behalf (processors). The AI Act assigns obligations based on who develops AI systems (providers) and who uses them (deployers).

These roles do not map one-to-one. A company can be a GDPR controller and an AI Act deployer simultaneously — common when you use a third-party AI tool to make decisions about your customers. Or a GDPR processor and an AI Act provider — common for SaaS companies that build AI tools for other businesses to use.

Each regulation's obligations attach independently based on your role under that regulation.

Where They Overlap

Despite their different targets, the two regulations share significant common ground.

Impact Assessments

Both regulations require structured assessments of risk:

  • GDPR Article 35 requires a Data Protection Impact Assessment (DPIA) when processing is likely to result in a high risk to individuals' rights and freedoms
  • AI Act Article 27 requires a Fundamental Rights Impact Assessment (FRIA) for certain deployers of high-risk AI systems

Crucially, Article 27(4) of the AI Act explicitly states that the FRIA shall complement any existing GDPR DPIA. The two assessments are designed to work together, not duplicate each other. If you have already conducted a DPIA for an AI system, your FRIA builds on that foundation.

Transparency Requirements

Both regulations mandate transparency about automated decision-making:

  • GDPR Articles 13-14 require informing data subjects about automated decision-making, including profiling
  • GDPR Article 22 gives individuals the right not to be subject to solely automated decisions with legal or significant effects
  • AI Act Article 50 requires disclosure when people interact with AI systems or view AI-generated content
  • AI Act Article 26(11) requires deployers of high-risk systems to inform affected persons

If your AI system makes decisions about people using their personal data, both sets of transparency requirements apply simultaneously.

Extraterritorial Reach

Both regulations apply beyond EU borders:

  • GDPR applies to organizations outside the EU that process data of EU residents
  • The AI Act applies to organizations outside the EU whose AI system output is used in the EU

A US-based company selling an AI-powered service to European customers falls under both regulations.

Data Governance

The AI Act's data governance requirements (Article 10) must comply with GDPR. When training, validating, or testing AI systems with personal data, all GDPR principles apply: lawful basis, purpose limitation, data minimization, accuracy, storage limitation, and security. Article 10(5) specifically permits processing special category data (health, biometrics, etc.) for bias detection and correction, but only under strict conditions and with appropriate safeguards.

Profiling

The GDPR definition of profiling (Article 4(4)) is directly imported into the AI Act. Under Article 6(3), an AI system that profiles natural persons is always classified as high-risk if it falls into an Annex III category — the exception that allows downgrading does not apply when profiling is involved. This creates a direct link between GDPR concepts and AI Act classification.

If You're Already GDPR Compliant

GDPR compliance gives you a head start on the AI Act, but it does not get you across the finish line. Here is what you likely already have and what you still need:

What GDPR Compliance Gives You

  • Data governance practices — GDPR requires documented data processing activities, lawful bases, and data protection measures. These feed directly into AI Act Article 10 data governance requirements.
  • Impact assessment experience — If you have conducted DPIAs, you understand the methodology for FRIAs. Article 27(4) lets you build on existing DPIAs.
  • Transparency infrastructure — Privacy notices, cookie banners, and data subject information processes can be extended to cover AI Act transparency requirements.
  • Documentation culture — GDPR's accountability principle (Article 5(2)) means you are accustomed to maintaining evidence of compliance.
  • DPO expertise — Your Data Protection Officer likely understands the regulatory landscape well enough to coordinate AI Act compliance.

What You Still Need to Do

  • Classify your AI systems — GDPR does not require you to inventory and classify AI systems by risk level. The AI Act does.
  • Technical documentation (Annex IV) — This is entirely new. GDPR's RoPA covers data processing; Annex IV covers the AI system's design, development, testing, accuracy metrics, cybersecurity measures, and more.
  • Risk management system (Article 9) — A continuous, documented risk management process specific to AI systems, covering identification, estimation, evaluation, and mitigation of risks.
  • Conformity assessment (Article 43) — Self-assessment or third-party assessment that your high-risk AI system meets all requirements. No GDPR equivalent exists.
  • CE marking and EU database registration — Providers must affix CE marking and register in the EU database before market placement.
  • Human oversight design (Article 14) — Documented measures enabling humans to understand, interpret, override, and halt AI system operations.
  • Post-market monitoring (Article 72) — A documented system for collecting and analyzing AI system performance data after deployment.
  • AI literacy (Article 4) — Staff training on AI systems, already enforceable since February 2025.

The Biggest Gap: Technical Documentation

For most GDPR-compliant organizations, the largest new compliance burden is Annex IV technical documentation. This requires detailed descriptions of system architecture, development methods, training data, accuracy metrics, bias detection and mitigation, cybersecurity measures, and testing procedures. Nothing in GDPR comes close to this level of technical system documentation.

DPO and AI Act Compliance: Same Person or Different?

There is no legal requirement for an "AI Act compliance officer" analogous to the GDPR's DPO. However, someone needs to own AI Act compliance.

In many organizations — especially SMEs — the DPO is the natural candidate. They already understand regulatory compliance, impact assessments, and documentation requirements. The AI Act adds a technical dimension (system architecture, accuracy metrics, risk management for AI) that may require upskilling or support from technical staff.

For larger organizations with multiple high-risk AI systems, a dedicated AI compliance function makes sense. For SMEs, combining the DPO role with AI Act compliance responsibility is pragmatic, provided the person has access to the technical teams who build and maintain the AI systems.

Practical Recommendations

  1. Do not treat the AI Act as "GDPR 2.0" — The regulations have different scopes, different roles, and different requirements. Attempting to shoehorn AI Act compliance into your existing GDPR framework will leave gaps.

  2. Leverage what you have — Your GDPR DPIAs, data governance documentation, and transparency practices are genuine building blocks for AI Act compliance. Do not start from scratch.

  3. Inventory and classify first — Before creating any documentation, determine which AI systems you have, their risk levels, and your role (provider/deployer) for each. This scoping exercise determines the entire compliance workload.

  4. Coordinate assessments — When conducting a FRIA, explicitly reference your existing DPIA for the same system. Article 27(4) expects this complementary approach.

  5. Start with the free tools — The Witness classifier determines your AI system's risk level and your role in minutes. From there, guided compliance tools walk you through the specific documentation your situation requires — building on the compliance foundation you already have from GDPR.

Prüfen Sie, ob die EU-KI-Verordnung für Sie gilt

Kostenlose Klassifizierung in 3 Minuten. Keine Anmeldung erforderlich.

Jetzt starten