EU AI Act for SMEs: What Small Businesses Need to Know
No Size Exemption — But Don't Panic
The EU AI Act applies to every organization that develops or uses AI systems affecting people in the EU. There is no minimum revenue, no employee threshold, and no "too small to regulate" clause. A three-person startup and a 50,000-employee corporation face the same core obligations for the same AI system.
That said, the regulation is not blind to the realities of smaller businesses. Several provisions specifically account for SMEs, and the risk-based structure means that most small business AI use cases fall into categories with minimal or no compliance burden.
Understanding where your AI systems land on the risk spectrum is the single most important step. It determines whether you need to do nothing, add a disclosure notice, or build a full compliance program.
The Four Risk Levels in Plain Language
Minimal Risk — No Mandatory Requirements
Your AI system does not fall into any regulated category. The EU encourages voluntary codes of conduct but imposes no obligations beyond general AI literacy (Article 4). Most AI systems fall here.
Limited Risk — Tell People It's AI
Your system interacts with people or generates content that could be mistaken for human-made. You must disclose the AI's involvement. This means adding notices like "This response was generated by an AI" or labeling synthetic content. Implementation is straightforward and low-cost.
High Risk — Full Compliance Program
Your system falls into one of the eight Annex III categories (biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration, justice) or is a safety component of a product under EU harmonisation legislation. You need technical documentation, risk management, conformity assessment, and more.
Prohibited — Cannot Be Deployed
Your system performs one of the eight banned practices under Article 5, such as social scoring, subliminal manipulation, or untargeted facial image scraping. These are illegal in the EU, full stop.
Common SME Use Cases and Their Risk Levels
Here is where it gets practical. These are AI use cases small businesses actually deploy, mapped to their most likely risk classification:
| AI Use Case | Risk Level | Why | What You Must Do |
|---|---|---|---|
| Chatbot on your website | Limited | Interacts with people (Art. 50) | Disclose it's an AI |
| Product recommendation engine | Minimal | No regulated category | Nothing mandatory |
| AI-powered email marketing | Minimal | No regulated category | Nothing mandatory |
| Content generation (blog, social) | Limited | Generates synthetic text (Art. 50) | Label as AI-generated if published without human editorial control |
| AI customer support agent | Limited | Interacts with people (Art. 50) | Disclose it's an AI |
| AI-powered HR screening / CV filtering | High | Employment decisions (Annex III, Area 4) | Full compliance: documentation, risk management, conformity assessment |
| AI credit scoring | High | Essential services (Annex III, Area 5(b)) | Full compliance + FRIA |
| AI-based employee performance monitoring | High | Employment (Annex III, Area 4(b)) | Full compliance |
| Predictive maintenance | Minimal | No regulated category | Nothing mandatory |
| Fraud detection | Minimal | Explicitly exempted from Annex III 5(b) | Nothing mandatory |
| AI exam proctoring | High | Education (Annex III, Area 3(d)) | Full compliance |
| AI-powered insurance pricing | High | Essential services (Annex III, Area 5(c)) | Full compliance + FRIA |
The pattern is clear: most common SME AI use cases fall into minimal or limited risk. High-risk classification is triggered by specific use cases — primarily those involving decisions about people's employment, creditworthiness, education, or access to essential services.
SME-Friendly Provisions in the Act
The regulation includes several provisions that specifically benefit smaller businesses:
Reduced Penalties (Article 99(6))
For SMEs (including startups and micro-enterprises), the lower of the two penalty alternatives applies. For large companies, it is the higher of the fixed cap or the percentage of turnover. For SMEs:
- Prohibited practices: capped at €35 million (not 7% of turnover)
- High-risk violations: capped at €15 million (not 3% of turnover)
- Misleading information: capped at €7.5 million (not 1% of turnover)
In practice, the percentage cap will always be lower than the fixed cap for SMEs, so SMEs pay the percentage. A company with €5 million in annual revenue faces a maximum high-risk violation penalty of €150,000 (3%), not €15 million.
Simplified Technical Documentation (Article 11(2))
The Commission is empowered to establish a simplified form of technical documentation for SMEs and startups. This reduces the burden of Annex IV compliance while still meeting the regulation's requirements.
Regulatory Sandboxes (Article 57-62)
Member States must establish AI regulatory sandboxes that provide a controlled environment for developing and testing AI systems before market placement. Crucially, Article 62 specifies that small-scale providers and startups shall be given priority access to these sandboxes. The fees must also account for SMEs' financial capacity — meaning reduced or waived fees.
Proportionality Principle
Throughout the regulation, obligations are framed with proportionality in mind. Risk management must be "proportionate to the risk" (Article 9). Monitoring must be "proportionate" (Article 72). This gives SMEs legitimate room to implement compliance measures that fit their scale.
The Compliance Timeline
Key dates every SME should know:
| Date | What | Impact on SMEs |
|---|---|---|
| Feb 2, 2025 | Prohibited practices ban + AI literacy obligation | Already in force. Ensure no prohibited AI use. Start AI literacy training. |
| Aug 2, 2025 | GPAI obligations + penalty rules | Mostly affects large AI model providers. SMEs using GPAI models (e.g., GPT, Claude) are not directly affected by GPAI obligations. |
| Aug 2, 2026 | High-risk system obligations, deployer duties, FRIA, transparency | The big deadline. If you have high-risk AI systems, all documentation, risk management, and conformity assessment must be complete. |
| Aug 2, 2027 | Annex I product-embedded AI | Only relevant if your AI is a safety component of products under EU harmonisation legislation (medical devices, machinery, etc.). |
For most SMEs, August 2, 2026 is the date that matters. That gives you approximately four months from the publication of this article.
Practical Steps for an SME with Limited Resources
Step 1: Inventory Your AI Systems (Week 1)
List every AI-powered tool, feature, or service your company develops or uses. Include third-party AI tools — if you deploy them professionally, you may be a deployer with obligations.
Step 2: Classify Each System (Week 1-2)
For each system, determine:
- Is it actually an AI system under Article 3(1)?
- What risk level does it fall into?
- Are you the provider, deployer, or both?
An interactive classifier can do this in minutes per system.
Step 3: Address AI Literacy (Week 2-3)
This obligation is already enforceable. Ensure that employees who work with AI systems understand the basics: what the systems do, their limitations, and the regulatory framework. Document the training.
Step 4: Handle Limited-Risk Obligations (Week 3-4)
If you have limited-risk systems, implement transparency measures. Add disclosure notices to chatbots, label AI-generated content, document your transparency approach.
Step 5: Tackle High-Risk Documentation (Weeks 4-12)
If you have high-risk systems, this is the substantial work:
- Prepare Annex IV technical documentation
- Establish a risk management system (Article 9)
- Document data governance (Article 10)
- Design and document human oversight (Article 14)
- Conduct conformity self-assessment (Article 43 / Annex VI)
- Conduct FRIA if applicable (Article 27)
- Register in the EU database (Article 49)
This looks daunting, but self-service compliance tools break each requirement into guided fields with explanations. A technically competent person who understands the AI system can work through the documentation in days, not months.
Step 6: Maintain Compliance (Ongoing)
Set up post-market monitoring (Article 72). When the system changes, update documentation. Keep logs for at least six months (Article 19). Report serious incidents within 15 days (Article 73).
The Self-Service Approach
The compliance industry has traditionally been built for enterprises with six-figure budgets. SMEs need a different path: tools that are affordable, self-service, and guided enough that a CTO or compliance officer can do the work without hiring external consultants.
The Witness platform was built for exactly this scenario. Start with a free AI system classification, then use guided compliance tools to generate the documentation your risk level requires. The entire workflow — classification, documentation, risk management, FRIA, conformity assessment — is available at SME-friendly pricing with no enterprise sales process.
Check if the EU AI Act applies to you
Free classification in 3 minutes. No signup required.
Get Started