EU AI Act compliance for small businesses is the practice of meeting the transparency, documentation, and risk-management duties of Regulation (EU) 2024/1689, scaled to a team of 5 to 50 people rather than a global enterprise. For most SMBs, it means producing five lightweight documents and keeping them up to date, not building a GRC platform.

Here is the uncomfortable news. Most small businesses think the AI Act is a problem for Meta, OpenAI, and Microsoft. It is not. The regulation applies to providers (those who build AI systems) and deployers (those who use them in a professional capacity). If your 10-person recruiting agency uses an AI CV screener, your 25-person clinic uses AI triage, or your marketing shop uses AI for A/B testing on customer decisions, you are a deployer. Deployer duties are lighter than provider duties, but they are not zero.

The good news. The regulation does not require enterprise-grade governance for an SMB. The European Commission guidance explicitly acknowledges proportionality: your documentation can be shorter, your risk assessment simpler, your audit trail smaller. This playbook shows what proportional actually looks like for a small team.

If you want the enterprise-focused version, see our GDPR + AI Act compliance checklist. This article is specifically for SMBs who need the same compliance outcome in a fraction of the effort.

€35Mor 7% of global turnover: max fine for prohibited AI (EU AI Act Art. 99(3))
Aug 2026high-risk AI obligations start applying (HR, credit, education) (Regulation (EU) 2024/1689)
5documents most SMB deployers actually need (AI Act Ch. III, Section 3)
< 25%of EU SMBs have a documented AI policy today (US Chamber / Eurostat, 2025)

What AI Compliance Actually Means for a Small Business

AI Act compliance for an SMB deployer breaks into three duties: know what AI systems you use, be transparent about them to affected people, and keep basic records so you can prove both if asked. That is 80% of the obligation, and it is achievable by a single person in a week.

The remaining 20% depends on whether any of your AI uses qualify as high-risk. The AI Act Annex III lists the categories that do: employment-related AI (recruiting, evaluation, promotion decisions), creditworthiness assessment, access to essential services, law enforcement, migration, administration of justice, democratic processes, and certain biometric and educational uses. Most SMBs either do not use high-risk AI at all, or use it in one area (recruiting is the single most common case).

If you do use high-risk AI, your obligations step up: a Data Protection Impact Assessment, human oversight procedures, a record of the system's logs, and a disclosure to the person being assessed. Still lighter than a provider's duties, still proportionate, but concrete.

The AI governance assessment is designed to tell you in under 10 minutes whether your current AI usage is high-risk under the Act, what your deployer duties are, and where the gaps are right now.

You may already be a deployer without knowing it

A deployer is anyone who uses an AI system under their own authority for a professional purpose. A bookkeeper using an AI-powered expense categoriser? Deployer. A sales team using an AI lead-scoring tool? Deployer. An HR person letting Copilot draft rejection emails? Probably deployer. You inherit deployer duties the moment you point an AI system at your operational data, even if you did not build the tool.

The 2026 to 2027 Enforcement Timeline

The AI Act came into force on 1 August 2024, but its obligations roll in staged. This matters for SMB planning: you do not need every document on day one, but you do need to know which deadlines apply to which of your AI uses. The tier that bites small businesses first is the high-risk category, starting August 2026. For the full authoritative text, see the consolidated version of Regulation (EU) 2024/1689 on EUR-Lex, and the European Data Protection Board publishes ongoing opinions on how GDPR and the AI Act interact for SMB deployers.

DateWhat starts applyingWho it hits
2 Feb 2025Prohibited AI practices + AI literacy dutyEveryone (all deployers must ensure staff have adequate AI literacy)
2 Aug 2025General-Purpose AI (GPAI) provider rules, governance bodies, penaltiesGPAI providers (OpenAI, Anthropic, Google); indirectly affects SMBs via vendor docs
2 Aug 2026High-risk AI obligations, most transparency rules, conformity assessmentSMBs using AI in recruiting, credit, worker evaluation, education, essential services
2 Aug 2027Full application, including high-risk AI embedded in regulated productsRemaining deployer categories, Annex I products

Check your AI governance maturity in 10 minutes

A structured assessment that tells you which of your AI uses fall under the AI Act, which deployer duties apply, and where the fastest compliance wins are. Free, no signup, benchmark report included.

Try It Free

The 5 Documents Every SMB Deployer Needs

1

AI Acceptable Use Policy (AUP)

One document, 2 to 4 pages, telling your team which AI tools are approved, for which purposes, with which data. This is the foundation. It also satisfies the AI literacy duty under Article 4 of the Act: your staff must know what they are using and why.

2

AI register (record of processing)

One spreadsheet or table listing every AI system in use, what data it processes, the vendor, the legal basis, risk classification, and who owns it internally. For an SMB, 10 to 30 rows is typical. This doubles as your Article 30 GDPR record for AI systems.

3

DPIA-lite for high-risk uses

A Data Protection Impact Assessment is already required under GDPR for high-risk processing. Extend it with the AI Act's Fundamental Rights Impact Assessment points: purpose, affected persons, risks, mitigations. For an SMB, 2 to 5 pages per high-risk system. Skip if you have no high-risk AI uses.

4

Disclosure notices

When AI interacts with a customer, candidate, or employee, and the AI's role is not obvious from context, you must tell them. For SMBs, this is often one paragraph in your job ad ('we use AI to screen CVs'), one note on your chatbot ('you are chatting with an AI'), or one clause in your customer terms. Not hard, but easy to forget.

5

Vendor log + DPAs

A short list of every AI vendor you use, with the date of the signed Data Processing Agreement, the data categories shared, and the hosting region. This is the single most likely thing a DPA inspection will ask for, and the single least prepared item in most SMBs. European AI data sovereignty covers which EU regions and DPAs to prefer.

Writing Your AI Acceptable Use Policy

An AUP is a one-document gate between your team and the wide-open world of AI tools. For a small business, the goal is not a 40-page legal document. It is a clear, 2-to-4-page document that anyone can read in 5 minutes and act on. If it takes longer than 5 minutes to read, nobody reads it, and that is how shadow AI starts.

The document does two jobs: it tells your team what is allowed (so they have a clear yes for common cases) and what is not (so shadow AI has a clear no). Both matter equally. A policy that only says no drives usage underground. A policy that says yes without conditions makes you liable for whatever happens next.

Building a Lightweight AI Register

The AI register is the single highest-leverage compliance document. It is also the one most SMBs do not have. The purpose is simple: a supervisor should be able to read one document and know every AI system your business uses, on what data, with what risk, and who owns it. A spreadsheet works. A Notion or Airtable table works. A Word table works. You do not need a dedicated GRC tool to start.

The minimum columns:

AI register: minimum viable columns

System name (e.g. Gmail Smart Compose). Vendor (Google). Purpose (email drafting). Data categories (customer PII, internal docs, none). Legal basis (legitimate interest / contract). Risk class under AI Act (minimal / limited / high). Hosting region (EU / US / unknown). DPA signed (Y/N + date). Internal owner (name). Last reviewed (date, quarterly cadence).

Populating the register usually takes 2 to 4 hours for a 20-person business. Run a quick poll of your team (the ai-usage survey is built exactly for this step), cross-check with your finance records (every paid AI tool leaves a receipt), and write the register. The hard part is not building it, it is keeping it updated when someone adds a new tool. Put the review on a recurring calendar invite, quarterly.

Map your team's AI usage to populate the register

A structured survey that surfaces every AI tool your team uses, including shadow AI you may not know about. Use the results as the first draft of your AI register.

Try It Free

Disclosure Notices for Customers and Candidates

Article 50 of the AI Act requires disclosure in four situations: when a person is interacting with an AI system (unless obvious from context), when content is AI-generated and could be mistaken for human, when emotion recognition or biometric categorisation is used, and when synthetic media is created. For an SMB, the practical obligations are narrower than they look.

Use CaseDisclosure Required?SMB-Ready Wording
AI CV screening in recruiting high-risk + transparencyOne line in the job ad and application acknowledgement
Customer-facing chatbot Art. 50'You are chatting with an AI assistant' on first message
AI drafting internal emails (reviewed by human) not user-facing, human in loopN/A
AI-generated marketing image synthetic contentAlt-text or caption: 'AI-generated'
Emotion detection in customer callsOften prohibited; check scopeGet legal advice before enabling

Common Traps, and How to Dodge Them

DIY SMB Compliance

  • Cost: a few hours of your time plus a €0 to €200 template pack

  • Proportional to actual risk of a 15-person team

  • You keep the knowledge in-house and understand what you signed

  • Ships in a week, not a quarter

Enterprise GRC Platform

  • Cost: €20,000 to €100,000 per year in licences and consulting

  • Designed for banks and hospitals, not 20-person recruiting agencies

  • Tooling outgrows team: the dashboard no one logs into

  • Ships in a quarter, outdated in a year

Five traps that hit small businesses harder than enterprises:

1. Treating compliance as a one-time project. A register you wrote in March and never updated is worse than no register, because it creates a false sense of security. Put the review on a quarterly calendar and keep it short enough that updating takes 30 minutes, not 3 days.

2. Assuming free-tier AI is out of scope. The AI Act does not care about your vendor's pricing plan. If Gemini writes a candidate rejection email, Gemini is in scope. Consumer accounts tend to be worse for compliance, not better, because of training-data defaults and missing DPAs. Shadow AI at small businesses is almost always a consumer-account story.

3. Writing a policy nobody reads. A 40-page AUP is worse than a 2-page one because compliance by avoidance is not compliance. Keep the tone conversational. Put one example per rule. Link out to deeper resources if someone wants them.

4. Forgetting the employee side. Your team is a data subject under GDPR. They have a right to know what AI touches their work, and especially what AI evaluates them. Employee trust in AI is not just nice-to-have; it becomes a GDPR + AI Act obligation the moment you use AI in performance reviews, scheduling, or disciplinary context.

5. Mistaking the 2027 deadline for a breather. August 2026 is the date that hits recruiters, HR tech users, and credit-adjacent businesses. If your AI does anything with hiring decisions, that is your deadline, not 2027. The full AI adoption change management plan covers how to sequence governance work so you are not building documents in a panic the week before.

The synthetic compliance trap

The worst compliance outcome is not having no documents. It is having documents that look polished but describe a process nobody actually follows. Authorities check whether your register matches your real tool usage, whether your AUP matches what people do on a Tuesday afternoon, whether disclosure notices actually appear in customer flows. If the paperwork says one thing and your reality says another, the paperwork makes things worse.

Start with a readiness assessment

Before you write the documents, map what you actually need. The AI readiness assessment tells you your deployer duties, which tier your current AI uses fall into, and the fastest path to each of the 5 documents.

Try It Free

AI Act compliance for a small business is a living practice, not a one-off project. The five documents are the floor. The culture of updating them is what turns floor into foundation. For a deeper look at the embedded-workflow pattern these documents support, see our companion guide on how small businesses actually use AI.

AI Act SMB compliance in 5 bullets

- You are likely a deployer. That triggers transparency, documentation, and human-oversight duties, scaled to your size.
- August 2026 is when high-risk obligations apply. Recruiting AI, credit AI, worker evaluation AI all fall in this tier.
- 5 documents cover 80% of SMB duties: AUP, AI register, DPIA-lite, disclosure notices, vendor log.
- Size the documents for your team: 2 to 4 pages each, not 40. Keep them readable in 5 minutes.
- Quarterly review beats one-off perfection. Schedule it, keep it short, actually do it.