EU AI Act Compliance Checklist for UK Small Businesses
A practical EU AI Act compliance checklist for UK small businesses. Find out exactly what documentation you need before the 2 August 2026 enforcement deadline.
By Clausely Team
What you need to know before you start
The EU AI Act enforcement deadline is 2 August 2026. If your UK business uses AI tools and has any connection to EU customers, staff, or users, you need to be compliant by then.
This checklist covers the core documentation and governance steps that most UK small businesses need to take. Work through each item and note what you have in place and what is still outstanding.
Step 1 — establish whether the EU AI Act applies to you
If both boxes apply, the EU AI Act applies to your business. Continue through the checklist.
If only the first box applies and you have no EU connection whatsoever, you are likely outside the Act’s current scope — but should still consider best-practice AI governance given UK legislation may follow.
- [ ] Your business uses at least one AI tool (ChatGPT, Claude, Microsoft Copilot, Midjourney, GitHub Copilot, or any other AI-powered software) in a professional capacity.
- [ ] Your business has at least one of the following: EU-based customers, EU-based employees or contractors, EU-based users of your product or service, or content or outputs that reach EU audiences.
Step 2 — identify your AI risk level
- [ ] Review the AI tools you use and the purposes for which you use them.
- [ ] Determine whether any of your AI use falls into a high-risk category under Annex III of the EU AI Act. High-risk categories relevant to UK SMEs include: recruitment and CV screening, candidate ranking or shortlisting, credit or financial risk assessment, access to essential services, and educational assessment or evaluation.
- [ ] If you have high-risk AI use, note the specific use case and the categories of people affected.
Step 3 — core documentation for all deployers
Every business that uses AI in an EU-facing context needs these two documents as a minimum.
- [ ] AI Acceptable Use Policy. A written policy governing which AI tools are approved, what they can be used for, what is prohibited, and who is accountable. This policy must be communicated to all staff who use AI tools.
- [ ] AI Literacy Policy (Article 4). A documented framework for ensuring staff who use AI systems have a sufficient level of AI literacy — understanding what the tools can and cannot do, the risks involved, and how to use them responsibly. Training records must be kept.
Step 4 — additional documentation for EU-facing businesses
If your business serves EU customers or has EU exposure, you also need:
- [ ] Article 50 Transparency Disclosures. Where you use AI to interact with customers, generate content, or make recommendations, you may need to disclose that AI was involved. This includes chatbot notices, AI-generated content labels, and email footer disclosures.
- [ ] Human Oversight SOP. A documented standard operating procedure setting out how human oversight is exercised over AI outputs before they are acted upon, shared externally, or used in decisions affecting individuals.
- [ ] AI Incident Response Procedure. A documented procedure for detecting, responding to, and reporting incidents involving AI systems — including data leaks, harmful outputs, bias events, and system failures.
- [ ] Vendor AI Risk Register. A register of all AI tools and vendors used, including their risk classification, data processing agreement status, certifications, and residual risk ratings.
Step 5 — additional documentation for high-risk AI use
If your business uses AI in a high-risk context (recruitment, credit, essential services, education), you also need:
- [ ] Fundamental Rights Impact Assessment (Article 27). A formal assessment of the impact of your high-risk AI system on the fundamental rights of affected individuals — covering discrimination risk, privacy, human dignity, and the right to effective remedy.
- [ ] AI Risk Management Plan (Article 9). A documented framework for identifying, analysing, evaluating, mitigating, and monitoring risks arising from your high-risk AI use throughout its lifecycle.
- [ ] Conformity Self-Assessment Checklist. An article-by-article self-assessment of your compliance with the EU AI Act’s requirements for high-risk AI deployers.
Step 6 — governance and accountability
- [ ] Designate an accountable person responsible for AI governance in your organisation — someone with the authority and competence to oversee AI use, approve new tools, and respond to incidents.
- [ ] Ensure your privacy notice and candidate or customer-facing information discloses your use of AI tools where required.
- [ ] Implement a process for reviewing and updating your AI governance documentation whenever you adopt new tools, change use cases, or when relevant legislation or guidance is updated.
- [ ] Set a calendar reminder to review all documentation annually and specifically before 2 August 2026.
How many items do you have in place?
If your answer is very few, you are not alone. Most UK small businesses have not yet started their EU AI Act compliance work — but the deadline is approaching fast.
Clausely generates all of the documents in this checklist, tailored to your specific business. Fill in a short intake form about your company, your AI tools, and your use cases. We generate your full compliance pack — editable Word documents, ready to sign — in under 10 minutes.
Recommended next step
Start with the Essentials intake.
Begin with the Essentials pack to put your baseline AI Acceptable Use Policy and AI Literacy Policy in place, then step up if your business has EU-facing or high-risk AI use.
Start your compliance pack from £399This article was written with AI assistance and reviewed for accuracy against current UK and EU regulatory guidance. It does not constitute legal advice. If you require specific legal guidance, please consult a qualified solicitor.