Does the EU AI Act Apply to My UK Business?

The EU AI Act affects UK businesses that use AI and serve EU customers. Find out if it applies to you, what you need to do, and how to get compliant before 2 August 2026.

By Clausely Team

What is the EU AI Act?

The EU AI Act (Regulation (EU) 2024/1689) is the world’s first comprehensive legal framework for artificial intelligence. Full enforcement begins on 2 August 2026. If you’re a UK business owner who uses AI tools in your operations, you need to know whether it applies to you — and what happens if you’re not compliant.

The short answer: if your business uses AI and has any connection to EU customers, staff, or users, it almost certainly applies.

The EU AI Act is a regulation passed by the European Parliament that sets out rules for how AI systems can be developed, deployed, and used. It categorises AI systems by risk level — from minimal risk (spam filters) to high risk (recruitment tools, credit scoring) to unacceptable risk (social scoring systems, which are banned outright).

For most UK small businesses, the relevant obligations fall under the deployer category. A deployer is any organisation that uses an AI system in a professional context — which includes using ChatGPT, Claude, Microsoft Copilot, or any other AI tool in your business operations.

Does Brexit mean it doesn’t apply to UK businesses?

This is the most common misconception. Brexit removed the UK from the EU’s legislative jurisdiction — but it did not remove EU law’s reach over businesses that serve EU markets.

The EU AI Act follows the same extraterritorial logic as the GDPR. It applies based on where the affected person is located, not where the business is incorporated. If your AI system affects EU users, EU customers, or EU employees — even indirectly — the Act applies to your use of that system.

Practically speaking, this means:

  • A UK recruitment agency using AI to screen candidates for EU-based roles is in scope
  • A UK SaaS company whose platform uses AI and has EU subscribers is in scope
  • A UK marketing agency using AI to produce content for EU clients is in scope
  • A UK healthcare provider using AI tools with no EU-facing activity is likely out of scope

What does the EU AI Act actually require?

If you’re unsure, the safer assumption is that it applies. The fines for non-compliance are significant — up to €35 million or 7% of global annual turnover for the most serious breaches.

For most UK SMEs deploying AI tools (rather than building them), the core obligations are:

  1. AI Literacy (Article 4) — You must ensure that staff who use AI systems have a sufficient level of AI literacy, meaning they understand what the tools can and cannot do, the risks involved, and how to use them responsibly. This needs to be documented.
  2. Acceptable Use Policy — You need a written policy governing how AI tools are used in your business, which tools are approved, what they can be used for, what’s prohibited, and who is accountable.
  3. Transparency Obligations (Article 50) — Where you use AI to interact with customers, generate content, or make recommendations, you may need to disclose that AI was involved. This includes chatbots, AI-drafted emails, and AI-generated reports.
  4. Human Oversight — For higher-risk AI uses, particularly in recruitment, HR, and customer-facing decisions, you need documented human oversight procedures. AI must assist human decisions, not replace them without accountability.
  5. Fundamental Rights Impact Assessment (Article 27) — If your AI use falls into a high-risk category such as recruitment, credit scoring, or access to essential services, you are required to complete a Fundamental Rights Impact Assessment before deploying the system.

What counts as high-risk AI use?

The EU AI Act’s Annex III lists specific high-risk categories. For UK SMEs, the most relevant are:

  • Recruitment and HR — any AI used to screen CVs, rank candidates, or assist in hiring decisions
  • Access to essential services — AI used in credit, insurance, or financial decisions
  • Education — AI used to assess or evaluate students
  • Law enforcement and justice — AI used in legal or regulatory contexts
  • If your business uses AI in any of these areas, you face additional compliance obligations beyond the baseline.

What happens if I don’t comply?

Enforcement begins 2 August 2026. The EU AI Act creates a tiered penalty structure:

  • Up to €35 million or 7% of global turnover for prohibited AI practices
  • Up to €15 million or 3% of global turnover for other violations
  • Up to €7.5 million or 1.5% of global turnover for providing incorrect information to regulators

What should I do now?

For a UK business with EU customers, enforcement action can be taken by the relevant EU member state’s national supervisory authority. You don’t need to be incorporated in the EU to be subject to enforcement.

The deadline is 2 August 2026 — 101 days away. Here’s what to do:

  1. Audit which AI tools your business uses and whether any EU-facing activity is involved.
  2. Identify whether any of your AI use falls into a high-risk category.
  3. Get the documentation in place — AI Acceptable Use Policy, AI Literacy Policy, transparency disclosures, and human oversight procedures at minimum.
  4. If you have high-risk AI use, complete a Fundamental Rights Impact Assessment and Risk Management Plan.

Get compliant before 2 August 2026.

Most UK SMEs don’t have a compliance team or a law firm on retainer. That’s exactly why Clausely exists.

Clausely generates tailored EU AI Act compliance document packs for UK small businesses. Fill in a short form about your business and receive editable Word documents — ready to sign — in under 10 minutes.

Recommended next step

Start with the Professional intake.

This article points most businesses toward the Professional pack for Article 50 disclosures, human oversight, and the core AI governance documents you’ll likely need next.

Start the Professional Pack for £899

This article was written with AI assistance and reviewed for accuracy against current UK and EU regulatory guidance. It does not constitute legal advice. If you require specific legal guidance, please consult a qualified solicitor.