EU data residency and AI tools: what every Irish SME needs to know

EU AI Act Compliance Audit

EU data residency and AI tools: what every Irish SME needs to know

Microsoft Copilot Flex Routing moves AI inference outside the EU by default from 17th April 2026. EU data residency guide for Irish SMEs using AI tools.

Eileen Weadick

Founder, Clear Gate Systems • 10 Apr 2026 • 8 min read

EU data residency and AI tools: what every Irish SME needs to know

There is a rule that sits quietly underneath the EU AI Act, one that most AI governance conversations skip straight past. It is not the headline obligation about audit logs or human oversight. It is more foundational than that. It governs where your data lives, how it moves, and whether the AI tools your business uses every day are operating inside the law or silently outside it. The rule is EU data sovereignty. And for Irish SMEs using commercial AI tools (Microsoft Copilot, ChatGPT, Google Gemini, recruitment platforms, loan decisioning software), it is now one of the most practical compliance questions you face.

This article explains what EU data residency means for AI tools in practice, why vendor GDPR claims are not sufficient assurance, and what three criteria any compliant AI tool must meet.

What EU data residency actually means (and what it does not)

The EU does not legally require you to store data within EU borders. What it prohibits is transferring personal data outside the European Economic Area without a valid legal mechanism in place. The distinction is clear: GDPR is a transfer law, not a localisation law.[5]

That said, the practical reality for AI systems is more complex. When your staff use an AI tool that processes prompts, queries, or document content containing personal or sensitive data, that processing is a data transfer if it occurs on servers outside the EEA, even temporarily. This becomes acute with generative AI. A prompt containing a client's name, a patient's details, or an employee's performance history can trigger GDPR's cross-border transfer obligations in real time, without anyone in your business being aware.

The EU AI Act adds obligations on top of GDPR. Under Article 10, providers of high-risk AI systems must ensure training data is governed, documented, and traceable.[4] As a deployer, your obligation is to verify that any high-risk provider you use can demonstrate that compliance. Separately, if an AI tool's inference-time processing involves your personal data and occurs outside the EEA without adequate safeguards, that triggers GDPR transfer obligations. Fines under the EU AI Act reach 7% of global annual turnover for prohibited AI practices, exceeding even GDPR's 4% cap.[2]

In summary

Under GDPR, AI processing of personal data outside the EEA is a transfer that requires safeguards such as Standard Contractual Clauses. The AI Act adds specific duties for high-risk AI providers and deployers, but defers to GDPR on data flows.

The Microsoft Copilot problem: a live example

In April 2026, a concrete illustration of exactly this problem emerged. Microsoft quietly enabled "Flex Routing" for all EU/EFTA Microsoft 365 Copilot tenants: a configuration that allows Copilot's LLM inferencing to occur outside the EU Data Boundary during periods of peak demand.[1] The setting goes live by default on 17th April 2026, meaning every Irish business using Copilot will have their AI processing temporarily shifted outside EU jurisdiction, with no opt-in required.

Microsoft's own documentation confirms that LLM inferencing calls (the process of generating a response from a prompt) may route to servers in the US, Canada, or Australia.[1] Microsoft maintains that data at rest continues to reside within the EU. But the processing (the actual AI inference) may now occur outside it. For organisations subject to GDPR, DORA (the Digital Operational Resilience Act, which applies to financial entities), or sector-specific data residency rules in financial services or healthcare, the compliance question is pointed: can you demonstrate that no personal data was processed outside the EU Data Boundary during a given period? With Flex Routing enabled and not reviewed, the honest answer is no.

The mitigation is straightforward. Sign into the Microsoft 365 Admin Centre using an account with the AI Administrator role. Navigate to Copilot > Settings > Flexible inferencing during peak load periods and select Do not allow flex routing. The action takes minutes. But the fact that it needs to be taken at all illustrates the broader point: EU data sovereignty for AI tools is not a configuration that vendors manage for you. It requires active, documented governance from within your organisation.

It is also worth noting that the Data Protection Commission has ongoing scrutiny of Microsoft Ireland's EU data practices. No specific guidance on Flex Routing has been issued as of publication.

In summary

A vendor shipping a default configuration that routes your data outside the EU is not a rare event. It happened with Microsoft Copilot in April 2026. The configuration can be corrected, but only by someone in your organisation who knows to look for it.

Why this matters specifically for Irish SMEs

Ireland has already designated approximately 15 national competent authorities for EU AI Act enforcement under Statutory Instrument 366/2025, effective from late 2025.[6] They include the Data Protection Commission for data governance, the Central Bank for financial AI, and the Workplace Relations Commission for HR AI. The Regulation of Artificial Intelligence Bill 2026, currently at General Scheme stage and not yet enacted, would give these authorities specific enforcement powers and establish an AI Office of Ireland by 1st August 2026, one day before high-risk AI obligations become enforceable.[3]

These authorities would have powers that extend explicitly to online tools and digital interfaces. The proposed enforcement framework would allow authorities to conduct unannounced inspections and require access to source code. They could also order the removal of non-compliant AI functionality from apps and platforms used by Irish organisations.[2] For an SME that has never formally assessed which AI tools it uses, the compliance risk is real. Deployers of high-risk AI systems face additional obligations on top of data residency, including Fundamental Rights Impact Assessments.

The practical implication is this: before you can build an AI Quality Management System (the governance framework required under Article 17 of the EU AI Act for high-risk AI providers), before you can design audit logs or human oversight mechanisms, you need to answer a more basic question. Are the AI tools you use permitted to process this data in the first place? If the answer is uncertain, your governance architecture is built on sand.

The EU AI Act itself applies directly to Irish organisations whether or not the Bill passes into law. The authorities are already designated. The Bill would provide the specific enforcement powers and establish the AI Office; without it, those operational mechanisms are not yet in place.

In summary

Ireland is establishing one of the most structured AI enforcement frameworks in the EU. The organisations best placed to respond are those that have already answered the foundational data residency question.

The three criteria for a compliant AI tool

A Clear Gate Systems Green List assessment is not a marketing category. It is a structured framework that determines whether a commercial AI tool meets the minimum requirements for compliant use under the EU AI Act, GDPR, and sector-specific regulatory obligations. A tool meets the standard by satisfying all three of the following criteria.

Criterion 1: EU Data Residency Guarantee. The tool must process and store data within the EEA, or the vendor must provide a fully documented legal mechanism (Standard Contractual Clauses, Binding Corporate Rules, or an EU adequacy decision) for any data that leaves the EEA. Vague contractual assurances are insufficient. The vendor must be able to confirm in writing where inference-time processing occurs, not just where data is stored at rest.

Criterion 2: No Training on Customer Data. The tool must not use customer data, prompts, queries, or outputs to train or improve its underlying model without explicit, documented, revocable consent. This flows directly from GDPR Article 25 (data protection by design)[5] and the EU AI Act's Article 10 data governance obligations.[4] Tools that use client content to improve their models are not eligible unless a Zero Data Retention (ZDR) agreement is in place: a contractual commitment from the vendor that no data from your prompts or queries is stored or used after the session ends.

Criterion 3: Signed Data Processing Agreement. The vendor must be willing and able to sign a GDPR-compliant DPA that includes EU AI Act-specific clauses: categories of data processed, purpose and legal basis, sub-processor locations and restrictions, retention and deletion timelines, and incident notification obligations. A vendor that declines to sign a DPA, or provides one that predates the EU AI Act, does not meet this criterion.

In summary

Three criteria, all three required. Meeting two out of three is not sufficient for regulated contexts.

A note on "GDPR compliant" vendor claims

One of the most important things to understand before assessing any AI tool is that vendor GDPR compliance claims are not verification. Every major AI vendor states they are GDPR compliant. What this typically means is that they have signed the European Commission's Standard Contractual Clauses, which are the legal minimum for data transfers to the US under the EU-US Data Privacy Framework. It does not mean their data processing meets the additional requirements of the EU AI Act. It does not mean their inference-time processing stays in the EU. And it does not mean they will not change their data routing configuration without prominent notice, as Microsoft's April 2026 Flex Routing update demonstrated.

Tool assessments should be reviewed at least every six months as vendor policies change. Any tool used in a regulated context should have a signed, current DPA in place. DPAs written before 2025 are unlikely to address AI-specific obligations adequately.

In summary

"GDPR compliant" means the vendor has signed a minimum data transfer mechanism. It does not indicate that their AI processing meets the full requirements of the EU AI Act or stays within EU borders.

How to audit your AI tools for EU data residency compliance

If you are not certain which AI tools your organisation is using, where their processing occurs, or whether your current DPAs address AI-specific obligations, the right starting point is a structured AI system inventory.

If you want to begin before engaging outside help, start by listing every AI tool your organisation uses and identifying which vendor handles the data processing. That inventory is the foundation for everything else.

A Clear Gate Systems EU AI Act Compliance Audit identifies every AI system your organisation is using, classifies each one by risk level, and assesses your data governance position against GDPR and EU AI Act requirements. It produces a prioritised action plan that your team can implement.

For a full overview of what the EU AI Act requires from Irish businesses and when, see what the August 2026 EU AI Act deadline means for Irish SMEs.

Book a discovery call to discuss what this would involve for your organisation.

FAQ

People also ask

What is Microsoft Copilot Flex Routing?
Flex Routing is a Microsoft 365 Copilot configuration, enabled by default from 17th April 2026 for EU and EFTA tenants, that allows Copilot's LLM inferencing to occur outside the EU Data Boundary during periods of peak demand. Data at rest stays in the EU, but the actual AI processing may be routed to the US, Canada, or Australia.
How do I disable Flex Routing for Microsoft 365 Copilot?
Sign into the Microsoft 365 Admin Centre using an account with the AI Administrator role. Navigate to Copilot > Settings > Flexible inferencing during peak load periods and select Do not allow flex routing. This setting must be configured before 17th April 2026 to prevent Flex Routing activating by default.
Does using a GDPR-compliant AI tool mean my data stays in the EU?
No. A vendor claiming GDPR compliance typically means they have signed Standard Contractual Clauses, the legal minimum for data transfers to the US under the EU-US Data Privacy Framework. This permits data transfers outside the EU; it does not prevent them. For regulated Irish SMEs, the relevant question is whether the vendor can confirm where inference-time processing occurs, not just where data is stored at rest.
Which Irish businesses are most at risk from EU data residency issues with AI tools?
Regulated Irish SMEs in financial services, insurance, health technology, and HR technology carry the highest exposure. These sectors handle the categories of data most often subject to sector-specific residency requirements, and their AI systems are most likely to fall within the EU AI Act's Annex III high-risk categories. DORA-regulated entities face additional data governance obligations on top of GDPR and the AI Act.
What is the difference between data residency and GDPR compliance for AI tools?
GDPR is a transfer law, not a localisation law. It regulates how data can be moved outside the EEA, not where it must be stored. Data residency is about where processing actually occurs. An AI tool that processes EU personal data on US servers is GDPR compliant if it has Standard Contractual Clauses in place, but that data is not residency-protected. For AI systems, the distinction matters: inference-time processing in the US is a transfer, even if data at rest is held in the EU.

Clear Gate Systems provides technical governance architecture. This article is for informational purposes only and does not constitute legal advice. Clients requiring legal interpretation of the EU AI Act or other regulation should engage a qualified legal practitioner.