

Angle
As AI Goes Global, So Do AI Regulations
- 1 Min
Key Takeaway: Global AI regulations are constantly evolving, with the EU leading regulations through DORA, the AI Act, and the Digital Services Act. These rules have extraterritorial reach, creating compliance challenges for businesses worldwide. Staying informed, implementing proactive governance, and partnering with experts mitigates risk and enables you to harness AI responsibly.
Every company using AI is on the regulator’s radar. Three new EU regulations raise the prospect of a new regulatory paradigm. Governments worldwide are applying new AI laws and regulations extraterritorially and retroactively, often with hefty fines attached. Where it was once sufficient to rest easy by complying with your country’s regulations, those days are at an end. A new approach to compliance is now required.
Many organizations still believe their geographic distance keeps EU regulators at bay, but today’s reality tells a different story. A closer look at these emerging EU regulations clarifies the key compliance approaches companies should adopt to adhere to these new requirements and prepare for borderless AI regulation.
Three Key EU Regulations Driving Global Compliance Challenges
The EU is leading the privacy and AI regulatory charge. Three new acts, each targeting various aspects of technology and compliance, together constitute a new framework with far-reaching implications, imposing a complex web of obligations and stiff penalties.
Digital Operational Resilience Act
The Digital Operational Resilience Act (DORA) addresses the digital resilience of financial entities within the EU. It establishes a framework for managing Information and Communication Technology (ICT) risks and to help ensure that institutions can withstand, respond to, and recover from operational disruptions, including cyber-attacks.
Notably, DORA applies to more than just financial institutions. It also extends to their third-party suppliers and agents. For example, if a financial institution uses an AI system like ChatGPT for critical business functions, both the institution and the AI provider could fall under DORA’s scope.
The EU AI Act (EUAIA) introduces a comprehensive framework for governing AI development and use. It takes a risk-based approach, categorizing AI systems into four levels: minimal, limited, high, and unacceptable. Requirements vary by risk level, and the Act includes specific rules for general-purpose AI models such as ChatGPT. Systems categorized as high will be subject to heightened requirements, such as:
- A risk management system throughout the high-risk AI system’s lifecycle.
- Technical documentation that demonstrates compliance and provides authorities with the information to assess that compliance.
- Design their high-risk AI system for record-keeping to automatically record events relevant to identifying national-level risks.
- Establish a quality management system to ensure compliance.
An unacceptable risk level includes systems considered a clear threat to the safety, livelihoods, and rights of individuals involved. The AI Act prohibits eight practices that include:
- Social Scoring.
- Individual criminal offence risk assessment or prediction.
- Emotional recognition in workplaces and education institutions.
- Biometric categorization to deduce certain protected characteristics.
The Digital Services Act (DSA) focuses on moderating online content, giving regulators authority to assess and remove content they deem false or harmful. Its subjective criteria raise complex questions:
- Who is responsible for reviewing AI content: The prompter, the poster, the model owner, or the data source?
- If your content is used by an AI model without your intent, can you still be liable?
Penalties for violations of these regulations can reach hundreds of millions of dollars, creating strong incentives for platforms to monitor and restrict content.
The Long Arm of the EU
Nothing is more unsettling for compliance professionals than uncertainty. Whether or not being outside the EU constitutes a safe harbor remains to be seen. In the meantime, to assume that your location puts you beyond the EU’s reach could prove a critical misstep. What is clear is that all three regulations have extraterritorial reach. For example, under the DSA, if an EU resident can access your content, the ACT applies. The EUAIA applies even if your company isn’t based in the EU.
In today’s global digital environment, simply having content accessible in the EU can trigger compliance obligations. Pushback exists, but for now, businesses must comply or face significant risk.
Practical Steps for Compliance
Despite the complexity, businesses can take several proactive measures to ensure adherence to emerging global regulations.
- Teams must stay educated by leveraging continuing legal or professional training programs focused on AI and compliance to keep pace with evolving requirements.
- Partner with a trusted supplier who can vet technology and confirm adherence to regulatory standards, reducing risk and ensuring accountability.
- Appoint a dedicated Data Privacy Officer (DPO) to monitor compliance and manage privacy initiatives.
- Engage local legal privacy counsel in the jurisdictions where you operate or where your data may be accessed, as regional expertise is critical for interpreting nuanced regulations.
- Implement and enforce corporate policies that clearly address AI use and data handling, ensuring they are communicated, monitored, and updated regularly.
Finally, familiarize yourself with client and supplier policies. Effectively avoid conflicts, maintain transparency, and ensure trust across all relationships with critical questions:
- Does the LLM train on client data?
- Where is data stored and accessed?
- What security safeguards exist?
- Is the model private or public?
- Can the system detect hallucinations and cite sources?
For legal teams, managed eDiscovery services offer a strategic advantage. Centralizing data storage reduces supplier risk, improves control over security settings, and enables better oversight of new tools and features. This approach helps address overlapping compliance requirements across multiple regulations.
Proceed With Clarity
Technology is evolving rapidly, and regulation is racing to keep up. The EU’s DORA, AI Act, and DSA are just three examples of a growing global trend toward stricter oversight. Businesses face unprecedented opportunities and risks with AI adoption. Staying informed, implementing strong governance, and partnering with experts are essential steps to navigate this landscape while harnessing the benefits of automation.

Brandon Hollinder, Vice President, eDiscovery and Cyber Solutions
As Vice President, eDiscovery and Cyber Solutions at Epiq, Brandon Hollinder partners with clients as an expert to provide value and ensure their solutions are effectively designed and expertly implemented. With over 15 years of experience in eDiscovery and Cyber solutions, Brandon leads Epiq go-to-market strategy for its eDiscovery Managed Services and Cyber Incident Response business lines.
The contents of this article are intended to convey general information only and not to provide legal advice or opinions.