

Angle
How To Escape AI Pilot Purgatory
- 1 min
A Q&A Guide for Corporate Legal Departments Ready To Move From Experiment to Execution
Key Takeaway: The overwhelming majority of legal department AI pilots stall out. For AI to deliver the benefits it can provide, legal operations and department leaders need to understand the key steps to cross the chasm between piloting AI and adoption.
Legal departments worldwide are buying AI, yet adoption still lags. Nearly 50% of in-house legal teams report being in the exploration phase , but most pilots never graduate to scaled deployment. Industry research puts the failure rate of generative AI pilots at roughly 95%.
The technology works, so why are legal organizations continuing to struggle in transitioning from piloting it to transformative, impactful use? Simply put, they're approaching things in the wrong order. Organizations are trying to prove value without first retooling how work gets done. They box AI into narrow success criteria, limit access to a handful of users, and test against one-off workflows that never reveal what full adoption should resemble. Below are common queries from legal departments working to deploy AI successfully.
Our AI pilot showed promising results. Why hasn’t it scaled?
Promising results in a controlled setting and sustainable adoption are two different things. Most pilots are designed to validate a tool’s capability. Can it summarize a contract? Flag a risk clause? Those questions are answered quickly.
However, the questions that determine whether AI sticks are harder. Who owns the workflow change? How are we measuring behavioral shift? What does “good” look like at the team level? Those considerations rarely make it into the pilot design. Scaling requires attention to change management, A/B testing of workflows, and honest reviews of end-user daily habits. Without that rigor, teams are pushing a proof of concept rather than a solution.
We’ve given our team access to AI tools, but usage is low. What are we missing?
Access and adoption are not the same thing. Giving someone a login is one thing; giving them a reason to work differently is another. With 60% of legal professionals citing “lack of trust in AI outputs” as a major barrier, making a tool available and then relying on the assumption that “if you build it, they will come” is a recipe for shelfware.
The same gaps continue to appear. No guided use cases are mapped to work from day to day, executive sponsorship is absent or half-hearted, and there are no feedback loops that let users shape how the tool evolves. One of the enduring lessons from all technological inflection points is that people don’t adopt tools because you tell them to. They adopt tools that make their Tuesday afternoon easier.
How do we measure whether AI adoption is actually working?
Measuring the adoption and impact of AI is where most organizations trip up. They default to tool-level metrics: queries run, documents processed, time saved per task. These are easy to measure, and they have a place, but they only tell you whether someone opened the tool. What matters, and what they don’t measure, is whether the work changed.
An AI Adoptability Index involves designing a quantifiable framework that measures your organization’s actual capacity to adopt and sustain AI across legal operations. Readiness can be scored across five dimensions: workflow integration maturity, change management infrastructure, data governance and quality, user proficiency and confidence, and leadership alignment. A low score on any single dimension can stall an otherwise promising initiative. The Index tells you where friction lives and what to address before you spend another dollar on technology.
What can we do in this quarter to make progress?
Assign accountable ownership. Every AI initiative needs a named owner with operational authority, not an innovation committee. If no single person with real decision-making power has skin in the game, the pilot drifts.
Set guardrails, then widen the aperture. Define at the outset what’s in and out of bounds. Then set your team loose to explore within those boundaries. Overly restrictive pilots kill the experimentation that drives adoption.
Run a baseline AI Adoptability Index assessment. Without a baseline, every future investment is a guess, so, before you proceed, score where you stand today across the five dimensions above.
Is the technology ready for legal workflows?
Yes, and that’s what makes this conversation difficult; it takes the technology excuse off the table. Modern AI platforms are demonstrating that they accelerate contracts review, cut privilege analysis time, and support knowledge management at a level that would have seemed absurd five years ago. The bottleneck is the gap between what the technology can do and what the organization is willing to change.
Successful implementations tend to allocate roughly 10% of their effort to algorithms, 20% to infrastructure, and 70% to people and retooling processes. The organizations breaking out of pilot mode are pouring energy into governance, training, and alignment.
So where does that leave you?
Pilot purgatory is a leadership problem, not a software problem. Getting out requires being willing to stop tinkering with the pilot and start transforming the work. Clear guardrails, accountable ownership, a structured AI Adoptability Index, and real hands-on training move AI from “interesting experiment” to “how we work now.”
The legal departments that figure this out won’t be the ones with the biggest budgets. They’ll be the ones who change how work gets done and measure how much work it did.
Learn more about Epiq AI Enablement for Corporate Legal Departments.

Kenzo Tsushima, Principal of Managed Solutions and AI Programs, Legal Solutions
Kenzo Tsushima partners with corporate legal department and law firm leaders to scale legal throughput via AI consulting, enablement, and implementation. He further works alongside legal operations and business stakeholders to design, optimize, and deploy solutions that integrate people, process, technology, and data, with a focus on outcome-driven legal functions and emerging technology to produce measurable business value
The contents of this article are intended to convey general information only and not to provide legal advice or opinions.