- The Foundation from Limestone Digital
- Posts
- The ROI Problem: AI Adoption Is Scaling Faster Than Value
The ROI Problem: AI Adoption Is Scaling Faster Than Value
Episode 18
Hi there,
Over the past year, AI adoption has accelerated across industries. First, the focus was on models. Then, on data. Now, a different question is emerging:
Is AI actually delivering measurable value?
While more organizations than ever are deploying AI, the results are less clear.
Adoption is increasing.
Capabilities are improving.
But business impact is not scaling at the same pace.
This week, we look at why the gap between AI adoption and ROI is becoming one of the defining challenges of this phase.
Inside the Issue
Adoption vs value: what the data shows
Why most AI initiatives stall before ROI
The difference between deployment and impact
Where value is actually being captured
What this means for teams building AI systems
Adoption Is Not the Problem
A growing share of organizations now use AI in at least one business function, with generative AI adoption accelerating across industries.
But adoption alone is no longer a meaningful signal.
At the same time, only a smaller subset of organizations report measurable financial impact — and an even smaller group is able to scale that impact across multiple functions.
In other words:
AI is being deployed widely — but not yet translated into consistent business value.
Where the ROI Gap Comes From
The disconnect is consistent across multiple sources.
Deloitte highlights that many organizations remain in pilot or early deployment stages, with ROI still difficult to quantify or validate.
Gartner similarly points out that many AI initiatives do not progress into production environments that deliver sustained value.
At a macro level, Stanford HAI shows that while investment and technical capability continue to increase, economic impact remains uneven and difficult to attribute across industries.
This suggests that the challenge is not access to AI, but the ability to operationalize it effectively.
Deployment vs Impact
One of the key distinctions emerging from recent research is the gap between deploying AI and generating measurable business outcomes.
Many teams successfully integrate models into products or internal workflows.
However, fewer organizations manage to connect AI outputs directly to decision-making processes, redesign workflows around AI capabilities or define clear performance metrics linked to financial outcomes.
As a result, AI often remains an enhancement — rather than a driver of measurable change.
Where Value Is Actually Being Captured
Across the data, a consistent pattern appears.
Organizations that report meaningful ROI tend to:
prioritize a limited number of high-impact use cases
integrate AI into core operational workflows
invest in data pipelines, system integration, and governance
establish ownership and accountability for outcomes
Findings from McKinsey & Company indicate that value is typically concentrated in organizations that move beyond isolated use cases and embed AI into broader business processes.
This reinforces a key point:
AI does not create value in isolation. Systems and workflows do.
Why This Is Not Just a Maturity Issue
It may be tempting to view the ROI gap as a temporary stage of adoption.
However, the pattern suggests something more structural.
Organizations often underestimate the complexity of integrating AI into real operational environments. Scaling requires coordinated changes across workflows, governance, and measurement — not just improvements in models.
This shifts the perspective:
The primary challenge is not technical capability, but system-level execution.
What This Means for Teams
For teams building AI-enabled systems, several practical implications follow:
defining success in terms of outcomes, not usage
designing systems around AI, not just with AI
establishing measurement early
planning for continuous iteration after deployment
The systems that generate value are not the ones that demonstrate capability, but the ones that are embedded into how work actually gets done.
Closing
AI capabilities are advancing rapidly.
But the ability to convert those capabilities into measurable outcomes is developing more slowly.
This gap is not about access to technology.
It is about how systems are designed, integrated, and operated in real environments.
If AI is not delivering value, the issue is rarely the model itself.
It is how it fits into the system around it.
Working With AI in Production
At Limestone Digital, we work with teams building AI systems that operate in real environments — with real constraints, real data, and real users.
That work is not about models in isolation.
It is about designing systems that produce consistent, measurable outcomes.
If you are navigating similar challenges, we are always open to continuing the conversation.
Sources & Further Reading
McKinsey & Company — The State of AI
https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-aiDeloitte — State of AI in the Enterprise
https://www.deloitte.com/us/en/what-we-do/capabilities/applied-artificial-intelligence/content/state-of-ai-in-the-enterprise.htmlGartner — Hype Cycle for Artificial Intelligence
https://www.gartner.com/en/articles/hype-cycle-for-artificial-intelligenceStanford HAI — AI Index Report 2025
https://hai.stanford.edu/ai-index/2025-ai-index-report
Thank you for joining us for another edition of The Foundation.
P.S. We want to make sure this newsletter hits the mark. So reply to this email and let us know what you think.