
Generative AI may feel like it’s been around forever, but we’re just getting started.
It’s been a little less than three years since Chat-GPT arrived. We’re now at the point where businesses must sharpen their strategies and move beyond first-generation adoption. The most pressing challenge now isn’t adding tools or even deploying them — it’s developing observability, sophistication, and strategy to capture what can feel like very distributed AI value.
Estimates vary but larger businesses are spending anywhere from $600-$1,400 per employee on AI tools this year. Yet 80% report no significant bottom-line impact from generative AI use.
This isn’t a technology failure—it’s a measurement failure. More specifically, it’s a sign that organizations are still struggling to see the value that AI is creating.
The problem is not, in fact, that AI doesn’t create value. It’s that most AI usage is invisible to the systems designed to measure that value, and a sound strategy going forward creates ways to for your business to observe, measure and take advantage of AI’s potential.
The Visibility Gap
McKinsey’s latest State of AI report shows 78% of organizations use AI in at least one business function. There’s a different struggle, though, for many businesses that many larger enterprises and Fortune 500 companies encounter: even more AI usage (by some estimates the overwhelming majority) happens outside of IT visibility.
Employees often use personal accounts for AI tools and features are quietly embedded in existing SaaS platforms. New tools are discovered and shared organically across teams. And IT sees very little of it, if any of it.
The result is a massive gap between what leadership thinks is happening and what actually is happening. This organic, distributed adoption represents AI maturation in action, but it challenges traditional enterprise management approaches and what company information remains private or is being fed into AI training models.
One healthcare system I encountered, for instance, approved 5 AI tools — but discovered 34 in active use. When you dig deeper, this isn’t an outlier: it’s an indicator of organizations transitioning from centralized AI experimentation to decentralized AI integration.
The ROI Measurement Problem
Traditional ROI measurement assumes you can track inputs and outputs. But how do you measure the business impact of AI tools you don’t know exist? This represents a classic challenge facing any organization moving from pilot-phase technology adoption to enterprise-scale integration. Which is exactly where we are with generative AI.
Companies meticulously track ChatGPT Enterprise usage while missing the GitHub Copilot alternatives, document AI tools, and embedded intelligence that employees actually use to get work done. They optimize spending on what’s visible yet estimates on what’s “invisible” to IT run as high as 89%. And it’s this “invisible” part that drives or destroys actual productivity.
This creates a paradox.
The more effective your AI adoption, the less visible it becomes to measurement systems. Success looks like failure in executive dashboards. Organizations stuck in first-generation measurement and observability will systematically underestimate their AI ROI and likely be oblivious to a lot of activity, while more mature competitors develop comprehensive visibility strategies.
The Three-Layer Solution
Mature AI organizations recognize that sustainable competitive advantage requires visibility across three layers:
Discovery Layer: What AI tools are employees actually using? Most companies discover 5-10x more tools than approved when they implement comprehensive monitoring.
Usage Layer: How are these tools being used? A sales team might use AI for email drafting, competitive research, and proposal generation but traditional monitoring only sees “sales team uses AI.”
Impact Layer: Which usage patterns correlate with business outcomes? Does AI-assisted email drafting actually improve close rates? Does AI-generated research reduce deal cycle time?
Without all three layers, ROI optimization is impossible. You can’t improve what you can’t measure, and you can’t measure what you can’t see. This is the next phase of thoughtful AI strategy, and a comprehensive approach separates organizations treating AI as a collection of tools from those building AI as an integrated capability.
Implementation Principles
So, how do you put this in place?
First, assume AI usage is 5-10x broader than current visibility suggests. Most organizations dramatically underestimate actual adoption.
Second, implement discovery before optimization. Don’t try to improve AI ROI until you can see the full landscape of usage across the organization. A more disciplined approach reflects the shift from reactive AI management to proactive AI strategy.
Third, measure at the workflow level, not the tool level. The same AI tool might accelerate productivity in one context while creating risk in another. Context determines value.
Fourth, design for continuous discovery. New AI tools launch weekly. Static tool lists become obsolete before implementation is complete. Building adaptive measurement systems is a hallmark of strategic AI maturity.
The Business Case
The companies solving AI measurement first will have a sustained competitive advantage over those still optimizing individual tool deployments. While competitors optimize spending on visible AI usage, these organizations will optimize the entire AI investment.
Early results support this approach.
Organizations with a comprehensive AI visibility report they are 3x more likely to exceed AI ROI expectations. They’re not using different tools — they’re seeing the tools differently. This measurement sophistication will become the defining characteristic separating AI leaders from AI laggards.
Your AI investment isn’t failing to deliver value, in other words, it’s delivering value that current measurement systems can’t capture.
That means the solution isn’t better AI, per se. It is evolving from first-generation AI adoption to mature AI operations with the measurement systems to match. This approach gets you out of AI’s current ROI paradox.



