AI & Technology

The ROI from AI tools can be hard to measure – but that’s no excuse for a lack of transparency

By Brook Downton, Head of Innovation at Ralph, on why experimentation with AI is no longer enough

For the last two years, many organizations have been on an AI spending spree. New roles, new tools, new partnerships, new decks about transformation. Labs were set up. Workshops were run. LinkedIn was flooded with posts about “the future of creativity.” Everyone was experimenting, and experimentation was deemed enough.

It’s not enough any longer. 2026 is the year that someone with clout – a CFO, a client, a board member – is likely to ask the question that’s been building quietly in the background: “What did we actually get for all of that?” Not, “What did we learn?” or “What’s the potential?” But rather, “What did we actually get?”

AI measurement requires a holistic approach

When trying to answer this question, there is a trap to avoid when measuring AI ROI – and that is to measure it at the tool level. This means that while, “What’s the return on our ChatGPT Enterprise license? How many hours did the image generator save? Did the AI writing tool reduce our content costs?” may seem reasonable questions at first glance, the reality is that measuring AI ROI tool by tool is like measuring the value of electricity by looking at one light bulb in isolation. You’ll likely get a number, but it’s far from the full picture.

The value of AI isn’t in any single application. It’s in the system you build around it. The companies seeing real returns right now are the ones which treated AI as infrastructure: proprietary data sets, connected workflows, custom knowledge bases that make every subsequent tool smarter and faster to deploy. The approach is iterative and its effect is cumulative.

I learned this firsthand. Late last year, I built a growth forecasting dashboard. No brief. No additional budget. Just a proactive experiment using infrastructure we’d already invested in. That dashboard ended up being a meaningful factor in winning a significant new engagement. But if you tried to calculate its ROI in isolation, the math wouldn’t make sense. For this tool that took days to build, and contributed to a six-figure outcome, the return wasn’t on the dashboard. It was on the months of foundational work that had made building it that quickly possible.

That’s the ROI most companies can’t see, because they’re not looking at the system as a whole. They’re looking at the light bulbs.

Panic is subsiding into practicality

Remember 2023? Half the advertising industry, where I’m from, was convinced AI was about to replace creative directors. The other half was convinced it was a glorified autocomplete. Both were wrong, and the panic has mostly subsided into something more practical.

The companies seeing real returns from AI aren’t the ones that automated decisions. They’re the ones that gave experienced people better inputs, faster. 

Today, the actual divide isn’t AI versus humans. It’s AI-informed intuition versus uninformed intuition. We presented an AI-powered strategy tool to a client recently. Their immediate reaction: “I’m a gut person. I hate the idea of AI controlling creative and strategic ideation.” Fair enough. But the conversation that followed wasn’t about replacing their judgment. It was about what happens when you give someone with 20 years of pattern recognition access to better raw material. The truth is that the gut still picks. The gut still shapes. The gut just has more to work with.

Human intuition still matters

The AI tools that survive this year of truth are the ones designed to collaborate with experienced humans, rather than bypass them. The moment you position AI as a replacement for expertise, you’ve lost the room and missed the point. During the experimentation phase, nobody interrogated the outputs too closely. But that era is finally coming to an end.

This new reality means scrutiny. And this is why 2026 is likely to be a year of truth for AI and its outputs. When a forecast lands on a decision-maker’s desk, someone is going to ask how you got that number. When an AI-generated strategy informs a campaign, someone is going to want to see the methodology. When a tool recommends a creator or predicts a trend, someone is going to ask what data it’s drawing from and whether that data is any good.

Credibility requires transparency

Black box AI is becoming a liability. Not because the outputs are wrong – though sometimes they are – but because “trust me, the AI said so” is never a good position, especially when real money is on the table. It’s important to build methodology transparency into every tool from day one. Not because anyone asked you to do so, but because it’s sensible to anticipate that question. Data sources, assumptions, calculation logic, should all be visible and documented. When the chairman walks through a growth forecast, the credibility may not be in the projection itself, but in being able to open the hood and show exactly how you got there. Every input, every assumption, every weighting decision. Transparency takes extra effort to build but it’s the difference between a tool that survives and one that gets quietly shelved (or worst case, drops you in the ****).

Connected systems are harder to measure

This is the hardest part of the AI ROI conversation, and it’s the part most organizations are going to struggle with in 2026 more than ever. Individual AI tools may have modest, measurable impact. A content tool saves ‘X’ hours. An analytics tool surfaces ‘Y’ insights. A forecasting model predicts ‘Z’ with some accuracy. All fine and justifiable. But connected systems of AI tools have exponential impact that’s extremely difficult to attribute cleanly. And this creates a problem: the most valuable AI investments are the hardest to justify in a spreadsheet. But this doesn’t mean we don’t try.

Over the past 18 months, amongst many fun side quests, I’ve built a trend intelligence tool, a creator relationship database, a storytelling platform, a synthetic audience panel and multiple client-facing dashboards. Evaluated individually, each one solved a specific problem. The trend tool tracked what was happening across platforms. The creator database organized relationships. The storytelling platform accelerated ideation. The audience panel validated concepts. The dashboards forecasted growth. 

But the real value emerged when those systems started connecting. When trend data began informing creator recommendations. When creator insights fed into story frameworks – and when campaign strategy was connected to performance forecasting. New tools that would have taken weeks to build can be set up in days, because the underlying infrastructure already exists.

Innovators must articulate value

The sum of all those connections doesn’t show up on any single project’s ROI calculation. And that’s exactly the trap. If you evaluate AI project by project – the infrastructure investments, the data layers, the connected workflows… it looks like they’re underperforming; right up until the moment everything clicks. The year of truth will force a reckoning: either organizations learn to articulate value, or they’ll defund the infrastructure that was about to pay off. That would be a shame.

This year of truth won’t provide us with a single lightbulb moment. It’s just what happens when the novelty starts to wear off, tooling is all over the map and someone asks to see the working. So, by all means build fast, break stuff – but make sure your foundation is good. And, perhaps most crucially of all, do show your workings!

Author

Related Articles

Back to top button