DataAI & Technology

From seeing to reasoning: why event-driven data Is the missing link for driverless cars

By Tom Fairbairn, Distinguished Engineer, Solace

When Nvidia revealed its Alpamayo platform at CES 2026, and Waymo began rolling out its driverless cars on London’s streets, it felt like a genuine turning point. Autonomous vehicles have moved beyond spotting pedestrians or recognising road signs. They’re starting to reason about the world around them. 

That’s a big leap. But bringing that capability to the UK, particularly to a city like London, raises an uncomfortable question: are we actually building the foundations these vehicles need to cope with reality? 

From where I sit, the industry has poured huge effort into the “brain” of the car, while paying far less attention to the nervous system that feeds it. And on British roads, that imbalance matters. Real-time data keeps autonomous systems from grinding to a halt the moment something unexpected happens. 

The messy reality of UK roads 

Anyone who drives in London knows the rules are… flexible. Roads narrow without warning. Temporary lights appear overnight. Lanes vanish for weeks because someone decided to dig them up. This is a very different challenge to navigating wide, predictable grid systems. 

For a human driver, that chaos is manageable because we react instinctively to what’s happening now. For an autonomous vehicle, relying on a map from yesterday, or even ten minutes ago, is simply not good enough. 

We’ve already seen what happens when autonomy meets disruption. In San Francisco, a power outage once took out traffic lights across the city. Self-driving cars, suddenly unsure how to proceed, stopped dead. Precision engineering turned into gridlock almost instantly. 

There’s a reason Mr. Musk talks about the “last 1%” being the hardest for autonomous vehicles. Cars don’t fail at driving down a clear road. They struggle when something changes unexpectedly. As Waymo expands into the UK, success won’t come from the best pre-loaded data, but from how quickly vehicles can respond to what just changed two streets away. 

Reasoning needs live context 

This is where a lot of existing data infrastructure shows its age. Too many systems still work by asking questions and waiting for answers: Is the road clear? Has anything changed? By the time the response comes back, it already belongs to the past. 

An autonomous vehicle doesn’t have that luxury. When you’re edging onto a busy roundabout, you need to know what’s happening right now, not what was true a few minutes earlier. 

Platforms like Nvidia’s Alpamayo point towards a future where vehicles can understand cause and effect, as well as patterns. But even the smartest reasoning model is only as good as the information it receives. If the data arrives late, the conclusions will be wrong. Real intelligence depends on systems that can listen continuously to what’s happening and react the moment something changes. 

Why event-driven thinking matters 

To make this work at scale, the architecture underneath autonomous vehicles has to change. Static integrations and batch updates won’t cut it. What’s needed is an event-driven approach, where changes are shared instantly across vehicles, infrastructure and fleet systems. 

A city-wide diversion, for example, isn’t one big problem, it’s hundreds of small ones happening at once. Those signals need to flow to the right agents in real time, without overwhelming onboard systems or wasting compute power. 

The encouraging sign is that the wider ecosystem is moving in this direction. Nvidia’s work around simulation and physical AI reflects a growing recognition that agentic systems need their own environments and guardrails. We’re seeing similar moves across the enterprise world, as organisations look for better ways to govern and deploy AI agents responsibly. 

At Solace, that’s exactly the gap we focus on, providing the real-time event mesh that allows these systems to communicate and adapt without forcing organisations to rebuild everything from scratch. 

What this really comes down to 

Waymo’s arrival in the UK and Nvidia’s latest announcements make one thing clear: real-time data goes far beyond technical detail. It’s the difference between autonomy that works in theory and autonomy that works on London’s streets. 

As the ‘Knight Rider’ showrunners predicted, driverless cars will become part of everyday life. But if they’re going to earn trust here, they have to do more than see what’s in front of them. They have to understand the moment they’re in and respond to it instantly. 

Autonomy on UK roads won’t fail because cars can’t see. It will fail if they can’t react fast enough. 

Author

Related Articles

Back to top button