Future of AIAI

The Cloud Cost Crisis Is Undermining AI Innovation

By John Bradshaw, Director of Cloud Computing Technology and Strategy, EMEA, Akamai

AI promises to transform every sector of the economy, but the costs of delivering that promise are beginning to undermine it.  

In the UK and across Europe, cloud spending has grown so rapidly and with such little flexibility, that many businesses are now unable to realise the value of their AI investments. 

Cloud services remain essential to the development and deployment of AI models. Yet today, cloud economics is forcing hard decisions. 

Egress charges, rigid contracts, and deep technical dependencies on proprietary infrastructure have shifted AI from a strategic priority to a budgetary burden. 

AI Budgets Diverted to Cover Rising Cloud Costs 

Funds originally allocated for innovation are being redirected toward basic operational continuity, and many organisations are delaying or cancelling plans to expand their use of AI altogether. 

These are not isolated experiences: the UK’s Competition and Markets Authority (CMA)  recently raised concerns that limited competition in the £9 billion domestic cloud market could lead to reduced innovation, higher prices, and lower quality of service.  

Gartner now forecasts that one in four businesses will experience frustration with cloud adoption within the next three years. 

Cloud costs are constraining budgets across departments, delaying new hires, and narrowing the scope of innovation initiatives. 

Why Cloud Costs Escalate So Quickly 

For many organisations, the return on AI investment is proving difficult to quantify – and even harder to justify – given the current pricing structure of most hyperscale cloud providers. 

One major contributor is egress pricing: the fees charged to move data out of a provider’s cloud.  

These charges, often overlooked at the contract stage, can become a persistent drag on performance and cost planning.  

Businesses accustomed to evaluating IT spend in terms of productivity or efficiency now find themselves facing charges they cannot easily model or recover. 

The logical response might appear to be switching providers. But in practice, vendor lock-in, contract length, and architectural complexity make this move costly and slow.  

Vendor Lock-In Limits Flexibility and Raises Cloud Migration Risks 

Many cloud contracts are structured around three-year terms, with technical integration that ties core systems to the provider’s proprietary technologies.  

Migration introduces operational risk, and the cost of re-engineering infrastructure often outweighs the anticipated benefits. 

In reality, vendor lock-in is not just a contractual problem: it includes data lock-in, where the scale and complexity of transferring large datasets becomes prohibitive, and architectural lock-in, in which the organisation’s systems are deeply embedded in a specific provider’s ecosystem.  

Once all three constraints are in place, strategic mobility is almost impossible. 

How to Reclaim Control Over AI Investment 

Despite these challenges, there are viable alternatives. Companies can avoid lock-in by planning workload portability early, selecting providers that offer open architectures, and reducing exposure to high egress fees.  

Choosing a vendor that enables more flexible data movement and supports distributed computing models can help restore control over both performance and cost. 

Edge computing, in particular, offers a path forward – rather than scaling AI vertically through larger centralised resources, edge architectures scale horizontally, placing AI workloads closer to where data is generated and decisions are made.  

This approach reduces latency, lowers bandwidth consumption, and makes real-time responsiveness possible at scale.  

Dynamic Cloud Models Enable Flexible, Cost-Effective Workload Deployment 

Crucially, it also allows businesses to shift from fixed infrastructure commitments to a more dynamic and adaptive model. 

The key is to match each workload with the environment best suited to its operational and financial requirements.  

This means looking beyond traditional hyperscaler models and moving toward providers who support edge-native deployment and cost transparency as a standard, not a premium. 

Building a Sustainable Future for AI 

The opportunity for AI remains immense, but it will not be realised through legacy infrastructure alone. 

 Businesses need a fundamentally different relationship with their cloud providers: one built on open standards, measurable outcomes, and architectural flexibility. 

By aligning investment strategy with a more distributed, edge-oriented cloud model, companies can reduce vendor dependency, regain budget control, and advance AI initiatives without compromise. 

The next phase of cloud computing will not be defined by scale alone, but by adaptability, efficiency, and strategic alignment – those who make that shift early will be in the strongest position to lead. 

Author

Related Articles

Back to top button