The exponential growth of AI adoption across industries is putting incredible demand on global data infrastructure. While businesses rush to implement AI systems and machine learning platforms, traditional data centres are rapidly approaching their technological ceiling. As investment in data centres continues on a massive scale, these facilities will require fundamental transformation to keep up with the cooling demands of advanced AI technologies.
Traditional air-cooling systems that have served data centres for decades are reaching their fundamental limits. Some of the latest generations of AI chips generate unprecedented amounts of heat that conventional methods like air cooling cannot effectively dissipate. The average rack density of a data centre has also more than doubled in just two years, rising from 8 kW to 17 kW per rack, with this figure expected to rise to as high as 30 kW by 2027. Without innovative cooling solutions, the full potential of AI remains locked away. Data centre operators risk being left behind as clients demand capabilities their infrastructure simply cannot deliver.
The Computational Bottleneck
As data centres scale to meet AI demands, performance bottlenecks have become a primary concern. Traditional air cooling simply cannot handle the heat output of modern AI workloads, limiting computational capacity. Electricity demand from data centres worldwide is set to more than double by 2030 to around 945 terawatt-hours (TWh), slightly more than the entire electricity consumption of Japan today.
With AI technology continuing to progress at speed, these thermal management challenges threaten to constrain AI advancement if left unaddressed, creating a situation where cooling infrastructure becomes the limiting factor in computational progress.
Overcoming Infrastructure Barriers
For AI to reach its full potential, critical performance challenges must be resolved, with cooling infrastructures needing to advance to manage thermal loads efficiently. As a result, liquid cooling technologies are experiencing rapid adoption across the industry, with the global liquid cooling market size being expected to reach $10 billion in revenue over the next five years. This rapid adoption will offer dramatically improved thermal management capabilities that can handle these extreme thermal demands.
Direct-to-chip cooling is one of the most proven approaches. This method uses cold plates attached directly to CPUs, GPUs, or other heat-generating components. A coolant – whether water or a specialised dielectric fluid – is circulated through these plates, absorbing heat at the source and carrying it away.
Direct-to-chip cooling is 3,000 times more effective than air cooling alone for HPC infrastructure, enabling data centres to handle far greater compute densities and maintain optimal chip temperatures for maximum performance. Chips, like NVIDIA’s Blackwell Ultra GB300, are now being designed to be liquid-cooled from their conception, featuring integrated cold plates and optimised thermal interfaces, signalling where the industry is headed.
While direct-to-chip cooling offers significant improvements, immersion cooling submerges entire servers or components in specialised dielectric liquid, which absorbs heat directly. This approach eliminates the need for fans entirely and can support much higher rack densities. Immersion cooling provides superior thermal performance to maximise operational efficiency, making it ideal for AI and HPC workloads that demand consistent, high-performance operation.
Industry research strongly supports this technological shift. Of the available options, immersion cooling provides the most comprehensive thermal management solution for data centres. In a recent study by Castrol, detailed in the Dipping Point Report, 74% of data centre leaders surveyed believe that immersion cooling is now the only option for data centres to meet the current computing power demands, with 90% considering switching to this method between now and 2030. This consensus across the data centre industry signals that the days of air-based cooling are numbered, with immersion cooling emerging as the solution for next-generation computing. This transformation is already underway, driven by necessity rather than preference.
The Data Centre of the Future
It’s becoming clear that the future of data centre cooling is liquid. As AI capabilities continue to expand, the infrastructure supporting them must evolve at an equally rapid pace. Forward-thinking organisations are already making the shift to immersion cooling to keep up with increasing AI workloads, ensuring their operations can handle the thermal demands of next-generation computing. The choice facing data centre operators is straightforward: implement liquid cooling systems now, or watch computational capacity become the limiting factor in an AI-driven economy.



