The advent of generative AI tools has dramatically and permanently changed the way all of us produce written work, images and videos. It has changed how we interact with search engines to find information, and it has changed how the essential (and non-essential) platforms and applications we use every day learn about us and tailor their services to us.
AI has many deeper applications too — from improving the efficiency of industry and logistics, to finding patterns in biological data that could help us diagnose and treat life-threatening diseases far more effectively than ever before.
But all of these benefits come with costs. The most prominent of these costs is power consumption, on a massive scale. Put simply, AI uses a lot of resources to produces a lot of resources, all of which requires CPU power and data storage. In 2024, the International Energy Agency suggested that global data centre electricity use could potentially double by 2026, to a staggering 1000 terawatt-hours — equivalent to the annual electric consumption of Japan. As of February this year, Goldman Sachs predicted that AI alone will drive a 165% increase in data centre power demand by 2030.
Always growing, always on
Data centres are now being powered like small cities, literally requiring specialist Grid connections due to their enormous energy needs. On the larger end, that can rise to over 100 megawatts.
The demand for data centres is growing faster than new facilities can be built, so data centres are finding ways to accommodate more compute capacity by using high-density racks. But densely packed racks dramatically increase the risk of overheating. So, while this power demand is directly coming from servers, it is compounded by the complex cooling systems needed to prevent overheating, whether they be HVAC or liquid.
Data centres are also always on, and necessarily so; our lives increasingly depend on connectivity and access to digital assets. Meanwhile, the Grid is aging, and our ability to generate and supply energy without interruptions is wavering. On top of this, climate change is driving an increase in the frequency of extreme weather events that have the potential to disrupt power supply.
Therein lies one of the major challenges facing society today. AI is driving an ever-increasing need for power, and more critically power continuity — but our ability to supply this power is declining. How are we supposed to manage that?
The bane and the antidote
Well, ironically, AI itself could well be an answer, particularly if we can harness it to optimise data centre cooling and power distribution.
Data centre automation, powered by AI, improves efficiency in energy management, maintenance, and performance monitoring. AI-driven cooling systems adjust dynamically, using only as much energy as necessary by increasing output only when required. Condition monitoring provides a detailed view of equipment health, offering insights beyond those available to individual engineers.
By analysing real-time asset data, AI enables predictive maintenance that helps reduce downtime and operational costs. AI also contributes to power optimisation by scanning electrical distribution systems for power quality issues, and it can also take corrective action. AI can also forecast power demand 24 hours in advance, allowing for better energy distribution planning, and it can continuously monitor and compare performance over time, using Energy Performance Indicators (EnPIs) to track efficiency and identify areas for improvement.
Case studies
Lakeland Community College in Iowa, for example, needed a new data centre with more space, greater flexibility for future expansion, and improved energy efficiency. The college also had to comply with new regulations requiring energy usage reporting.
ABB stepped in to deliver full data centre automation. The combination of a redesigned facility and automated systems allowed for an optimised layout and more efficient cooling without affecting uptime. The college also moved many of its servers to the cloud, reducing the number of physical servers that required cooling and lowering energy costs. Since adopting these changes, the data centre has cut its energy use by more than 53%.
Likewise at the Lefdal Data Center in north-west Norway, which is located in a former mine 100 metres underground, aims to be one of Europe’s most energy-efficient data facilities. It runs entirely on renewable hydropower and uses cold water from a deep fjord for cooling.
ABB automation solutions and medium voltage switchgear integrated with these natural conditions creating a stable and efficient operating environment. This has resulted in a Power Usage Effectiveness (PUE) of between 1.08 and 1.15 for a 5kW rack, reducing energy consumption by up to 40% compared to traditional data centres. Waste heat from the facility is redirected to the local community, while lower operating costs allow for more competitive pricing for customers.
AI is itself causing a huge spike in energy demands, but we can harness the benefits of AI to minimise the impact. By incorporating AI, data centres can operate with greater reliability and lower energy consumption. I’ll be explaining this in more detail at Data Centre World as part of Tech Show London. Hope to see you there!