The energy usage of datacentres,ย particularly for AI applications,ย has been covered extensivelyย โย and for good reason. AI consumes more power and runs hotter than standard computing loads. In 2022, theย IEA reportedย that the total power used by datacentres, including for AI and cryptocurrency, was around 460TWh.ย ย
Although estimatesย seeย this power usageย potentiallyย growย toย 945TWh by 2030,ย electric vehicles are predicted to consume aroundย 780TWh by 2030, to put this in context. When we look at AI specifically,ย Schneider Electricย has estimated that AIโs share of this power consumption is currently around 8%ย and may grow to 15-20% by 2028.ย ย
These estimates areย stillย prone to beย too high.ย Koomeyโsย Law tells us that over time, we see greater efficiencies in computing โ or specifically, that the number of calculations per unit of energy increase over time. For example, between 2010 and 2018, the amount of computing being done in datacentres increased by over 500%, but theย amount of energy being used only increased by 6%.ย ย
However,ย althoughย the amount of energy used by AI is considerable,ย it can also return the favour.ย ย
AI: Water and Chips with that?ย ย
AIโsย contribution to humanย endeavorย isย already significant.ย Perhaps theย mostย high-profileย example is AlphaFold, which helps us predict protein structures, improving drugย discoveryย and our understanding of diseases.ย ย
Butย weโveย seen many other applications,ย includingย improving chili yields in India,ย reducing conflict between humans and snow leopards, or supporting better risk modelling forย insurance companies.ย ย
AI lives in the cloud, so the most logical place to use AI to reduce water usage is theย datacentre.ย Datacentresย have historically been cooled with air conditioning. With AIโs workloads, cloud companies areย rapidlyย realizing that air is insufficient, and the future will revolve around usingย liquid cooling.ย ย
The reason for this is simple: the thermal conductivity of water isย about 23ย times better than air, and when you considerย additionalย factors like flow rate, waterโs volumetric heat capacityย isย overย 3000ย times betterย than airย when used in an industrial setting.ย ย
On this basis alone,ย itโsย a no-brainerย to use water to cool technology infrastructure. Better conductivity means more power efficiency,ย and ultimately, lessย powerย usedย to remove more heat.ย ย
Andย weโreย still seeing innovation in this field. Historically, cloud companies and gamers alike have attached plates to CPUs (and often, GPUs) and used water to remove the heat. This is known as direct liquid to chip cooling.ย ย
We are now starting to see immersive cooling techniquesย emerge,ย where the entire server is immersed in fluid.ย Although this hasย a number ofย implications for unit maintenance, servers immersed in fluid are not only more power-efficient, but it also eliminates dust from units, improving component lifespans.ย ย
Soย how do we use AI to further improve this efficiency?ย ย
Air, water and changing prioritiesย
AIโs core strength lies in pattern recognition, analysing complex data sets and finding links. Most serversย have the ability toย measure their own workloads and temperatures, and this data can be fed back toย data lakes whereย AI systemsย can learn how toย optimise cooling and power requirements.ย ย
However, sensors can also be put on the servers themselves, measuring water flow andย consequentlyย obtaining more information about a serverโs temperature and cooling requirements.ย ย
Itโsย important to remember that cloud serversย donโtย exist in isolation. Local weather affects cooling: many datacentres use โfree air coolingโย andย use ambient temperature to cool the servers โ this is more effective in Iceland than in Florida, for example.ย At the same time, most datacentres use dry coolers outside to do evaporative cooling โ but this is less effective in areas of high humidity.ย ย
Balancing these equations isย where AI excels. AI canย analyse not only the temperature and power consumption of the servers, but also the environment around them, including data from weather stations. This helpsย to react to local conditions, but also to predict them and streamline water usageย now and in theย future.ย ย
Conversely,ย the datacentre may not be in an area of water scarcity, in which case, AI can be tailored to optimise theย server performance or theย power usage of the pumps and other equipment. Datacentres in urban areas may prioritise noise reduction to avoid disturbingย local residentsย โ which AI can also help with, optimising systems to decrease volume from mechanicalย operations.ย ย
Self-optimising technologiesย
The technology industry is always moving forwards, and although the AI industry has seen a considerable amount of backlash, it also has considerable potential to improve our lives and the world around us. However, we should always have sustainability in mind, considering how to provide for todayโs needs while still safeguarding the world of tomorrow.ย ย
This does require a complex conjunction of worlds: AI needs data toย operate, which means using a combination of IoT and industrialย expertiseย alongside data analysis techniques. But with the right skills,ย visionย and commitment, we can not onlyย benefitย from AIย directly, butย also use it to streamline its own resource consumption, driving a self-improving virtuous circle.ย ย



