AI now underpins much of modern life, from the way we discover medicines through to everyday searches online. Inevitably, this rapid adoption has delivered significant productivity benefits and progress across a variety of industries.
However, training and using AI models is an incredibly intensive process, both in terms of energy and computation. Large-scale AI workloads, such as training and inference for AI models, demand unprecedented computing power. In fact, a single AI GPU today draws 1200W of power, around 12 times the power draw of an office laptop, and AI data centres can hold thousands of GPUs at a time.
While demand for AI continues to grow, so does its potential to put a strain on energy supplies. Data centres already account for 1.5% of global electricity use, a figure that’s only set to rise in the coming years. Without action from hyperscalers and industry leaders to mitigate this, data centres risk becoming a major challenge on sustainability efforts globally.
To prevent this, organisations must ensure that their computing infrastructure aligns with environmental standards. This involves exclusively choosing data centre sites with an abundance of local renewable energy, improving energy efficiency, and implementing transparent sustainability practices. Data centre operators, AI providers, and businesses need to work together to build infrastructure that balances performance with sustainability.
The foundations of sustainable AI infrastructure
In order to work with AI sustainably, two principles must be followed – only building data centres where there is enough local renewable energy and transparency from hyperscalers.
Unlike traditional infrastructure, data centres are not bound by location. As long as the site has the connectivity needed to handle the massive quantity of data involved in training and managing AI workloads, i they can be situated far from major cities without affecting performance. For example, to serve London, the data centre could be hosted as far as Scotland or even further afield without having a noticeable impact on performance for users.
This flexibility means hyperscalers have more options available to choose sites in areas where there is a local abundance of renewable energy. However, this isn’t happening. Data centres are currently driving a surge in natural gas power. Part of this problem comes down to transparency, which can be solved by customers taking more of an interest in what’s powering their AI workloads. Customers should be demanding to know both where their AI workloads are hosted and what energy sources are powering them.
Unfortunately, many major public cloud providers do not reveal how efficient their energy and water use is. Although they all publish some data, it isn’t consistent and it is rarely comprehensive enough to cover all the regions they operate in. It’s also heading in the wrong direction. As hyperscalers have scaled up energy and water consumption to maintain hardware that can handle AI workloads, there are some companies that have even removed regional data that shows their power efficiency and water efficiency in recent times.
If hyperscalers are compelled to accurately disclose their environmental impact, it could even lead to a boom in renewable energy projects. A constant struggle with renewables is that they don’t generate power at convenient times. In the UK, power consumption spikes twice – once in the morning as people head out for work and a second time once they get back. As this is completely independent of when the wind blows or the sun shines, national power systems can’t keep up with fluctuations in supply and demand.
Data centres, with constant power supply requirements, could provide the baseload demand for energy that renewables need. It is difficult to justify building a new wind farm when nobody is going to use the energy it produces. However, if hyperscalers were more transparent about the energy they use and where it comes from, this could provide energy providers the confidence needed to build more renewable projects.
Optimising energy consumption in data centres
Beyond powering data centres with renewables, hyperscalers should ensure that they use energy as efficiently as possible. This involves deploying the best hardware available for the job and making sure that it is cooled effectively. AI workloads have unique computing requirements beyond what traditional data centres can provide, yet large public clouds regularly host their customers’ AI models on legacy and outdated technology.
While older GPUs can handle AI training, they do so inefficiently. The hardware that performs AI workloads most efficiently will always be the latest generation of AI GPUs. To future-proof their infrastructure, hyperscalers should design new data centre sites with retrofitting in mind. Modular data centre constructions, where clusters of GPUs can be swapped out easily for newer versions when needed, help optimise data centre energy consumption.
Furthermore, the more advanced AI GPUs become, the more waste heat they produce. Traditional air-cooling methods are no longer sufficient, so liquid cooling is a necessity. But not all liquid cooling systems are created equally – a single data centre can use up to 26 million litres of water, posing a risk to local water supplies.
To avoid impacting drinking water supplies, hyperscalers also need to choose advanced, closed-loop liquid cooling for new data centres. Closed-loop systems recycle cooling fluids, significantly reducing water usage while maintaining optimal operating temperatures. The benefits of this go further than sustainability – water costs money, and it’s cheaper for hyperscalers if they use less.
Accountability and action for hyperscalers
It’s true that AI will continue to grow in demand, but this doesn’t need to have a detrimental impact on the environment. Data centre operators, hyperscalers, and the customers that use cloud computing must take action to make the industry more sustainable. Businesses should demand to know where their AI workloads are hosted and how much energy they use.
On the other hand, hyperscalers should hold themselves accountable for their environmental impact. Ultimately, AI does not need to come at the cost of the planet, and if implemented correctly it could even help the transition to renewable energy, but the industry needs to align itself with sustainable practices to achieve this.