Future of AIEnviromentalEnergy

The Cooling Issue: Seeking All-Encompassing AI Sustainability

By Salomé Beyer Vélez, Espacio Media Incubator

From the Dot-com Bubble in the late 1990s and the smartphone revolution in the mid-2000s, to the more recent social media and cloud computing booms, the tech industry has played a huge part in encouraging social phenomena – everything from shifting labor markets, democratization of information, sparking frenzies in the stock market, and  the rise of gig economies. 

Now, with the introduction of artificial intelligence (AI), what experts have deemed as the Dot-AI Bubble has prompted similar shifts, underscored by unprecedented investments and multi-sectoral integration. 

AI’s critics abound. As recently as December 2024, the United Nations Security Council debated the technology’s potential for making armed conflicts worse. And in 2021, the European Union introduced its AI Act, the world’s first piece of comprehensive legislation about the tech, seeking to ensure that AI systems are “safe, transparent, traceable, non-discriminatory, and environmentally friendly.” 

This last criticism, rooted in the ever-present environmental impact of AI, has mostly slipped under the radar of legislators, who have turned to addressing other concerns, from data usage in the United Kingdom to accountability in Canada and transparency in South Korea

Data centers in particular, currently essential to the effective operation of AI, have become one of the fastest-growing industries around the world, as signaled by the Electric Power Research Institute. 

Triggered especially by the release of OpenAI’s ChatGPT in November 2022, the International Energy Agency (IEA) projected that global data center electricity demand –  set currently at 1-1.5% of global energy use – will more than double in the next five years. In fact, data centers will use as much electricity by 2030 as the whole of Japan uses today. 

This increase in power demand will not uniformly affect areas around the world, however. In the United States, 15 states account for 80% of the national data center load. And globally, the Asia-Pacific region accounts for 30% of the total data centers, followed by Europe and North America. 

Within these, it is also imperative to understand where energy is being used. An estimated 45% of the energy consumed by data centers is required for IT equipment, 15% for power provisioning and infrastructure, and a critical 40% for cooling systems. 

Emerging calls for sustainability encompass both environmental and social concerns, with the communities near data centers experiencing power grid strains, increased gas emissions and water contamination. These topics were top of mind at NTT’s recent Upgrade 2025 Conference in San Francisco.

“Traditionally, data centers used air cooling, with lots of air conditioners needed. But with AI, density goes up significantly, so both air and liquid cooling are needed,” Bob Thronson, Vice President of Marketing and Business Development at data center optimization company Vigilant, told me. 

“And data centers don’t use dirty water… they use potable water! In areas like California, Texas, South Africa, India or Australia, communities have water problems because aquifers are being sucked dry by these data centers,” Thronson added. 

With AI’s seemingly unstoppable surge in the following decade and its reliance on data centers for operation, embracing these sustainability challenges- and addressing them individually- is vital. 

Innovation at the forefront 

Conventional cooling methods use air conditioner-powered refrigeration or compression such as water pumps, and cooling towers to uphold IT equipment. And although these methods have been reliable and friendly in terms of initial investment and maintenance costs, with the latter being particularly used across developing countries, they are wasteful. 

As per a 2024 report, cooling systems have progressed to incorporate airflow control elements to address this issue, including hot and cold aisle forms, diversion barriers, and server mode models. 

Particularly with the rise of AI and its characteristic reliance on data centers, more energy is wasted now than in the previous 25 years, despite previous technological booms. As such, developing cooling optimization within data centers has emerged as a necessity for both environmentally concerned actors and companies which seek performance improvement

“Data centers are very power hungry and current ones are so big and complex that it is very important to have multi-sectoral collaboration, especially in developing countries,” noted Yosuke Argane, Vice President of NTT’s Innovative Optical and Wireless Network Office (IOWN), while in conversation with The AI Journal

Similarly, Thronson noted the importance of cross-team collaboration within companies, virtually unnecessary until the current AI proliferation. 

“While historically it hasn’t been that important for the IT and the cooling and power side to communicate, now with the big changes to the IT side- much higher density equipment, much greater variability- it is critical that both sides work hand in hand,” he stated. 

Beyond the human variable, a crucial fact remains: cooling technologies themselves remain outpaced by the rate at which data centers are being leveraged by AI companies. Their high energy consumption, inefficient airflow management, non-stop operation and infrastructure limitations are unsustainable, which will become increasingly evident as energy use continues to skyrocket over the next five years.  

Researchers, however, have developed sustainable alternatives. In 2012, researchers applied hot water to refrigeration systems, and in 2016 they determined that energy efficiency can be increased substantially through low-grade energy recovery. 

Free cooling has emerged as a way to decrease the load of electrical coolers, and it is as simple as direct fresh air. And although this approach poses several limitations, including lacking ventilation systems, the need for dehumidifiers, and possible dust collection, other alternatives have emerged. 

NTT experts suggest that, because the issue arises from electricity-borne heat, one has to find alternative ways for data transmission, traditionally carried out via copper wires. Optical data transmission, via fiber-optic cables, on the other hand poses the possibility of keeping data centers cooler. 

“The signal loss of copper wires generates heat, and also means that not all the electricity contributes to the transmission itself. So, for sustainability purposes, electrical wiring has lots of problems that need to be resolved, especially for the ultra-high speed transmission that AI calls for,” Argane shared. 

A paradoxical solution has also been proposed: using AI for solving the cooling issue it has partly caused. 

In 2016, years before AI became a prevalent topic on the public agenda and discourse, Google announced that it had reduced its data centers’ cooling energy consumption by 40% upon its application of AI system DeepMind. 

“Every improvement in data center efficiency reduces total emissions into our environment and with technology like DeepMind’s, we can use machine learning to consume less energy and help address one of the biggest challenges of all- climate change,” the company stated at the time. 

In this case, DeepMind learned to identify cooling actions, their impact, and predict future temperature and pressure data by applying machine learning algorithms. The technology then suggested real-time changes, from fan speeds to water temperatures, which human operators implemented upon review. 

More broadly, a recent study published by Cornell University found that data centers could save energy by 14-21% after integrating AI, and that their operations ran safely and without operational constraints. 

“What you need is a more mission-critical type of AI that isn’t in the cloud- because of data security concerns-, that should be limited in its hallucinations to allow some exploration within safety constraints, and that it incorporates fail saves to that if there’s any issue, automation can be switched back on,” Thronson urged. 

Corporate commitments have been made to comply with this call for sustainability and integrate researcher’s findings, and governments have addressed AI, but the daunting figures remain; energy use in data centers will skyrocket by 2030. 

“Climate change is real. Greenhouse gases are warming the planet, oceans are acidifying. Others will do renewable fuels for airlines or nuclear power, but if we had 500 companies [committed to sustainable cooling] deployed across the entire economy, we could greatly offset [this environmental toll],” Thronson concluded. 

Author

Related Articles

Back to top button