As artificial intelligence (AI) reshapes industries and transforms economies, it is also driving change in data centre design and operation, particularly in cooling technology.
GPUs (Graphics Processing Units) are at the heart of this evolution. Once the realm of video games and visual effects, GPUs are now used in high performance computing (HPC). This encompasses all that we think of as ‘AI’, from machine learning to speech recognition, to training on large language models (LLMs).
Companies active in this field include NVIDIA which is probably the current market leader in terms of its technology. However, others are close, including Intel, Google, Graphcore (a UK-based company), Microsoft and Amazon.
Liquid cooling will become the gold standard, with air-cooled systems still vital for full environmental control
Significant heat generation
High-performance computing workloads such as training deep neural networks generate significant amounts of heat. The GPUs run hot because of their dense transistor layouts and high power consumption. For example, a single NVIDIA A100 GPU can consume around 250 watts of power. When deployed at scale in hyperscale data centres, the heat generated is significant.
Traditional air-cooling methods, which are widely used in legacy data centres, are being left behind by the cooling demands of these advanced systems. The urgent need for efficient heat management has placed liquid cooling technology at the forefront of the AI data centre evolution.
Liquid cooling offers an effective alternative to air-based systems. While air-cooling relies on the circulation of air to dissipate heat, liquid cooling transfers heat directly from high-temperature components into a cooling liquid. This liquid is then circulated to a heat exchanger, where the heat is expelled.
Several advantages
There are several advantages to liquid cooling, particularly for AI and HPC workloads:
- Offers increased cooling efficiency: Liquid cooling systems can dissipate heat more effectively than air, reducing energy consumption and improving overall data centre efficiency.
- Supports higher-density racks: Liquid cooling enables higher server density, allowing data centres to pack more powerful GPUs into smaller spaces.
- Delivers quieter operation: With fewer fans required for cooling, liquid-cooled systems are quieter, which is an added benefit in data centres with shared office space.
- Makes efficient use of water: Advanced liquid cooling systems use less water than traditional evaporative cooling, making them a more sustainable choice.
Direct to chip solutions
Liquid cooling is already being adopted in new data centres. For example, NVIDIA has partnered with data centre operators and cooling technology providers to integrate liquid cooling into its systems.
In 2023, the company announced plans to offer liquid-cooled versions of its A100 GPUs, recognising the critical role this technology will play in the future of AI data centres. AMD and Intel are also adopting liquid cooling as a solution for their thermal challenge.
Leading providers are deploying direct-to-chip liquid cooling solutions, where coolant is circulated through pipes in direct contact with GPU and CPU components. This method eliminates the need for bulky heatsinks and enables precise temperature control.
Beyond hardware, liquid cooling also supports the broader sustainability goals of AI data centres. Waste heat from liquid-cooled systems can be captured and reused for district heating or powering adjacent facilities. Examples in Finland and Denmark demonstrate the potential for data centres to act as heat suppliers, turning a challenge into an opportunity.
As AI adoption accelerates, the demand for liquid cooling will only grow. This technology will play a vital role in enabling data centres to keep pace with the heat generated by advanced GPUs while reducing environmental impact. Governments and organisations investing in AI infrastructure must consider liquid cooling not as an optional upgrade but as a critical enabler of future success.
Vital technologies
GPU and HPC technology is transforming our technical landscape, driving innovation across industries and setting new benchmarks for computational power. However, this progress comes with challenges that demand equally innovative solutions.
Liquid cooling is poised to become the gold standard for managing thermal loads in AI data centres, enabling operators to deliver high-performance computing while meeting sustainability goals. For data centre operators and decision-makers, the message is clear: the future of AI requires a new approach to cooling, and liquid cooling is the key to unlocking it.
However, air-cooled systems are still vital to delivering full environmental control in these buildings, often working in tandem with liquid cooling. Mitsubishi Electric has a range of products to meet the needs of data centres of all sizes, and we are already underway with developing the next generation of liquid cooling – an exciting development that we will be talking about more shortly.
Shahid Rahman is EMEA Data Centre Strategic Account Lead