AI supercharges the race for liquid cooling solutions: what’s in store for 2026?

AI supercharges the race for liquid cooling solutions: what’s in store for 2026?

key takeaways.

  • Liquid cooling is essential for AI-driven data centres, efficiently managing the extreme heat generated by high-density AI server racks. It offers up to 15% better energy efficiency and reduces cooling costs compared to traditional air-cooling systems
  • The technology also enables higher server density and extends hardware lifespan, while reducing space and maintenance requirements. Despite higher upfront investment, liquid cooling becomes cost-effective for power racks above 40 kW
  • Investment bottlenecks for liquid cooling vary by region, driven by supply pressures on core components and uncertainty around long-term power reliability, yet demand is expanding as businesses race to build the infrastructure required for advanced AI
  • As AI demand surges across all business sectors, the technology frameworks that power these accelerating systems require enhanced cooling to fuel further expansion. 

Liquid cooling has emerged as the new battleground for AI growth, now recognised as the essential technology powering the latest generation of AI systems. As companies race to stay ahead of the soaring power demand, it is quickly establishing itself as the industry-standard approach for maintaining performance at scale.

AI is also changing how data centres are being built and operated, with high-power server racks the “new norm”. Racks often require more than 120 kW each, rendering traditional air-cooling systems insufficient for today’s data-driven businesses.

Liquid cooling has emerged as the new battleground for AI growth, now recognised as the essential technology powering the latest generation of AI systems

A report published in April 2025 by the International Energy Agency forecasts that global electricity use by data centres will more than double by 2030, reaching approximately 945 terawatt-hours—surpassing the annual energy consumption of Japan.1 AI will be the principal driver of this surge, with AI-optimised facilities expected to see their power usage quadruple within five years.

Johan Nilerud, Chief Strategy Officer at Khazna Data Centres, notes that the rise of generative AI has triggered an "unprecedented demand for data centre capacity worldwide."2  In Europe especially, limited power availability has become one of the most significant constraints on data centre expansion. In 2024, vacancy rates fell below 10%, underscoring how demand is rapidly outpacing the infrastructure needed to support it. 

The global scramble for power is becoming increasingly visible, and not always successful. A recent example is Amazon’s decision to abandon plans for a $300 million data centre in Dublin, Ireland, after authorities could not guarantee the required power needed within the timeframe.3

Read also: AI’s productivity promise and the US-China race

Hitting the thermal wall

As AI-fueled demand for power accelerates, the industry is confronting a “thermal wall” – a point at which conventional cooling can no longer keep pace with the heat generated by advanced AI workloads. Pressure on data centres is intensifying, and with it, the urgency for cooling solutions that are effective, reliable and commercially viable at scale.

The next generation of servers has surpassed the limits of even the most sophisticated air-cooling systems. A September 2024 equity research paper on AI cooling solutions, "Liquid Cooling-A Song of Ice and Fire," by research firm Jefferies, highlights that traditional air-cooling systems struggle to manage the thermal load generated by today’s high-performance chips.4 Liquid cooling, by contrast, introduces an entirely different new playbook: it dramatically increases the efficiency of the cooling apparatus by nearly 25 times compared with air-based systems and can transfer heat up to 3,500 times more effectively, depending on the water volume. 

As AI-fueled demand for power accelerates, the industry is confronting a “thermal wall” – a point at which conventional cooling can no longer keep pace with the heat generated by advanced AI workloads

Air cooling also faces structural limitations when confronted with the heat dissipation requirements of modern AI workloads. Data from the Green Grid, a consortium focused on energy efficiency in data centre ecosystems, shows that traditional air-cooled racks typically reach their upper threshold at around 15-25 kW per rack, unless supplemented with additional, often costly, air-handled systems.6 Beyond this point, air simply cannot move heat fast enough to keep modern AI servers within safe thermal limits.

Read also: Defending the grid – renewables on the frontline as nations invest to build energy security

AI, however, is consuming power and producing heat at a far more aggressive pace. Very high-density racks today routinely use around 85 kW of power per cabinet, more than triple what conventional designs were built to handle. Mitsubishi’s Spectra Magazine projects that, as AI becomes more advanced, each rack could require 200–250 kW of power to process the next generation of workloads. This demand scale represents a profound shift for data centre operators, who will have to manage much more power as AI demands continue to grow.7 Managing this much power is rapidly becoming one of the defining engineering challenges of the AI era, underscoring why liquid cooling is the critical foundation for the future of AI.

Goldman Sachs forecasts that liquid-cooled AI servers will increase from 15% in 2024 to 54% in 2025, rising to 76% in 2026, driven largely by soaring demand for next-generation, full-rack liquid-cooling solutions.8

Managing this much power is rapidly becoming one of the defining engineering challenges of the AI era, underscoring why liquid cooling is the critical foundation for the future of AI

The cost of change

The financial implications of this transition are substantial. Analysts note that the initial capital expenditure for liquid-cooling deployment remains high.9 In China, costs can be 30-50% more than traditional air-cooled solutions, while in many Western markets this premium can reach 100-150%. Differences in lifecycle costs, power density and system scope all affect the investment profile.

But the costs of switching to liquid cooling are also at a transformation point. Given the pace and scale of power demands, liquid cooling is becoming not just a smarter operational choice but also a compelling financial one. For servers surging to 40 kW and beyond, which is typical of the workload AI demands, liquid cooling is cost-efficient on many levels. At higher power densities, liquid-based systems reduce reliance on extensive mechanical infrastructure, such as chillers, air-handling units, and the real-estate footprint they require. Schneider Electric modelling highlights this shift, stating 40 kW liquid-cooled racks allocate around 21% of capital expenditure to cooling, compared with 10% for 10 kW air-cooled server racks.9 This is a clear sign that as density rises, liquid cooling becomes economically advantageous.

Read also: Q3 earnings season | AI and valuations

However, challenges remain. The market still relies on a relatively small pool of certified suppliers, creating a constrained ecosystem struggling to meet rising demand. Supply-chain resilience, component availability, and standards harmonisation will become critical in determining how quickly liquid-cooling technology can scale.

Barriers to a cooler future in 2026 and beyond

Three barriers in particular will shape how quickly next-generation cooling can scale in the new year:

1. Sustainability: A Hewlett-Packard sustainability report warns of inefficiency risks when organisations overprovision hardware or energy infrastructure for “just in case scenarios” – leaving costly equipment underutilised and power wasted. “Equipment efficiency,” it notes, requires maximising output with minimal energy and heat.10 In 2026, companies that master this shift towards precision energy management will set the benchmark for sustainable AI operations.

2. Power supply: While investment often focuses on physical hardware required for liquid cooling systems, the biggest challenge may lie in the underlying electrical ecosystems. In high-density AI sites, power supply units (PSUs) and power distribution units (PDUs) can cost two to three times more than traditional units. And only four suppliers are certified by Nvidia for next-generation PSU supply, underscoring the supply chain's fragility.11 Strengthening these systems will be essential in 2026.

3. Quick disconnect couplings: QDCs – specialised fittings essential to liquid-cooling – remain concentrated among a handful of Western suppliers, including Staubli, Eaton, CPC and Parker-Hannifin. Although Chinese counterparts such as Envicool and AVIC Jonhon are emerging, stringent certification requirements are slowing down their entry.12 Expanding and diversifying QDC pipelines will be essential for scaling liquid-cooling infrastructure in 2026.

Value channels for investors

Regional dynamics are becoming increasingly important in identifying where investment opportunities will emerge. While global tech giants, such as Meta, Amazon, Google, and Microsoft, continue to drive early adoption of liquid cooling, Chinese firms are rapidly closing the gap in both investment and technological capability. Research from Goldman Sachs and Jefferies highlights two core forces behind this acceleration.13

First, China’s stringent regulatory framework gives state-owned enterprises the ability and incentive to scale liquid cooling infrastructure quickly. Penalties for energy-inefficient data centres have further strengthened the transition.

Read also: Defence begins at home

Second, there has been a quantum leap in Huawei's AI server chips, which, when coupled with China's renewed government-fuelled solar energy drive, is creating a multiplier effect on the country’s computational capacity. As a result, China’s data centre liquid-cooling market is forecast to rise from USD 498 mn in 2024 to an estimated USD 3,350 mn by 2032, an annual growth rate of around 27% over the period.14

In addition, Chinese system suppliers such as Envicool, a Huawei-certified partner, and Megmeet, an Nvidia-certified PSU provider, are rapidly expanding as the domestic ecosystem matures. Meanwhile, Europe and North America remain focused on energy-efficiency, sustainability metrics and regulatory compliance, all of which are becoming vital economic incentives for liquid-cooling adoption across the data-centre landscape.15

Research and market trends point firmly to the necessity, rather than the option, of investing in liquid cooling for AI high-density racks with power density above 20 kW. Investors are gravitating towards established suppliers with service responsiveness across QDC, CDU, and critical power infrastructure.16

In 2026, the ability to deploy and scale advanced cooling infrastructure will be a defining competitive advantage

Keeping ahead of the heat

AI liquid cooling can no longer be considered an emerging technology or a discretionary add-on. In 2026, the ability to deploy and scale advanced cooling infrastructure will be a defining competitive advantage. As AI accelerates, liquid cooling has become the critical foundation for performance, sustainability, and investment margins across next-generation data centre architectures. Organisations that invest early in resilient systems, diversified supply chains and forward-thinking thermal strategies will be the ones best positioned to thrive in a year when AI innovation and the heat it generates will only intensify.

view sources.
+

1 https://www.iea.org/news/ai-is-set-to-drive-surging-electricity-demand-from-data-centres-while-offering-the-potential-to-transform-how-the-energy-sector-works
2 https://datacentremagazine.com/technology-and-ai/ai-data-centres-power-cooling-sustainability-challenges
3 https://www.ainvest.com/news/amazon-withdrawal-dublin-blow-ireland-economy-2507/
4 https://datacenter.uptimeinstitute.com/rs/711RIA-145/images/2024.GlobalDataCenterSurvey.Report.pdf?version=0
5 Jefferies, Song of Ice and Fire, Page 6
6 https://blog.se.com/datacenter/architecture/2019/07/11/not-just-about-chip-density-five-reasons-consider-liquid-cooling-data-center/
7 https://spectra.mhi.com/data-center-cooling-the-unexpected-challenge-to-ai
8 Goldman Sachs, Equity Research November 2024, Global Tech Goldman Sachs, Equity Research
9 Jefferies, Song of Ice and Fire, Executive Summary
10 https://www.hpe.com/us/en/newsroom/blog-post/2025/09/ai-sustainability-why-green-data-centres-arent-enough.html
11 Jefferies, Markets Equity Research, Electrics, September 2024
12 https://www.digitimes.com/news/a20240717PD200/ai-server-heat-dissipation-liquid-cooling-production-capacity.html
13 Goldman Sachs, Equity Research November 2024, Global Tech 
14 https://www.credenceresearch.com/report/china-data-center-liquid-cooling-market
15 Jefferies, Markets Equity Research, Electrics, September 2024
16 JP Morgan, Asia-Pacific Equity Research, September 2024

important information

This is a marketing communication issued by Bank Lombard Odier & Co Ltd (hereinafter “Lombard Odier”).
It is not intended for distribution, publication, or use in any jurisdiction where such distribution, publication, or use would be unlawful, nor is it aimed at any person or entity to whom it would be unlawful to address such a marketing communication.

Read more.

get in touch.

Please select a category

Please enter your firstname.

Please enter your lastname.

Please enter a valid email adress.

Please enter a valid phone number.

Please select a country

Please select a banker

Please enter a message.


Something happened, message not sent.
Lombard Odier Fleuron
let's talk.
share.
newsletter.