Highlights –

  • As opposed to traditional air cooling, Nvidia says its new liquid-cooled cards can reduce power consumption by around 30% while reducing rack space by 66%.
  • Liquid-cooled A100 GPUs will be available in the second half of this year, while the new H100 card will be available in the HGX H100 server from early next year.

At its Computex 2022 keynote, Nvidia, a computing platform company, revealed its plan to make its data centers more energy efficient by introducing liquid-cooled graphics cards. The firm announced two new liquid-cooled GPUs, but they won’t be able to make their way into the next gaming PC right away. The two GPUs – the H100 (announced at GTC earlier this year) and A100 GPUs – will ship as part of HGX server racks toward the end of the year.

Besides the HGX server racks, Nvidia will offer the liquid-cooled versions of the H100 and A100 as slot-in PCIe cards. The A100 will be released in the second half of this year, and the H100 will come early next year. Nvidia believes that “at least a dozen” system builders will have these GPUs available by the end of the year, including Asus, ASRock, and Gigabyte options.

According to the firm, compared to the air-cooled version, it consumes 30% less power. Nvidia guarantees that it already has more liquid-cooled server cards on its roadmap. It also hinted that it would bring the tech to other applications like in-car systems required to keep cool in enclosed spaces.

According to Nvidia, minimizing the energy required to perform intricate computations can have a significant impact. It claims that over one per cent of the world’s electricity is used by data centers, and 40% of that is down to cooling. As opposed to traditional air cooling, Nvidia says its new liquid-cooled cards can reduce power consumption by around 30% while reducing rack space by 66%.

“Data center operators aim to eliminate chillers that evaporate millions of gallons of water a year to cool the air inside data centres. Liquid cooling promises systems that recycle small amounts of fluids in closed systems focused on key hot spots,” Nvidia explained.

“We plan to support liquid cooling in our high-performance data center GPUs and our NVIDIA HGX platforms for the foreseeable future.”

Unlike the liquid-cooled gaming GPU with an all-in-one system, the A100 and H100 use a direct liquid connection to the processing unit itself. Everything but the feed lines is hidden in the GPU enclosure, which only takes up one PCIe slot (instead of two for the air-cooled versions).

To track energy usage – a ratio between how much power a data center is drawing versus how much power the computing is using – data centers rely on Power Usage Effectiveness (PUE). With an air-cooled data center, Equinix had a PUE of about 1.6. Liquid cooling with Nvidia’s new GPUs brought that down to 1.15, which is remarkably close to the 1.0 PUE data centers aim for.

According to Asetek, a major manufacturer of water cooling systems, liquid cooling is popular in high-performance use cases – be it supercomputers or custom gaming PCs or even a few phones – because liquids are better absorbents than air. Compared to cooling down the air in an entire building or increasing airflow to the specific components on a card by dumping out all the heat with warm liquid, it’s relatively convenient to transfer it elsewhere to cool off.

Liquid-cooled cards are not just energy-efficient; they have an added benefit over their air-cooled counterparts — remarkably, they take up less space, meaning more of them can be fitted in the same amount of space.

Besides bringing energy efficiency, Nvidia says that liquid cooling also offers benefits to preserving water. To keep air-cooled systems operating, millions of gallons of water are evaporated in data centers every year. According to the head of the edge infrastructure at Equinix, Zac Smith, liquid cooling allows that water to recirculate, turning “a waste into an asset,” according to the head of the edge infrastructure at Equinix Zac Smith.

Although these cards won’t show up in the massive data centers run by Google, Microsoft, and Amazon — which are likely using liquid cooling already — that doesn’t mean they won’t have an impact. Banks, medical institutions, and data center providers like Equinix compromise a large portion of the data centers around today, and they could all benefit from liquid-cooled GPUs.

Nvidia says this is just the start of a journey to carbon-neutral data centers. In a press release, Nvidia senior product marketing manager Joe Delaere wrote that the company plans “to support liquid cooling in our high-performance data center GPUs and our Nvidia HGX platforms for the foreseeable future.”

Nvidia’s move towards energy efficiency via liquid-cooling comes close when various companies keep in mind the amounts of energy used by their servers. This is not to say that data centers are the only source of carbon emissions and pollution for big tech, but they also cannot be ignored. Moreover, critics have emphasized that offsetting energy use through credits isn’t as impactful as reducing consumption altogether. With an aim to use less energy and water, several companies, including Microsoft, have experimented with submerging servers in the liquid completely and even putting whole data centers in the ocean.

Nvidia believes that over a dozen server manufacturers – from ASUS to Gigabyte, Supermicro and more – plan to integrate the new cards into their products later this year, with the first systems to hit the market in Q3.