Nvidia has introduced a new strategy, known as liquid-cooled graphics cards, with the goal of lowering the amount of energy that is consumed by data centers while training AI models or processing huge amounts of data.
At Computex, the firm made the announcement that it will be releasing a liquid-cooled variant of its A100 compute card. The company claims that this variant will consume thirty percent less power than the air-cooled variant did.
Nvidia has also pledged that this isn’t just a one-off, stating that it already has more liquid-cooled server cards on its roadmap. Additionally, the company has hinted that it may bring the technology to other applications, such as in-car systems, which require maintaining a cool temperature in enclosed spaces.
Even with liquid cooling, this can be a difficult problem to solve, as evidenced by Tesla’s recent chip recall due to overheating concerns. According to Nvidia, lowering the amount of energy required to carry out complicated computations might have a significant impact.
Peak performance and cooling in ever-smaller spaces matters to data centers, and Nvidia’s liquid-cooled A100 card targets that market. The chip giant also detailed new Jetson devices and designs for data-center systems. https://t.co/6FdlCCwYDD
— PCMag (@PCMag) May 25, 2022
The business reports that data centers consume more than one percent of the world’s electricity, with forty percent of that being attributable to cooling. Although it is important to note that graphics cards are only one part of the picture, reducing that by almost a third would be a huge effect.
Other components, like central processing units, storage devices, and networking equipment, also take power and require cooling. Nvidia asserts that by using liquid cooling, its GPU-accelerated systems would conduct artificial intelligence and other high-performance activities at a much higher level of efficiency compared to servers that just use CPUs.
According to Asetek, a leading maker of water cooling systems, liquids are able to absorb heat more effectively than air does, which is one of the reasons why liquid-cooling is so popular in high-performance use cases such as supercomputers, bespoke gaming PCs, and even a few phones.
And once you have warm liquid, it is relatively simple to transfer it elsewhere so that it can cool off. This is in contrast to trying to cool down the air in an entire building or increase airflow to the specific components on a card that are dumping out all of the heat, both of which are significantly more difficult.
Liquid-cooled graphics cards have a number of advantages over their air-cooled counterparts, one of which is the ability to save significant amounts of space. This allows you to store a greater quantity of liquid-cooled cards in the same volume as air-cooled cards.
The effort that Nvidia is making to reduce the amount of energy it uses by utilizing liquid cooling comes at a time when a lot of businesses are thinking about the amounts of energy that their servers use. Although data centers are not the only source of carbon emissions and pollution caused by large technology companies, they are an integral part of the problem that cannot be overlooked.
Nvidia’s GPUs are becoming increasingly more power hungry, so the US giant is hoping to make datacenters using them “greener” with liquid-cooled PCIe cards that contain its highest-performing chips. At this year’s Computex event in Taiwan, the computer… https://t.co/IuBJJnepF4
— The Register: Summary (@_TheRegister) May 24, 2022
- Were There Any Signs of AliensTechnology ? Is It Possible That We Are Not Alone in the Universe?
- Google is Taking Its Own Ecosystem Seriously With The Pixel Watch and Pixel Tablet
Critics have pointed out that offsetting energy use through credits is not as impactful as lowering consumption altogether. Companies such as Microsoft have conducted experiments in which they completely submerge servers in liquid and have even placed entire data centers in the ocean in an effort to reduce the amount of energy and water they use.
While the type of liquid-cooling that Nvidia is offering isn’t necessarily the norm for data centers, it’s not as out there as putting your servers in the ocean (though Microsoft’s experiments with that have been surprisingly successful so far). Of course, these solutions are quite unusual. Instead of being marketed as a cutting-edge solution, Nvidia promotes its liquid-cooled GPUs as being suitable for “mainstream” servers.
WHEN CAN I GET A LIQUID-COOLED RTX CARD WITHOUT MODDING?
This does raise the issue of whether we could see Nvidia try to push liquid cooling even more mainstream by integrating liquid cooling into the reference designs for its gaming-focused cards. The company doesn’t mention any plans to do that, only saying that it plans to “support liquid cooling in our high-performance data center GPUs” for the “foreseeable future.”
However, server technology frequently makes its way down to home personal computer technology, and gaming cards coming straight from the factory with an all-in-one liquid cooler is not something that is completely unheard of — AMD has had a few reference designs that included a liquid-cooling loop, and third parties have sold liquid-cooled Nvidia cards in the past.
It wouldn’t surprise me if Nvidia announced an RTX 5000-series card that comes stock with a liquid cooler, given that Nvidia’s cards continue to draw more and more power (a stock 3090 Ti can take up to 450 watts), as the company’s cards continue to demand more and more power.
When it comes to Nvidia’s cards designed specifically for use in data centers, the company claims that server manufacturers such as ASRock, Asus, and Supermicro will integrate liquid-cooled cards into their products “later this year,” and that slot-in PCIe A100 cards will be available in the third quarter of this year. In “early 2023,” the company plans to release a liquid-cooled PCIe version of its recently announced H100 card, which is the next-generation version of the A100.