Letter AI stands for “artificial intelligence” at the Amazon Web Services booth at the Hanover Mess Industrial Fair in Hanover, Germany on March 31, 2025.
Julian Stratenschulte | Photo Alliance | Getty Images
Amazon On Wednesday, the cloud division said it had developed hardware to cool the next generation nvidia A graphics processing unit used for artificial intelligence workloads.
Nvidia’s GPU, which powered the generative AI boom, requires a huge amount of energy. This means that companies using processors will need additional equipment to cool them down.
To get the most out of these power-hungry Nvidia GPUs, Amazon has considered building a data center that can accommodate widespread liquid cooling. But the process took too long and commercial equipment would not have worked, said Dave Brown, vice president of Compute and Machine Learning Services at Amazon Web Services, in a video posted to YouTube.
“They’ll take up too many data center floor space and significantly increase their water usage,” Brown said. “And while some of these solutions may work with lower volumes at other providers, they simply aren’t enough flow cooling capacity to support our scale.”
Rather, Amazon engineers have devised an in-row heat exchanger, or IRHX, that can be plugged into existing and new data centers. For the previous generation of Nvidia chips, more traditional air-cooled was enough.
Brown wrote in a blog post that he now has access to AWS services as a computing instance that customers go on under the name P6E. The new system comes with Nvidia’s design for compact computing power. NVIDIA’s GB200 NVL72 is packaged in a single rack with 72 Nvidia Blackwell GPUs, wired to train and run large AI models.
Clusters based on NVIDIA GB200 NVL72 were previously available Microsoft or coreweave. AWS is the world’s largest cloud infrastructure supplier.
Amazon has deployed its own infrastructure hardware in the past. The company has custom chips for general purpose computing and AI, and designs its own storage servers and networking routers. When operating homemade hardware, Amazon is independent of third-party suppliers. In the first quarter, AWS provided the widest operating margin since at least 2014, with the unit responsible for most of Amazon’s net profit.
Microsoft, the second largest cloud provider, has stepped up to chip development after Amazon’s lead. In 2023, the company designed its own system called Sidekicks to cool the Maia AI chip it developed.
Watch: AWS unveils the latest CPU chips and offers record networking speeds