Data center operators want to run chips at higher temps. Here’s why.

  • Higher-power chips are generating more heat, increasing cooling needs
  • Data center operators want chip designers to let them run chips at higher temperatures to balance sustainability needs
  • The ecosystem conversation is slowly evolving to prioritize not just performance but also second- and third-order considerations

It might sound counterintuitive, but using hotter liquids to cool high-power chips could be the answer to sustainably supporting monster compute loads. The problem? Data center operators are still trying to convince chip-makers to get on board with the idea.

Liquid cooling is poised to become a standard cooling method in data centers across the country as demand for artificial intelligence (AI) and other high-performance compute loads proliferate. Indeed, GPU giant Nvidia recently introduced its first liquid cooling-based compute system earlier this year. And it’s already teasing next-gen chips that will likely push the bounds beyond its 1,000-watt Blackwell GPUs.

My Truong is field CTO at data center operator Equinix. He told Fierce that silicon designers are still basing their products on “antiquated views” of the temperatures data centers should run at. And they’re increasingly asking for lower and lower water temperatures to deal with thermal challenges presented by their higher-power chips. The floor for both temperatures, he argued, needs to be higher.

“Instead of being able to say ‘I can give you 27-degree C water and you’ll have 3-degree C of heat gain across a heat exchanger,’ they’re coming back and asking for 15, 13, 12, 11, 10-degrees C water to the chip just to be able to dissipate the power, which is a trend line we are not excited about as an operator,” Truong said.

Why is this trend a bad thing?

Well, Truong explained that a data center operator’s entire cooling infrastructure – including its chillers – will have to work harder to achieve those water temperatures. That means even more power consumption.

Equinix is also working to recycle some of its waste heat by providing it (in the form of hot water) to local communities who can use it to heat buildings and pools. By requiring lower temperatures in liquid cooling systems, chipmakers are making it harder for data center operators like Equinix to produce waste water at temperature high enough to be useful to its recycling partners.

However, if, for example, Equinix was permitted to run a maximum case temperature of 60-degrees C, it could use 50-degree C water for cooling. Equinix could probably supply water at that temperature across the bulk of its footprint without running chiller plants, and that water would exit the system at a temperature that is sufficiently useful to municipalities.

“We need to continue having this conversation with the industry,” Truong said.

While silicon makers today are focused squarely on performance, he argued that once infrastructure operators start acquiring those chips and begin to feel the second and third order impacts of running them, the conversation will start to shift.

On a recent call, Fierce asked Dave Salvator, director of Accelerated Computing Products at Nvidia, about whether today’s liquid cooling technology will be able to keep up with the cooling demands of its chips. Salvator acknowledged that the issue has not entirely been figured out but said it is working with partners to come up with solutions. He added that beyond the direct-to-chip system it already announced, it is also eyeing immersion cooling technology.

“We will look at and/or all solutions that we think make sense to be able to deliver more AI capabilities into data centers,” Salvator stated.

Truong said that Equinix wants “to have that conversation, we are actively having that conversation… But the feedback loop that is currently missing is an option to go choose somebody else that provides a different outcome right now.”

“And that part of the ecosystem is developing – so what is AMD, what is Intel, what is SambaNova, what are all these other AI providers doing to go steer a different direction,” he concluded. “Once one of those catches on, I think that you’ll see the entire ecosystem go the right direction.”