Sandia LabNews

Supercool


New cooling method for supercomputers to save millions of gallons of water

Sandia engineer David J. Martinez examines the cooling system at Sandia’s supercomputing center.  Photo by Randy Montoya
Sandia engineer David J. Martinez examines the cooling system at Sandia’s supercomputing center.  Photo by Randy Montoya

In different parts of the country, people discuss gray-water recycling and rainwater capture as ways to minimize the millions of gallons of ground water required to cool large data centers. But the simple answer in many climates, says David J. Martinez (9324), is to use liquid refrigerant.

Based on that principle, Dave — engineering project lead for Sandia’s infrastructure computing services — is helping design and monitor a cooling system expected to save 4 million to 5 million gallons annually in New Mexico if installed next spring at Sandia’s computing center, and hundreds of millions of gallons nationally if the method is widely adopted. The method is currently being tested at the National Renewable Energy Laboratory in Colorado, which expects to save a million gallons annually.

The system, built by Johnson Controls and called the Thermosyphon Cooler Hybrid System, cools like a refrigerator without the expense and energy requirements of a compressor.

My job is to eventually put cooling towers out of business.

Currently, many data centers use water to remove waste heat from servers. The warmed water is piped to cooling towers, where a separate stream of water is turned to mist and evaporates into the atmosphere. Like sweat evaporating from the body, the process removes heat from the piped water, which returns to chill the installation. But large-scale replenishment of the evaporated water is needed to continue the process. Thus, an increasing amount of water will be needed worldwide to evaporate heat from the growing number of data centers, which themselves are growing in size as more users put information into the “cloud.”

“My job is to eventually put cooling towers out of business,” Dave says.

“Ten years ago, I gave a talk on the then-new approach of using water to directly cool supercomputers. There were 30 people at the start of my lecture, and only 10 at the end.

“‘Dave,’ they said, ‘no way water can cool a supercomputer. You need air.’

“So now most data centers use water to cool themselves but I’m always looking at the future and I see refrigerant cooling coming in for half the data centers in the US, north and west of Texas, where the climate will make it work.”

The prototype method uses a liquid refrigerant instead of water to carry away heat. The system works like this: Water heated by the computing center is pumped within a closed system into proximity with another system containing refrigerant. The refrigerant absorbs heat from the water so that the water, now cooled, can circulate to cool again. Meanwhile, the heated refrigerant vaporizes and rises in its closed system to exchange heat with the atmosphere. As heat is removed from the refrigerant, it condenses and sinks to absorb more heat, and the cycle repeats.

There’s no water loss, like there is in a cooling tower that relies on evaporation.

“There’s no water loss, like there is in a cooling tower that relies on evaporation,” Dave says. “We also don’t have to add chemicals such as biocides, another expense. This system does not use a compressor, which would incur more costs. The system instead uses a phase-changing refrigerant and only requires outside air that’s cool enough to absorb the heat.”

In New Mexico, that would occur in spring, fall, and winter, saving millions of gallons.

 In summer, the state’s ambient temperature is high enough that a cooling tower, or some method of evaporation, would be used. But more efficient computer architectures can raise the acceptable temperature for servers to operate and make the occasional use of cooling towers even less frequent.

 “If you don’t have to cool a data center to 45 degrees Fahrenheit but instead only to 65 to 80 degrees, then a warmer outside air temperature — just a little cooler than the necessary temperature in the data center — could do the job,” Dave says.

For indirect air cooling in a facility, better design brings the correct amount of cooling to the right location, allowing operating temperature to be raised and allowing the refrigerant cycle to be used more during the year. “At Sandia, we used to have to run at 45 degrees F. Now we’re at 65-78 F. We arranged for air to flow more smoothly instead of ignoring whorls as it cycled in open spaces. We did that by working with supercomputer architects and manufacturers of cooling units so they designed more efficient air-flow arrangements. Also, we installed fans sensitive to room temperature, so they slow down as the room cools from decreased computer usage and go faster as computer demand increases. This results in a more efficient and economical way to circulate air in a data center.”

In another smart water-saving procedure, big jobs that don’t need instant completion can be scheduled at night, when temperatures are cooler.

“Improving efficiencies inside a system raises efficiencies in the overall system,” Dave says. “That saves still more water by allowing more use of the water-saving refrigerant system.”