Hot Water Cooling? Three Projects Making it Work

The Dell modular data centers (MDCs) were customized for the eBay Project Mercury to allow them to use a “hot-water cooling” system inw hich air was cooled by a cooling loop at temperaturs up to 87 degrees. (Photo: eBay)

The phrase “hot water cooling” seems like an oxymoron. How can hot water possibly help cool servers in high-density data centers?

Although the data center community has become conditioned to think of temperatures between 60 and 75 degrees as the proper climate for a server room, there are many ways to keep equipment running smoothly with cooling technologies featuring significantly higher temperatures.

There are three recent examples of this trend. Last week’s release of the Top 500 list of the world’s most powerful supercomputers highlighted the new IBM SuperMUC system at Leibniz Supercomputing Centre (LRZ) in Germany, which placed fourth on the list.

Another example is the eBay Project Mercury data center in Phoenix, which houses high-density racks of servers in rooftop containers, which have continued to operate at high efficiency at exterior temperatures of up to 119 degrees. A micro-modular data center enclosure from Elliptical Mobile Systems has also shown the ability to cool components using warm water.

Benefits of Hot Water Cooling

The projects employ different approaches to hot water cooling, all of which benefit from tightly-designed and controlled environments that focus the cooling as close as possible to the heat- generating components.

Using a higher water temperature in a cooling system provides two benefits – it allows you to either use your chiller less, or not at all. Higher inlet water temperature maximizes the number of hours in which “free cooling” is possible through the use of water side economizers. Many data center cooling systems set the chilled water temperature in a range between 45 to 55 degrees. Here’s a look at three projects that have pushed the boundaries on water temperature.

IBM’s SuperMUC Supercomputer

The new LRZ SuperMUC system was built with IBM System x iDataPlex Direct Water Cooled dx360 M4 servers. IBM’s hot-water cooling technology directly cools active components in the system such as processors and memory modules with coolant temperatures that can reach as high as 113 degrees Fahrenheit.

By bringing the cooling directly to components, SuperMUC allows an increased inlet temperature. “It is easily possible to provide water having up to 40 degrees Celsius using simple ‘free-cooling’ equipment, as outside temperatures in Germany hardly ever exceed 35 degrees Celsius,” LRZ says. “At the same time the outlet water can be made quite hot (up to 70 degrees Celsius) and re-used in other technical processes – for example to heat buildings or in other technical processes.”

SuperMUC is based on the liquid cooling system developed for the Aquasar supercomputer at the Swiss Federal Institute of Technology Zurich (ETH) in 20120. The cooling system system features a system of capillary-like pipes that bring coolant to the components, remove the heat, and than are returned to a passive cooling system that uses fresh air to cool the water. IBM has a video providing additioal infromation on the SuperMUC cooling system.

eBay’s Project Mercury

In his vision for Project Mercury, eBay’s Dean Nelson sought a design that could run without chillers in even the most brutal climates – such as Phoenix, where daytime temperatures regularly exceed 100 degrees. Nelson, the Director of Global Foundation Services for eBay, wanted to test the limits of using air to cool servers.

In designing for year-round use of free cooling, eBay deployed data center containers from Dell that were able to use a water loop as warm as 87 degrees F and still keep servers running within their safe operating range. Dell warranties its servers for fresh-air cooling solutions are capable of running at 104 degrees Fahrenheit for up to 900 hours per year and 113 hours Fahrenheit for 90 hours per year.

To make the system work at the higher water temperature, the system as designed with an unusually tight “Delta T” – the difference between the temperature of air at the serve inlet and the temperature as it exits the back of the rack. Nelson says eBay’s servers were designed to maintain a Delta T of 6 to 12 degrees. This allows eBay to raise the inlet temperature and still maintain the exhaust heat at a manageable level.

Nelson discusses the approach to Project Mercury in this video from Dell DCS.

Elliptical Mobile Systems

Recent testing found that Elliptical Mobile’s newest enclosure can cool high-density loads using water in a range of 65 degrees all the way up to 85 degrees. The R.A.S.E.R. HD is a 42U enclosure designed to handle IT loads from 20 kW to 80 kW.

The testing was conducted at the United Metal Products facility in Tempe, Arizona, with the enclosures placed outdoors on a 100-degree day. The testing used a 23kW load bank to simulate IT loads, and found the unit was able to maintain a server inlet temperature around 85 degrees after the water temperature was raised to 85 degrees.

The cooling system for R.A.S.E.R. HD consists of an air loop and a water loop. The fans of the cooling unit draw warm air from the rear section of the cabinet and into an air/water heat exchanger. The air is cooled and then blown into the front area of the cabinet. Inside the air/water heat exchanger, the heat energy of the warm air is transferred to the medium of water. The heat exchanger is connected to an external reciprocal chiller unit, where the water is cooled again.

In this video, Scott Good of gkworks provides an overview of the testing and a closer look at the enclosures in action.

DCK’s John Rath contributed to this story.

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)


  1. MG

    Main information that I am missing is what technology is used as servers that they do not have problems with such a high temperatures

  2. Thomas

    It would be nice to see all those temperatures in some format that the entire world except for the US uses. Though, i do see some celsius sprinkled in there. A 100-degree day would mean water is boiling everywhere, and make for a pretty interesting day :)

  3. Barry Barlow

    Try looking at the Iceotope system from the UK this will cool units with water temperatures of around 45 degrees C (circa 113 F) and still remove 20Kw per rack. see for full time free cooling that will even work in a dessert climate

  4. Scot Heath

    Any server worth its salt will easily accommodate 85 degree F inlet temps. This by no means guarantees a minimum energy configuration due to the energy used by the server fans. I suspect that in the case of the solution mentioned which uses a chiller, the COP increase of the chiller is more than offset by the increased IT fan energy.