U.S. Defense Department to Cool Servers With Hot Water

A server tray using Asetek’s Rack CDU Liquid Cooling system, which is being implemented in a U.S. Department of Defense data center. The piping system connects to a cooling distribution unit. (Source: Asetek)

The U.S. Department of Defense (DoD) will soon begin cooling its servers with hot water. The DoD said this week that it will convert one of its data centers to use a liquid cooling system from Asetek Inc. The move could clear the way for broader use of liquid cooling in high-density server deployments at the DoD, which says it will carefully track the efficiency and cost savings from the project.

Asetek was selected for the $2 million project to retrofit of a major DoD data center with its direct-to-chip liquid-cooling technology called RackCDU (short for Rack Coolant Distribution Unit), which brings high-performance cooling directly to the hottest elements inside every server in a data cente, removing processor heat from servers without the use of traditional computer room air conditioners or water chillers.

Benefits of Hot Water Cooling

The RackCDU solution uses hot water cooling, which allows it to work using only outside ambient air (free cooling). While most air cooling systems use chilled water at temperatures as low as 45 degrees, higher water temperatures are possible in tightly-designed and controlled environments that focus the cooling as close as possible to the heat-generating components. Turning off the chillers and CRAC units allows data center operators to lash the amount of power required to support their cooling system. (see Hot Water Cooling: Three Projects Making it Work for more examples of this approach).

“The Department of Defense has become very serious about improving data center efficiency, and they are seeking new approaches to address this mission-critical problem,” said Andre Eriksen, Asetek’s CEO and founder. “Hot water direct-to-chip liquid-cooling is a powerful approach that can capture more than 80% of the heat generated by a data center and remove it from the building, where it can be cooled for free by ambient air or even reused for building heating and hot water. No power what so ever goes in to actively chilling the water.”

Multiple federal mandates are driving the DoD to increase energy efficiency, increase the use of renewable energy and to consolidate data centers. Similar mandates affect data centers operated by other departments of the Federal government.

Extending Liquid to Pizza Boxes and BLades

Liquid cooling has been used in government data centers that house supercomputers, such as those operated by the Department of Energy (such as Oak Ridge National Laboratory) and the National Security Agency (NSA). The DoD initiative extends liquid cooling to rack servers and blade servers as well.

The project will convert an existing air-cooled enterprise data center into a liquid-cooled data center, without disrupting operations during the transition, and with significant improvements in energy consumption, density (enabling consolidation within existing facilities), and creating opportunities to reuse energy by capoturing the waste heat from servers.

Asetek has been a leading supplier of liquid cooling solutions for high-performance gaming PCs and workstations, and recently announced its entry into the data center market.

The National Renewable Energy Lab (NREL) will analyze energy efficiency performance, savings, lifecycle cost, and environmental benefits of RackCDU, while McKinstry will install investment grade monitoring and collect data results. Measured, validated energy savings and performance results could qualify Asetek liquid cooling technology for broader adoption across the DoD.

Johnson Controls Federal Systems, a business unit of Johnson Controls, was chosen for the installation and integration of the system. “This new liquid cooling technology has the potential to shape the future of this industry and will provide a low cost retrofit solution that can be applied to virtually all data centers,” said Mark Duszynski, Vice President Johnson Controls Federal Systems.

In this video from SC12, Asetek founder & CEO André Eriksen describes the company’s innovated hot water cooling technologies for HPC and Cloud computing in an interview with Rich Brueckner of InsideHPC.

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)


  1. Robert Daugherty

    I can't imagine the costs involved to us the taxpayers for this or how their contractor could justify the approach. Most government systems aren't running at a level of utilization that would warrant both the up front costs or long term operational expenses given the additional complexity and maintenance. You can cool up to 30kW a cabinet with air and a proper design. No water or in-row cooling cooling needed and all at a fraction of the cost that this solution will run. As someone said at the Gartner Data Center conference last week if you soak a duck long enough even it will leak. With the number of connections and fittings this system introduces each one is a potential disaster.

  2. James

    At a estimated cost of $2 million with a projected savings of about 50% per year on cooling costs this actually has great potential to save money in the long run. This excess heat can also be used to heat water or even other rooms in the building potentially saving more money on heating costs. Simple air cooling in a medium to large server room still requires constant air conditioning. This is the reason why many companies including Facebook and Google are building massive Data centers in far northern latitudes so that they can take advantage of similar passive cooling technologies requiring far less energy and using the natural resource of "cold" provided by these higher latitudes.