Are data center operators ready to abandon hot and cold aisles and submerge their servers? Startup Green Revolution Cooling has developed a liquid cooling enclosure that it says can cool high-density server installation for a fraction of the cost of air cooling in traditional data centers.
The Austin-based company’s CarnotJet Dielectric Fluid Submersion Cooling System resembles a rack tipped over on its back, with servers inserted vertically into slots in the enclosure, which is filled with a coolant similar to mineral oil.
Green Revolution says its approach can produce large savings on infrastructure, allowing users to operate servers without a raised floor, CRAC units or chillers. The company launched at SC09 conference last fall, and is targeting will soon install its first units at the Texas Advanced Computing Center, home to the Ranger supercomputer
In this video from Data Center World, Green Revolution co-founder Christiaan Best provides an overview of his company’s offering, along with a look at a demo of a “swimming server” at the company’s booth.
As Best indicates in the video, submersion cooling isn’t new. Mineral oil has been used in immersion cooling because it is not hazardous, transfers heat almost as well as water but doesn’t conduct an electric charge. Many readers may be familiar with Fluorinert, a dielectric coolant from 3M that was used in the Cray 2 and other supercomputers. An immersion system with a different design was introduced by UK firm Iceotope at the SC09 event.
Green Revolution says its GreenDEF coolant has been “optimized for data centers” and can support heat loads of up to 100 kilowatts per 42U rack, far beyond current average heat loads of 4 to 8 watts a rack and high-density loads of 12 to 30 kilowatts per rack. The company says the CarnotJet system is designed to comply with fire codes and the Clean Water Act, and integrates with standard power distribution units (PDUs) and network switches.
Some mineral oil-style coolants can be messy to maintain. The company says the coolant can be drained for enclosure-level maintenance, and individual servers can be removed for work, a process documented by Green Revolution in this video: