Colocation Company Will Submerge Servers
August 25th, 2010 By: Rich Miller
Austin-based Midas Networks will soon offer colocation customers the opportunity to submerge their servers in a liquid cooling enclosure. Midas is the first hosting customer for Green Revolution Cooling, an Austin startup that says its approach can produce large savings on infrastructure, allowing users to operate servers without a raised floor, air conditioning units or chillers.
Midas Networks has purchased four of Green Revolution’s 42U CarnotJet racks, which submerge servers in a dielectric fluid. The CarnotJet system resembles a rack tipped over on its back, with servers inserted vertically into slots in the enclosure, which is filled with a coolant similar to mineral oil. The Midas installation is scheduled to be operational in the fourth quarter of this year.
Not Just for HPC Anymore
Green Revolution introduced its product last fall at the SC09 supercomputing show, targeting high performance computing (HPC) applications supporting power loads of 25 kilowatts per rack and beyond. Liquid cooling has been fairly common in HPC and supercomputing environments, but not in colocation facilities. But that’s changing as more companies pack their gear into high-density racks.
“The market is moving towards denser server applications,” said Ken Tooke of Midas Networks. “Now we will be one of the few colocation and hosting centers that can properly support blades, trays and other extremely dense computing platforms. With GR Cooling racks, our costs will be a lot lower too – for any type of server, low density or high density. We will pass that savings along to our customer.”
Dielectric Cooling Not New
Submersion cooling isn’t new. Mineral oil has been used in immersion cooling because it is not hazardous and transfers heat almost as well as water, but doesn’t conduct an electric charge. Many DCK readers may be familiar with Fluorinert, a dielectric coolant from 3M that was used in the Cray 2 and other supercomputers. An immersion system with a different design was introduced by UK firm Iceotope at the SC09 event, while Hardcore Computing introduced its Liquid Blade immersion cooling unit this spring..
We spoke with Green Revolution co-founder Christiaan Best earlier this year at Data Center World, where he provided an overview of his company’s offering, along with a look at a demo of a “swimming server” at the company’s booth.
jeff hinklePosted August 25th, 2010
Did they get a permit from the fire department for this and if so did they receive much push back from the Fire Marshall regarding storage of that much combustible oil on the premises?
i have been interested in this but have not spent time on it yet.
Mark TlapakPosted August 25th, 2010
While we’re trying to not post company messages here, hopefully answering this question adds non-biased value for the readers.
The quantity of Class IIIB liquids (such as our coolant and almost certainly the competitor’s coolant) under IFC 3402.1, which also have no toxicity or reactivity concerns, is unlimited in buildings per IFC Table 2703.1.1(1), Footnote f., as long as they are equipped with a fire suppression system such as a sprinkler or equivalent.
The Fire Marshall also likes to see other safety features built in such as overflow protection, loss of circulation detection, manual and automatic shutoff, etc, that provide fail-safes.
Upon request the company (GR Cooling) can send out a comprehensive 3rd party review of the IFC and IBC and how they apply specifically to our system. I would think most of it would apply to the competitors also, for anyone looking at submersion cooling in general.
jeff hinklePosted August 25th, 2010
awesome – thank you for an insightful reply
Derek JohnsonPosted August 26th, 2010
How does the coolant impact optical connections?
Can you elaborate on the electrical distribution methodology for this. From the video and the website there is no mention and it looks like you are either using DC LV or 120v AC. Also on your website it says you do not need Generators for back up. I would assume if you have low voltage and battery carry through at the cabinet this may be achievable.
The power of computing these days is amazing. I’ve known people who cool their own PC’s with liquid nitrogen. They’re like super computers themselves.
Mark TlapakPosted August 31st, 2010
For the control system, we usually use 115VAC. Dual feeds, one for redundancy. Also, the control computer has battery backup internal to the control box.
The servers themselves will still of course need a generator, but that can be smaller since the data center itself uses less power.