Liquid Cooling
GRC's cooling tanks at a CGG data center in Houston GRC
GRC's cooling tanks at a CGG data center in Houston

Bitcoin Drove a Surge in Immersion Cooling Sales, But GRC is Eyeing More Stable Markets

Nearing its 10-year anniversary, the company formerly known as Green Revolution Cooling has yet to foment a revolution • A recent name change signals it’s no longer trying to • In an interview with Data Center Knowledge, GRC’s CEO Peter Poulin said new applications like AI and edge computing will drive demand for immersion cooling

If you pay attention to what’s happening in the data center cooling space, chances are you’ve heard of Green Revolution Cooling. Its at first glance radical approach of submerging servers in dielectric fluid to cut down the amount of energy used for cooling has earned it some press, a handful of innovation awards, and quite a few customers, both in public and private sectors.

The efficiency improvements over traditional air-based technologies are substantial. One high-frequency trading firm that’s been testing the solution recently saw not only the reduction in energy use from not having to push air around or chill water to below 50F, but also noted a dramatic improvement that resulted from removing server fans. (Fans aren’t needed because the servers are submerged into tanks filled with coolant.)

The trading firm’s data center engineers ran an identical set of servers cooled by traditional means side-by-side with the pilot deployment and saw the servers dunked in oil use 30 percent less energy, according to the cooling company’s CEO Peter Poulin. That was the highest energy savings resulting from removing fans Paulin’s ever seen – 15 percent is more common.

Even at 15 percent, the load reduction usually outstrips the amount of power the cooling system itself requires. “So, it’s almost like you get the cooling for free from the energy perspective,” Poulin said in an interview with Data Center Knowledge.

He declined to name the pilot customer (this kind of information is often veiled by non-disclosure agreements), but said the firm was his first potential client in high-frequency trading – a space where he is hopeful the company, whose solutions are currently installed in 20 to 30 data centers across various industry verticals, will see some traction soon, as players’ computing requirements increase.

Green Revolution Cooling recently rebranded, becoming simply GRC. Poulin, who joined the company in 2016, said the rebrand was necessary because the product was mature and no longer revolutionary. The move is also probably smart because after nearly a decade in existence, the Austin-based company hasn’t exactly fomented a revolution. Air-based cooling still dominates in the data center industry, and whatever traction liquid-cooling designs have had to date has been driven primarily by the better understood direct-to-chip or rear-door heat exchanger approaches.

GRC hasn’t yet reached profitability. “We’re spending on growth,” Poulin said. “Because we think this market is going to be a lot bigger.”

The company will celebrate its 10-year anniversary next January, but at a recent data center industry event in San Francisco, the CEO still found himself fielding questions like, “Don’t you guys only support your own servers?” (GRC supports several well-known server brands) and, “Isn’t your oil a fire hazard?” (it isn’t, according to Poulin). A lot of misinformation about immersion cooling still remains in the market, he said.

This year has been good for GRC in terms of sales – due primarily to cryptocurrency mining. In fact, the last big surge in bitcoin price, which peaked close to $20,000 in December, drove an unusual surge in revenue for the company in the first half of the year. It takes a lot of power to mine digital currency, and mining companies generally operate with tiny margins, making GRC’s promise of dramatic energy savings attractive.

The company had happened to launch a product line designed specifically for crypto last September. Orders for the single-rack immersion solution for mining – HashRaQ – and for the six-rack system inside a shipping container – HashTank – pushed its revenue for the first six months of 2018 to about five times what GRC had been turning over annually for the previous three years, Poulin said.

The cryptocurrency market’s volatility isn’t lost on him of course. Green Revolution has seen this movie before, when it sold a 10MW system to a crypto miner during the previous short-lived bitcoin surge, in late 2013-early 2014, only to see the customer go out of business a year later and return the gear.

Poulin said Green Revolution didn’t lose any money on that deal, but the company now requires crypto customers to pay 80 percent of the contract cost upfront before a factory sees an order.

GRC’s sales in that space “slowed considerably” since the price of bitcoin tanked earlier this year, and management’s sights are set on the more stable enterprise data center markets.

Not surprisingly, most traction it’s had outside the cryptocurrency market has been in high-performance computing. HPC users in academia and otherwise are closely familiar with liquid cooling, although the traditional approach in HPC has been to bring coolant directly to processor heatsinks via piping.

To date, the company’s biggest non-crypto deployment is a 1.5MW system for CGG, a Houston-based company that has ships combing the oceans for geological data, which it then crunches using a GRC-cooled supercomputer to produce reports energy companies buy and use to look for oil deposits. Poulin expects more business to come from the oil and gas sector in the future.

In academia, GRC is used to cool supercomputers at the Texas Advanced Computing Center in Austin, the Vienna Scientific Cluster in Austria, and the Tokyo Institute of Technology.

But supercomputers are a niche market, and Poulin is hopeful that new, emerging applications that have been driving up power densities in enterprise data centers will also drive new business for GRC.

More specifically, training machine learning models for AI requires dense GPU clusters that aren’t much different from supercomputers, and hyperscale cloud operators like Microsoft, Google, and Facebook have been expanding that kind of infrastructure in their facilities. Google is already using direct-to-chip liquid cooling in its data centers to cool its custom TPU chips for AI, and at least some of the others are actively exploring the space.

“One of the hyperscale guys has just bought a pilot system from us for their AI cloud,” Poulin said. “They’re seeing the density challenge in front of them.” Again, he declined to name the pilot customer.

Chinese cloud giants Baidu and Alibaba both have been running immersion-cooling pilots. They’re not using GRC’s solution, but they have been sending engineers to study GRC’s design, Poulin said.

Another developing space the company is watching closely is edge computing. If soon companies are going to need servers running everywhere there’s a high concentration of end users, or in places like factory floors, offshore oil rigs, and retail stores, there will be demand for compact solutions for deploying lots of computing capacity. If these edge deployments crunch data in real-time using heavy-duty analytics or AI software – the so-called “intelligent edge” – they just might be dense enough for immersion cooling to make sense.

It’s very early days for both AI and edge computing, and there’s little certainty about what exactly the infrastructure that will support them will ultimately look like. Chances are there will be a variety of design approaches at least for some time. There’s a lot of innovation still left to be done on both of those fronts.

If GRC is to break out of its current niche status, those, and other applications, such as high-frequency trading, will have to reach a certain level of power density, while the company works to dispel all the myths about its technology that continue swerving around the industry.

Today, “the majority of applications for cooling don’t require the things that can be accomplished with immersion,” Poulin admitted. “But we kind of see that turning a little bit.”

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish