Skip navigation
GRC's cooling tanks at a CGG data center in Houston GRC
GRC's cooling tanks at a CGG data center in Houston

Schneider and HPE Throw Their Weight Behind Data Center Liquid Cooling

Two major data center vendors enter separate partnerships around liquid cooling solutions.

Hewlett Packard Enterprise and Schneider Electric have thrown their weight behind data center liquid cooling in two separate partnership announcements – the latest signs that the emerging market is starting to gain momentum.

GRC (formerly called Green Revolution Cooling) announced in late September a partnership with HPE to integrate HPE servers with its ICERaQ immersion cooling systems and will sell them as all-in-one pre-packaged systems.

“This is a big deal for us, and apparently it’s a big deal for HPE as the requests internally are starting to flow pretty strong,” said GRC’s chief revenue officer Jim Weynand. In an interview with Data Center Knowledge, he said the company saw 500 percent revenue growth last year.

In early October, Schneider Electric teamed up with liquid cooling company Iceotope and tech solutions provider Avnet to codevelop chassis-level data center liquid cooling solutions. They plan to release a product within the next 12 months, said Rob Bunger, Schneider’s US business development director for energy management.

“We are working together to leverage our strengths: Iceotope’s liquid cooling technology, Schneider Electric’s expertise in data center facilities, and Avnet’s expertise on the IT stack,” he told us.

Liquid cooling is primarily used for mainframes, gaming, and high-performance computing (HPC). But analysts say they are seeing growing interest by enterprises, and hyperscale cloud platforms, as they use faster processors and deploy machine learning workloads that use more power and produce more heat. The backing of two big tech companies – HPE and Schneider – is increasing awareness and should boost liquid cooling adoption, they say.

“For GRC, the big win is the additional validation and publicity the formal HPE partnership provides,” said Daniel Bizo, senior analyst for 451 Research’s data centers and critical infrastructure channel. “As one of the largest server brands, the availability of HPE systems will make more enterprises and mid-tier service providers consider liquid cooling and GRC.”

“The Schneider-Iceotope-Avnet announcement is a signal to the market that Schneider Electric, one of the world’s largest vendors of data center equipment, is now throwing its weight behind liquid cooling and specifically Iceotope’s approach,” Bizo added in an interview with Data Center Knowledge.

Growing Interest in Liquid Cooling

Liquid cooling in the data center provides numerous benefits over traditional air cooling: it significantly reduces data center power consumption and Power Usage Effectiveness (PUE) and enables operators to increase cooling capacity in select places in the data center without overcooling the entire facility in order to eliminate a few “hot spots.”

Liquid cooling is ideally suited for data centers that are installing power-dense systems for machine learning (a subset of AI computing techniques) as well as edge data centers where traditional cooling may not be available and where a self-contained immersed systems protect IT hardware from dust and debris, vendors and analysts say.

“What’s really driving it is new AI learning workloads that depend upon GPUs,” Jennifer Cooke, IDC’s research director for cloud and edge data center trends, told us. “IT is learning that high-performance compute and new AI workloads are really disrupting their data center environment with excess heat.”

The recent liquid cooling partnership announcements will help IT organizations new to liquid cooling overcome their fears about introducing liquid into a highly electrified environment, Cooke said. While she doesn’t expect the emerging liquid cooling to take over more than 10 percent of the market in the future, more data center operators are exploring the viability of the technology. 

“There is quite a bit of interest, and it will make sense for some data center and edge deployments that are trying to improve operational efficiency and tackle the challenges of HPC and GPU usage,” she said.  

Bizo, of 451, said he, too, has seen more data center operator interest and business activity around liquid cooling driven by a combination of business and technical factors.

On the business side, if some large data center operators, such as hyper-scale cloud providers, are running out of power capacity, liquid cooling compresses the power footprint of a data center, allowing service providers to squeeze up to 20 to 30 percent more IT capacity using the same utility substation, he said.

And on the technical side, as processors and servers get more powerful, so is their power consumption. Ten years ago, 90W to 100W server chips were standard and two-socket servers dissipated about 200W of thermal power on average, he said. Server chips can now be north of 200W, and servers north of 400W. 

“The power density of chips is nearing the limits of what’s practical with air,” Bizo said. “Liquid cooling helps expensive server processors to achieve higher levels of peak performance and sustain longer by virtue of much more efficient heat transfer than air cooling.”

However, liquid cooling remains very much a niche market with no dominant players, largely because a typical sale is a one-off custom project as opposed to repeat business in standardized products, Bizo said. “Liquid cooling as a category is similar to air cooling in the sense it comprises a wide array of engineering approaches, but it’s very different in that it integrates with IT hardware, which complicates procurement, operations and ultimately begs for the creation of standards, which hasn’t happened and creates fragmentation in the market,” he explained.

GRC, Iceotope, Asperitas, Submer, CoolIT, and ZutaCore are some of the key vendors in the data center liquid cooling market, and they all approach liquid cooling a little bit differently, Cooke said.

GRC’s Partnership with HPE

GRC sells liquid-immersion cooling solutions where servers are fully submerged in fluid dielectric coolant. As part of its new partnership, GRC will integrate HPE ProLiant and Apollo servers into its ICERaQ products by disabling the fans, said GRC’s Weynand. Running without fans, a standard customer will see a 10 to 30 percent reduction in compute power usage, according to him.

GRC said data center operators can deploy high-performance HPE servers combined with GRC’s cooling systems virtually anywhere – the edge, greenfield sites, and existing data centers – without having to alter the infrastructure. For example, GRC recently announced ICERaQ Micro, a smaller 24U system with an integrated cooling distribution unit that can handle more than 2,000 watts of heat load per U of rack space.

“You can put the ICERaQ Micro anywhere without changing anything about your existing data center infrastructure and essentially give it HPC capabilities,” Weynand said.

GRC’s partnership with HPE is important because GRC’s customers have long asked for it, Weynand said. In the past, when customers asked if GRC’s products worked with HPE servers, the company always answered that everything is compatible. Now, GRC executives can tell customers they have an OEM partnership where they can integrate HPE servers with GRC’s cooling solutions. “They get the comfort and confidence of the HPE brand with the benefits of immersion cooling,” Weynand said.

GRC, which also has a longstanding OEM partnership with Supermicro, plans to announce more partnerships in the future, he added.

Bizo said the HPE partnership will allow GRC to reduce friction in procurement, installation, and support for customers. “The big question that remains is if such deals will be limited to high-performance computing applications or evolve into wider data center-scale installations,” he said.

Schneider’s Pact with Iceotope and Avnet

Schneider, which sells data center air cooling technology among many other things, had previously partnered with Iceotope in 2014, and together, they have sold some liquid cooling products to customers over the years, Bunger said. In the new partnership, they’ve added Avnet to the mix and are producing new chassis-level liquid cooling solutions together.

“Avnet is a big key here,” he said. “They have been in the industry a long time. As a large global company, they both do IT customization as well as full rack integration. They have these capabilities to take a technology like Iceotope to the broader market.”

Schneider sees demand for liquid cooling. When compared to air-cooled solutions, chassis-level immersion-cooled solutions can save 15 percent in capital expenditure and provide energy savings of at least 10 percent, the company said.

“HPC is probably the No.1 use of liquid cooling. Hyperscalers are looking at it. But all sorts of companies are looking at AI applications, so we are starting to see the need for it beyond a handful of customers,” Bunger said. “And as you see higher-performance chips go mainstream, there will be a great application for liquid cooling for the masses.”

For Iceotope, the partnership shows they have the backing from two trusted providers in the industry, Cooke said. “This is huge for them.”

Bunger gave no details on the product Schneider is co-developing with Iceotope and Avnet. But Cooke recently wrote in a report that Schneider, Iceotope, and Avnet are developing a hybrid immersion cooling solution that “uses standard racks, housing standard hardware, that is partially immersed in fluid.” The solution can be deployed side-by-side with other equipment and does not require a retrofit of the data center, she wrote. A hybrid approach makes sense for more traditional enterprise data centers, she said.

“For them, a complete switch out of cooling infrastructure is not an option unless they are building a new data center,” Cooke told us. “For this reason the hybrid approach makes more sense. They can provide targeted cooling right where it’s needed and ensure that it doesn’t disrupt the rest of the environment.”

TAGS: Design
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish