A data hall inside the eBay Phoenix data center uses water-cooled rear door cooling units to tame power densities of up to 35kW per rack.

A data hall inside the eBay Phoenix data center uses water-cooled rear door cooling units to tame power densities of up to 35kW per rack.

eBay Shifts to Water-Cooled Doors to Tame High-Density Loads

2 comments

Data center operators are seeking to pack more computing power into each square foot of space. As these users add more servers and increase the power density of their racks, it can alter the economics of how to best cool these environments.

A case in point: eBay recently reconfigured a high-density server room in its Phoenix data center, switching from in-row air cooling units to water-chilled rear door cooling units from Motivair. These units, which cool server exhaust heat as it exits the rear of the rack, have been in use for years (see our 2009 video overview for an early example) but tend to be more expensive than traditional air cooling.

As power density changes, so does the math driving investments in cooling, according to Dean Nelson, Vice President of Global Foundation Services at eBay. The data hall in Phoenix featured 16 rows of racks, each housing 30kW to 35kW of servers – well beyond the 4kW to 8kW seen in most enterprise data center racks. Cooling that equipment required six in-row air handlers in each row, meaning eBay had to sacrifice six rack positions for cooling gear.

Switching to the rear-door units allowed eBay to recapture those six racks and fill them with servers, boosting compute capacity in the same footprint. The Motivair cooling doors are active units, with on-board fans to help move air through the rack and across the unit’s cooling coils. Some rear cooling doors are passive, saving energy by relying upon the fans on the servers for airflow. Nelson said that the high power densities in the eBay Phoenix installation required the active doors.

Weighing Power Tradeoffs

The Motivair doors use high-efficiency EC (electronically commutated) fans, which mean the overall power usage (12.8kW vs 11kW for the in-row units) was minimal. The system also uses less energy because it can use warm water, working at a water temperature of at 60 degrees rather than the 45 degrees seen in many chiller systems.

“We thought the rear doors were expensive, but when you added the six racks back into the row, it paid for itself,” said Nelson.

The cost scenario also worked because eBay had pre-engineered the space to support water cooling. Nelson believes that major data centers are approaching the limits of conventional air cooling, and sees water cooling as critical to future advances in high-density equipment. eBay is using hot water cooling on the rooftop of its Phoenix data center, which has served as a testbed for new data center designs and technologies being implemented by the e-commerce giant.

In the rooftop installation, eBay deployed data center containers from Dell that were able to use a water loop as warm as 87 degrees F and still keep servers running within their safe operating range. To make the system work at the higher water temperature, the system as designed with an unusually tight “Delta T” – the difference between the temperature of air at the serve inlet and the temperature as it exits the back of the rack.

The Phoenix facility also includes traditional indoor data halls, which is where the rear-door cooling doors were implemented.

Here’s a look at the finished room:

ebay-motivair-full

Add Your Comments

  • (will not be published)

2 Comments

  1. Jeff

    It will be very interesting to see how (if) these pay for themselves in TCO over time. The big unknown is how long the assemblies will last before developing leaks, and what the right amount of maintenance is in order to keep them in shape. Scale-able computing technologies are absolutely essential too, as if you need to shut off even one door for maintenance and start blowing 30kW into the room, you need to shut down that workload very fast so you don't end up with high inlet temps on the other side (unless you have room based cooling as a backup, at an extra expense).

  2. I had to the math first since I am on the other side of the pond, but 60 degree F is 15 degree C and that is not hot or even warm water in my book. Even the 87F from eBay are far from warm or hot water cooling. The water loops in the DataTank immersion systems (the 3M stuff, not mineral oil) are for 50C and could be hotter. That is 122 degree Fahrenheit. That's how you get a PUE of less than 1.01 even in Arizona.