Skip navigation
A data hall inside eBay's Phoenix data center eBay
A data hall inside eBay's Phoenix data center

eBay Shifts to Water-Cooled Doors to Tame High-Density Loads

As power density changes, so does the math driving investments in cooling. An eBay project in Phoenix illustrates the tradeoffs in space, power and density as data center operates seek to optimize the cost of computing power.

Data center operators are seeking to pack more computing power into each square foot of space. As these users add more servers and increase the power density of their racks, it can alter the economics of how to best cool these environments.

A case in point: eBay recently reconfigured a high-density server room in its Phoenix data center, switching from in-row air cooling units to water-chilled rear door cooling units from Motivair. These units, which cool server exhaust heat as it exits the rear of the rack, have been in use for years (see our 2009 video overview for an early example) but tend to be more expensive than traditional air cooling.

As power density changes, so does the math driving investments in cooling, according to Dean Nelson, Vice President of Global Foundation Services at eBay. The data hall in Phoenix featured 16 rows of racks, each housing 30kW to 35kW of servers - well beyond the 4kW to 8kW seen in most enterprise data center racks. Cooling that equipment required six in-row air handlers in each row, meaning eBay had to sacrifice six rack positions for cooling gear.

Switching to the rear-door units allowed eBay to recapture those six racks and fill them with servers, boosting compute capacity in the same footprint. The Motivair cooling doors are active units, with on-board fans to help move air through the rack and across the unit's cooling coils. Some rear cooling doors are passive, saving energy by relying upon the fans on the servers for airflow. Nelson said that the high power densities in the eBay Phoenix installation required the active doors.

Weighing Power Tradeoffs

The Motivair doors use high-efficiency EC (electronically commutated) fans, which mean the overall power usage (12.8kW vs 11kW for the in-row units) was minimal. The system also uses less energy because it can use warm water, working at a water temperature of at 60 degrees rather than the 45 degrees seen in many chiller systems.

"We thought the rear doors were expensive, but when you added the six racks back into the row, it paid for itself," said Nelson.

The cost scenario also worked because eBay had pre-engineered the space to support water cooling. Nelson believes that major data centers are approaching the limits of conventional air cooling, and sees water cooling as critical to future advances in high-density equipment. eBay is using hot water cooling on the rooftop of its Phoenix data center, which has served as a testbed for new data center designs and technologies being implemented by the e-commerce giant.

In the rooftop installation, eBay deployed data center containers from Dell that were able to use a water loop as warm as 87 degrees F and still keep servers running within their safe operating range. To make the system work at the higher water temperature, the system as designed with an unusually tight “Delta T” – the difference between the temperature of air at the serve inlet and the temperature as it exits the back of the rack.

The Phoenix facility also includes traditional indoor data halls, which is where the rear-door cooling doors were implemented.

Here's a look at the finished room:

ebay-motivair-full

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish