Facebook Revises its Data Center Cooling System

2 comments

One of the huge chambers inside the “penthouse” cooling system used by Facebook in its new Oregon data center. Facebook has updated some elements of the system in its newest data centers. (Photo credit: Alan Brandt)

Facebook is updating its data center cooling system, swapping media in place of misters. The changes to the social network’s multi-room “penthouse” cooling were based on lessons learned during the first year of operating its data center in Prineville, Oregon.

The changes are minor, but reflect on Facebook data center constant refinement of its data center design in pursuit of improved efficiency and cost.

Facebook is adjusting a phase in its fresh air cooling system that manages humidity and heat removal . In the first phase of its Prineville data center, Facebook used a misting system, with small nozzles attached to water pipes that sprayed a fine mist across the air pathway, cooling the air and adding humidity.

In phase 2 of the Prineville project, Facebook has replaced the misters with an evaporative cooling system featuring adiabatic media made of fiberglass. Warm air enters through the media, which is dampened by a small flow of water that enters the top of the media. The air is cooled as it passes through the wet media.

Air Not Fully “Scrubbed”

The change followed an incident in which a plume of smoke from a fire spread across the area around the Facebook data center. Staff could smell the smoke inside the data center. That prompted the Facebook’s data center team to examine other options for treating and “scrubbing” air as it makes it way into the data center.

“Our analysis of options focused on the amount of water usage,” said Jay Park, Director of Datacenter Engineering for Facebook. “ The expectation was that the media would require a lot more water, but that turned out not to be true.

“(The media) won’t actually need more water, and eliminates the need for reverse osmosis,” said Park. “Any sediment is flushed out at the bottom of the media, just like blowdown from a cooling tower.”

Saving on Construction Cost

In its first data center in Prineville, Facebook had an entire room dedicated to reverse osmosis, a natural process that separates small particles and impurities from water. The media serves a similar process (unlike misting), allowing Facebook to save construction costs by foregoing reverse osmosis.

Facebook’s data center design uses a two-tier structure, which separates the servers and cooling infrastructure and allows for maximum use of floor space for servers. Facebook opted to use the top half of the facility to manage the cooling supply, so that cool air enters the server room from overhead, taking advantage of the natural tendency for cold air to fall and hot air to rise – which eliminates the need to use air pressure to force cool air up through a raised floor.

The air enters the facility through an air grill in the second-floor “penthouse,” with louvers regulating the volume of air. The air passes through a mixing room, where cold outdoor air can be mixed with server exhaust heat to regulate the temperature. The cool air then passes through a series of air filters and the evaporative media, continues through another filter to absorb the moisture, and then through a fan wall that pushes the air through openings in the floor that serve as an air shaft leading into the server area.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)

2 Comments

  1. steven schulte

    I'm assuming this is a Tier IV Center. Can you describe the overall M&E redundancy? Thanks, Steve