How Facebook Tames High-Density Racks
The spectacular growth of Facebook’s audience has translated into growth in its data centers. Mostly it’s meant more servers – lots and lots of servers. As the social network’s audience has shot past 500 million, the company has sought to make the most of every inch of its data center space, filling every available server tray in its racks.
Those jam-packed racks translate into high-density heat loads that can be difficult to manage and costly to cool. After a period of breakneck growth, Facebook is now focusing on improving the energy efficiency of its data centers, and those high-density racks represent some of the biggest opportunities.
Facebook recently retrofitted one of its high-density data centers in Silicon Valley, implementing efficiency improvements that will reduce its annual energy usage by 2.8 million kilowatt hours. The company says it reduced its carbon output by 1,088 metric tons, the equivalent green house gas emissions of 395 cars or 251 homes.
Smaller Facility, Larger Gains
In its latest case study, Facebook retooled a 14,560 square foot facility in Silicon Valley, which was equipped to use fresh air to cool the servers (air economizers). Although the facility was smaller and newer than the 56,000 square foot facility featured in an earlier efficiency retrofit, Facebook actually realized greater total energy savings.
The high-density site was designed with a cold aisle that was eight feet wide, a broader configuration than Facebook normally uses, to provide additional cooling capacity. All of the floor tiles in the cold aisle were perforated to allow a higher volume of air into the space.
As the company took a closer look at the airflow in the facility, that proved to be significantly more air than was needed to cool the space, even with 45u racks completely filled with servers. As with the previous retrofit, Facebook took a systematic approach that included airflow analysis, installing cold aisle containment, reducing fan energy in servers, and raising the temperature of supply air.
Confronting the Cold Aisle
The first step was dealing with the problems in the cold aisle design. After tracking the air pressure in the aisle and under the raised floor, Facebook installed a cold aisle containment system, with a plastic roof overhead and sliding doors at the end of each aisle of racks. Director of Datacenter Engineering Jay Park and his team then swapped out tiles, replacing two rows of perforated floor tiles with solid ones. Facebook also took steps to seal every area where cold air could escape, using blanking plates, skirts for PDUs (power distribution units) and sealing cut-outs for cabling.
The next step was targeting fan energy. The containment system allowed greater control over the airflow, allowing Facebook to lower the frequency of its server fans. The first step was to analyze its fan performance.
“We monitored our Facebook application load for a long time under a variety of load conditions,” said Park. “We then invited our server manufacturers to come in and they provided us with an optimized fan speed algorithm.”
Vendor Relationships an Enabler
One of Facebook’s motivations in releasing its retrofit case studies is to contribute to the industry knowledge base of best practices. While many of Facebook’s strategies – like cold aisle containment, replacing floor tiles and adjusting inlet temperatures – can be replicated by smaller companies, the server fan speed refinements are easier for large companies like Facebook, which purchase servers in volume.
But Park says server vendors need to hear feedback from smaller companies as well. “You can start a dialogue with your server suppliers and see if they are willing to do it,” said Park. “If more people ask, it will become common practice. Down the road, I believe this will happen.”
Once the air flow and fan speed issues were addressed, Facebook was able to adjust the temperature of the supply air for the cold aisle from 51 to 67 degrees. This in turn allowed Facebook to expand the window in which it can use fresh air to cool servers from 2,200 hours (about three months a year) to 6,700 hours (more than nine months).
How was fire protection handled? There are no sprinkler heads visible. This has been a challenge in cold aisle containment, often solved by hinged ceilings which drop via a fusible link, or installation of additional heads within the containment. This can add considerable expense to these projects.
Hi David. With the aisle containment system, Facebook used fusible links that will allow the panels to swing open in the event the sprinklers are required.
J. TaylorPosted November 4th, 2010
This enclosure doesn’t appear to incorporate any fire detection or protection in the enclosed space. Was this construction permitted and inspected by the local AHJ?
[...] http://www.datacenterknowledge.com/archives/2010/11/04/how-facebook-tames-high-density-racks/ This entry was posted in Data Center, Facebook. Bookmark the permalink. ← Hardware Failure Cited in PayPal Outage Facebook’s Mobile Ambitions Get Bigger → [...]
As a manufacture of CAC/HAC we developed a fusible support 4 years ago,at very low cost to support our ceiling panels. When water based fire suppression is installed.
The panels are split along the aisle length.
The panel is supported on the fusible support , it is allowed to completely drop into the aisle. Not hinged as this could in certain circumstances come down and block the server inlet. Its a very light weigth tranasparent and removable.
The temperature can be adjusted to suit the activication of the water sprinker. Generally 10 derees C below.
Its great to see an endorsment from Facebook of the major benifits of cold aisle containment particularly in retro fit application.
I am happy to share my knowledge of containment with anyone. Via this site or direct to email@example.com
[...] just been reading an article today about Facebook retrofitting a data center and all the great energy efficiency gains. Unfortunately, sometimes the best retrofit method for a [...]