Skip navigation

How Facebook Tames High-Density Racks

<img src="/sites/datacenterknowledge.com/files/wp-content/uploads/2010/11/facebook-containment.jpg" alt="" width="441" height="433" /> Facebook recently retrofitted one of its high-density data centers in Silicon Valley, implementing efficiency improvements that will reduce its annual energy usage by 2.8 million kilowatt hours

A look at one of the containment systems installed by Facebook in one of its Silicon Valley data centers.

The spectacular growth of Facebook's audience has translated into growth in its data centers. Mostly it's meant more servers - lots and lots of servers. As the social network's audience has shot past 500 million, the company has sought to make the most of every inch of its data center space, filling every available server tray in its racks.

Those jam-packed racks translate into high-density heat loads that can be difficult to manage and costly to cool. After a period of breakneck growth, Facebook is now focusing on improving the energy efficiency of its data centers, and those high-density racks represent some of the biggest opportunities.

Facebook recently retrofitted one of its high-density data centers in Silicon Valley, implementing efficiency improvements that will reduce its annual energy usage by 2.8 million kilowatt hours. The company says it reduced its carbon output by 1,088 metric tons, the equivalent green house gas emissions of 395 cars or 251 homes.

Smaller Facility, Larger Gains

In its latest case study, Facebook retooled a 14,560 square foot facility in Silicon Valley, which was equipped to use fresh air to cool the servers (air economizers). Although the facility was smaller and newer than the 56,000 square foot facility featured in an earlier efficiency retrofit, Facebook actually realized greater total energy savings.

The high-density site was designed with a cold aisle that was eight feet wide, a broader configuration than Facebook normally uses, to provide additional cooling capacity. All of the floor tiles in the cold aisle were perforated to allow a higher volume of air into the space.

As the company took a closer look at the airflow in the facility, that proved to be significantly more air than was needed to cool the space, even with 45u racks completely filled with servers. As with the previous retrofit, Facebook took a systematic approach that included airflow analysis, installing cold aisle containment, reducing fan energy in servers, and raising the temperature of supply air.

Confronting the Cold Aisle
The first step was dealing with the problems in the cold aisle design. After tracking the air pressure in the aisle and under the raised floor, Facebook installed a cold aisle containment system, with a plastic roof overhead and sliding doors at the end of each aisle of racks. Director of Datacenter Engineering Jay Park and his team then swapped out tiles, replacing two rows of perforated floor tiles with solid ones. Facebook also took steps to seal every area where cold air could escape, using blanking plates, skirts for PDUs (power distribution units) and sealing cut-outs for cabling.

The next step was targeting fan energy. The containment system allowed greater control over the airflow, allowing Facebook to lower the frequency of its server fans. The first step was to analyze its fan performance.

"We monitored our Facebook application load for a long time under a variety of load conditions," said Park. "We then invited our server manufacturers to come in and they provided us with an optimized fan speed algorithm."

Vendor Relationships an Enabler
One of Facebook's motivations in releasing its retrofit case studies is to contribute to the industry knowledge base of best practices. While many of Facebook's strategies - like cold aisle containment, replacing floor tiles and adjusting inlet temperatures - can be replicated by smaller companies, the server fan speed refinements are easier for large companies like Facebook, which purchase servers in volume.

But Park says server vendors need to hear feedback from smaller companies as well. "You can start a dialogue with your server suppliers and see if they are willing to do it," said Park. "If more people ask, it will become common practice. Down the road, I believe this will happen."

Once the air flow and fan speed issues were addressed, Facebook was able to adjust the temperature of the supply air for the cold aisle from 51 to 67 degrees. This in turn allowed Facebook to expand the window in which it can use fresh air to cool servers from 2,200 hours (about three months a year) to 6,700 hours (more than nine months).

Facebook discusses the retrofit project in additional detail on its Facebook Engineering blog, which is also featured on its Facebook Green page.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish