Facebook Saves Big By Retooling its Cooling

These before-and-after CFD modeling diagrams show the difference in airflow management once Facebook implemented cold aisle containment.

Facebook’s efforts to make its data centers more energy efficiency isn’t limited to its new facility in Oregon. The social network recently retooled the cooling system in one of its existing data centers in Santa Clara, Calif., slashing the facility’s annual energy bill by $229,000 and earning a $294,761 rebate from Silicon Valley Power.

The company shared details of its efficiency project today at the Data Center Efficiency Summit 2010 hosted by the Silicon Valley Leadership Group (SVLG) on the Brocade campus in Santa Clara. Facebook pursued a multi-faceted approach in its retrofit of a 56,000 square foot data center, in which it is the only tenant and holds a long-term lease. The facility did not offer the option of using economizers, which allow data center operators to save money by using use fresh air in its cooling systems in place of energy-intensive chillers.

Series of Best Practices
With economization off the table, Facebook implemented a series of best practices to dramatically reduce its energy use, including thermal modeling, installing cold aisle containment, reducing fan energy in servers, and raising the temperature of both supply air and chilled water. At the SVLG Summit, Facebook’s refinements were described by Director of Datacenter Engineering Jay Park and engineers Veerendra Mulay and Daniel Lee.

Here’s a look at the components of the project:

CFD Modeling: The first step in Facebook’s efficiency project was creating a thermal model of the data center using computational fluid dynamics (CFD) software that creates a 3D model of how cold air is moving through the facility, identifying potential “hot spots” as well as areas that are receiving more cold air than needed, wasting cooling and energy. The CFD study revealed that some of the cold air entering the room through the raised-floor was bypassing servers, cooling the room rather than the IT equipment, while warm exhaust air from the hot aisle was mixing with cold air in key areas.

Cold Aisle Containment: Facebook took several steps to address the airflow problems identified in the CFD modeling. It began by installing a cold aisle containment system to isolate the hot and cold air in the data center. Roof panels were installed over the cold aisles, with fusible links to allow for adequate overhead fire suppression. Doors at each end of the aisle allowed access for tech staff. Facebook also took steps to seal every area where cold air could escape, using blanking plates, skirts for PDUs (power distribution units) and sealing cut-outs for cabling.

Reducing the Number of CRAH Units: Once the cold aisle was encapsulated, less airflow was required to cool the equipment. This allowed Facebook to turn off 15 computer room air handlers (CRAHs), saving the energy required to operate those excess units.

Reducing Server Fan Energy: Further savings were gained through adjustments to the server fans. “These fans are PWM fans – pulse with modulation,” Park explained. “They’re typically pre-set by the manufacturer to run at higher speeds. You modulate the fans to a lower speed and you bring less air through the servers. You can set this through software. Intel can tell you how to do this.”

Raising the Air Temperature: Facebook next concentrated on raising the rack inlet temperature as high as it could without triggering additional fan activity. Optimizing the cold aisle and server fan speed allowed Facebook to raise the temperature at the CRAH return from 72 degrees F to 81 degrees F.

Raising the Water Temperature: The higher air temperature then allowed Facebook to raise the temperature of the supply water coming from its chillers, requiring less energy for refrigeration. The temperature of chiller water supply was raised by 8 degrees, from 44 degrees F to 52 degrees F.

Facebook’s systematic application of best practices illustrated how energy efficiency projects can create a “waterfall affect” of cascading benefits. The company’s approach to the project made an impression upon its landlord, Digital Realty Trust.

“Facebook has distinguished itself as one of the leading efficiency teams among our global portfolio of thousands of customers,” said Jim Smith, CTO of Digital Realty Trust. “In addition, Facebook has been open and collaborative in their approach, enabling us to implement some of their strategies with our other customers.  Thus, we have the potential to multiply the energy savings and environmental protection across the infrastructure of many other companies.”

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)


  1. Love the systematic approach. First you set the standard for social media. Now you do data center cooling the smart way. What next? Bring it on! Way to go!

  2. Adam

    Data centres all over the world are doing this and just because its facebook doing it, it makes the news.....

  3. Steve

    Now when one CRAC goes down the temp runs away quickly and thermal breakdown begins. The room heats up so quickly it takes forever to cool it back down. Maybe a virtualization to bring the kW down at the sever would stop the heat build up from the start. But I guess when facebook gets hot it will only have to deal with someone not getting the gossip.

  4. Joe

    I would like to know what material they used for the containment. Was it the meat locker type or solid doors?