Facebook Servers Get Hotter, But Run Fine in the South

Add Your Comments

An aerial view of the new Facebook data center in Forest City, North Carolina. (Photo: Facebook).

Facebook has been able to cool its servers through the North Carolina summer using only fresh air and no mechanical refrigeration, the company said today, even on days when the temperature reached 102 degrees.

The key to the cooling breakthrough was raising the temperature inside the Facebook data center, allowing servers to run at an inlet temperature of 85 degrees F. That’s about five degrees warmer than Facebook’s data center in Prineville, Oregon, where the company pioneered  a design that relies 100 percent on using outside air to cool its servers.

The new data, which will be discussed by Facebook engineers today at industry conferences in London and Phoenix, adds to the growing body of evidence that servers can run in warmer environments. It also offers fresh support for the use of fresh air cooling in more areas, including warmer climates where data centers normally rely on refrigeration.

Many data centers feel like meat lockers, as servers are maintained in cool environments to offset the heat thrown off by components inside the chassis. Typical temperature ranges in data centers often range from 68 and 74 degrees. In recent years, rising power bills have prompted data center managers to try and reduce the amount of power used in cooling systems. If you can raise the temperature, data centers can run with less cooling, or even none at all – which in a large data center could save millions of dollars in energy bills.

Bringing Free Cooling Further South

When Facebook built its second data center in Forest City, North Carolina, the company wasn’t sure whether the fresh air coling strategy it relied upon in Prineville would work during summer in the South, when both the heat and humidity are higher than in the Oregon. It turns out that Facebook’s servers did fine in a warmer, more humid environment, which allowed Facebook to stick with fresh air (free cooling) instead of DX systems it installed on a “just in case” basis.

“The design we use in Prineville works great in central Oregon – a high desert plain with hot, dry summers and cool evenings and winters,” wrote Facebook Mechanical Engineer Dan Lee in a blog post. “These are ideal conditions for using evaporative cooling and humidification systems instead of the standard chillers you see in pretty much every other data center.

“To try to make the free cooling system work in Forest City, we expanded the server environmental conditions on the high end,” Lee continued. “We set the upper end of the server inlet temperature range at 85°F, instead of at 80°F. And because of the higher humidity in North Carolina, we expanded the relative humidity (RH) maximum from 65% RH to 90% RH.”

Unusual Heat Tests Cooling System

The weather ensured that it was a tough test of Facebook’s system. July 2012 was the second hottest month on record in North Carolina, with the temperature hitting 102 degrees on July 1. Surprisingly, the Power Usage Effectiveness (PUE) for Forest City clocked in at 1.07 this summer, versus 1.09 in Prineville during roughly the same period

“Despite the record-breaking heat, we didn’t run the DX coils at all (in North Carolina) this past summer,” Lee wrote. “If you look at the trend data, it shows that when the record hot days occurred, relative humidity was low, allowing the misting system to provide all the needed cooling.”

Facebook’s cooling system  uses the upper floor of the building as a large cooling plenum with multiple chambers for cooling, filtering and directing the fresh air used to cool the data center.  The air passes through a series of air filters and a misting chamber where a fine spray is applied to further control the temperature and humidity.

The Facebook team  will discuss its data center designs and their performance at the Open Compute Summit on January 16-17 in Santa Clara, Calif.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)