The Facebook Data Center FAQ. (Page 4)

5 Min Read
Facebook FAQ

We continue with page 4 of the Facebook Data Center FAQ (or Everything You Ever Wanted to Know About Facebook’s Data Centers).

How Energy Efficient Are Facebook’s Data Centers?

An overhead view of the Facebook server area, showing the cabling distribution.

Facebook’s Prineville data center — the first data center the company designed in-house — operates at a Power Utilization Effectiveness (PuE) measurement for the entire facility of 1.06 to 1.08, and the company said its North Carolina data center would have similar efficiency profile. The PUE metric (PDF) compares a facility’s total power usage to the amount of power used by the IT equipment, revealing how much is lost in distribution and conversion. An average PUE of 2.0 indicates that the IT equipment uses about 50 percent of the power to the building.

The cool climate in Prineville allows Facebook to operate without chillers, which are used to refrigerate water for data center cooling systems, but require a large amount of electricity to operate. With the growing focus on power costs, many data centers are designing chiller-less data centers that use cool fresh air instead of air conditioning. On hot days, the Prineville data center is designed to use evaporative cooling instead of a chiller system.

In its cooling design, Facebook adopted the two-tier structure seen in several recent designs, which separates the servers and cooling infrastructure and allows for maximum use of floor space for servers. Facebook opted to use the top half of the facility to manage the cooling supply, so that cool air enters the server room from overhead, taking advantage of the natural tendency for cold air to fall and hot air to rise – which eliminates the need to use air pressure to force cool air up through a raised floor.

Oregon’s cool, dry climate was a key factor in Facebook’s decision to locate its facility in Prineville. “It’s an ideal location for evaporative cooling,” said Jay Park, Facebook’s Director of Datacenter Engineering. The temperature in Prineville has not exceeded 105 degrees in the last 50 years, he noted.

The air enters the facility through an air grill in the second-floor “penthouse,” with louvers regulating the volume of air. The air passes through a mixing room, where cold winter air can be mixed with server exhaust heat to regulate the temperature. The cool air then passes through a series of air filters and a misting chamber where a fine spray is applied to further control the temperature and humidity. The air continues through another filter to absorb the mist, and then through a fan wall that pushes the air through openings in the floor that serve as an air shaft leading into the server area.

“The beauty of this system is that we don’t have any ductwork,” said Park. “The air goes straight down to the data hall and pressurizes the entire data center.”

Testing at the Prineville facility has laid the groundwork for adapting fresh air cooling extensively in North Carolina, even though the climate is warmer than in Oregon. “Comparing our first phase of Prineville with how we plan to operate Forest City, we’ve raised the inlet temperature for each server from 80°F to 85°, 65% relative humidity to 90%, and a 25°F Delta T to 35°,” wrote Yael Maguire on the Facebook blog. “This will further reduce our environmental impact and allow us to have 45% less air handling hardware than we have in Prineville.”

The Delta T is the difference between the temperature in the cold aisle and hot aisle, meaning the hot aisles in Facebook’s new data center space will be as warm as 120 degrees – not a pleasant work environment for data center admins. Mindful of this, Facebook designed its Open Compute Servers with cabling on the front of the server, allowing them to be maintained from the cold aisle rather than the hot aisle. The contained hot aisles in Prineville are unlit, as the area was not designed to be staffed.

What’s Different About Facebook’s Data Center in Sweden?

Officials in Sweden are pretty excited about the new Facebook data center in Lulea, as illustrated by this huge ice sculpture of the “Like” symbol. Economic development officials are hoping to leverage that success to win other data center projects.

Facebook took a different approach to the electrical infrastructure design at its data center in Sweden, reducing the number of backup generators by 70 percent. Facebook says the extraordinary reliability of the regional power grid serving the town of Lulea allows the company to use far fewer generators than in its U.S. facilities.

Using fewer generators reduces the data center’s impact on the local environment in several ways. It allows Facebook to store less diesel fuel on site, and reduces emissions from generator testing, which is usually conducted at least once a month.

Local officials in Lulea say there has not been a single disruption in the area’s high voltage lines since 1979. The city lies along the Lulea River, which hosts several of Sweden’s largest hydro-electric power stations. The power plants along the river generate twice as much electric power as the Hoover Dam.

“There are so many hydro plants connected to the regional grid that generators are unneeded,” said Jay Park, Facebook’s data center design architect. “One of the regional grids has multiple hydro power plants.”

Park said Facebook configured its utility substations as a redundant “2N” system, with feeds from independent grids using different routes to the data center. One feed travels underground, while the other uses overhead utility poles.

Technical Presentations:

For those interested in more detailed information, here are links to PDFs and videos of presentations about Facebook’s Infrastructure and operations from members of the Facebook Engineering team.

The Facebook Data Center FAQ | Page 2 | Page 3 | Page 4

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like