Facebook's Servers Stay Warm en Route to Arctic Circle

Facebook is considering using environmentally controlled trucks to make deliveries to its new data center in Sweden. Why? It’s not the cold, it’s the humidity, which creates challenges as equipment is transported and installed, when it may move from outdoors to indoors.

Rich Miller

August 2, 2012

3 Min Read
Facebook's Servers Stay Warm en Route to Arctic Circle
Officials in Sweden are pretty excited about the new Facebook data center in Lulea, as illustrated by this huge ice sculpture of the “Like” symbol.

Like-Ice-Symbol

The cold temperatures in Lulea, Sweden make it easy for the local community to greet the new Facebook data center with this huge ice sculpture of the "Like" symbol. But it's not as welcoming for servers and IT equipment.

It’s a long cold trip to Lulea, Sweden, a town on the edge of the Arctic Circle where Facebook is building its newest data center. But some of the company’s custom servers may be making the trip in style.

Facebook is planning to use environmentally controlled trucks to make deliveries to Sweden, according to Frank Frankovsky, VP of Hardware Design and Supply Chain for Facebook.

Why? It’s not the cold, it’s the humidity, which creates challenges as equipment is transported and installed, when it may move from outdoors to indoors.

“A rapid rate of change (in temperature) can create condensation on the electronics, and that’s no good,” said Frank Frankovsky, Director of Hardware and Supply Chain at Facebook. “The transition is the important part. We want to make sure we don’t have a big rate of change” as servers are moved from transport vehicles into the data center.

Heated delivery trucks may seem like a luxury for equipment, but are a small expense compared to Facebook’s investment in server and storage hardware.

What About Colder Data Centers?

Most of the discussion involving temperature and servers has focused on the impact of warmer data center environments. Large Internet companies like Google, Microsoft and Intel have been aggressive in operating their data centers at temperatures above 80 degrees. This allows companies to use outside air to cool servers, reducing the use of power-hungry air conditioners and chillers.

A recent study from researchers at the University of Toronto found that heat may be less important than temperature fluctuation in reducing hardware failures.

“Even failure conditions, such as node outages, that did not show a correlation with temperature, did show a clear correlation with the variability in temperature,” the authors wrote. “Efforts in controlling such factors might be more important in keeping hardware failure rates low, than keeping temperatures low.”

Less attention has focused on operating servers and data centers in extremely cold environments. The issue is worth noting given the growing number of data center projects in the Nordic countries in recent years. Notable examples include Facebook’s Lulea data center, a Google project in Hamina, Finland and a newly announced facility in Kajaani, Finland, where the winter temperature can range as low as -45C.

The CSC — IT Center for Science Ltd. (CSC) said it picked SGI as its vendor for the project in northern Finland because it had developed an "extreme weather" version of its ICE Cube Air modular data center. CSC will transform a former paper warehouse into a data center. The first phase of the project will deploy 2 megawatts of IT capacity across 3,000 square meters.

Mixing Helps Modify Cold Air

The design of the ICE Cube Air allows it to recirculate server exhaust heat and mix it with incoming fresh air, raising the temperature of the cold air to an operating range for servers. The modules will be drawing air from within the building shell, rather than sitting directly outside in the cold.

Facebook said its management challenges in Lulea won’t end once the servers arrive safe and warm inside the data center. Facebook Director of Datacenter Engineering Jay Park says the company will be carefully monitoring humidity levels during the winter months.

“It’s very cold and dry,” said Park. “It’s all about humidity during the winter. We don’t know how much humidity we’ll need to add (to the data center environment), but we’ll be watching it closely.”

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like