MS-Dublin-Colocation-PODs-470

Squeezing More Efficiency Out of Microsoft’s Cloud

Add Your Comments

Microsoft-dublin-newhall-47

The new data halls in Microsoft’s Dublin data center feature white cabinets and narrower hot aisle containment systems. Rows of cabinets are nestled into each containment enclosures, the structures with the green end doors. Newly-arrived cabinets are in place and waiting for the remainder of the row to be filled and then enclosed. The white cabinets are a new feature, reflecting available light and allowing Microsoft to use less energy on overhead lighting. Click the image to see a larger version. (Photo: Microsoft)

DUBLIN, Ireland – After you’ve built one of the most efficient data centers on earth, how do you make it even better? One refinement at a time, as  Microsoft has found in its data center in Dublin, the primary European hub that powers the company’s online services throughout the region.

When the $500 million Dublin facility came online in 2009, it was an early example of a data center operating with no chillers, relying almost totally upon fresh air to cool thousands of servers in the 550.000 square foot facility, which powers the company’s suite of online services for tens of millions of users in Europe, the Middle East and Africa

Early last year Microsoft added a $130 million expansion that nearly doubled the capacity of the data center. The expansion allowed Microsoft to implement several new tweaks to its design that have allowed it to more than double the compute density of each server hall while using less power.

Along the way, Microsoft has also improved the facility’s energy efficiency, lowering the Power Usage Effectiveness (PUE ) from 1.24 in the first phase to 1.17 in the newest data hall. The PUE metric compares a facility’s total power usage to the amount of power used by the IT equipment, revealing how much is lost in distribution and conversion. The average PUE for enterprise data centers is about 1.8.

Date-Driven Refinement: The Next Phase of Efficiency

Squeezing more efficiency and density out of bleeding-edge facilities is the next phase in the data center arms race. It’s a process that other leading players will be undertaking as they seek to get more mileage out of new server farms that came online in the huge construction boom from 2007 to 2010.

“We’re all moving towards constant evolution and improvement,” said David Gauthier,  Director of Data Center Architecture and Design at Microsoft, who helped design and launch the Dublin facility in 2009.

One key to improvement is relentless review of data from the early operations of new data centers, according to David Gauthier, Director of Data Center Architecture and Design at Microsoft. As it studied the operating data it collected, Gauthier says Microsoft found that it could be more aggressive in its use of free cooling.

“We were being conservative at first, because it was new and we hadn’t done it before,” said Gauthier. Microsoft had installed a small number of DX (direct expansion) cooling units in the first phase to provide backup cooling if the temperature rose above 85 degrees. The climate in Dublin, which has ideal temperature and humidity ranges for data center operations, never tested those levels.  The DX units were retired, making additional power available, which was used to install more servers and cabinets in the data halls.

In place of the DX units, Microsoft added a less energy-intensive backup system to address “just in case” scenarios of unusually warm weather.  It used adiabatic cooling, in which warm outside air enters the enclosure and passes through a layer of media, which is dampened by a small flow of water. The air is cooled as it passes through the wet media.

But Microsoft has now shelved the adiabatic systems in its most recent data halls, as Dublin’s weather simply doesn’t require it. “The climate in Dublin is awesome,” said Gauthier.

Greater Density, Same Power Footprint

Inside the data center, Microsoft is using more powerful and efficient servers, and configuring data halls to house more cabinets and servers. Each row of cabinets is housed in a “server pod” featuring a hot aisle containment system, with the cabinets housed in a fitted opening in the side of a fixed enclosure.

Microsoft designed the contained hot aisles so they could easily use cabinets of different heights, with one enclosure fitted with some 40U cabinets and some 48U cabinets, for example. This allows the company flexibility if it opts to use different server vendors. It has also narrowed the hot aisles themselves, which frees up more space for servers n each data hall.

These refinements, along with advances in processor power and efficiency, have helped boost Microsoft’s server power

Other recent refinements include the installation of energy-saving LED lights tied to motion sensors, meaning Microsoft uses less energy to power its lights, and only uses them when staff are present in a room.  It has also adopted white cabinets, which can save on energy since the white surfaces reflect more light. This helps illuminate the server room when less intense lighting.

The focus on energy savings extend to the backup power systems. Microsoft uses short-duration UPS units, which provide about 1 minute of runtime during a utility outage before shifting load to the building generators. This approach allows Microsoft to forego a huge battery room in favor of a smaller enclosure within its power room. Rather than cooling the entire power room to protect the battery life, the enclosure is air conditioned, using ony eneough energy to cool a small space instead of the entire room.

Microsoft is not alone in the effort to pursue energy gains in company-built facilities. Google recently “gutted” the electrical infrastructure of its data centers in The Dalles, Oregon to upgrade it for more powerful servers. The facility in The Dalles was built in 2006.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)