Microsoft data center campus in Quincy, Washington (Source: Microsoft video)

Latest Microsoft Data Center Design Gets Close to Unity PUE

Microsoft says it is now close to achieving unity PUE, or perfect infrastructure energy efficiency, with its latest data center design.

Innovation in data center design by cloud giants has widespread implications for their users as well as for the data center industry in general. Improving data center efficiency is one of the main levers cloud providers can pull to lower the price of their services. As they innovate in pursuit of lower and lower cost at greater and greater scale, many of those innovations eventually get adopted by companies designing data centers for other purposes.

Using a mix of data centers it builds and operates on its own and leased facilities, Microsoft’s cloud now lives in more than 100 data centers in 34 regions around the world.

The company is now on its fifth-generation data center design. In a rare look at the physical backend of its global cloud infrastructure, the company has produced a video tour (registration required) of its data center campus in Quincy, Washington, which showcases three design generations, including the latest one.

The fifth-generation facility is a culmination of everything the company’s infrastructure team has learned about data center design over the previous generations. “We’ve learned something from every step of the way,” Christian Belady, general manager of cloud infrastructure strategy and architecture at Microsoft, said in the video. “We’ve adopted it and brought it into this design.”

The facility’s PUE (Power Usage Effectiveness) is down to 1.1, and even below 1.1 at certain times of the year, he said. It’s “almost at unity, so it’s a major step.”

While PUE has been championed by The Green Grid, the data center industry organization responsible for making it the most widely used data center energy efficiency metric around the world, Belady is the one who created the concept about a decade ago while designing a data center for a customer. PUE measures the power overhead of electrical and mechanical infrastructure by comparing total power that comes into a data center with the amount of power used by the IT equipment inside.

Microsoft doesn’t appear to have come up with any groundbreaking innovations to achieve its extremely low PUE. The biggest differences between the latest design and the previous-generation one are a move away from using data center containers, or ITPACs, as the company called them, and a switch to waterside economization from a mix of outside air and adiabatic cooling that was employed in ITPACs.

Microsoft spokesman Alex Brady with ITPACs in the background on Microsoft's Quincy, Washington, data center campus (Source: Microsoft video)

Microsoft spokesman Alex Brady with ITPACs in the background on Microsoft’s Quincy, Washington, data center campus (Source: Microsoft video)

As we reported earlier this year, the company has stopped using ITPACs because the model couldn’t support the rate of global data center capacity expansion it has been undertaking as it grows its cloud services business, namely Azure and Office 365.

The fifth-gen Microsoft data center design is a building where racks that ship to the site already packed with pre-tested servers are rolled in from the loading dock along a bare concrete floor. The lack of raised floor is a departure from Microsoft’s pre-ITPAC days, when it used traditional raised-floor-based cooling design.

The new design uses a closed-loop waterside economization system. Water is cooled in massive cooling towers outside of the facility and pushed through the cooling loop. Inside the building, the loop goes through a heat exchanger, from where a wall of fans pushed cold air into the IT hall.

Fan wall inside Microsoft's fifth-generation data center in Quincy, Washington (Source: Microsoft video)

Fan wall inside Microsoft’s fifth-generation data center in Quincy, Washington (Source: Microsoft video)

The design also employs a few reliable efficiency best practices, such as hot-aisle containment and optimal inlet air temperature. Because Microsoft designs its own servers, it has been a priority to ensure they can run at relatively high temperatures, which reduces the demand for cooling capacity.

Belady did not specify inlet air temperature inside the fifth-generation Microsoft data center, but servers inside its ITPAC-based gen-four data centers run without any air conditioning when outside air temperature is around 80F. Once outside air reaches above 90F or so, the cooling surface of the adiabatic coolers gets wetted to bring the temperature down, he said.

In general, the latest Microsoft data center design looks close to what Facebook and Google have been doing in their facilities. The overall theme is simplification and industrialization. These facilities look closer to manufacturing plants than to the lab-like raised-floor IT environments you will find inside the more traditional enterprise data centers. “It feels very industrial,” Belady said. “Everything’s very simple.”

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

San Francisco-based business and technology journalist. Editor in chief at Data Center Knowledge, covering the global data center industry.

Add Your Comments

  • (will not be published)

One Comment

  1. Steve S

    So nothing really earth shaking and not as good as what others(Facebook, Google) are claiming(designs below 1.1). Bottom line is that they may not be creating as much CO2 pollution as before but they still create "tons" of thermal pollution which no one is talking about. In the case of Facebook, Twitter, Instagram, etc. we're polluting the planet so people can share cat videos and their latest trip to the grocery store, or walk to the kids school. We're turning into a world of narcissists while damaging our environment.