Microsoft’s Timmons: ‘Challenge Everything’

8 comments

The building blocks for Microsoft’s data center of the future can be assembled in four days, by one person. The two data center containers, known as IT PACs (short for pre-assembled components) proof of concept, are built entirely from aluminum. The first two proof of concept units use residential garden hoses for their water hookups.

“Challenge everything you know about a traditional data center,” said Kevin Timmons, who heads Microsoft’s Global Foundation Services, in describing the company’s approach to building new data centers. “From the walls to the roof to where it needs to be built, challenge everything.”

Timmons, the keynote speaker at today’s DataCenterDynamics New York conference at the New York Hilton, discussed Microsoft’s design innovations for its next generation data center infrastructure, saying the industry is at an “inflection point.”

The Just-in-Time Data Center
“View your data centers as a traditional manufacturing supply chain,” said Timmons. “We’ve got PACS coming in from Singapore, others from Italy and others from the United States.” Those building blocks – which will include containers for electrical and mechanical support equipment as well as servers and storage – allow data centers to be assembled on a just-in-time basis. Once a site is selected and a steel frame deployed, the modular approach allows data center capacity to be deployed quickly, in cost-effective increments.

Microsoft plans to assemble its IT PACs in huge facilities built around a central power spine, with container shelters on either side. Diagrams from Timmons’ presentation depicted facilities with no side walls and a pointed roof with vents at the top, a design that appears similar to the “computing coops” being built by Yahoo at the company’s new data center in Lockport, New York.

The future Microsoft data centers will be fully air-cooled, with no mechanical cooling, Timmons said. The key to that approach is Microsoft’s updated container/IT PAC design, which functions as a huge air handler with racks of servers inside. The units are technically classified as air handlers instead of structures, a designation which may prove helpful in deploying capacity quickly.

PUE of 1.06 in Testing
Timmons said the latest container design is proving to be extraordinarily efficient, operating with a Power Usage Efficiency (PUE) of 1.06 in testing. That would rank among the lowest scores reported, below even Google’s published PUEs, which average between 1.1 and 1.2 for most of its facilities.

Running servers at a higher temperature is a contributor to that efficiency. “We’ve started to push our inlet temperatures up to 90 to 95 degrees,” said Timmons. “At that point, it’s really about humidity more than temperature.”

“Free cooling” using fresh air instead of chillers can save enormous amounts of energy, but also usually places limits on site location for data centers. Timmons said climate is a crucial factor in Microsoft’s site location decisions, but indicated that the new container designs may broaden its options.

“I haven’t yet found a place in the world where they won’t work,” he said. “We’re currently running a trial in Southeast Asia in a high-temperature, high-humidity environment, and I’m looking forward to the results.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)

8 Comments

  1. Paul M

    I'm pretty sure whilst motherboards, CPU and RAM don't mind raised temperatures, it's a fact that the batteries in UPSs get shorter lives and I wouldn't want to run my hard drives too warm either.

  2. Bob L

    very interesting. would love to see what these look like.

  3. Kevin Timmons is dreamy :-) Congrats for joining the party in popping the traditional data center bubble. Lower Cost. Faster Construction. Higher Performance. Yes, it's a rare land grab opportunity to have all three.

  4. It is a natural way, but it is a change of mind, about clean room, and UFAD systems, where can see this product. Thanks

  5. Santiago Rex

    I used this technology 40 years ago and after some months we decided go back to the "classical system" due to the high cost of high eficiency filtering system. Pity, but our enviromental is too dusty. At this time we are using closed circuit air conditioning with great results and saving a lot of energy. Then, also, how with this system is possible to install an efficient extintion system? With an open refrigeration system it's not possible!