New from Microsoft: Data Centers In Tents

Microsoft data center in a tentThis month we’ve already had a patent from Google for floating data centers, and a proof-of-concept from Intel for a data center with almost no air conditioning. What’s next?

It turns out that Microsoft’s Christian Belady and Sean James have run their own outside-the-box proof of concept in which they’ve run a rack of servers under a tent in the fuel yard for one of the company’s data centers. Christian provides some details about the experiment in a blog post at The Power of Software:

Inside the tent, we had five HP DL585s running Sandra from November 2007 to June 2008 and we had ZERO failures or 100% uptime. In the meantime, there have been a few anecdotal incidents:

  • Water dripped from the tent onto the rack. The server continued to run without incident.
  • A windstorm blew a section of the fence onto the rack. Again, the servers continued to run.
  • An itinerant leaf was sucked onto the server fascia. The server still ran without incident.

While few data center managers would be brave enough to submit mission-critical apps to such conditions, the experiment builds on Intel’s work in suggesting that sevrers may be hardier than believed, making more room for optimizing cooling set points and other key environmental settings in the server room. Suffice it so say that Microsoft is also on the same page with Intel when it comes to increased use of air-side economizers, which have been incorporated into the design of data centers Microsoft has under construction in Chicago and Dublin.

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)


  1. God


  2. Army Commo Guy

    Geez, we've been doing this for 5 years in iraq

  3. ron thompson

    This is wonderful but the mean temp in Tukwila is 45 degress during this timeframe. What happened from june-present? I do agree that servers are a lot more robust than we give credit. Most of them will run in 10-85% humidity and up to a 90 degree input temp now. RT

  4. Dubya

    Sure... This might prove that servers are a well built... I don't see it convincing managers to go out an invest in tents any time soon, however, especially considering that the elements did make it in there an into contact with the servers in question.

  5. Chubblez

    @Ron Thompson - Actually, last I checked, they're still doing quite well. They're not in any real type of production enviroment, as it's still proof of concept. ~Chubblez

  6. Chris

    Run them for a year without the tents and then I'll be impressed.

  7. just another electronic engineer

    Let's look at this tent setup at a component level. I mean the PCB, capacitors, resistors, inductors, heatsinks (copper or aluminum), wires (rubber, teflon or silicone) and ceramic ICs. In the tent, the electronic components are exposed to rapid and continuous wet/dry and hot/cool cycles (relative to a 24/7 A/C datacenter). The result is shortening the service life of the parts exposed. Under these conditions, the parts will reach its EOL sooner than compared to conditioned environment in these ways: PCB will warp, capacitors will lose its charge/discharge ability, inductors will rust (the sealed ones are better), resistors will deviate from its initial resistance, heatsinks will oxidize (higher thermal resistance), wire sleeves will become damaged (dry, crack, peel...etc) ceramic IC, if heated when it has absorbed enough moisture, will crack the solder joints, when the PCB and part above expand or compress at different rates, will weaken or break. If the hassle is trying to save money, then the balance should depend on the intended lifetime of a server. If its intended service life is, for example, 2 years, then of course the decrease in the part life out of its spec'd 10 years will not be observed; however, if the servers are intended to last until it dies, that's where it will make a difference.

  8. anonymous

    Nothing says secure data like putting it in a tent.

  9. Jay

    Physical security??? :)

  10. michael

    Oh yea... but think of the DR possibilities!! I could set up DRP involving an old army tent in the carpark of my (now a shell of a burnt out) office; pull the fibre feeds back from the burnt office into the tent. No aircon costs. No paying hotel bills - the DR staff can sleep in tents too - or their cars! sweet! I'm off to book a meeting with the board to float that idea!

  11. AM

    This is new? Not so much...

  12. Cwilly

    This sounds okay if a company wants to do it on the cheap. Too many issues though. Physical security? Weather that includes snow or hail, and temps ranging from below freezing to above 100 degrees during the year? Electronic engineer brings up several points about the effects on the server electronics. Not really a good idea for a company serious about their data center.

  13. Lucy Stockton

    Hi Ron, Funny Comments, Lucy Stockton, ATS

  14. Wooden Monkey

    They should start building the server racks out of wood next

  15. mike Check out the temp section (3.4), figure 5.