Skip navigation

Moving Beyond 'Babysitting Servers'

After years of living in air-conditioned, brightly-lit rooms, servers are now seeing a change in their surroundings. Is the data center industry ready to put servers in warmer, darker environs? Or will uneasiness among management and customers limit the impact of this trend?

The interior of a 40-foot container inside the new Microsoft Chicago data center, packed with servers on either side of a center aisle (click to see a larger version of this image).

Are servers spoiled? Microsoft CEO Steve Ballmer thinks so. "There shouldn't be people babysitting all these machines," Ballmer said recently in discussing Microsoft's push into cloud computing.

After years of living in air-conditioned, brightly-lit rooms, servers are now seeing a change in their surroundings. At the largest data center builders, servers now reside in warmer, darker environs, sometimes encased in shipping containers.

Is the data center industry ready to get out of the "babysitting" business? Or will management and customer uneasiness limit the impact of this trend? These were hot topics of discussion at both the Uptime Institute Symposium 2010 and the Tier 1 Datacenter Transformation Summit.

As mounting power bills push energy efficiency to the fore, data center designers continue to set aside long-held beliefs about operating environments for IT gear. As Ballmer's comments suggest, Microsoft has been among the most aggressive in pushing the boundaries, particularly with the temperature inside its IT PAC data center containers.

Pushing to 45 Degrees C


"We've gone all the way to 45 degrees C (about 113 degrees Fahrenheit) and seen no significant decrease in reliability," said Dan Costello, Microsoft's Director of Data Center Research.

Raising the baseline temperature inside the data center can save money spent on air conditioning. Data center managers can save as much as 4 percent in energy costs for every degree of upward change.

While most of Microsoft's data center containers are managed remotely, the warmer temperatures require some design accommodations so servers can be maintained by staff  in the cold aisle, rather than the hot aisle, as is common in many server and rack designs. "We’re working with front access designs," said Costello.

"Servers don’t need much air conditioning," agreed KC Mares of Megawatt Consulting, who has designed and built data centers for Yahoo and Google. "We humans like it pretty comfortable. Servers could care less. They're a block of metal. Most equipment is waranteed to 85 degrees and vendors will still support it."

Concerns About Recovery Time


But not all data center operators are willing to push those limits, fearing that high efficiency requires a trade-off on reliability. "One of the issues comes up when you start to run high density loads at that 80 degree temperature," said Jack McCarthy, a principal with Integrated Design Group. "Many customers aren’t comfortable with that level of risk."

McCarthy said a common customer concern is that a higher temperature will allow less time to try and recover from a cooling failure. Another panelist at the Tier 1 event, Joerg Desler, the VP of Engineering for cooling vendor Stulz, said proper monitoring is critical in those  scenarios. "We believe that with controls you can address those risks and maximize safety," said Desler.

Raising the temperature to reduce the power required for cooling has also received a boost from ASHRAE, the industry group for heating and air conditioning professionals, which increased the top end of its recommended temperature range from 77 to 80 degrees.

Fighting the Fans


But nudge the thermostat too high, and those energy savings can evaporate in a flurry of fan activity. "At a certain (temperature) point your fan load goes up," said Mares. "The fans near servers consume about 3 to 4 times the power you really need to get the job done.

"I’m not a fan of fans," said Mares. "Blowing more air is usually not the right solution. What you want to do instead is shrink the room. You can do without fans in your data center, but it requires a lot of creative energy."

Mares said there are two approaches to "fanless data centers." One approach only uses fans in the servers or racks, and none in the data hall. The other design approach features no fans on the servers, but fans in the data center or container.

Educating the Customer


These new design concepts can be jarring to consider for companies familiar with data centers running between 68 and 72 degrees with lots of fan noise. "I sit down with my customer and say 'what are your real needs,' " said Mares. " All of these technologies are backed up by real studies with real companies. Part of what I do is to educate the customer."

"We can do all this today," said Victor Avelar, Senior Research Analyst at APC by Schneider. "It’s about whether we can psychologically get over these barriers."

That's a particular challenge in multi-tenant environments such as colocation facilities. "It’s been a mixed uptake," said Dave Pickut, the Chief Technology Officer for colocation specialist Equinix. We have some customers that want a traditional data center design. But we have others who are more open to new things."

Pickut said Equinix had recently widened the temperature range for its global footprint in line with the ASHRAE adjustments. While some customers remain cautious, Pickut said the efficiency and cost gains from new approaches - such as the Yahoo Computing Coop - are having an impact.

"I think you’ll see a much wider acceptance that the data center can look much different than it does today," said Pickut.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish