Too Hot for Humans, But Google Servers Keep Humming

24 comments

The Google data center in Belgium, which features no chillers and routes traffic to other facilities during hot spells. (Photo from Google)

Raising the temperature in server racks can make a data center more efficient. But what happens if the room gets too hot for people?

If you’re Google, the servers keep humming along while the humans retreat to climate-controlled sections of the building. That’s what’s been happening at Google’s data center in Belgium, which was the company’s first facility to rely entirely upon fresh air for cooling, instead of energy-hungry chillers. That approach has helped the facility become Google’s most efficient data center.

For the vast majority of the year, the climate in Belgium is cool enough that this design works with no problems. When it gets hot in Belgium, the temperature inside Google’s data center warms beyond the facility’s desired operating range – periods that the company refers to as “excursion hours.”

Staff Retreats From Heat

During these periods, the temperature inside the data center can rise above 95 degrees. That’s when the humans leave the server area.

“We’ve been operating in Belgium since 2008 with no chillers,” said Joe Kava, Senior Director of Data Center Operations for Google. “We’ve had very few excursion hours, and they don’t last long, so we let the site run right through them. We ask our employees to go in and do office work. It’s too warm for people, but the machines do just fine.”

Google’s experience is the latest affirmation that servers are much tougher than we think. Many data centers feel like meat lockers, as servers are maintained in cool environments to offset the heat thrown off by components inside the chassis. Typical temperature ranges in data centers often range from 68 and 72 degrees.

In recent years, rising power bills have prompted data center managers to try and reduce the amount of power used in cooling systems. If you can raise the temperature, data centers can run with less cooling, or even none at all, as is the case at Google’s Belgium site.

Servers Handle Heat

Studies by Intel and Microsoft showed that most servers do fine with higher temperatures and outside air, easing fears about higher hardware failure rates. Dell recently said it would warranty its servers to operate in environments as warm as 45 degrees C (115 degrees F).

But this trend has implications for data center staff. Higher temperatures at the server inlet area also mean more heat in the hot aisle, where staffers work on cabling and other connections in the rear of the server. This is a special challenge when the hot aisle is contained to prevent waste heat from recirculating into the cold aisle.

The federal Occupational Safety and Health Administration (OSHA) sets guidelines on work conditions in the U.S., which call for specific ratios of rest time to work time for people working in environments where temperatures exceed acceptable working conditions, such as a contained hot aisle in a data center.

Some server techs simply dress for warmer conditions in the hot aisle. For some providers, the answer was shifting cabling connections to the front of the server, so techs could work in the cold aisle. Facebook’s Open Compute design, for example, places cabling on the front of the server, allowing the company to leave the hot aisle dark.

On-Demand Cooling for Hot Aisle

Vendors are starting to offer equipment that can address conditions in the hot aisle. Tate, which makes flooring tiles for raised-floor environments, recently introduced the SmartAire T, an in-floor damper that can provide on-demand cooling in hot aisles. When maintenance is required, cool air can be briefly redirected into the hot aisle to make the area tolerable for staff.

Before entering the hot aisle, a technician uses a supply trigger, typically a switch located outside the hot aisle, to activate the SmartAire T units. Cool air then enters the hot aisle until a comfortable temperature is established. SmartAire T units maintain this temperature until the technician completes the assigned work and deactivates the units.

 

About the Author

Rich Miller is the founder and editor-in-chief of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)

24 Comments

  1. Why don't we get rid of the hot aisle all together instead of reintroducing cooling or having employees leave the room?

  2. Nate

    95 degrees? Seriously? I work in restaurant where the guys on the cook line are in 100+ degree areas for several hours. They seem to do ok with a pitcher of ice water.

  3. Karen

    Because that's a moronic idea and is vastle less efficient.

  4. David Sommers

    You can't really just get rid of it. All the servers blow air from front to back. You supply colder air to the front, the equipment heats it up as the air transfers through the server, and then spits it out the back warm. By having a hot aisle, you can control air flow such that chillers can chill just the hot air - they don't have to process the vasts amount of air that's already cool. Which in turn makes the system overall more efficient. The key is, if you had to cool to 90 degrees instead of 70 degrees - you'd save a lot of energy. For the techs working in the 100+ degree hot aisle (because 90 degrees is the ambient cold side air) - they can bring it down to a manageable level.

  5. Wilgert

    Great solution Wenda, just turn of the servers: problem solved.

  6. Moschops

    "Why don’t we get rid of the hot aisle all together instead of reintroducing cooling or having employees leave the room?" Because the hot aisle is a good thing, as explained in the article. You can put all the heat in the hot aisle, and everything the meat-bags actually need to get to in the cool aisle.

  7. Le_Baron_Samedi

    I guess plenty of guys must have though this thru before. But it seems the game is to cool efficiently but in order to reach "99.99%+" server reliabiity. Is that not simply costly over quality. Would anyone know what is the failure rate increase as temperature goes up? There could be an arbitrage to make between total cost of server failure/interruption Vs savings thanks to higher air temperature. I would guess that if Dell guarantees 45C/115F it means that their gears are almost completely safe at these temperatures. And there is possibly little reasons why other brands or older kits won't be able to support that heat as well.

  8. Mark

    Why stop at 95 degrees Fahrenheit? And I always wonder why they cool the air. Just route some outside air in, and then out a warm air vent, possibly with the heat being captured and used for some other purpose. They could install these in basements of apartment buildings and heat the buildings. Or in summer when the heat is not welcome, use it to heat water for use in the buildings. Make it a turnkey appliance that's free or subsidized by whoever is using the server resources, for a win-win.

  9. in response to Wilgert & Moschops, my comment was not meant at ignorance but I should have been more clear. My intent was to suggest not having an inhabited hot aisle and to contain the racks and pull the hot air into chimneys with active variable speed fans which makes the entire data center a cool stabilized environment which reclaims wasted cooling.

  10. How long until they use the Bernoulli-principle? It has been some decades ago that a German university researched how termites and prairie dogs ventilate their buildings this way. They also found out you can reduce your energy costs by 80% for cooling in summer and 40-60% for heating in winter. HOW LONG? There is only one south-African architect yet using this for malls ..

  11. James

    Why not close down the data centers and simply use paper? We could have web programmers and designers build web sites on pads of paper and everyone could just simply search through the stacks of paper to find what they need.

  12. PhelanPKell

    It's never as easy as several of you seem to think Everyone seems to forget that it's not just the temperature they are controlling, but also things like moisture in the air. With the temperatures going on in a server farm, any moisture from the air could fry any of that hardware fast.

  13. I like Wenda's idea - eliminate the hot aisle and evacuate server exhaust heat out the top of the racks. ASHRAE accepts this and its called front to top airflow. There is a white paper available on this website titled; reclaim wasted cooling, that covers many of these details.

  14. JCL

    Wenda has redeemed herself with her second comment. Shes on to us.

  15. JCL

    @ Le_Baron_Samedi - Check ASHRAE TC9.9 for x factor and other failure rate information as it correlates with temperature & humidity. You'd be surprised.

  16. JohnSmith

    @Dell warranty past 95F -- the ASHRAE spec states that the server can be throttled to achieve compliance with the spec. Does anyone know if Dell is throttling the power to comply or can they run at 100% during the time the inlet air is over 95F? @Mark&Wenda -- HP EcoPOD uses those principles to achieve some pretty impressive efficiency. they take outside air, measure for temp/humidity and if it is within ASHRAE spec, they feed it directly to a cold aisle and the exhaust is out the the top. if it's out of spec, they heat, cool or humidify the air as needed.

  17. That Guy

    @Nate: Yes, because having Data Center techs working in the server room with "pitchers of ice water" is a really fantastic idea.

  18. Im sick of the raised heating - bottom line is - hard drives, servers and switch's life expectancies are severely diminished with higher temps.... Companies don't care since they normally carry hefty hardware contracts/warranties and simply replace failing/overheated hardware and keep on rolling