Skip navigation
Questions Every IT Manager Should Ask About Thermal Management

Questions Every IT Manager Should Ask About Thermal Management

A smart approach to data center cooling can reduce electricity bills by up to 50 percent, writes JP Valiulis of Emerson Network Power. But how do you identify the best thermal management strategy for your facility?

JP Valiulis, vice president of Marketing at Emerson Network Power, is a sales and marketing leader with a history of growing revenue and profitability at leading B-to-B and B-to-C companies.

Managing data centers to keep servers and their applications functioning properly is a core responsibility of every IT manager, as is keeping costs and energy usage to a minimum. CIOs and IT managers don’t have to be cooling experts, but they should know enough to make intelligent management-level decisions about the most cost effective way to manage energy usage in their data center.

Today’s robust servers allow data centers to operate at hotter temperatures. You don’t need to run iceboxes anymore, nor should you need to wear a sweater inside your data center. A smart approach can reduce the electricity bill for cooling by up to 50 percent at many data centers, with simple but smart adjustments or investments to the cooling system.

The following questions and best practices will help you identify the best thermal management strategy for your data center, with the end goal of slashing energy costs without reducing availability of critical infrastructure.

Turn Up the Thermostat

Are you hot when you walk through your data center? You probably should be a little. The old standard was 72 degrees for return air (the mixture of air returning from computers to the cooling unit) and relative humidity at 50 percent. Today, you can push return air temperatures as high as 95 degrees (best to do this in small increments to avoid unexpected humidity trouble and to ensure all the IT equipment is functioning properly). This can be done over a few days with little risk to applications and IT equipment. Enlist your facilities manager or vendor partners to assess how to do so safely. Remember, for every 1 degree Fahrenheit increase in temperature you will save 1.5-2.0 percent of your energy costs.

Raise chilled water temperatures. For many years, 45 degrees was the standard for water in the chiller. That’s changing. Operating chillers up to 55 degrees is possible today, reducing energy consumption by 20 percent. Every degree matters—each 1 degree increase in water temperature reduces chiller energy consumption by 2 percent. This can make a huge difference, since the chiller is the heart of cooling system and consumes approximately 75 percent of the system’s electricity. Be careful to work with your facilities manager, because raising chilled water set points can reduce cooling capacity in your data center cooling units which is fine if you have some excess capacity.

Put Cold Air in its Place

Do you know where your cold air is going? The physical arrangement of your data center can make a big difference. If you have a raised floor, keep it as uncluttered as possible. Areas beneath the raised floor are often jammed unnecessarily with wires that block cold air from efficiently reaching its target.

Make sure blanking panels are in place wherever there are unused spaces in racks. Plug gaps in floors, walls and ceilings to seal the room. Add return plenums (closed chambers that direct air flow) where appropriate. These tweaks have one goal: make your cooling unit works less, a sure-fire way to cut costs.

In temperature control, you don’t want hot air mixing with cold air in the data center aisle. By placing a pre-fab structure over the area that needs to be cooled – the aisle between two racks – you create a “cool room” within your data center. Warm air, discharged from the back of servers, can’t creep around the front to meet cool air. Thus, cool areas stay cool, warm areas stay warm.

Adjust Cooling Capacity

Do your cooling units have the ability to ramp up and down with changes in your IT loads? Your cooling equipment should have variable capacity components (fans and compressors if applicable) to adjust cooling capacity up and down with your IT load. Constant speed fans are common, but can’t adjust to a data center’s actual performance. A 10HP fan motor uses 8.1kWh of electricity at 100 percent speed, but only 5.9kWh at 90 percent, and 2.8kWh at 70 percent. Savings are significant (and exponential) when fan speed can be matched to the data center’s actual requirements.

Is it Time for an Upgrade?

How old are your unit controls? If they are more than 4 or 5 years old, you may have opportunities to upgrade your equipment controls and save up to 40 percent on your energy bills with an aggressive payback, made even better with energy utility rebates.

New controls – like a thermostat in a zoned house – give users more information and greater control than older versions. Controls can be networked together to prevent units from “fighting” each other – one heating, the other cooling. Small sensors placed strategically throughout a data center make the cooling process work smoothly. These sensors, generally part of a control upgrade, allow data centers to automatically optimize temperatures and airflow in different parts of a room and to isolate potential trouble quickly.

New data center smart technologies pay for themselves quickly through lower energy costs. How much? When new controls and variable capacity component strategies are added to the operational tweaks described previously, power consumption from cooling in a typical enterprise data center with 500kW of IT load, drops over 50 percent from 380kW to 184kW or $171,690 in annual energy savings assuming $0.1 per kW hour. That drops the PUE from 1.76 to 1.37. That’s worth looking into.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish