Skip navigation

It's That Time of Year: How to Prepare Your Data Center for 2014

How do you optimize your environment for 2014? Given the continued focus on improving computer room cooling efficiency and that the typical data center today has cooling capacity that is nearly four times the IT load, data centers could reduce their operating expense by an average of $32,000 annually simply by improving airflow management, writes Lars Strong of Upsite.

Lars Strong, senior engineer, thought leader and recognized expert on Data Center Optimization, leads Upsite Technologies' EnergyLok Cooling Science Services, which originated in 2001 to optimize data center operations. He is a certified US Department of Energy Data Centre Energy Practitioner (DCEP) HVAC Specialist.

Lars-Strong-HeadshotLARS STRONG
Upsite Technologies

As 2013 draws to a close, demands continue as high as ever for increased data center efficiency, capacity, and reliability. Given this focus on improving computer room cooling efficiency and that the typical data center today has cooling capacity that is nearly four times the IT load, data centers could reduce their operating expense by an average of $32,000 annually simply by improving airflow management (AFM). In addition, the stranded cooling capacity that is released from these improvements could result in deferring capital expenditure that would have been required for adding additional cooling units or building a new data center. Moreover, releasing stranded capacity makes increasing computer room density and reducing carbon emissions possible.

Upsite Technologies’ recent research of 45 data center sites reveals there is much room for improvement. Poor AFM accounts for nearly half of conditioned air in data centers escaping through unsealed cable openings and misplaced perforated tiles. When your data center has common openings like these, it obviously requires you to run more fans to provide vital conditioned air to your heat load. This state of cooling inefficiency is a prime example of bypass airflow, which is any conditioned air supplied by a cooling unit that does not pass through (bypasses) IT equipment before returning to a cooling unit. Cable openings in a raised floor and excessive volumes of cold air delivered to a cold aisle are two principal sources of bypass airflow.

As Power Usage Effectiveness (PUE) analysis reveals, the cooling infrastructure is the largest consumer of power in a data center, and AFM remains the easiest and lowest-cost way to improve cooling infrastructure efficiency and capacity. However, even if your site makes strides to improve AFM and you do it well, your efforts can easily erode over time, and some infrastructure components will require performance validation that is often not part of a standard maintenance agreement.

Is Your Site Calculating Key AFM Metrics Monthly?

You need to calculate your key AFM metrics monthly and analyze them annually for trends and capacity planning. Key AFM metrics include:

  • Cooling Capacity Factor (CCF) – The ratio of total running manufacturer’s rated cooling capacity to 110 percent of the critical load. Ten percent is added to the critical load to estimate the additional heat load of lights, people, etc.
  • Perforated tile and grate placement – Perforated tiles and grates should only be located in front of equipment that requires conditioned air for cooling. The percentage of properly located perforated tiles and grates should be 100 percent. Place perforated tiles and grates to make all IT equipment intake air temperatures as low and even as possible. Replace all perforated tiles and grates located in dedicated hot aisles and open spaces with solid tiles.
  • IT equipment intake temperatures – The primary purpose of a computer room is to provide a stable and appropriate intake air temperature for IT equipment. As such, computer rooms are in either of two categories, those with and those without intake air temperature problems.
    Of the 45 sites that Upsite researched, 20 percent of cabinets had hot spots and 35 percent of cabinets had cold spots on average. ASHRAE recommends an allowable IT equipment intake air temperature range of 64°F (18°C) to 80.6°F (27°C). The percentage of cabinets with intake temperatures outside of the ASHRAE recommended range should be 0 percent.
  • Raised floor open area percentage – Raised floor bypass open area is made up of unsealed cable openings and penetrations, and perforated tiles placed in hot aisles or open areas. The percentage of raised floor bypass open area is calculated by dividing the total bypass open area by the total open area in the raised floor. The percentage of bypass open area should be less than 10 percent.
  • Blanking panel utilization – Install blanking panels that seal effectively, with no gaps between panels, in all open spaces within cabinets. Spaces between cabinets and under cabinets need to be sealed to retain conditioned air at the IT equipment face and to prevent hot exhaust air from flowing into the cold aisle. The percentage of open U spaces filled with blanking panels should be 100 percent. Close all open space of the vertical plane of IT equipment intakes. Install blanking panels, seal under cabinets, and seal between mounting rails and sides of cabinets.
  • Rack space utilization – The utilization of rack space is important to understanding how well the valuable space of a computer room is being utilized. Cooling capacity and planning are closely related to rack space utilization.
    Equipment performance validation.

Another key aspect of your overall AFM improvement strategy is to regularly validate your IT cooling equipment performance:

  • Return air temperatures vs. standard rated conditions - Manufacturers rate their cooling units on standard return-air conditions, typically 75 degrees (F) with a 45 percent relative humidity (RH%). However, since most sites run their cooling units with set points lower than standard conditions, the rated capacity cannot be delivered. This results in the very costly condition of more cooling units running because the cooling unit’s cooling capacity decreases at lower return-air temperatures. For example, a common 20-ton (70 kW) cooling unit has 20 tons (70 kW) of total capacity at a 75-degree (F) return-air temperature and 45% Rh. Conversely, at a 70-degree return-air temperature and 48% Rh, the same 20-ton cooling unit has a sensible cooling capacity of only 17 tons (59.7 kW).
  • Presence of latent cooling - In some IT configurations, high relative humidity (RH%) can result in condensation forming on cooling unit coils (i.e. latent cooling). Moisture condensing on cooling unit coils actually gives off heat that consumes some of a cooling unit’s cooling capacity, stranding capacity that could otherwise be used to reduce the air temperature of the supply air to IT equipment.
  • Calibration of cooling unit return-air temperature and relative humidity sensors - To accurately assess cooling unit return-air temperatures and latent cooling conditions, ensure that you regularly calibrate all cooling unit return-air temperature and relative humidity (RH) sensors.

Infrared Temperature Surveys

Conducting an infrared (IR) temperature survey will help track the proper management of previous thermal management issues, as well as identify any new issues that may arise.

  • First, use an infrared thermometer to measure the intake air temperatures. If they are all cool and the ceiling is cool, then there is more conditioned air being delivered to the aisle than needed.
  • Start removing perforated tiles and measure the intake air temperatures again.
  • Repeat Step 2 until you find that the intake air temperatures start to increase. Then, add tiles back until you resolve the problem. This process establishes the optimum airflow needed for that aisle.

Keep Fine Tuning

A computer room is a dynamic environment, so it’s unrealistic to expect that these key AFM metrics would not drift over time. Therefore, closely tracking each will help assure that your cooling infrastructure will be operating at maximum capacity, maximum reliability, and the lowest operating cost (and best PUE) in 2014.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish