Energy Efficiency in Today’s Data Center

7 comments

DBIA, LEED AP BD+C, leads the Mission Critical Market for at JE Dunn Construction. Ron was previously Director of Mission Critical for Gray Construction and also served in leadership roles with Qwest Communications and Aerie Networks. You can find him on Twitter at @RonVokoun.

Ron Vokoun, GrayRON VOKOUN
JE Dunn

At the end of my last column on renewable energy in today’s data center, I stated that energy efficiency should always be addressed before implementing renewable energy. The reason is that energy efficiency is where sustainability pays for itself and the return on investment (ROI) is measured in weeks or months and not years. Also, using less energy as a whole means less renewable energy will be required to power your data center.

In this column, I will elaborate on aspects of measuring and addressing efficiency in energy, cooling and water-use related to energy.

Key Areas to Examine for Efficiency

Struggling on where to start your energy efficiency efforts? Look to these four key areas for improvements.

  • Cooling: Typically the lowest hanging fruit.
  • Water:  Don’t overlook the use of water, due to its scarcity in certain areas. Water is related to energy as well.
  • Electrical Design: Recent engineering innovations offer new efficient options.
  • Incentives: Help offset improvement or development costs of energy efficiency.

Measuring Efficiency

Power Usage Effectiveness (PUE) is the most popular industry metric for measuring the energy efficiency of data centers. Today, there appears to be an arms race for the lowest PUE.  Even if you aren’t one of the select few with the operational flexibility to participate, you can measure your PUE and work to improve efficiency relative to your own data center site. The industry group, The Green Grid, has many resources available on PUE.

Green Cooling Techniques

ASHRAE’s latest version of TC 9.9 drastically expanded the recommended and allowable temperature and humidity ranges with the approval of the major server manufacturers. It is estimated that an energy savings of 2-4% can be realized for each degree Celsius the temperature is raised in a data center. It seems raising the temperature is low hanging fruit, but I have seen very few do it to date.

Another undisputed, easy and inexpensive energy saver is hot or cold aisle containment. Preventing the mixing of cold and hot air results in a higher return air temperature that yields an increased efficiency of the cooling system. Many systems exist ranging from hard containment systems to simple refrigerator curtains that you might see in a meat locker. Have a limited budget? Hot or cold aisle containment provides a compelling financial argument for adoption.

Free cooling is now a critical consideration–with either air-side or water-side economization. The new temperature and humidity ranges offered by TC 9.9 make free cooling feasible for a large part of the year in any location, and when designing a new data center or expanding an existing facility.

Liquid cooling has been talked about a great deal of late, with liquids being far more efficient at expelling heat than air. The approach requires some modification of the server so that it can be submerged in the liquid, but studies have shown positive results.

Evaporative cooling is another energy efficient technique, especially applicable in dry climates. However, evaporative cooling often sparks a debate over the use of additional water, especially in water-constrained areas.

Water Efficiency

Water is a topic that is gaining increased attention and will continue to do so in the future. I once heard a “futurist” say that “water is the new oil.” In evaluating evaporative and other cooling techniques, many (myself included) have made the mistake of evaluating only the amount of water used in the respective cooling systems.

In order to determine the complete hydro-footprint of a system, you must also look at energy usage and how much water is used in the production of that energy. The National Renewable Energy Lab (NREL) published a study that analyzed how much water is used in the production of power per kilowatt-hour on a state-by-state basis. While not perfect, it provides a basis for analysis from an authoritative source. After taking the amount of water used in the production of energy into account in a particular geography, evaporative cooling can have a smaller hydro-footprint (use less total water) than a chilled water system due to the amount of energy saved.

There have been a few projects of late that use either sea water or ground water for cooling, which is very efficient as it effectively eliminates the need for much of the cooling equipment. A site in central Nebraska is pursuing this tactic by using irrigation wells with a volume of 1,000 GPM at 52⁰F as the source of groundwater for cooling and re-injecting the water back into the aquifer. This is not only very energy efficient, but uses little to no water for cooling, saving on both capital expenditures (CAPEX) and operating expenditures (OPEX) through the elimination of much of the cooling equipment. The net impact addresses both the energy and water efficiency of the equation for a very energy efficient, and therefore sustainable, cooling solution.

Highly Efficient Electrical Solutions

Major efficiency gains have been made in recent years in electrical equipment that can improve your data centers’ PUE. There are multiple manufacturers of UPS’ that are reaching efficiencies of 96-98 percent at less than 50 percent load. This is important if you utilize A and B feeds to your equipment for redundancy.

Another trend is for the UPS to operate in a by-pass mode, which eliminates the losses through the batteries. Many are not yet comfortable with this mode of operation, but it is another efficiency gain to consider in optimizing performance. Higher voltage and DC power are also evolving trends that provide efficiency gains that bear mentioning.

Energy Efficiency Incentives and Rebates

Whether designing a new, energy efficient data center or upgrading your existing facility, there are many incentives available to help defray the cost and improve your ROI.

Power companies are commonly providing incentives based upon your performance compared to a baseline building or a baseline piece of equipment. Plan to include the power company as early in the design phase as possible, to maximize the financial benefits. Some require approval of the incentive prior ordering the equipment.

Additional Considerations

There are additional considerations beyond those mentioned above in optimizing your mission critical facility’s efficiency.

  • System modularity is an accepted practice that affects efficiency. Implementing modular and rapidly expandable designs in lieu of installing full density on day one typically results in higher efficiency through higher equipment utilization. This saves on CAPEX and OPEX, making for a smart business decision.
  • Cogeneration, also known as combined heat and power (CHP), has gained in popularity and can be as high as 60-80% efficient compared to the typical 30% efficiency of normal power plants.
  • Peak power shaving can also be achieved through thermal storage. This is done by creating ice at night when power rates are lower and utilizing the ice for cooling during the day.

Measure, Improve, Monitor and Repeat

Regardless of the selected energy efficiency measures in your new or existing data center, make sure you measure your initial or existing condition so you have a baseline. After your improvements are made, measure again to determine your new condition and your ROI. In the case of a new data center, perform a total cost of ownership (TCO) analysis to guide your decisions. You should continue to monitor your efficiency and make improvements to improve your PUE relative to your initial condition. As a reminder, measuring your outcomes against those in the industry under different operating conditions may not provide an apples-to-apples comparison.

In closing, remember that energy efficiency is always more cost effective than renewable energy in your mission-critical facility. My next column will look at the pros and cons of the USGBC’s LEED and the EPA’s Energy Star for Data Center programs.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Add Your Comments

  • (will not be published)

7 Comments

  1. Ron - You are quite right about the water issue, but in my opinion, until an offsetting "Water Credit" can be given to offset Carbon Credits, water conservation in a data center environment will never be given proper consideration. In many of the data facilities I have performed major upgrades on, I have recommended to the client that they switch from your typical chilled water system using cooling towers to high-efficiency air-cooled chillers. In a significant number of sites, the economics come out very favorable for the air-cooled systems. In one site here in Northern VA, we did a study on a 100,000 Sq. Ft. facility with 30MW of Power Capacity. Ultimately, once we considered the cost of water ($4.50/thousand gallons) and the cost to treat the water; that data center would have saved over $2M per year in operating costs plus $6M in initial CAPEX as the local water company wanted $2M per 8" water line and the location needed three such connections. Unfortunately by the time we were asked to look at the project, they had already purchased the water-cooled chillers and cooling towers for the project. Because water is more valuable in some regions and less so in others, it is hard to determine its overall value to the environment the way we do carbon footprints. In Arizona for example there is always a water shortage. The Colorado River has shrunk so much that by the time it reaches Yuma County, there is hardly anything left for irrigation. The Colorado River provides Phoenix with 36 percent of its water and the Salt River provides 54 percent. Both can be mere trickles after they leave the Phoenix area, especially in drought years, which occur more and more frequently. Now, if there was a "Water Credit" in Phoenix that was as attractive as "Carbon Credits"... How many data centers do you think would gladly use 10% more power to completely eliminate their water consumption? I think they all would! Let's not forget that the Palo Verde Nuclear Power Plant supplies all of southern Arizona and a large portion of Southern CA with Zero Carbon Footprint power and that supplemental power can be taken from the power plant at Hoover Dam that is also green energy. So in a city that has a Zero (0) carbon footprint, I think water "Footprint" should be just as important and that companies operating huge cooling plants in the middle of the water-stricken desert should be given incentives to get off the "Water Grid" and save precious water as well.

  2. Ron, Great point about the hydro footprint. We have seen several large clients take advantage of the opportunity to reduce not only their on site utility costs by applying compressor free cooling solutions, but do it in an environmentally responsible way by reducing their overall hydro footprint.

  3. Ron, Completely agree that energy efficiency is the first step to an efficient data center. Efficiency applies to every kind of energy, solar, wind, coal... Efficiency is many times referred to the "fifth" fuel.... Even if the power comes for a low cost, using power to it's highest potential not only makes sense, it's make cents! Dollars even! And while cost of energy continues to rise, efficiency will only become more of a necessity for data centers, and everything else that draws power. As another posted said, this approach can have an ROI in months, not years.

  4. Ron - absolutely, it is more important to determine baseline measurements in your data center so you know if your actions are having a positive effect, lowering your PUE, so you can calculate your ROI. (Rather than comparing your PUE to the other guy's.) Deploying wireless sensors is a low cost tool for getting accurate and reliable measurements so you can gauge the ongoing effectiveness of your cooling operations.

  5. Ron - good article. One clarification on liquid and evaporative cooling. Refrigerant-based cooling is technically liquid cooling and does not require server modification as does the submersion form of liquid cooling. Refrigerant-based cooling also fits in the evaporative cooling category with heat exchangers at the back of the rack. LIquid cooling fits any data center rquirement and offers significant energy savings. I recommend the recent reserach from Henrique Cecci, Research Director at Gartner, "Plan Now for Liquid Cooling in Your Data Center" on March 6, 2012.

  6. Raj Kapoor P.E.

    Ron: The fan and pumps use cube of power rule for air and water they provide and or circulate in data center. Ambient climatological conditions vary in 24 hours, say in Texas in summer from 50 Degree F to 102 degree F or even more in some places. The load on the data center equipment also varies from 15% to 50% in tier IV data center and from 30% to 95% in Tier I and Tier II data centers. Real good controls to adjust air and water volumes by using VFD's as well as modulating chillers with full throttle use of cooling tower to extent allowed by chiller manufacturer can result in lots of energy saving. From 1992 data center to present day data center, if we are not using less than half of the energy (not comparing to Holland), then we need to look at it closely. We need to benchmark each data center and should keep it in top 10% for lowest energy use for the area. Free water side or air side cooling or non-refrigerated cooling should be first thing to look at. Containment of aisles, use of 6sigmaDC program to very air and water quantity as 24x7 operational tool with ambient temperature input is the best option to reduce PUE. I can take any data center, audit and bring them in top 10% data center of the region for lowest energy use.