Skip navigation

Are You Suffering from PUE Envy?

You are not alone if you’re experiencing a touch of PUE or budget envy, writes Tom Roberts of AFCOM. Slashdot.org recently reported “green fatigue” among data center managers who were tiring of the constant PUE chase.

Tom Roberts is President of AFCOM, the leading association supporting the educational and professional development needs of data center professionals around the globe.

Tom_Roberts_tnTOM ROBERTS
AFCOM

It’s kind of become an “our PUE is less than your PUE” world as companies battle it out for green data center efficiency bragging rights.

Of course, data centers with the most resources—financial, natural and manpower—have an advantage.

Here's recent examples:

Yahoo! spent close to $200 million on its 1.07 PUE-boasting data center in Lockport, NY. Carbon-free hydroelectric power generated from Niagara Falls feeds its servers.

Google, with an average PUE of 1.12 for its data centers, just purchased a $200 million stake in a wind farm in west Texas to add to an already impressive green portfolio that includes offshore wind power and solar. This brings the Internet giant’s total investments in alternative energy to more than $1 billion.

Apple spent about $1 billion to build iDataCenter, its first data center facility in Maiden, North Carolina. With two massive solar arrays and a nearby fuel cell farm it also manages a 1.1 (or so) PUE.

You are not alone if you’re experiencing a touch of PUE or budget envy from the previous examples. Slashdot.org recently reported “green fatigue” among data center managers who were tiring of the constant PUE chase.

Let's Get Real

For most of us, building data centers next to magnificent rivers or buying huge chunks of real estate to place solar arrays is just pie-in-the-sky thinking. Our solutions must be much more grounded.

In fact, Data Center World keynote speaker Brian Janous from Microsoft addressed this very real-world frustration. At the end of his talk, someone asked him: “It is great that (Microsoft) can experiment in using other sources of fuel to power its centers, but how do we (smaller data centers) benefit from that?”

Well, because the Microsofts of the world can experiment with new ideas and renewable fuel sources, it takes the pressure off us to determine what is viable and what is not. If they find their experiment did not work as planned, they learn from it and try something new. Their experimentation turns into our future implementations.

Practical Lessons from the Megascale Projects

While we may not be able to match the scope of what these corporate giants achieve, we certainly can apply the lessons that make practical sense in our data centers. For example, you can thank the larger data centers for “discovering” the use of outside air and evaporative cooling to lower temperatures as well as establishing a safe threshold for raising them. Just choose the projects that make the most sense now, for your specific situation and budget.

Research firm Gartner suggests keeping these guidelines in mind:

  • More exotic projects, like alternative energy and green building design, may take a decade or longer.
  • Five-year paybacks are probable for projects that attempt to change employee behavior, and for lifecycle management programs and green legislative initiatives.
  • Two-year paybacks are possible for efficient facility designs, advanced cooling, processor and server designs, and heating and power issues.

Try the below five methods (from AFCOM’s Communique newsletter) to make quick, cost-efficient differences in your energy usage:

1. Place Power Distribution Units (PDUs) in wider, warmer aisles in chimney-ducted rooms. Since they don’t need the low temperatures that cold aisles offer, PDUs shouldn’t be using up the colder air that other equipment requires.

2. Don’t use doors on the cold aisle sides of cabinets. Fans can produce both kinetic and radiant heat during regular operation. The easier time these fans have aspirating air through cabinet door perforations, the less heat they will produce, saving on required cooling.

3. Humidify using a combination of partial extraction of the warm return air, an atomizing spray of water, and a supply of 10-degree cooler air in back, overhead, and directly into the cross aisles. The natural vapor pressure will keep the area humidified.

4. Align cabinets with hot sides on the other side of a demising wall that forms the perimeter of the data center. Use a second concentric demising perimeter external to the first demising wall to form a security barrier and a warm air collection point for drawing heat in winter to warm office spaces. Finally, supply cooled air directly below exterior windows in an office area back to the interior of the computer room or NOC to assure a complete airflow/air replenishment circuit.

5. Use custom, break away ductwork in a chimney-ducted room. Affix the ductwork to the back door of cabinets that only vent horizontally, allowing hot air to redirect up and into a chimney when the back door is closed. The result is that the warm aisle stays cooler with less chance of mixing the warm air with the cold air within the data room.

Follow the Leaders

The challenges of lowering energy costs are here to stay. Whether you’re in a position to take on massive projects or nip away at smaller ones, be sure to keep your eyes open and ears peeled for the next great PUE-lowering strategy from the leaders in this arena. You—and the next generation of data centers—are bound to get something out of it.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish