Capacity Utilization as a C-Suite Shocker

Add Your Comments

Are C-suite executives really comparing efficiency metrics like Power Usage Effectiveness (PUE) in evaluating their data center performance? The Uptime Institute says it sees this happening, but industry observers are skeptical. “I think it’s a bit of a stretch to assume C-Level execs are even aware of PUE (let alone calling data center staff out on the carpet about it),” writes Matt Stansberry.

Other industry veterans say executives are indeed pressuring data center managers, but the metric that has been a real shocker is capacity utilization – how much mileage the company is getting out of the hardware it has already bought.

“We’re going to see the C-suite pushing back down on asset utilization and asset management,” said Jack Pouchet, Direct of Energy Initiatives at Emerson Network Power’s Liebert unit. Pouchet said there is low executive awareness of server utilization rates, which are often cited as averaging between 10 and 30 percent. “You’re going to start seeing aggressive policies coming down to change that.”


The server capacity utilization challenge is not new, and has been a driving force in data center consolidations that use virtualization to pack more computing resources onto fewer servers. In some cases, low utilization is driven by the need to dedicate servers to different operating systems or specific applications. Virtualization can address this by allowing multiple operating systems in virtual machines to share the same hardware.

Power and cooling constraints are also a major factor in low utilization rates, preventing data center managers from pursuing high-density configurations that would fill racks with gear but create “hot spots” that are difficult to cool.

Pouchet believes the current capital environment will focus additional attention on management of data center assets. With many new data centers costing $100 million or more, decisions about additional data space will likely be accompanied by more exhaustive scrutiny of existing operations. CFOs and CEOs are less likely to support new capital spends once they figure out that just 20 percent of their equipment is being used.

“In the last six months, people have been taking a harder look at asset utilization,” said Pouchet, who added that retrofitting an existing facility to support higher densities is “not as hard as it used to be. There are new technologies in the cooling area you can deploy without tearing your data center down.”

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)