Aperture: Aging Data Centers Won’t Scale

Add Your Comments

The latest data from the Aperture Research Institute reinforces the fact that data center operators need to build new data center space and do a better job managing the capacity in their existing facilities. Aperture’s survey (PDF) of more than 600 data centers worldwide found that many facilities are aging and unable to support high-density computing environments.

Thirty eight percent of those surveyed said their newest data center was at least four years old, which Aperture said makes them less likely to be able to support the latest technologies for conserving space and energy. “The current problem is that these data centers are not well-suited to meet these demands,” said Steve Yellen, principal of the Aperture Research Institute. “It’s put the average large company in a situation where they have older data centers and new equipment.”

Are data centers built in 2003 already outmoded? Earlier this year The Gartner Group said that any data center built prior to 2000 was likely to be obsolete. In some cases, the upgrade timetable is even shorter: in June Mellon Financial Corp. announced that it would spend $70 million to expand the power and cooling infrastructure at a Pittsburgh-area data center that was less than a year old.


Data center operators and analysts may differ about the best benchmark for when to replace or upgrade a data center. But Aperture said its latest research shows that not enough companies are planning or building new facilities, setting the stage for even more data center operators to find themselves running out of power or space, or both. Aperture found that 64 percent of participants in its study were not actively building or planning new data center space.

“Data center managers are already facing day-to-day challenges on managing increasingly complex technologies in old facilities,” said Yellen. “Installing state-of-the-art equipment in an aging facility will limit the benefits that can be delivered by the new technology, and in some cases, will overload the infrastructure to the point of failure.”

Yellen believes this “gap” between business IT needs and data center capabilities will continue, placing a premium on effective management of existing space. “We feel we’re in a period about three to five years where we’ll see this gap,” said Yellen, who said ARI research indicates up to 20 percent of users have no current data that will help determine when they’ll run out of resources.

First you have to have a handle on where you are today,” he said. “Then you need to start tracking the trends and determine your limiting factors. When will I run out of space? When will I run out of power? Capacity management is ensuring that the data center can economically meet the demands of the business. The IT and data center groups have to be able to say how they can adjust to any of these scenarios.”

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.