(Photo by Sean Gallup/Getty Images)

One Data Center Standard to Rule Them All?

Mehdi Paryavi says people from every walk of data center life he’s met over the years call him for advice, ranging from operations staff to senior-level execs. “I have chiller technicians call me that know me from 15 years ago,” he says, adding that he’s as likely to get a call from a facilities manager as from someone configuring core switches and routers.

Paryavi says he started his career as a management and information systems engineer, became an IT manager, then learned about things like power and cooling, and eventually became a businessman. He declines to name the companies he worked for in those roles however. “Honestly, I don’t want to get into that stuff,” he says.

He also declines to name customers of his data center consultancy, TechXact, which he co-founded in 2002. “Almost anybody you think about has been a customer of ours at some point,” he says. “I don’t want to name any customers. We have a boutique data center services company, and we don’t disclose references.” More often than not, companies like to keep their data center projects secret, and it’s common for contractors they hire for those projects to be bound by non-disclosure agreements.

Several years ago, TechXact founded an organization called International Data Center Authority to develop a standard that would give companies a way to assess the performance of their IT infrastructure, starting with power and cooling infrastructure and ending with software applications the infrastructure is built to support.

The technical committee IDCA put together to develop the standard consists of senior engineering and operations staff from several well-known companies, including eBay, LinkedIn, and AIG, an architecture branch chief who oversees cloud and hosting for US Courts, as well as TechXact’s own employees besides several others. IDCA’s self-imposed deadline to deliver the standard is about four and a half months away.

Taking Aim at Tiers

On its website IDCA in no vague terms attacks the Uptime Institute’s four-tier rating system for defining reliability of data center infrastructure as “outdated.” In interviews, Mehdi, who chairs the IDCA board, and Steve Hambruch, a data center architect at eBay who chairs its standards committee, explain that the Tier system is one of too many standards used in the data center industry, each of them addressing an individual component of the ecosystem without regard for the ecosystem as a whole.

Uptime, now owned by The 451 Group, developed the tier system and reserves the right to be the only organization that can certify and assign tier levels to facilities. A data center operator can pay hundreds of thousands of dollars and spend weeks on the certification process, which involves working with a team of Uptime experts onsite. It has certified more than 700 data centers around the world, including certification of design documents and constructed facilities, which are separate certifications.

Read more: Explaining the Uptime Institute’s Tier Classification System

The problem, according to IDCA, is that the tier system focuses squarely on reliability of facilities infrastructure. “A data center is not a cooling center; it’s not a power center; it’s not a bunch of walls,” Paryavi says. It’s a good standard for what it is, he says – “Uptime has done a good job for their own space” – but it’s not enough if you want to truly rate how well your data center will do the job it was built to do.

“It’s not our mission to replace existing, well-established standards,” Hambruch says. “It doesn’t mean we don’t recognize where gaps exist.” It’s not enough to rate the level of redundancy in your power and cooling infrastructure using a tier rating or determine the efficiency of your facilities infrastructure using Power Usage Effectiveness (PUE), he explains. “That tells me nothing about whether or not I’m getting the maximum possible compute as a result of the power and cooling I’m consuming.”

He offers an example: You can have two Tier II-rated data centers – which is the second-lowest availability rating – and a failover system between them, which will provide a higher level of availability than each of them individually.

Addressing the entire data center ecosystem was never Uptime’s aim, as there is still a big need to address just the underlying infrastructure, Julian Kudritzki, chief operating officer of the Uptime Institute, says. “All of our standards and certifications were never meant to address absolutely everything,” he says. “We focused on what we thought the need was.”

Just recently he was involved in a data center project that had the same problems as the first tier certification Uptime ever did. Different teams were working in isolation from each other and there was no coordination even on fundamental things like the order in which design, construction, and commissioning should happen.

Certification prevents things like that from happening, Kudritzki explains. “It addresses that human foible factor,” he says. “If someone thinks that that’s still not needed, than that person doesn’t understand our business.”

In his opinion, there really isn’t a need for an all-encompassing standard like the one IDCA is working on. “I don’t think that rolling it all up into one giant hairball is the way to do it. The industry needs simplicity, not further complexity and splintering.”

The Infinity Paradigm

To guide its standard development process, IDCA has created a framework it calls Infinity Paradigm. It is represented as a seven-layer pyramid, whose bottom layer is Topology (individual data centers and how they interact), and whose tip is “Application” (set of software services the company needs). The middle layers cover data center location, facilities infrastructure, physical IT infrastructure, compute resources in the abstract form in which they are presented to applications, and platform, or specific methodology for delivering applications.

IDCA’s goal is to offer companies guidance in defining the product of this entire stack for themselves (that unit will be different from industry to industry and possibly from company to company within each industry) and use the standard to fine-tune the stack to maximize whatever that output is.

“When the organization can identify what their work unit is, we can begin to measure the efficiency of the overall IT operation – including the very low-level KPIs at the site layer and so forth – against that work unit,” Hambruch says.

Too Many Standards, or Not Enough?

TechXact is already advertising data center audit and training services based on IDCA’s future standard, but unlike Uptime, which doesn’t allow certification using its rating systems by anyone other than its own staff, IDCA isn’t planning to retain the exclusive right to audit and issue certifications based on its standard, according to Paryavi. “We’re not keeping it exclusive to us,” he says, adding that he wouldn’t mind if an Uptime consultant learned the framework and the standard and went on to audit data centers on their own.

The reason he believes IDCA has a shot at making the data center industry adopt its standard is that he sees a need for it. “We’re addressing everybody’s pain point,” he says. “I never met a person who said [they were] happy with [the standards] they have.”

See also: Data Center Design: Which Standards to Follow?

IDCA’s technical committee members are volunteers, doing it out of passion, Paryavi says. “They care about the community. We are not another bunch of consultants sitting in a room making a recipe for everybody else in the world.”

Regardless of whether or not you think IDCA is actually capable of creating a universal standard that will address every layer of the data center from chillers to software and every inter-layer dependency, and then also convincing the industry to accept and adopt it, the confusion about all the standards used in the data center industry is a problem people in the industry often complain about.

Depending on who you talk to, there are either too many standards or not enough. Whether it’s possible to create a single standard that effectively and elegantly solves this problem by covering the entire ecosystem IDCA’s framework describes is a different question. There is a countdown clock on the organization’s website. Once it gets down to ‘zero,’ we’ll learn what they propose as the answer.

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

San Francisco-based business and technology journalist. Editor in chief at Data Center Knowledge, covering the global data center industry.

Add Your Comments

  • (will not be published)