Skip navigation
Data center

Hyperscale Cloud Platforms Changed Data Center Design and Function. Here's How

In their designs, they’ve taken a different approach to risk, reliability, and redundancy. Have their principles propagated outside their small club?

It was August 2014 when a team of Gartner analysts declared that original design manufacturers (ODMs) were having a greater influence on the implementation of data centers than traditional OEMs. This was on account of the growing purchasing power of hyperscale customers — organizations looking to build big, but also in modular segments that scale more rapidly to changing traffic demands.

Those observations about expenditures have since been validated by financial analysts. It makes sense that the biggest customer in a given market would set its rules, so when that customer’s preferences are motivated by design factors, those factors should take precedent in the market.

History informs us that innovation in any market starts at the premium tier and trickles down from there. “Hyperscale data centers are innovating solutions because of the sheer scale,” Joe Skorjanec, product manager at Eaton, writes in a note to Data Center Knowledge. “They have the buying power to justify custom solutions from vendors, so more often the innovation is occurring there first.”

Yet as other data center design practitioners and experts tell us, just because hyperscale is associated with “big” doesn’t mean it’s necessarily a premium tier. It is a way of implementing resource scalability on a very large scale, and it has been championed by the largest consumers of those resources. Facebook — among the world’s largest consumers of data and the resources that support it — is responsible for catalyzing the Open Compute Project, which has produced what many consider the formula for hyperscale computing.

There’s nothing about scalability as a science that lends itself – or should necessarily gravitate – to the biggest customers. Theoretically, the concept should apply to everyone.

Does it? Has hyperscale impacted the design and implementation of all data centers, everywhere in the enterprise, in the way that conventional wisdom anticipated?

“I think there’s a couple of things that are not so readily apparent, and maybe not so obvious that a lot of people talk about it,” said Yigit Bulut, partner at EYP Mission Critical Facilities. “One of the things most notable from my perspective is the whole approach to risk, reliability and redundancy, as it applies to the data center infrastructure and design. Hyperscale, just by necessity of economics, scale, and fast deployment, really has challenged on multiple levels this whole notion that every data center has to be reliable and concurrently maintainable. Because of that fact, it’s allowed the enterprise designers and operators to rethink their approaches as well.”

This is the point of impact, where the hyperscale blueprint collides with the standards and practices of the enterprise dating back to the client/server era. The influence is undeniable, but the directions that enterprise data centers have taken — which, in turn, define the data center industry as a whole — may not be what anyone predicted.

To read the rest of the article, please fill out this form:

 
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish