Lex Coors is Chief Data Center Technology and Engineering Officer for Interxion.
Earlier this summer, Facebook broke ground on its fifth data center, a Fort Worth, Texas-based facility powered entirely by renewable energy and cooled using outdoor air. This is the tech giant’s latest energy-efficiency endeavor under the Open Compute Project (OCP), which Facebook and its partners launched in April 2011 to "develop the most efficient computing infrastructure possible."
Data centers constructed based on initiatives like those developed within the OCP are notable because they rely on design principles that use industry standards as the base but innovate from there, yet they generate long-term results. To again cite an example from Facebook, the company designed a data center using custom-designed servers and racks, power supplies and battery backup systems, and was able to generate energy savings of 38 percent and cost savings of 24 percent.
The key to the success of these prominent facilities is that, by challenging the status quo, they actually surpass industry standards around design and build, energy efficiency, information security and more.
So, what can data center owners do to achieve similar results?
The ultimate goal for enterprises when constructing in-house data centers is to consider industry standards to be merely a foundation upon which they incorporate new innovations, which should then lead to increased operational efficiency and continued high availability.
However, while the figures generated by Facebook and its web-scale peers are impressive, they aren't necessarily achievable for the average corporate data center that follows industry standards to the letter at the expense of design excellence and efficiencies. This is prevalent among enterprises that build Tier IV data centers in order to comply with stringent security, legislative and governance requirements. It’s also becoming increasingly difficult for data center managers to deliver future-ready infrastructure, while under the pressure of declining budgets for design and build.
That is why many enterprises are turning to colocation providers that comply with Tier III as a minimum requirement , including multiple Tier IV features, while achieving 20 to 30 percent better PUEs. Typically, colocation providers build multiple large-scale facilities each year – versus one every 8 to 10 years in the enterprise world – and most importantly, use standards as a foundation for innovation to help enterprises run their data centers more efficiently.
The History of Data Center Standards
Industry-wide data center standards were not established until 1995, when the Uptime Institute published the whitepaper, "Tier Classifications Define Site Infrastructure Performance." With that document, the data center industry had guidance around how to connect data center uptime with business service delivery.
Within the Uptime Institute's tier system, data centers are assigned progressive classifications based on performance of core infrastructure elements.
The Uptime Institute isn't the only purveyor of these standards. The Telecommunications Industry Association (TIA) and Building Industry Consulting Service International (BICSI) also contribute the blueprint that data center operators follow as they design and build their facilities. On an ongoing basis, they also provide a methodology for comparing the capabilities of different data centers, along such metrics as operational procedures, levels of redundancy, energy efficiency and fault tolerance.
The conventional wisdom is that data centers that follow these standards will achieve their operational goals, all while reducing costs and minimizing waste; but for some data centers, strict adherence to industry standards may no longer be to their ultimate benefit.
One prominent example of innovation paying dividends focuses on a colocation provider that, in 2010, went as far as to overhaul the Uptime Institute's standards around the way Uptime had prior to 2010 to show how to achieve similar Tier IV statistical availability in favor of an alternative design. This led to savings – for a generic 1,000 m2 data center – topping out at €3.3 million in construction and replacement CAPEX, as well as nearly €250,000 in annual energy cost savings.
The IT landscape has evolved to the point where new data center approaches like this are needed.
New Standards to Meet New Demands
Today, data centers see efficiency as more of an imperative than ever before because they're managing input from an increasing number of connected devices, and they need to support more data-heavy computing activities. Big data, cloud, social media mobile – it’s all made computing environments much more complex. Everyone knows the numbers – that 90 percent of the world's data has been created in the last two years. All that data needs to be managed and stored somewhere.
It's against this backdrop that in-house data center managers are being told to upgrade facilities to ones that are future-ready, even amidst declining operational budgets. What often happens then is that data center managers fall back on industry standards during the design and build process, but what many don't realize is that in doing so, they may be inadvertently creating for themselves higher costs and less efficiency. Many, especially in regulated industries, probably don't realize that the efficiency they lose in trying to follow these standards is entirely preventable.
Design excellence is within reach, but the equation doesn't need to include line-by-line compliance with industry standards as long as all local rules and regulations are met. There are ways to innovate beyond these standards, while maintaining efficiency and high availability, and reducing operating costs in the process.
What a Next Generation Data Center Looks Like
Time and again, innovations in data center design – beyond those that standards call for – have been proven to generate efficiency in a facility's operation, from the millions saved in CAPEX and OPEX to continued high availability. Particularly among colocation providers, which are able to roll in the newest efficiency-generating improvements to the multiple large-scale facilities they build each year, the benefits of deviating from industry standards can be considerable.
Take, for example, the use of sprinkler systems. While the TIA recommends pre-action sprinkler systems, innovative data centers have instead turned to new methods like gas suppression for fire protection. These systems manage fires more efficiently and less likely to damage IT infrastructure. Some data centers have also stopped using PVC-coated cables (to avoid extreme additional damage to IT infrastructure in a fire) and emergency power-off systems.
As these examples show, it's possible for data centers – even those that don’t match the scale of web-scale companies like Facebook and Amazon – to go beyond industry standards to increase efficiency and lower costs, all while maintaining security and governance requirements and high availability. Savvy enterprises that choose colocation providers who are emerging as industry leaders in challenging the norm to improve and innovate upon typical design, build and operate requirements will gain all these benefits.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.