Disruptive Forces Shaping the Next Generation of Data Centers

Jack Pouchet<br/>Emerson Network PowerJack Pouchet
Emerson Network Power

Jack Pouchet is Vice President of Market Development for Emerson Network Power.

Traditionally, the data center has evolved in response to technology innovation—mostly server-based—and the pace and direction has been somewhat predictable. Disruptive forces such as cloud computing, sustainability, cybersecurity and the Internet of Things are driving profound IT changes across all industries and creating opportunities and challenges in the process.

The data center, an enabler of disruption in many instances, is not immune. These forces are causing new archetypes to emerge that will change the data center landscape and improve productivity, drive down costs and increase agility. Four of these archetypes, in particular, will have a profound effect on the data center.

The Data Fortress

Cyber attacks have disrupted some of the world’s leading companies as our increasingly connected world creates more and more openings for hackers. The cost and frequency of these data breaches continues to rise, despite the billions spent annually on digital security. A Ponemon study of data center downtime commissioned by Emerson Network Power found that the number of security-related downtime incidents rose from 2 percent in 2010 to 22 percent in 2015.

As a result, organizations are beginning to take a security-first approach to data center design, deploying out-of-network data pods for highly sensitive information—in some cases with separate, dedicated power and thermal management equipment. The next wave is the purpose-built, cold-storage facility with massive storage arrays protected by heavy investments in security systems and protected from access by all but authorized networks.

The Cloud of Many Drops

The reality of cloud computing today is that many enterprises are buying computing to bring applications online faster and cheaper, at a time when in-house computing resources are underutilized. Despite virtualization-driven improvements, too many servers still remain underutilized – some studies indicate servers use just 5-15 percent of their computing capacity and that 30 percent of all servers are “comatose.”

We see a future where organizations explore shared service models, such as those being applied to everything from personal taxi service to legal counsel, to tap into this unused capacity on-demand – and even sell the unused capacity on the open market. This shared services approach could even result in increased enterprise server utilization, extended life for existing data centers that move toward a self-support model, and the ability for enterprises to build new data centers based on average rather than peak demand.

Fog Computing

Distributed architectures are becoming commonplace as computing at the edge of the network becomes more critical. Introduced by Cisco, fog computing is a distributed computing architecture that connects multiple small networks into a single large network. Application services are distributed across smart devices and edge computing systems to improve efficiency and concentrate data processing closer to devices and networks.

This provides a more efficient and effective method of dealing with the immense amount of data being generated by the sensors that comprise the Internet of Things (IoT). It also allows data to be aggregated and filtered locally to preserve bandwidth for actionable data.

The Corporate Social Responsibility Compliant Data Center

Energy efficiency continues to be important for an industry with seemingly limitless consumption needs, but other drivers—most notably an increased focus on reducing carbon footprint among some organizations—are pushing the focus toward sustainability and corporate responsibility.

Many organizations, including colocations and cloud service providers, will take a more aggressive approach to data center efficiency—adopting, for example, cooling with maximum economization and UPS systems that use active inverter eco-mode to provide high efficiency—while also pushing for increased use of alternative energy, such as wind and solar, to power data center operations and achieve carbon neutrality.

Speed, cost, security, sustainability, application availability and productivity must all be factored into future data center archetypes as data center operators deal with disruption from inside and outside their organization. Software-defined management will increasingly provide the flexibility organizations need to move away from single-instance data centers in which all data and applications are treated with the same level of resiliency and security while adopting new technologies and practices being pioneered in the archetypal data centers. This will enable organizations to evolve to the data center ecosystem of the future capable of accommodating all of these disruptive trends through a multimodal model in which the environment is tailored to specific needs of the data, applications and users it supports.

By working with application owners to meet their specific needs, data center operators will have the opportunity to build on their role as service providers to become trusted advisors.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Add Your Comments

  • (will not be published)