Keeping Pace With the Cloud: How Enterprise Data Centers Can Compete
September 30th, 2013 By: Rich Miller
ORLANDO, Fla. - Developers and end users want the speed and convenience of cloud computing. If companies aren’t prepared to deliver cloud-style services from their own data centers, their users will seek them out from public cloud providers.
That creates a challenge for data center managers, according to Shannon Poulin, vice president of Intel’s Datacenter and Connected Systems Group. Poulin was the keynote speaker at the Data Center World Fall conference, and came bearing good news: new developments in both hardware and software can help rearchitect enterprise data centers for a service-oriented world.
Poulin said the rise of cloud computing has placed additional pressure on data center managers, who must sort out the cloud equation and remain relevant in a fast-changing environment for IT service delivery. It means balancing competing calls to be both more secure and more nimble.
Raising the Bar for Internal IT
“Business expectations are being altered by the rapid deployment of consumer services,” said Poulin. “There’s a lot of pressure to to look at cloud environments. If I’m going to compete with a public cloud provider, I have to get better. That’s how to keep workloads on premises. We have to deliver services much more quickly than we currently do.”
Poulin said most CIOs want the economics of a public cloud, but they want it on premises. Keeping workloads on premises isn’t easy when any developer with a credit card can launch virtual servers on Amazon and a plethora of other cloud providers. Intel speaks from experience on this issue.
“Last year Intel spent $5 million at a public cloud provider,” said Poulin. “We have our own cloud environments, and we have a very clear policy, yet we still see this rogue cloud usage. It’s probably happening in your company as well.”
Software Defined Everything
The growth of cloud services are part of a broader shift towards a “software-defined data center.” But getting there will require new architectures, and new approaches to storage and networking.
“We believe the underlying infrastructure of the data center must be more cloud-friendly,” said Poulin, who said virtualization has made great headway on the server, but has a way to go in storage and networking. “In networking, there’s not much separation between hardware and software, between the control plane and the data plane.”
The rise of OpenFlow and other software defined networking (SDN) technologies have laid the groundwork for a shift in the networking space. On the storage side of the house, there’s a growing trend towards tiered storage, which sorts data into hot, warm and cold buckets based on use profiles. This allows data center operators to segment their storage infrastructure, reserving expensive high-end hardware for priority data while shifting “cooler” data to commodity platforms.
Matching Processors to Workloads
These new architectures allow data center managers to squeeze more performance and efficiency out of their infrastructure. Poulin says Intel is working to match its processor offerings to a range of uses to support specialized computing. But these new approaches also require tools to manage and automate infrastructure, speeding the delivery of new offerings and allowing the creation of service catalogs.
“You need to have policies and some level of automation, so you can deploy in minutes instead of months,” said Poulin. “You have to provide tools, or else your users are going to go to a public cloud provider. We want to get to the point where these private clouds are competitive with public clouds.”
The shift to a cloud-powered, service-driven IT world provides both opportunities and challenges. Companies must decide what to outsource, and what to keep in-house.
“There’s going to be a shakeout,” said Poulin. “The companies that can use IT to solve their challenges will win.”
As a data centre consultant I would like to receive this newsletter and contribute to forums