Skip navigation

Cloud Computing Trends to Watch in 2012 (Part 2)

Yesterday we brought you part 1 of Cloud Computing Trends to Watch in 2012. Today we continue with the second half of our predictions for 2012. Here's some of the trends our prognosticators see for cloud computing in the year ahead: a sharper focus on cloud management, growth in the use of cloud for disaster recovery, opportunities for systems integrators and colocation providers, and Big Data driving changes in network topologies as it moves into the enterprise.

Antonio Piraino, ScienceLogic

Antonio Piraino is CTO of ScienceLogic and previously tracked the cloud computing industry as an analyst for The 451 Group.

Cloud management moves up the priority list: Management always seems to be a secondary concern when hot new technologies come to the fore. Cloud computing is no different. This is not just about managing cloud resources but managing service delivery across both data center and cloud infrastructures. Harnessing these expansive, decentralized and fluid virtual environments to gain visibility and control is challenging but critical. Both cloud service providers and enterprise IT pros will be looking to separate fact from fiction among the management vendors’ claims in 2012.

Operational Business Intelligence expands: A popular use of cloud computing resources today is Big Data and business analytics, but the IT infrastructure itself can also be a great source of business intelligence if cultivated properly. This largely untapped resource will become one of the true differentiators of business operations in 2012, especially as the underlying infrastructure continues to become less and less differentiated in terms of baseline functionality. Correlating the right data points, especially around IT operations, can help organizations make more informed decisions about human resource needs, capacity planning, future spending and ability to increase business productivity.

New topologies rise: While data center and server consolidation is a big endeavor of the federal government and many large organizations, there is ironically a parallel trend at work for data to follow the applications, and applications to follow the users of those applications. The reason is so that compute cycles can be consumed as logically close to the users as possible (akin to content delivery networks). This is driving a hub-and-spoke style topology of compute power, where some applications and databases will reside centrally, and more agile applications reside in a more dispersed environment. Understanding the topology needed will help organizations make more informed decisions about where to host their applications in 2012.

Darryl Brown, Telx

Darryl Brown is Director of Cloud, SaaS and Media at colocation and interconnection specialist Telx.

In 2012, we will see the accelerated deployment of dedicated, hosted clouds in colocation facilities. We’ll see more enterprises turn to off-site data centers for storage of heavily used data, such as business intelligence data, or enterprise data resources that are leveraged by both on-site and mobile employees. Within these colocation sites, enterprises will gain access into private cloud environments, reducing the costs associated with the continuous backhauling of that data. These enterprise data center customers will want the most choice in cloud providers and even more choices in how to connect with them.

We will also see these third-party colocation facilities leveraged more as WAN hubs and gateways. There is going to be an ongoing evolution from private hosting to private cloud, in order to deliver privately hosted services for better utilization of the existing hardware assets. I think medium to large enterprises will be early adopters, and then smaller enterprises will follow suit.

What will be key for the widespread adoption of this model will be the evolution of “shrink wrap,” turnkey, private clouds – essentially, making the enterprise hardware/software mere appliances. Think of this as the Apple approach to enterprise cloud computing – simple, easy to use, easy to manage – packaged hardware and software together to deliver a fully integrated cloud at a cost-effective price point. In 2012, cloud will become a commodity: an integrated “business in a box” that will accelerate adoption, increase time to value with internal IT. Cloud computing will be a technology and methodology, not a product.

Jelle Frank van der Zwet, Interxion

Jelle Frank van der Zwet is the cloud marketing manager at Interxion where he is responsible for managing the Cloud Hubs, Interxion’s sizeable and fast-growing cloud communities.

2011 marked a significant year for cloud computing, and with half of Gartner’s top predictions for IT organizations and users for 2012 and beyond centered on the cloud, it’s certain that next year will be another critical year for the cloud industry. Although well-known IT providers offer public clouds and have moved some companies to the cloud, it is time for a more focused group to lead cloud deployment, giving companies a more individualized solution that works for them.

Because of their deep experience in infrastructure optimization and the trust they have already established with many organizations, system integrators (SIs) have the power to be the real cloud migrations agents in 2012, enabling significant uptake in the cloud. SIs are known for their personalized customer service and tailored service-level agreements (SLAs), which many public cloud providers simply can’t offer. While some organizations may believe these big-name IT vendors are the right companies to assist them in their transition to the cloud, they may not be the answer for organizations moving to the cloud, as they tend to be a one-size-fits-all solution.

SIs are likely already working with organizations that will be targeting to achieve costs and efficiency benefits of migrating to the cloud. So, it’s a natural fit for them to drive next year’s movement. For SIs that would like to offer cloud services but may not be ready to build and roll out their own underlying platforms, they can test the cloud in a data center testing lab environment, which allows them to develop services before fully transitioning their customers to the cloud.

2012 will be an important year for cloud computing, and with their experience, flexibility and long-standing relationships, SIs can lead the way.

Treb Ryan, OpSource

Treb Ryan is CEO of OpSource, Dimension Data’s wholly-owned enterprise cloud and managed hosting business. Prior to co-founding OpSource in 2002, Mr. Ryan was President of the Americas for Metromedia Fiber Network (MFN). He joined MFN from SiteSmith, a company that he co-founded in 1999 and ultimately sold to MFN in a deal valued at $1.4 billion.

Managed private cloud
2011 brought wide-scale growth in adoption of public cloud services, not to mention an explosion in providers. Enterprises found public cloud models especially beneficial in test and development environments, where the pay-as-you-go model realizes cost savings quickly. These initial deployments whetted IT departments’ appetites for larger, enterprise-wide adoption in 2012, with a managed private cloud at the core.

CIOs desire a turnkey, automated private cloud from which they can service the cloud computing needs of the entire organization, while maintaining data sovereignty on-premise. Moving forward, they’ll quickly find that the complexity of creating and managing such an infrastructure is more demanding than expected. To launch an internal private cloud, enterprises will look to private cloud service providers to assist them with maintaining the benefits of the cloud model by managing the infrastructure for them. Data will remain on-premise, reducing risk and addressing compliance concerns, while outsourced management keeps costs low.

Business Continuity: Moving Disaster Recovery to the Cloud
IT system downtime costs North American businesses $26.5b annually, however most organizations have not developed a disaster recovery plan capable of handling catastrophic events. CIOs have found DR plans can be costly, time consuming and built to address only specific situations. As more enterprise applications and development tools move to the cloud, so will the need to develop a secure mechanism for preventing disruption in business. As companies evaluate their cloud spend for 2012, disaster recovery will be at the top of the list for many.

By placing DR in the cloud, organizations can scale cloud computing resources on the fly, only paying for computing resources consumed. With cloud-based solutions, organizations could build a second site managed separately and with all the cost benefits of the cloud. In 2012, it will be critical for cloud offerings to include security, performance and operational components necessary to be true enterprise-class solutions. This enables IT to experience dramatic cost savings while maintaining stringent business continuity objectives.

Eric Webster, Doyenz

Eric Webster is Chief Revenue Officer of Doyenz, provider of cloud-based disaster recovery solutions. He has 15+ years of experience in the area of high availability and business continuity in management and executive roles at Oracle, Ricoh, NetBrowser and XOsoft.

As the cloud move towards the disaster recovery market, organizations will have greater expectations around application availability and intuitive web-based management. In 2012 we can expect to see the following shifts take place:

1. The Demise of Cloud Storage
Many businesses think of cloud storage as a hard disk in the sky, when in fact the only functionality cloud storage provides is data backup and retrieval. We believe companies will look beyond cloud-based storage solutions to a new generation of recovery as-a-service technologies, focusing on replication and recovery of production environments in the cloud.

2. Companies Will Demand Availability of Applications, Not Just Data
As we all know, companies are increasingly relying on software applications to run their business. While this increased dependency on apps seems like a simple concept, it is making them more vulnerable than ever. With this shift in mind, it's important that organizations demand not only data recovery capabilities, but full application recovery.

3. Access to Recovery Environments from Any Device, Anytime, Anywhere Becomes Critical Requirement
In the cloud, it's all about instant and scalable access to your data. We believe that these consumerized cloud computing expectations will influence requirements of service providers. In 2012, organizations will require vendors to provide intuitive web management that enables instant access to production applications from any device.

Bruce Tolley, Solarflare

Bruce Tolley is the VP of solutions marketing at Solarflare, which specializes in application acceleration for the most demanding scale-out compute environments.

In 2011, the world was abuzz about Big Data. During 2012, we will see Big Data re-invent itself to become ready for prime time deployment in Fortune 500 enterprise IT shops. What’s the reason behind our thinking? Data center and storage professionals do not have the leisure to integrate 20 different open source technologies to deploy one data mining application. These IT managers will increasing look to the vendors to deliver appliances that integrate well with existing data infrastructure and whose systems are certified, pre-configured and optimized to scale from small two to four node proof of concepts to racks of 20 to 1000 servers.

On that note, with the growth of Big Data applications, there inevitably comes an increase in network traffic. Technologies like MapReduce attempt to collocate the data to be processed close to the network nodes doing the processing, thereby conserving network bandwidth and making access faster because it is local. Compute nodes can become idle though when network I/O becomes a bottleneck. To overcome this, we’ll see an increase in paired solutions, such as 10GbE networking combined with data decentralization. Such integrated solutions enable decreased latency, increased application performance and faster message rates. This collaboration will also help customers more effectively install and run Big Data applications, moving them closer to the network, where the best performance happens.

We have only touched the tip of the iceberg in the growth of Big Data technology. The amount of data in our world is expanding daily, as seen in the continuous growth of social networking, Internet search, financial trading and similar, data-intensive applications. 2012 will not only bring an increase in Big Data in the enterprise, but in networking technology that helps alleviate the problems associated with the influx of data.