Public Cloud Adoption in 2014 Fueled by Hybrid Solutions and Private Patching Tech
December 27th, 2013 By: Industry Perspectives
Robert Jenkins is the CEO of CloudSigma.ROBERT JENKINS
Ford’s recent “And is Better” ad campaign ponders how much worse things in life would be if people were forced to choose between two features when clearly the better option would be to have both (i.e. sweet or sour chicken or black or white photography). It’s a humorous ad with a message that applies perfectly to the cloud. Why settle for private or public alone, when you can have the best of both worlds thanks to private patching-enabled hybrid solutions? In 2014, choices will no longer be limited to one or the other, showing that and is truly better.
2014 will be the year of the hybrid cloud solution. As these hybrid solutions gain momentum, they will fuel more widespread public cloud adoption in enterprise circles than we have ever seen before. When you think about it, this has actually been years in the making. We’ve just now reached that tipping point where hybrid solutions will be able to deliver the best of what cloud has to offer, while chipping away at the things that have prevented many enterprises around the world from fully embracing it.
The next 12 months will lay the foundation for the cloud’s sustained growth. In fact, IDC predicts a 25-percent surge in cloud spending among enterprises in 2014, including software, services and cloud infrastructure. And, critical to that future will be private patching technology. Private patching will allow enterprises that deal in highly sensitive information, such as those in the finance and healthcare sectors, to connect their private cloud infrastructures directly with the VLANS of their public clouds.
To understand why hybrid solutions and private patching technology will be so important to the success of the public cloud over the next year, we first have to look at the longstanding roadblocks that have been in place.
Public Cloud Security – The first and most obvious challenge for public cloud has been security — specifically, data leaks and digital assaults by cyber criminals. These are two problems that organizations handling sensitive data don’t want to struggle with, and sending that data out across public IP certainly runs that risk. In fact, according to the InformationWeek 2013 Cloud Security and Risk Survey, of the respondents with no plans to use public cloud services, 58 percent cite security as the reason. Even among those respondents already using or considering using the cloud, security is still the number one concern for 52 percent. So, as an alternative, many often opt for private clouds instead.
However, by directly connecting private and public cloud infrastructures, enterprises can avoid those risks and, thanks to private patching technology, maintain more granular control over their own security measures. Now, their cloud environments will be as secure as companies choose to make them, rather than having to work within the limitations of traditional public cloud offerings.
Changing the Economics of Cloud – Hybrid cloud computing is about rationalizing the purchasing and planning of both private and public infrastructure. Currently when companies consider purchasing public cloud resources, they often don’t simultaneously reconsider their existing and planned private infrastructure purchasing plans. Hybrid cloud brings those two worlds together and asks the question, how can we release value by coordinating the two strategies? Implemented correctly, a harmonization of private purchasing and location plans with public cloud procurement can unleash significant hidden value. Firstly, public IP costs can be reduced through a direct hybrid connection as well as through network-as-a-service offerings being rolled out by public cloud operators, changing per line costs into per GB usage charges. Secondly, consolidated hosting arrangements can also bring down the cost, not only of infrastructure through sensible public cloud usage, but also through cross subsidization of private environments cohosted with public cloud providers. Data center providers are also getting in on the action, assisting in this coordination by offering low cost access to ecosystems of providers such as Equinix’s Platform Equinix product.
Disaster Recovery – An extension of the data security issue is disaster recovery. Whether it stems from a manmade source like cybercrime or a natural disaster like a hurricane, the loss or corruption of data can have dire consequences in any industry. Private patching allows for an entirely new level of data portability than previously possible, greatly simplifying the disaster recovery process and saving organizations untold amounts time and money. The disruptions caused by system outages and unexpected downtime cannot be completely wiped out, but private patching technology can certainly make sure that the effects of such disruptions are kept to a bare minimum by making data easily recoverable.
Deployment Restrictions – This brings us to the day-to-day functionality of private patching-enabled hybrid solutions. What people want most out of public cloud environments is accessibility, scalability and functionality. What they don’t want are operating system (OS) and application restrictions that limit what they can do in the cloud and how they can do it. The hybrid and public cloud solutions that free enterprises to do as they wish, rather than handcuff them to ways of doing things that they are neither comfortable nor familiar with, will help drive the widespread public cloud adoption that is to come in 2014.
What’s more, because these arbitrary OS and application restrictions can soon be relegated to nothing more than a footnote in cloud history, enterprises will be able to exactly mirror their on-premises and private cloud environments in the public cloud with greater data portability thanks to private patching. This eliminates the need to have two dedicated teams tasked with managing different cloud environments that use different configurations, reducing the overall cost of ownership for the enterprise and making public cloud environments that much more appealing.
All of this adds up to more options for enterprises, more control over their public cloud environments, fewer restrictions, better security and lower overall costs. Essentially, the cloud in 2014 will be all of the strengths hampered by none of the weaknesses.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
I agree that “by directly connecting private and public cloud infrastructures, enterprises can avoid those risks and, thanks to private patching technology, maintain more granular control over their own security measures”, but sensitive data may not be secure enough to meet regulations. The trend is that more and more types of information is covered by different privacy laws and the enforcement activities are now escalating. Current threats to data and escalating regulations are rapidly changing the security landscape.
It is also interesting to see how organizations are desperately looking for effective ways to comply to new stringent privacy regulations, also when offshoring data. I reviewed an interesting offshoring project in Europe that addressed the challenge to protect sensitive information about individuals in a way that will satisfy European Cross Border Data Security requirements. This included incoming source data from various European banking entities, and existing data within those systems, which would be consolidated in one European country. The project achieved targeted compliance with EU Cross Border Data Security laws, Datenschutzgesetz 2000 – DSG 2000 in Austria, and Bundesdatenschutzgesetz in Germany by using a data tokenization approach, protecting the data before sending and storing it in the cloud.
The fact is that you are actually using somebody else’s computer when using public cloud computing. In many popular public cloud environments, my data is NOT under my control, NOT in a computer within in my organization and potentially NOT in a country or location that I know about. My Data may NOT even be stored or processed in a compliant way in an accepted country, by a 3rd party and/or cloud provider. I may not have information about who can access my data, maybe administrators or other tenants. I may be sharing disk, memory and other infrastructure components with parties that I don’t know about. They maybe stealing my data.
Therefore I think that all sensitive data should be encrypted or tokenized before it is sent to the cloud. Below are a few words of guidance from the payment card industry, PCI SSC. The guidance is applicable for all sensitive data that is sent to the cloud.
If you outsource to a public-cloud provider, they often have multiple data storage systems located in multiple data centers, which may often be in multiple countries or regions. Consequently, the client may not know the location of their data, or the data may exist in one or more of several locations at any particular time.
Additionally, a client may have little or no visibility into the controls protecting their stored data. This can make validation of data security and access controls for a specific data set particularly challenging.
In a public-cloud environment, one client’s data is typically stored with data belonging to multiple other clients. This makes a public cloud an attractive target for attackers, as the potential gain may be greater than that to be attained from attacking a number of organizations individually.
I found some good news in an interesting report from the Aberdeen Group that revealed that “Over the last 12 months, tokenization users had 50% fewer security-related incidents (e.g., unauthorized access, data loss or data exposure than tokenization non-users”. The name of the study is “Tokenization Gets Traction”.
Ulf Mattsson, CTO Protegrity
@Ulf Of course what you say regarding security makes total sense. One thing to point out is that at the IaaS layer is not so problematic regarding the dating homing issues and location clarity issues you highlight. In fact at CloudSigma we explicitly run every cloud location via a local company (our overall holding company is Swiss). All customer data is stored in the data center specified and never moved. Clarity regarding data location and jurisdiction tend to affect PaaS and SaaS layers more for sure.