6 Min Read
Data Center Knowledge logo

Today’s companies are often faced with the complex decision of whether to use public cloud resources or build and deploy their own IT infrastructures. This decision is especially difficult in an age of mounting data requirements when so many people expect limitless access and ultra-flexibility. For these reasons, cloud computing has become an increasingly popular choice for many organizations – though not always the right choice.

According to Right Scale’s 2017 State of the Cloud Survey, 85 percent of enterprises have a multi-cloud strategy.

Common reasons for using public cloud resources include scalability, ease of introductory use and reduced upfront costs. In many ways public cloud usage is considered the “easy button.”

However, cloud computing does present many drawbacks that often come back to haunt users, such as skyrocketing fees, poor performance and security concerns. The decision between using a public cloud versus owning your own infrastructure is not so different than deciding between renting and a buying a home. It is a choice between controlling your own environment versus living within someone else’s domain. The home owner (or public cloud provider) reaps the benefits of equity gains while the renter continues to pay someone else’s mortgage.

When Clouds Break, Not Everything Is So Sunny

Some enterprises choose cloud services for their scalability. Although the pricing model for cloud computing is “pay for what you use” and buy more computing power as your business demands, this may not be as cost-effective as it initially appears. Organizations that move significant amounts of data should think twice before moving to the cloud because bandwidth prices can be all over the map. Bandwidth prices for cloud computing can easily rack up unexpected costs, especially when dealing with performance and security setbacks.

In fact, trading algorithm developer Deep Value estimates that using Amazon EC2 is 380 percent more expensive than having an internal cluster. insideHPC.com’s Rich Brueckner described the claim that cloud services could lower the cost of high-performance computing as being “owed more to marketing hype than to reality.”

The big three providers, Amazon EC2, Microsoft Azure and Google Cloud Platform have massive buying power, however between them the average bandwidth price is 3 to 4 times higher than colocation facilities. Amazon does offer the ability to reduce some of the expense by offering reserved instances. However, this can cripple a company’s cash flow. With all these added expenses, on premise hardware starts to make more financial sense, especially for businesses with predictable workloads.

Many IT departments choose cloud services to improve access for remote employees and find an inexpensive solution to run data center applications. Clouds have also been praised for handling non-sensitive needs and non- mission-critical data that are not utilized for long periods of time. They can be ideal for start-up or business-to-consumer companies who are not working with large amounts of sensitive data. However, public cloud services have always been large targets for breaches and attacks, especially if they are well- known and frequently used services like Amazon EC2. For Amazon Web Services

Users’ security can be automatically affected by anything that happens to their cloud provider. Additionally, anyone using a public or shared cloud can experience a data breach and information loss through no real fault of their own. This drawback makes public clouds a potential nightmare for enterprises running mission-critical applications with high-availability and compliance or regulatory requirements.

Cloud customers also encounter problems with performance and reliability. Using a public cloud means sharing a network with potentially “noisy neighbors” that hog resources. Given that cloud service providers have servers in several dispersed locations, users can also experience latency issues and are often forced to pay exorbitant fees for a solution to these issues. AWS customers, in particular, experience performance glitches with server virtualization. In contrast to bare-metal hardware users, public cloud customers have limitations on resource availability and may struggle with application performance and data transfer rates. The solution from service providers – pay more money for premium access – only adds to the headaches.

When Is It Time to Deploy Infrastructure?

Several factors determine when it is time to leave a public cloud and deploy on premise infrastructure. At the top of this list is the monthly spending statement. For some companies the $30,000 monthly Amazon bill is the threshold at which it is time to consider a move from the public cloud. For other companies, the decision to move away from a public cloud is tied to performance and security issues. In-house infrastructure allows companies to control their computing environments to best ensure application success—leading to performance that is much more predictable and consistent. As a result, deploying on premise infrastructure is best for high-scale IT environments that process a lot of data, especially if that data is consistently growing. Organizations also have the option of building dedicated servers for workload-specific needs.

In a Wired magazine article, Dropbox outlined their journey leaving Amazon. According to Dropbox’s Vice President of Engineering, Aditya Agarwal, “Nobody is running a cloud business as a charity. There is some margin somewhere.” Companies working at a large enough scale can save huge sums of money by cutting out the public cloud provider. Companies should also be cautious that Amazon, with arms in such a vast number of industries, may in the future offer online applications that will compete against those of their own customers, something Dropbox noted as part of their decision making. Giving a potential competitor insight into your business model, applications and customer base could be a risky endeavor.

The Solution

The conversation about where and how to store an organization’s data is a hotly debated topic in IT. In some cases, the solution is a combination of services. While some enterprises may be better suited for in-house servers, others could certainly benefit from using a public cloud. Just as frequently, an organization’s IT needs can be met by a hybrid of the two, such as implementing in-house servers to handle standard traffic which can burst to the cloud for additional capacity.

However, organizations need to seriously consider their own IT needs outside the hype of the cloud, placing focus particularly on specific applications. This includes prioritizing requirements for processing, performance, storage, security, data transfers and, of course, determining how much they are willing to spend on all factors.

There is an unmatched value in building and owning your infrastructure instead of living in someone else’s environment. Companies should take a long-term approach of making their IT environment an asset rather than a cost center. Spending today on in-house infrastructure results in equity, and ultimately leads to long-term profits that will support the growth of not only the IT organization but the larger business of which it is a part.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like