Building A Cloud-Savvy Model for TCO and ROI
April 1st, 2013 By: Industry Perspectives
Ravi Rajagopal, Vice President at CA Technologies, has led and managed organizations that delivered innovative and practical technical and business solutions for corporations and governments around the globe.
Economic benefits almost always lead the argument for moving to cloud computing. We’re told many things: cloud is cheaper; cloud frees up IT resources; cloud reduces capital expenditures; cloud allows organizations to scale with demand.
Maybe it does. Maybe it doesn’t. The only way to make an informed decision, backed by a solid return on investment (ROI), is to first understand the total cost of ownership (TCO) for your current and planned cloud infrastructure in advance of any cloud adoption.
This is obvious, right? But you might be surprised to learn that many large organizations commit to cloud computing without really knowing their TCO and projected ROI. It’s not that they’re irresponsible and ignoring this requirement. It’s that the tools most IT teams use to evaluate TCO and ROI are inadequate for application to the cloud.
An Improved TCO Model
That’s why I set out to create a better TCO model. In addition to my work at CA Technologies, I also teach at NYU. One of my classes is about managing the cloud. When I first taught the class three years ago, I heard lots of assumptions such as the cloud is not secure or it’s less expensive. These statements were nearly always based on opinions and word-of-mouth buzz.
I engaged the class in researching the topic, with an eye towards developing the tools IT leaders need to get objective insights about cloud computing. We worked to develop a complete view of the cloud, beyond just the technical pieces. The result is a new approach that takes a business view of cloud computing by considering the economics and measuring its business value.
Simply put, TCO changes for the cloud because the cloud changes IT’s business model. Cloud computing has taken the information technology silo and made it a business service. And from the standpoint of TCO analysis, this adds complexity, because the cloud can be both a function of, as well as an alternative to, in-house IT resources.
For example, in the pre-cloud era, IT was simply a department of function. You could calculate the IT department’s cost, break it down using whatever algorithm you wanted to, and allocate cost back to the business units.
But today IT, and its cost, is the function of many business units (including IT itself). The business units need to have visibility into their costs, plus a clear understanding of the value they’re getting from these expenditures.
Unless the organization understands its total IT costs across all domains in the organization, it’s hard to arrive at an apples-to-apples comparison between what you’re spending in-house versus what’s available in the cloud.
A Wide-Ranging Perspective
To analyze cloud TCO, you must use a comprehensive view of your entire infrastructure and all services being provided by it, for it, or running on it, whether in the cloud, on-premise, or legacy. Only then will you be able to make an informed decision based on an accurate understanding of your total IT costs.
Not long ago, McKinsey reported that moving to the cloud caused companies to spend around 25 percent more than they would otherwise for the same services. As you can imagine, this caused a controversy, as it ran contrary to what cloud service providers were saying.
Once the study’s methodology was explained, however, what was happening became clear. Organizations were moving to cloud while keeping their legacy infrastructure in place. That’s fine if you’re piloting cloud or want to keep your options open, but it’s not a strategy to reduce cost.
This is a key point about cloud TCO that many organizations miss. If you don’t make the right choices and changes when using cloud computing, you’ll end up adding services and cost to the infrastructure. Vendor promises of cost savings go right out the window.
What’s Your Embedded Costs?
It’s hard for many organizations to get a handle on the true cost of an application because there are so many embedded costs: servers, OSes, the network, electricity, the real estate, personnel, and more. Does moving an application to the cloud shave those costs? How do you remove the infrastructure cost from the total cost associated with the application?
Part of the cost of an application is a service cost, which is visible and obvious. You can go into Salesforce.com and measure it on a per-user basis. But what’s not obvious is the associated infrastructure cost that’s needed, and what’s being done in the legacy environment.
If you’re not diligent in removing that piece from your analysis, you’re going to run into cost issues. You would still be incurring part of the cost already in the legacy system which will not be eliminated, and you’re incurring additional costs from a SaaS perspective.
These are just a few examples of how a better model for cloud TCO can help managers get quantitative analysis of cloud costs with no subjectivity. And as I mentioned earlier in this post, we’ve taken these insights and have started building a new model for determining the total cost of ownership of cloud services.
Much of our research is now embedded in a spreadsheet which I am planning to make available to customers. I’ll be blogging about these efforts as we refine the model over the next few months, sharing what we’ve learned as well as your feedback on the findings. Stay tuned.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.