Posted By Industry Perspectives On April 9, 2013 @ 8:59 am In Industry Perspectives | No Comments
Mike Goodenough is global director of cloud field engineering for savvisdirect  at Savvis, a CenturyLink company and global provider of cloud infrastructure and hosted IT solutions.
Far back in the mists of time–at the pre-dawn of the digital age–scientists and engineers huddled in small rooms, devising ways to move electrons in such a manner as to represent the operations of logic upon the physical domain. No more would the slide rule be the sword of innovation and discovery. The future of the world rested firmly in the whirring circuits of that which would become: the computer.
Computers weren’t things that rested on one’s desk. No, they were mighty monoliths erected as harbingers of mathematical, scientific and engineering feats that recreated the world of humanity in its own, towering image. To be master of the computer was to hold the destiny of the universe in your hand.
A scant few years later, though, things changed.
Where once a high degree of science was required to understand, navigate and create meaning from numbers, figures and formulas in a swarm of electro-mechanical interactions, now even high-school dropouts could construct powerful systems and arrange them strategically to illuminate the fabric of world commerce.
At some point along the way, the science was lost. When we abandoned the mainframes for personal computers and servers, math succumbed to convenience. Physics became a distant memory. And the culture that at once feared and admired the messengers of technology knelt down to worship tiny fruit-based entertainment.
In this fashion, the corporate environment shifted. Enormous tasks could be tackled with stacks of small machines, instead of acres of memory core. Simplicity became the watchword, and when computer science was replaced by information technology, so too was knowledge traded for the total cost of space and power.
But now cloud computing returns us full-circle to where we began, with enormous data centers housing collective systems so massive that a million businesses can fit inside. The science of “big computers” is here once again.
Bringing cloud into the mix changes the nature of modern business computing. Most recently, you would measure IT in terms of physical CPUs, disks, networks, blades and other manifestations of technology to be managed, replaced and amortized—entities of their own designs. However, the budgets with which these eventualities were addressed became so compressed that they began to collapse in upon themselves, ultimately resulting in an explosion of outsourced infrastructure.
The equation thus mutated from purchasing power to operational effectiveness. Services now deliver what once was provided by legions of staff. And budgets for computers and software are instead shifted to create meaningful, lasting value for the enterprise. With the business itself rejoining this computing equation, the service model provides a solution to an evolutionary change in commercial mechanics.
When you compare it to the corporeal world in which we live, computer services are the energy in a universal system of hardware matter. Energy is mobile—it can transfer states, add or subtract properties, and alter its surroundings—yet exists regardless of its bindings. Cloud services are likewise portable, divorced from the platform on which they operate. Moving cloud energy around to satisfy the demands of a continually changing business environment does not end in chaos, but merely reflects the subtle alteration in technical trajectory.
This is the promise of the future data center. Cloud meets the requirements of the operating system budget. What was once implemented with CapEx servers costing $7 per day is delivered for $3 per day in the OpEx cloud. Your operational parameters go from return on investment (ROI) on assets to value realized from capabilities.
Indeed, cloud is more than just cost efficacy. It moves beyond hardware implementation and software licensing, and is instead quantified by the value of the services being provided. And while the basic variables in the algorithms of the cloud remain—compute, memory, storage and network—they become hidden by a universal architecture that focuses on the what, not the how.
IT is not about defining “what is the cloud.” Rather, it is deriving value by conjoining business principles with technology innovation. How does the equation fit you?
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process  for information on participating. View previously published Industry Perspectives in our Knowledge Library .
Article printed from Data Center Knowledge: http://www.datacenterknowledge.com
URL to article: http://www.datacenterknowledge.com/archives/2013/04/09/quantifying-the-cloud-bringing-the-science-back/
URLs in this post:
 savvisdirect: http://www.savvisdirect.com/
 guidelines and submission process: http://www.datacenterknowledge.com/industry-perspectives-thought-leadership/
 Knowledge Library: http://www.datacenterknowledge.com/archives/category/perspectives/
Copyright © 2012 Data Center Knowledge. All rights reserved.