The Cloud, As Seen in 2002

2 comments

Is cloud computing a new way of computing? Or a new buzzword for an elderly concept? Mashable takes the latter position, asserting that cloud computing is an old idea with a new coat of paint.

Here’s a relevant tidbit from the Wayback Machine: In March 2002 I covered the IMN Forum on Carrier Hotels and Internet Data Centers in New York. One of the panels focused on outsourcing, and where it might be headed. Here’s an excerpt:

At the IMN show, IBM’s Bob Hinckley articulated the company’s vision for “utility computing,” envisioning a future in which packaged information technology services will be metered and delivered to customers much like electricity, gas and water. “One of the things that will drive demand will be the migration of IT services to this type of utility model,” said Hinckley. “Only recently have we had the solid, industrial-strength base technology that will make this work. It is the enterprise customers that we believe will validate this model.”

In touting a utility model, IBM seems to be projecting a future in which managed services can be commoditized – at least by IBM, which may possess the financial heft and operational scope to deliver such packaged services to customers.

Some observers have puzzled over IBM’s recent cloud computing initiatives, especially since the company isn’t operating a consumer-facing utility cloud similar to Amazon or Google. Big Blue has been thinking about this model for a very long time. Most recently its been a model in which IBM provides the hardware and data center expertise for clouds, rather than being the toll collector on the Information Superhighway in the clouds.    

IBM wasn’t the only tech titan represented on that panel at the 2002 IMN panel, either:  

At least one large competitor is treading more cautiously. “I don’t think I see the market focused as fast on utility computing,” said Mark McKenna, the general manager of solutions at Hewlett Packard. “We really believe in (the future of utility computing) at HP. “But to me, we’re still in the black and white days of TV,” McKenna added. “I think there’s a lot of maturity we’re yet to see in this market. The ASP industry is a step toward utility computing. It will eventually get to the point where you will connect and have IT delivered. The utility model is a megatrend. Will utility computing get us out of this recession? No. Will utility computing be a major trend 10 years from now? Absolutely.”

At the time, I felt as though I’d stumbled into an episode of The Jetsons. It looks like we’re somewhat ahead of McKenna’s timetable.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)

2 Comments

  1. "Cloud" aka "Utility" aka "Centralized" computing is definately an old concept. All it took was some web2.0 companies to realize that they don't know how to run a datacenter and that they should outsource it. The problem with that is they don't know how to pick companies to provide reliable service either. What passes for "reliable" these days is a joke.

  2. I feel it is a little of both. The packaging and running of hardware and services for a company is not new, however the new twist with cloud computing in my opinion are the advances made in virtual server instances, unlimited storage capacity, the ability to scale at real time, etc. To be able to get away from all the administrative work of filesystems, file permissions, database administration, etc and replace those with cloud services like what amazon has to offer is the real ticket/key to cloud computing.