Kyung Mun is Principle Analyst for Mobile Experts.
Back in college, I took an engineering class where I had to write a FORTRAN program and take a stack of punch cards (like the one shown below) to a computer room to get those cards processed to run an application. I recall at the time how archaic that whole process was – “writing” a program via punch cards and then physically walking over to a computer room located several buildings away to run the application. Mind you, this was during the PC era, and desktop computing was common. That experience was eye-opening for me to understand the days when computing resources were scarce.
A computer punch card (source: Wikimedia Commons)
Back to the future to today: The situation is much different, of course. All of us carry powerful smartphones (i.e., computers) in our hands, and we can seemingly run an infinite number of applications via app stores.
In today’s cloud computing era, the premise is that computing is cheap enough that enterprises can consume IT as a service – rather than procuring hardware and software separately. Putting them together, and operating and managing them has been the norm for decades. To make this commercially viable, the cloud computing model requires massive scale to bring down the cost. Amazon Web Services touts 22 regions (with more in the pipeline). Assuming three or more giant data centers in a region, that represents less than 100 data centers worldwide! Of course, each of these mega data centers is huge – 30-40 MW of capacity. With power and cooling being significant cost factors, oftentimes, these mega data centers are located in remote places where land and power are cheap.
Today, the cloud providers, network operators, and enterprises are looking to extend the cloud computing model to the edge (i.e., Edge Computing) for various reasons:
• Lower latency and bandwidth savings – with cloud environments closer to the users, application responses can be quicker and cost savings from reduced data transport can be realized;
• Enterprises want hybrid cloud – while the flexibility of cloud computing is undeniable, many enterprises want the flexibility of retaining control and security of running business-critical applications in-house; they want the flexibility of leveraging both private and public cloud environments depending on specific use cases;
• New revenue opportunity for network operators in the transition to network virtualization – network operators are virtualizing their networks to tackle the OPEX challenge and see multi-access edge computing (MEC) as an opportunity for new revenue generation from enterprises via edge computing and cost savings from network virtualization; and,
• Political and regulatory factors – with more frequent incidents of security and privacy breaches, many countries are regulating where and how data can be used; these privacy and data sovereignty aspects will lead to more distributed data centers.
The determination of where the edge should be will be driven by differing use cases. Edge computing for mobile operators may differ from edge computing for enterprises use cases. The private LTE and 5G use cases will drive different locations for edge computing facilities – whether on-premise, near edge, or far edge of the network. While the initial choices for hyperscale cloud locations were driven by cheap power, land, and tax breaks, the locations of mobile edge clouds will primarily be driven by proximity to people and enterprises.
With edge computing, network operators have an opportunity not just to connect, but to offer computing services. Enabling cloud services will require close partnerships with traditional IT software companies, major cloud providers, and many others. People aren’t going to go to a computer room to run applications. Computing will be done where they are (in places where high-speed connectivity and edge computing facilities are readily available).
Today, punch cards are quaint and nostalgic. Someday, the typical mom-and-pop manufacturing company will run its industrial automation through the cloud on some kind of smartphone app. Then, we will perhaps look back at today’s crude robotics with dedicated compute platforms with nostalgia.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.