Loren Zweig is Vice President of Operations for EdgeMicro.
Twist endings are great fun in movies, whether it’s hearing why Haley Joel Osment has been talking to Bruce Willis for the past two hours or spilling your popcorn when Luke Skywalker hears who his dad is. But twists aren’t just for the movies. Sometimes they can happen in the tech industry, and I think we’re about to see an unexpected twist in how large-scale edge implementations are going to unfold. Maybe not as big as Darth Vader saying, “I am your father,” but it’s still a heckuva curveball.
Before we get to that, though, let’s talk about what the prevailing narrative is for edge computing. In other words, the plot so far in the first 89 minutes of the movie.
If I were to ask you to describe what a typical edge implementation would look like and how it would be used, what would come to mind? My guess is you would picture a standalone micro data center, perhaps at the base of a cell tower or alongside a telecom building with ample underground fiber. And my guess is you would picture its primary duty as solving latency issues by moving popular content and services closer to consumers and businesses.
Is that a pretty good match for what you are envisioning? If yes, then you’ve been doing your homework on edge trends because that has been the predominant vision for the edge that has been discussed in hundred of articles and dozens of panel sessions at conferences—including panels I may have been on and articles I may have been quoted in. After all, that is exactly how I was envisioning most edge implementations to unfold. It also happens to be how most companies that are planning edge implementations began talking about the edge.
But here comes the plot twist. That vision for edge deployments may be how conversations started, but the conversation has shifted dramatically as engineers at influential companies discovered they can use edge data centers to solve needs they hadn’t anticipated during the first 89 minutes of this feature.
Twist #1: Unburdening and De-Risking Core Data Centers
Yes, edge data centers will play a vital role in lowering latency for delivery of things like Game of Thrones episodes to HBO Go subscribers and delivery of important web-based applications used by corporate employees. But the bigger story is that major tech companies are viewing edge data centers as a way to protect their entire IT infrastructure.
One way edge computing provides protection is by unburdening the strain on core data centers and network architecture. They were not designed for how big the demands of mobile computing are, including everything from streamed entertainment to IoT networks. To serve that need, edge data centers are being designed to be true workhorses rather than just outposts with some Netflix shows and data caches for popular apps.
Edge computing also addresses the emerging concern that centralized data centers create: when outages or other incidents at a single facility can threaten the entire company or even the economy. Edge data centers will be used to de-risk companies' and countries' data center infrastructure by creating a de-centralized web of assets that ensure the resilience of the cloud no matter what happens at a core data center.
Twist #2: Massive Throughput for a Massive Job
Fiber connectivity has also been discussed as a requirement for successful edge deployment, but there’s connectivity and then there’s connectivity. One of the biggest surprises I've noticed in the pilot projects of customers I am working with is the significant bandwidth they are seeking in order to support the volume of work they are envisioning for edge data centers.
As I discussed in Twist #1 above, it is a direct function of the more significant role that edge data centers will be playing, Yes, fiber connectivity is a must for delivering-movies-and-gaming-and-services vision for the edge, but using edge data centers to unburden and de-risk core infrastructure requires connectivity that is a magnitude larger than what has typically been discussed for edge deployments. I am seeing companies requesting dual 100G fiber connects for their micro data centers in order to handle the kind of throughput that they are envisioning for one rack of equipment. That is serious connectivity that one would assume would be for a larger data center, but it’s being set as the uppermost default connectivity requirement for micro data centers that will be deployed in the dozens and eventually hundreds.
Twist #3: Clusters Rather than Lonesome Installations
In stark contrast to the traditional vision of edge data centers as standalone outposts—doing yeoman’s work of reducing latency for select services and applications—the edge strategies of many influential companies may actually be based on micro-campus models that cluster edge data centers in key locations. Yes, there will be a place for the truly standalone box at the edge, but micro-sized campuses may be the preferred model for many companies.
One reason micro-campuses may be advantageous is the massive connectivity discussed above isn’t available on just any old street corner. Significant connectivity is available in very specific locations, and edge data centers will cluster around aggregation spots that already have access to the right bandwidth and power. These campuses will also have economies of scale, allowing companies to quickly add more boxes as needs grow, without the need to start a site selection process from scratch—a complex and time-consuming process for edge projects.
Given the important role edge data centers will play in decentralizing and unburdening core data centers, it will also be important to make sure companies can service them efficiently. A campus model where many assets are clustered together is ideal for everything from preventative maintenance to emergency response to outages.
Twist #4: Nearer than you May Expect
This last twist is a doozy, but it makes sense in the context of the three above. When you envision edge deployments, you probably think first and foremost about secondary markets that are not blessed with Tier 1 infrastructure like northern Virginia and Silicon Valley. Yes, that will be a major focus of edge deployments in order to strengthen service to populations in those markets, but I believe there will be many deployments in the suburbs of Tier 1 markets, even near core data centers.
That wouldn’t make sense in the traditional vision for edge computing, but it makes tremendous sense when you are looking at deploying edge data centers in clusters near major fiber nexuses with an important job of becoming workhorses that unburden and de-risk known core data centers. That means micro-campuses near and maybe even tethered to a core data center might make a whole lot of sense. And that also means clusters in Tier 1 market suburbs that already have lots of traditional IT infrastructure.
These twists are dramatically different than the traditional narrative about what the edge will look like, but it all may make perfect sense in the context of the broader infrastructure needs that companies are looking to solve beyond just lowering latency. And notice I use the word “may” just about in every thought. These might not be plot twists as shocking as learning that Sirius Black in Harry Potter isn’t a bad guy or finding out who Tyler Durden is at the end of Fight Club, but we’re still early in the story of edge computing. Even bigger surprises might be in store for all of us, especially when we get the MNOs to write part of the future screenplay.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.