Will Growth at the Edge Shrink the Core?

When a human population spreads out, it tends to spread thinner. Will computing capacity follow the same pattern?

Scott Fulton III, Contributor

August 12, 2020

6 Min Read
Will Growth at the Edge Shrink the Core?
Two people use the VR experience at Claude Monet – The Immersive Experience Exhibitionat Teatro Degli Arcimboldi in July 2020 in Milan. VR is often touted as one of the main use cases for edge computing.Francesco Prandoni/Getty Images

You have no doubt already read the vendors’ claims. The trend of placing computing capacity nearest to where data it’s supposed to crunch is stored or nearest to the app or content distribution point to end users will soon transform the topology of data centers and networks that connect them to one another. Edge computing is said to be the greatest decentralizing force in all of computing: the pull toward optimum proximity to maximize performance and efficiency.

So what’s the new order of things in colocation data centers? If this transformation is a relocation of assets from the core to the edge — from centralized to decentralized facilities, such as branch offices or micro data centers — shouldn’t we be seeing existing centralized data center deployments shrinking?  You’d think increasing footprint at the edge would decrease footprint in the core.

Data Center Knowledge posed this question to several well recognized experts in wholesale and retail data center colocation. In as close to a unanimous response as this correspondent has ever seen, the answer we heard was no.

“That is a potential, likely outcome,” remarked Scott Mills, global VP of solutions engineering at Digital Realty. “It’s not always the case.”

Decentralization could potentially lead to less centralized IT assets for an organization – at some point down the road – Mills conceded. It’s not an unreasonable conclusion. But as he perceives things, the math is not that direct.

Related:Azure Edge Zones: Microsoft’s Plan to Dominate Edge Computing and 5G

“This notion that I used to have ten cabinets in one place, where really it’s now two cabinets in five locations… I don’t know that the straight-line math always works that way,” said Mills. Consolidation of workloads is also a force that comes into play, he added, reducing the amount of remote communication that comes between applications and services by allowing them to share the same servers they’ve always shared.

“It could lead to reductions in space and power spend, absolutely,” he told us. “It doesn’t always correlate to that, however. I think what you are seeing with these distributed architectures [is] this notion of mixed deployment.” In such a setup, data-related activities are allowed to congregate centrally, while the distributed network is deployed to better manage costs associated with ingress and egress — activities that consume bandwidth.

“I think it’s a little bit of a mixed bag,” responded Matt Senderhauf, who directs interconnection products for CoreSite — effectively agreeing with Mills’ characterization of the service mix.

Related:Local Break Out (LBO) and Its Role in Edge Cloud

“Customers making their first move out of their on-prem data center may very well be looking at just a single location,” Senderhauf continued. “But as we have conversations with them on what their ultimate goals are, what their strategy is, why they’re looking to make the move, that’s when we can start talking about potentially looking at multiple locations for either disaster recovery-type solutions or, depending on the types of applications they’re running, colocation closer to their consumers.”

On its web site, CoreSite characterizes edge computing as “top-of-mind for business leaders” and a principal driver in designing effective colocation architectures for customers. That said, Senderhauf told us, performance at the top of that mind jostles for space with concerns about security, cost, and lack of control.  A 2019 survey of the company’s own customers cited these latter three factors as the main things preventing them from choosing colocation in their edge computing deployments.

Maybe what’s top-of-mind doesn’t necessarily drive customer organizations’ decisions.  As Senderhauf acknowledged, “We kind of steer that conversation.”

Today's Edge Works Just Fine

Bill Long, Equinix’ senior VP of product management, explained the quandary colo customers may find themselves facing: “There are efficiencies of scale, both from a cost basis as well as an operational simplicity basis. If a network is going to have more points of presence, there needs to be enough value to warrant the inefficiencies that you get by having lower operating scale, as well as complexity, because you’re operating more nodes rather than less.”

Long sees customers’ networked applications as demanding 30 milliseconds to 40 milliseconds of round-trip latency — what he calls “the sweet spot.”  Right this moment, he asserted, as much as 80 percent of the US population can be served through Equinix data centers performing with less than 10 ms of round-trip latency — without having to build newer or closer facilities to try to drive that latency down even further.

“We think, given the latency budget that most of these applications actually have,” said Long, “the aggregated, interconnected edge that Equinix already has with our present locations actually serves a lot of the current need.” The classes of applications that may require less than 10 ms, he said, are certainly cool, but they’ll only come to fruition in the future. How far in the future, however, is unclear. “They’re not here today,” he said.

“Everybody understands the virtue of the edge,” remarked Todd Bateman, managing principal of IT-oriented real estate advisory firm Intelligence & Strategic Advisors. “But we’ve seen very few users — even in the hyperscale space, but especially outside the hyperscale space — with a delineated plan for what that’s going to look like.”

Bateman was formerly the long-time VP and North American agency practice leader at CBRE’s Data Centers Solutions Group.

Hyperscalers, he said, may be among the folks defining “the edge” for the rest of the world. Yet their view of this world, he believes, is different from that of the people hearing the various definitions of “the edge” for the first time. “For a lot of users,” he stated, “the reality of what the edge means outside of concept is a blank spot.”

Certainly, there’s plenty of “ethereal talk,” as he called it, about 5G, the Internet of Things, and autonomous vehicles. And it certainly seems that there will be network connections between servers, applications, and users. But if we’re being honest about things, Bateman said, the lines, circles, and arrows connecting these networks have yet to be drawn.

“If you like a baseball analogy, we’re still a couple of innings away from seeing those plays be run,” he said.

“Those companies that have a lot of data traffic, they have to think about where they’re putting it,” Chris Brown, CTO of the Uptime Institute, told us. “And that’s why I think we’re not seeing a huge increase in edge computing – because most companies don’t need it yet.”

Brown confirmed the trend Equinix’s Long noted. Customers are moving IT assets into regional colos, which are not at the edge but at least toward it. However, the drive to move data to a specific point on the map, he believes, has yet to be justified.

“We continue to expand to consume even more data, more bandwidth, and more capabilities,” Brown said, citing the abundance of Wi-Fi connectivity in everyday appliances. “And I think that we’ll continue to expand and consume those resources to the point where more and more companies will be driven to that edge.”

As the number of interconnected, ordinary devices continues to increase, the consumption of bandwidth is certain to follow suit. When that happens, said Brown, locating facilities closer to the customer edge will look more and more attractive to the organizations harvesting those devices’ data. “Just the amount of traffic is going to drive more and more companies to be at the edge,” he stated. “But right now, we’re just not there yet.”

It’s tempting to conclude that an industry isn’t going to lend credence or excitement to a trend that could work against it. Building facilities in remote locations to replace revenue generating space already owned in a central location may not seem like a sound business proposition, from that angle. But Bateman’s and Brown’s observations lend accuracy to the colo leaders’ points of view: The edge, in a sense that’s big enough to start meaningfully affecting the colocation business model, remains unpredictably far in the future.

About the Author(s)

Scott Fulton III

Contributor

Scott M. Fulton, III is a 39-year veteran technology journalist, author, analyst, and content strategist, the latter of which means he thought almost too carefully about the order in which those roles should appear. Decisions like these, he’ll tell you, should be data-driven. His work has appeared in The New Stack since 2014, and in various receptacles and bins since the 1980s.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like