Skip navigation
Optical cables connected to a switch in a data center

The Future of Edge Computing in the COVID Era

Business applications and data must move as close to the data ingestion point as possible, but it's easier said than done.

julius francis juniper.jpg

Julius Francis is Senior Director of Product Marketing at Juniper Networks

The rise of IoT, 5G, and AR/VR have long been driving the need to bring computing to the network edge. But now, amid the COVID-19 pandemic, the increasing demand for high-speed networks is accelerating at an unprecedented rate. We’re seeing video conferencing and content streaming at record highs, both of which require higher bandwidth and zero-latency data transfer. While this is usually defined as less than 5 milliseconds, in this world of hyperconnected remote work, even five milliseconds is too slow.

Networks have never been more critical than they are right now. Between conference calls and streaming media, service providers can’t afford to have lagging, downgraded resolution, and slower caching. To address this, business applications and data must move as close to the data ingestion point as possible, reducing the overall round-trip time and ultimately allowing applications to access information in real-time.

But that’s easier said than done.

Confronting the Challenges

For service providers in particular, edge computing comes with unique challenges. The proliferation of solutions at the edge means containers are constantly being deployed faster than humans can manage them. While orchestration tools can be used to automate deployment, observability is key in troubleshooting and assuring service in an automated manner. 

After all, any service disruption comes with an outpour of customer complaints, so service providers put pressure on IT teams to address the issues as quickly as possible. While IT already has the information needed to identify the source of the problem and solve it, challenges arise when sifting through reams of telemetry data spread across server components. IT teams need the ability to process the data quickly and gain valuable insights based on visible trends.

A Data-Driven Solution

The solution lies with AI capabilities, specifically machine learning, which powers the orchestration solutions that deliver predictive and scalable operations across workloads. Combining machine learning with real-time network monitoring can provide the insights necessary to power automated tools capable of provisioning, instantiating, and configuring physical and virtual network functions quicker and more accurately than if a human carried out the task. This process also means IT teams can spend their time on mission-critical, higher-value initiatives that contribute to the bottom line.

Bringing AI to the Cloud

Machine learning also has a critical role in application life cycle management at the edge. In an environment that consists of a few centralized data centers, operators can determine the optimal performance conditions of the application’s virtual network functions (VNF). As the environment disaggregates into thousands of small sites, VNFs have more sophisticated needs that must be catered to accordingly.

Because operators don’t have the bandwidth to cope with these needs, machine learning algorithms can run all of the individual components through a pre-production cycle to evaluate how they will behave in a production site, giving operations staff the reassurance that the apps being tested will work the edge.

A Future Consumed by the Edge

As the edge takes off, it’s fundamentally changing the way service providers are thinking about their infrastructures. The edge is increasingly viewed as prime beachfront property, often provided by and managed by service providers, to be optimized with AI and machine learning for almost limitless business purposes. And once this highly immersive edge computing power is unleashed, we will see applications and new workloads coming to the edge that were simply unimaginable just five years ago.

Looking ahead, it won’t just be service providers capitalizing on the edge. Soon, edge cloud environments will be open, secure, and cloud-native with predictive and scalable operations that cater to a broad range of enterprise, consumer, and telco workloads. The edge cloud will have integrated security to reduce the blast radius of any security breaches. And finally, AI-driven predictive operations will be leveraged to manage the complexity of operationalizing thousands of edge locations, creating an enhanced consumer and employee experience.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish