Skip navigation

Busting Three Edge Computing Myths

As organizations begin eyeing a move to edge computing, various misperceptions are clouding their potential migrations.

Jason Collier is Co-founder of Scale Computing.

Every day millions of machines and objects are connecting to the Internet for the first time, and companies are challenging legacy architecture by changing how we look at our cloud infrastructures through edge computing. In fact, Gartner anticipates that more than 40 percent of enterprise IT organizations will utilize an edge computing strategy, an increase of one percent from last year.

In today’s world, edge computing continues to lead industry discussions, as more sensors, mobile devices and powerful applications drive data at the edge of our networks. More companies continue to put computing resources at the edge of the network for close proximity to the devices that generate data and insights. 

As organizations begin eyeing a move to edge computing, various misperceptions are clouding their potential migrations. Here are three myths to consider as they relate to edge computing resources.

Myth #1 - Edge Computing Is Resource-Intensive

Despite edge computing requiring on-premises resources outside a typical data center – the resources required can be minimal. Full or even small data centers at the edge are not necessary to connect and process data at the edge of a network.

Edge computing is the processing of data at the edge of the network where information is generated where the remote primary data center or cloud have limited capabilities. By putting compute sources next to the sources that collect data, we can dramatically improve our responses to events like cybersecurity breaches or take advantage of real-time changes in the market and consumer behaviors.

Computing infrastructures can be as small as an IoT device, or an infrastructure as large as a micro data center of multiple compute appliances. Think of it in the context of remote office or branch office computing, however with edge computing, resources can be adjacent to manufacturing systems, medical equipment, point of sales and IoT devices.

Myth #2 – Edge Computing Doesn't Require Change  

Edge computing might require multiple network providers and connectivity points to support full loads from the edge data center. The diversity and redundancy are critical that if there’s a failure or loss of a network provider, organizations can still deliver the same high-quality service. With edge computing, compute sources can run from cellular base stations or nearby metropolitan networks. 

Building out an edge network means changing the way you manage and run data centers. Organizations’ systems are no longer in large, easy-to-access buildings with on-site operations teams. With hardware deployed in modular housings on remote sites that take time to reach, organizations have to build something that’s more like a cellular network.

Network performance within or near a data center is often taken for granted because of the standard high-availability connectivity and power systems. But on the edge it’s an absolute necessity.

Myth #3 - Edge Computing Is One-size-fits-all

It’s hardly surprising that some vendors will tell you they can offer an easy path for this new form of networking combining compute and storage on the edge. Edge data centers aren’t one-size-fits-all — an installation could be anything from a single server, to a self-contained rack to 20 or 30. Whatever the size, they need the right equipment. But rather than viewing an edge data center as an inexpensive and small infrastructure, think of each individual node as a data center that must be designed and tested to support business needs.

Edge computing environments are small enough to operate without dedicated IT staffs. But to operate in a low-maintenance fashion, the infrastructure needs to be easy to implement and manage, as well as easily connected to the primary data center, or cloud as needed. Most data centers require onsite staff that works in shifts to maintain equipment. That’s not possible with edge computing because you’re managing multiple small data centers in a diverse set of locations, as well as your data center assets.

This arrangement requires remote monitoring and a significant amount of automation. Redundant hardware might be needed to address access issues. Edge computing applications need to be self-healing or capable of failing over to nearby nodes or data centers to maintain service levels. So far the industry has yet to establish an abundant amount of best practices in this space. We’re all still somewhat in a trial and error mode, but once we crack the code and perfect this approach, the computing landscape will be an entirely different world.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish