By 2025, more than 40% of enterprise storage will be deployed at the edge, which is a significant increase from 15% in 2022. Also, large enterprises will triple their unstructured data capacity stored as file or object storage on-premises, at the edge, or in the public cloud by 2025 compared to 2022.
Infrastructure and operations leaders are expanding storage beyond the traditional data center towards the public cloud and the edge to deliver data services close to or at the location where the data is generated or consumed rather than in a centralized environment. In that way, companies will no longer have to wait for data transmission times across a wide area network to or from a central point, allowing for enhanced efficiency and productivity while also enabling network resilience and location flexibility.
The move towards edge storage systems is being driven by the explosion of data created by IoT devices and the movement toward distributed remote work. “Organizations that have embarked on a digital business journey have realized that a more decentralized approach is required to address digital business infrastructure requirements,” says Santosh Rao, senior researcher at Gartner. “As the volume and velocity of data increases, so too does the inefficiency of streaming this information to a cloud or data center for processing.” That’s where edge computing comes in.
With IoT devices, edge systems are part of a continuous flow of data across an ecosystem and allows vast amounts of data to be immediately captured and ingested before then being streamed to the data center or cloud. Organizations may also process the raw data at the edge and only forward a subset of analyzed or enriched data to the central site to achieve better sustained connectivity and efficiency.
Moving the data center to the edge will also permit companies to leverage the benefits of content and data collaboration across an increasingly distributed workforce. Interconnected edge storage systems can increase employee productivity and enable more efficient information sharing between employees, clients, and third-party partners to drive real-time content collaboration and enable innovation at an increasing pace. This is especially utilized in follow-the-sun development and design teams where collaborators are spread around the globe.
However, while the technical rationale for deploying edge systems is compelling, some enterprises making the shift are unprepared. This article provides the best practices for preparing enterprise data center storage deployments to be edge ready.
Transitioning to Edge Computing – What Are the Concerns?
Many experts regard distributed cloud/datacenter as the future for all enterprises since it can help solve challenges like bandwidth limitations, network failure and congestion problems, compliance hiccups due to regional data storage requirements, higher application latency and delays, and scalability issues resulting from high computational processing.
Industry experts also suggest that bandwidth costs approximately four times as much as storage per gigabyte, based on AWS pricing. Therefore, stripping redundant data at the edge before sending it to your data center can work magic in saving costs.
But, as with all evolving technologies, evaluating, deploying, and operating edge computing comes with its fair share of challenges. The issues come in many forms, with the key ones relating to security, cross-platform integration, and data replication. Certainly, edge computing has diverse use cases and unique workloads. Additionally, this distributed architecture comes with unique infrastructure requirements, which introduces the potential for issues in IT management, cost, information security, and resilience factors.
Towards this end, Infrastructure and Operations (I&O) leaders must determine the actions needed to optimize their IT operating models to avoid challenges that may affect their transition to the edge. For instance, what storage architectures and hardware or software solutions best suit the new use cases and workloads? What technologies should you use for replication and deduplication? How can you enhance security and data protection in edge environments?
Best Practices to be Edge Ready
Based on the challenges I&O leaders may face while readying their data center storage deployment for transition to edge, some of the best practices to avoid such pitfalls include:
1. Consolidate and Integrate Decentralized Operation
As your enterprise grows and changes, it’s natural for the tech stack to follow suit. However, if you lack a strategic approach for this IT growth, it can sprawl and get complex. For instance, shadow IT can sneak into your transition plan from data center deployment to edge computing. In its worst form, your infrastructure gets so dispersed you are not sure you have full visibility of your IT environment, let alone identifying and addressing arising problems.
Consolidating and integrating your stack can prevent the snowball effect that weakens and makes your infrastructure less manageable when moving to the edge. I&O leaders need to ensure centralized management, control, and coordination of decentralized operations even if cloud, data center and edge storage systems are different. Consider your edge as a combination of disconnected storage systems, applications, and software infrastructure. Consolidation and integration offer increased flexibility, where you can shift and reallocate resources efficiently to achieve desired business outcomes.
2. Use Necessary Technologies for Replication and Deduplication
One of the most significant data management challenges in edge computing is deploying robust replication capabilities that minimize reaction cost and time across the distributed environment. Ideally, data is replicated in real time to achieve the maximum gains in deploying distributed edge computing systems. The solution should also meet the desired levels of timeliness, loss tolerance, network bandwidth consumption, and latency performance. Ultimately, such a service should also allow multi-site file access with automatic failover between replicated edge sites and data centers.
3. Automate at the Edge
Automation improves performance and enhances digital experiences at the edge by taking over manual, mundane, and complex tasks. For example, I&O leaders can address edge storage concerns by using an automation engine to manage the continuously increasing data created and consumed at the edge. Automation can discover data that is no longer required for business operations and shift it to the datacenter or replicate it to cloud storage to pave way for new, fresh data. Ultimately, automation allows your IT team to focus only on what needs to be placed at the edge.
4. Put Security at the Forefront
Enterprises must put security at the core of the edge architecture implementation. It’s essential to build security from down up and extend it across all areas of the IT environment.
A Gartner market report recommends establishing appropriate data protection levels to limit attack vectors like ransomware, data exfiltration, and double extortion tactics.
Enterprises can enhance security through encryption techniques that make data inaccessible. Additionally, you can explore zero-trust solutions that ensure the authentication of all users, devices, and traffic flow on the network. It’s also vital to create role-based policies and implement them across all aspects of the edge network.
Technology solutions deployed at the edge should have the same security measures and must integrate an edge security plan into the entire company’s security strategy.
5. Deploy Edge Storage Solutions That Address Unique Edge Workload Requirements
Remote management, data security, file and block and object protocols, data persistence, data protection, and public cloud integration are also key considerations. It is also vital to choose a storage topology that can scale up and down to handle massive data types and volumes and deliver critical capabilities without inhibiting performance.
From a hardware perspective, edge storage solutions need to be robust and secure devices. For instance, such storage hardware should be engineered to withstand deployment in volatile environments. Everything from external housing to internal components should be tested and validated to run efficiently at the edge.
6. Use Compatible Solutions
While edge storage solutions are uniquely designed for their specific workloads and requirements, I&O leaders must simultaneously ensure that edge systems work seamlessly with data center and cloud environments as well. In many instances, the vendors providing edge storage solutions may be different from data center and cloud vendors. When this is the case, data interoperability and compatibility are key concerns that must be thought through to ensure that data does not become siloed in any one environment. Fortunately, there exists a market of third-party vendors that can interconnect mixed on-premises and/or hybrid cloud environments into a cohesive distributed system that can be efficiently managed under a common control plane. This would provide for maximum flexibility to choose the optimal solution for each workload while maintaining centralized control of decentralized operations.
Undoubtedly, edge computing solutions are increasingly helping enterprises respond to growing data volumes in their traditional computing environments. This is because the technology allows the processing and storage of data to the extremes of a network, close to the production or consumption points. In turn, such capabilities reduce the cost and latency of transferring data to central data centers. In addition, processing locally and sending a portion of significant information to a central data center reduces bandwidth demands. This is certainly desirable as it does not overwhelm the network connectivity and infrastructure.
Edge systems, however, come with unique requirements and challenges. Some of these concerns include data security, replication, and cross-platform integrations. Fortunately, enterprises can use appropriate technologies for replication, automate network operations, deploy appropriate storage solutions, and enhance security to ensure their data centers are ready for migration to the edge.
About the Author
Jimmy Tam is the CEO of Peer Software, a global software company focused on simplifying file management and orchestration for enterprise organizations since 1993. Jimmy is a 25-year veteran of enterprise software solutions and works with customers and partners daily on architecture, planning and design of IT infrastructure solutions that meet the complex demands of data storage, access, protection, and sharing across distributed employees, partner firms, and customers.