Chad Dunn is Vice President of HCI Product Management for Dell EMC.
Last year, adoption of hyperconverged infrastructure (HCI) continued to rise in an unprecedented fashion. Businesses of all types deployed HCI just about everywhere—from the edge, to hosting core data applications, and to clouds of all types.
The breakneck speed at which data is being generated and accumulated is one of the primary catalysts for such a broad evolution, and there are no signs of a slow-down in sight. Massive data quantities need to be collected, stored, managed, moved and, ideally, analyzed—all with the end goal of extracting value. This is causing many organizations to evaluate their existing infrastructure to stay competitive or even just keep up with the pace.
HCI has become the IT delivery model of choice for most organizations taking a hard look and modernizing their data centers. In the past, most businesses considering HCI were implementing new workloads. Today, that number is equaled by those planning a tech refresh. However, businesses should be turning to HCI to prepare for impact as the data tsunami makes landfall.
From Overhead Expense to Business Enabler
By 2020, the digital universe—the data we create and copy annually—will reach 44 trillion gigabytes, and, without HCI, there’s a good chance that much of this data would be difficult to access or cost-prohibitive to store. As the amount of data organizations collect continues to increase exponentially each year, HCI can provide businesses with the ability to scale at their own pace, while reducing operating costs and providing flexibility as the foundation for multi-cloud approaches.
Organizations also do not necessarily have to be large to reap big benefits from HCI. A prime example is Boys Town, a growing, mid-sized non-profit that previously struggled with older, legacy infrastructure. The organization’s infrastructure was not only underperforming under the weight of their rapidly-growing data, but it was also expensive to service and cost prohibitive to scale. When it ultimately became time to modernize, they opted for an HCI system, which saved them costs on maintenance and scaling. Most importantly, the new technology’s efficiency created value by allowing them to do more in less time–which means they can help more people more quickly with less overhead.
Operational Hub for Multi-cloud Deployments
The realization and benefits of a multi-cloud world have been inevitable for years, and thanks to burgeoning data inundation it’s finally here. A recent IDC survey points to more than 80 percent of respondents repatriating data back to on-premises from public clouds. Based on conversations with our own customers, who frequently cite cost and performance as the primary driving factors, we expect this trend to continue.
As organizations arm their on-premises clouds to support and optimize IT infrastructures for multiple cloud types, they are looking for solutions that provide optimal performance, flexibility and, ultimately, management consolidations/simplification—all of which only HCI offers. During the next year, we’ll see more customers turning to HCI as operational hubs for multi-cloud approaches, prompted by the need to ensure that data and workloads are stored and managed in environments that best suit the changing needs of the business.
Accelerates Data Actionability
With 5G devices slated to hit the market this year, there will be a warranted change in the data game in terms of speed and accessibility. As data growth from IoT and AI begins to hit us at increasing rates, it will be expensive and cumbersome to bring the entirety of this data back home. Instead, organizations can take-in data at edge locations, determine the insights that can be gleaned from it and discard the excess. With a “hub and spoke approach,” organizations are enabled to bring back only the insights to the core hub, where they then can act on them. Having HCI, which can start very small but scale quite large, is fantastically useful because users can put the appropriate amount of compute power analysis at those edge locations.
Currently, retail is the most common HCI/edge use case due to the proliferation of IoT connected devices, like surveillance, sensors and RFID, with an emphasis on local analysis and business insight. The data compute and memory demand out in multiple locations where these devices are located is wildly dynamic.
Enables Cloud Native and Container Adoption
Another step organizations are taking to manage the data deluge and their operations is to look strategically at how their applications and workloads are being developed. A key enabler for this is the emergence Kubernetes (K8s) as the ubiquitous infrastructure feature for container management and orchestration. They can determine whether to transform them into cloud native workloads either to operate in a more agile manner, or in a DevOps model. HCI is the ultimate deployment platform for containers and a cloud native approach being enabled to support not only existing workloads, but offer a Kubernetes dial tone anywhere, whether at the edge, core or cloud.
While we’ve seen newer companies adopt this model from the start, we’re going to begin seeing more mainstream customers adopt this model by rewriting and refactoring workloads in this way to leverage the scalability and reliability that HCI offers.
Brace Yourself, the Data Tsunami Is Coming
As the data tidal wave rolls in, it’s exciting to see the way it is driving human progress and transforming businesses—but preparation is key. Keeping an eye on the forecast and planning accordingly is not just a question of thriving, but also of surviving.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.