Sachin Chheda is the Director of Product and Solution Marketing at Tintri; he has long been involved in Information Technology with positions at HP, NetApp and Nimble Storage–developing and taking to market products that power some of the largest enterprises. This is part one of a two part series about the mismatch between virtualization and storage.
The IT world is becoming increasingly virtualized. A recent Enterprise Strategy Group (ESG) survey revealed that one third of respondents have already virtualized more than half of their x86 servers. 1 But that percentage is expected to increase significantly over the next few years. Analysts and experts agree that almost all new IT workloads are being deployed in virtual environments.
Many enterprises started their virtualization journey by focusing on their tier-two and tier-three applications. Impressed with the results of their first initiatives, these organizations are actively extending virtualization to include key tier-one applications and end-user desktops, taking advantage of the unmatched flexibility, agility, scalability and availability virtualization can bring to business-critical systems.
This widespread adoption of virtualization is driving a software-defined approach to IT infrastructure — using the flexibility and configurability of software to decide how, when and where virtual machines (VMs) and applications are running and stored. This software-centered design does not tie the data center into any particular configuration; enabling IT to flexibly configure and scale the virtual infrastructure to best serve applications and end users. But true software-defined infrastructure is not possible without a storage platform designed and optimized for the unique needs of virtualized environments.
Although virtualization has improved the performance and manageability of the servers in the enterprise, it has created extra workload for the storage platform. The majority of enterprises embarked on the path to virtualization using general-purpose storage based on LUNs and volumes. Most were happy with the results they obtained while they still had the initial exuberance to tackle the many challenges general-purpose storage presents in virtualized environments. But now they are suffering from inadequate storage performance and the tremendous strain on their already overloaded IT staff.
Storage Capabilities vs. Demands of Virtualization
The storage management burden is due in large to the significant mismatch between the capabilities of traditional storage and the demands of virtualized environments. General-purpose storage was designed to meet the needs of every system and application in the customer’s environment. But by trying to solve a wide range of problems with just one dated approach, the environment effectively becomes “jack of all trades, master of none.” There are four main challenges enterprises face when using general-purpose storage in virtual environments, including increased complexity and management, inadequate storage performance, insufficient data protection, and the disappointingly low ROI of virtualization initiatives:
1. Management complexity: Virtualization has simplified the management of compute infrastructure with VMs, but made storage management much more complex. IT administrators spend an excessive amount of time configuring and managing storage to meet the requirements of the virtual environment. These tasks are complex, error-prone and time-consuming for IT organizations using general purpose storage. Analyst research reports and recent VMware surveys agree that in typical IT environments, two-thirds of all IT resources are spent on management, leaving only one-third for more strategic initiatives. That statistic may actually be lower, considering the additional burden of storage administration on increasingly virtual environments.
2. Storage performance: Virtualization places additional demands on storage performance. IT administrators must ensure proper configurations so performance doesn’t suffer when multiple users’ applications need simultaneous access to shared storage, or there are heavy workloads during crunch times. Some IT organizations try to solve the problem by paying lots of money for very fast, flash-only storage solutions. IT can throw any workload at these systems with decent results, but it comes at a significant price and can still be unpredictable in rapidly changing virtual environments.
Other IT organizations try to improve performance by bolting flash options onto traditional storage systems or by adding disks to existing legacy solutions. With any of these disk-based approaches, enterprises end up significantly overprovisioning capacity, as the storage isn’t intelligent enough to automatically tune itself for various virtualized applications. Scaling storage to meet the growth in virtualization can also be a challenge. Deploying additional storage systems to meet growing performance and capacity needs increases administrative overhead. Using a traditional scale-out storage approach doesn’t solve the problem, as it adds unnecessary complexity and it still requires administrators to manually organize storage for virtualization.
3. Data protection: The need for VM-level data protection and availability are critical in a virtualized environment. General-purpose storage solutions look at backup and recovery from a volume or LUN level vs. the VM-level. This contributes to additional complexity as IT administrators must keep closer watch on VM-to-LUN mapping. There are ways to look at data protection from an application level, but that adds significantly more cost and complexity. Replication for business continuity and disaster recovery suffers from the same challenges except they extend over the network. Management complexity and bandwidth costs are the top reasons IT organizations shy away from deploying disaster recovery.
4. TCO and ROI: Virtualization has significantly increased the costs of storage management and underlying storage infrastructure, negatively impacting the ROI of virtualization. Enterprises need storage solutions designed specifically for virtualization to improve TCO and ROI.
1 ESG Research Report: 2013 IT Spending Intentions Survey (http://www.esg-global.com/research-reports/research-report-2013-it-spending-intentions-survey/)
In part two, we explore the evolution of storage to better serve virtualized environments.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.