Jarrett Potts, director of strategic marketing for STORServer, a provider of data backup solutions for the mid-market. Before joining the STORServer team, Potts spent the past 15 years working in various capacities for IBM, including Tivoli Storage Manager marketing and technical sales. He has been the evangelist for the TSM family of products since 2000.
In part one of our series about the major items to consider before making an investment in a data protection solution, we covered three areas: 1) ensuring the solution offers more than just backup and recovery, 2) finding a vendor that offers superior customer support as well as subscription and support (maintenance) contracts, and 3) using reliability as a key measure of a solution’s ROI.
In the second part of our series, I will discuss the importance of finding a solution that’s easy to use, why different data should be treated differently, how to eliminate the burden of virtual machine backups and why all the talk shouldn’t focus on deduplication.
Ease of Use
There are many data protection products on the market today, and all of them have features and functions that make them stand out. One of the major items to consider is how easy the solution is to use. When it comes down to “brass tax,” ease of use is one of the most important items. After all, the person responsible for data protection may not have a huge skill set or the time to spend on managing the solution on a day-to-day basis.
When choosing a data protection solution, look for the ability to manage the system from a single pane of glass. The user interface needs to be simple enough that within a few minutes all daily tasks can be completed. As a bonus, the solution should send alerts to your inbox and mobile devices in an easy-to-understand report that includes important information about the previous night’s activity and the status of those activities.
This simple to use requirement dovetails into historical reporting. If the user knows what the solution is doing on a day-to-day basis, then he or she will also be able to tell what the system has been doing for the last few days, weeks, months or longer. This allows for planning the future with little to no hands-on work. For example, if there is a report that gives weekly growth for the last 26 weeks, users will be able to predict when they are going to run out of space in the solution or when to purchase more tapes. It is a very simple example, but shows that the solution should help plan for the future as well as operate today. And, following this example, users will be able to budget six months or more in the future for growth—a great advantage when budgets are tight.
The single-server footprint versus the master/media footprint can also make the solution easier or harder to manage as will automatic client software updates that keep IT administrators from spending valuable time making manual updates to systems across the infrastructure.
With business-wide administration, monitoring and reporting, plus the flexibility enabled by automation, a new solution should create administrative time-savings that can measurably reduce the cost of operations.
Data Life-Cycle Management: From Birth to Death
IT organizations can drive up the cost of storage unnecessarily by treating all data the same and storing it all on the same media. All data is not created equal. When looking for a data protection solution, managing the ones and zeros from the time they are created to the time they can be deleted is important.
Long-term archive and hierarchical storage management (HSM) allow organizations to store data on different tiers based on specific policies, enabling administrators to migrate and store data on the most appropriate tier. For example, older and less frequently accessed data can be moved to a slower, less-expensive storage platform, such as tape, leaving more expensive disk storage available for more high-value data. Automated data archiving also helps organizations ensure compliance with data retention policies and reduces the costs associated with compliance.
When making a decision about data protection, look only for solutions that reduce costs by providing automated, policy-based data life-cycle management, moving data to the most cost-effective tier of storage while still meeting service level requirements. This helps ensure recovery objectives are met while enabling transparent data access.
Support for Virtualized Environments is Key
Virtualization technology has helped IT organizations of all sizes reduce costs by improving server utilization and reducing application provisioning times. However, this produces two new problems that most people do not account for when choosing a data protection solution. First, the cost savings offered by virtualization can disappear quickly in the face of virtual machine sprawl. Second, the link between physical and logical devices becomes harder to map and track, making a virtual environment more complex than most can imagine.
Data protection can become a unique challenge in these environments. For example, backing up and restoring data for a dozen or more virtual machines residing on one physical server can bring all other operations on that server to a complete halt.
When searching for a data protection solution, investigate whether the product provides an effective solution to this challenge by eliminating the burden of running backups on a virtual machine, and instead, off-loading backup workloads from a VMware ESX or ESXi-based server to a centralized vStorage backup server. The solution must improve the frequency of protection and enable faster recovery of data, helping increase the business value of virtualization. The solution should also help reduce license management costs by removing agents from the individual virtual machines.
Data Reduction, Not Just Data Deduplication
Data protection is not just about data deduplication. Many data protection products and providers talk about data deduplication as if it will save the world. In fact, data deduplication is only a small part of the solution. What needs to be talked about is across the board data reduction.
Data reduction technologies are the first line of defense against rapidly expanding data volumes and costs. A solution that provides built-in data reduction technologies, such as progressive-incremental backup, data deduplication and data compression, can enable organizations to reduce backup storage capacity by as much as 95 percent. Advanced tape management and efficient tape utilization capabilities can further reduce data storage capacity requirements.
While some solutions create massive amounts of duplicate data through repetitive full backups, necessitating expensive data deduplication solutions, others provide progressive-incremental backup technology that avoids the duplicate data in the first place by creating only an initial full backup and then capturing only new and changed data. Built-in data compression and data deduplication operate at multiple storage layers to minimize the amount of data being retained for operations and disaster recovery.
Dramatically reducing backup storage requirements not only helps cut capital expenses, but can also decrease network bandwidth requirements and shrink backup windows. This results in reduced operational impact of backups and helps ensure high levels of application uptime.
In the next part of this three-part series, I will discuss how making the right licensing decision can save you money, how to scale data protection, setting different policies for different data and the role of unified recovery management.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.