Points to Consider Before Buying a Data Protection Solution

In the final part of this series on data protection, Jarrett Potts of STORServer discusses how making the right licensing decision can save you money, how to scale data protection, why to set different policies for different data and the role of unified recovery management.

Jarrett Potts, director of strategic marketing for STORServer, a provider of data backup solutions for the mid-market. Before joining the STORServer team, Potts spent the past 15 years working in various capacities for IBM, including Tivoli Storage Manager marketing and technical sales. He has been the evangelist for the TSM family of products since 2000.

Jarrett-Potts JARRETT POTTS
STORServer

In the second part of our series, we discussed the importance of finding a solution that’s easy to use, treating data differently, eliminating the burden of virtual machine backups and using built-in data reduction technologies.

In part-three of our series, we will discuss how making the right licensing decision can save you money, how to scale data protection, why to set different policies for different data and the role of unified recovery management.

License Correctly and Save Money

Can two of the same things have two different prices? Absolutely. Not only can they have different prices, but they can also be dramatically different.

In the last few years, data protection solution providers have started to offer something besides “core-” or “server-” based. When buying software consider all options. One of the newest options is capacity.

Look for a company that offers pricing options that allow users to pay for solutions in the manner that makes the most financial sense. In the past, licensing models were based on the number and power of processor cores in the servers being protected. They also had cost advantages for organizations with relatively large amounts of data and a small number of servers, or for organizations with other software products licensed this way.

But, some also offer a capacity-based licensing option that allows organizations to pay for the software based on the amount of data being protected. This model has cost advantages for organizations with a relatively large number of servers. It also eliminates licensing cost surprises when servers are added or cores are upgraded. The software includes tools to help the organization make accurate budget forecasts.

The capacity-based model has particular value in infrastructures with multiple applications that require data protection solutions. Under the capacity-based model, these solutions are included at no additional cost. Also, using advanced features, such as data deduplication, which reduces the amount of data being protected, can decrease the amount of data being measured against the license cost.

When looking for a backup and recovery or archive and retrieve solution, ask the vendor to show all the different ways it can be licensed. Ask for a price for all options, and then make an informed decision.

Scalability–You Grow, It Grows

When keeping pace with growing data, a major concern for IT organizations in terms of storage and data protection is how the solution will handle the growth.

If a user’s business has grown capacity by 40-60 percent in each of the past three years, and it now supports billions of data objects, a solution is needed that will grow with them. This growing of capacity may be outpacing the data protection solution, and there is a need to find a way to scale the protection.

This growth can be handled, but the scaling must be done in a logical manner. There are three ways to do this:

  • Scale out: This usually means that new hardware and software will be added as further resources to handle the load from the growth and involves a significant investment in new resources.
  • Scale up: This is where users add new software on existing servers, such as creating a second copy of an application on existing hardware. While this does cut the cost of new hardware, it assumes that the existing hardware can handle the load.
  • Scale in: If users can find a data protection solution that grows as they grow without additional resources, they have hit pay dirt. There is usually no additional investment involved, however, users must have 20/20 foresight.

A solution should be on hardware that will grow into the future and have software with proven ability to grow at the same pace or better than the company’s growth.

When choosing a product for data protection, decide up front if you want to scale up, out or in. That up-front decision will dramatically change the amount spent down the road in years three and five.

Not All Data is Created Equal. Stop Treating It That Way.

IT organizations can drive up the cost of storage unnecessarily by treating all data the same and storing it all on the same media. Let’s face the fact: a resume is not as important as the payroll database or even the email database. So, why do IT folks use the same storage policy for both?

Stop using one policy to rule all data. It might be simple, but it will kill the bottom-line. Find a data protection solution that allows policies to be set to treat data differently.

Important data should be prioritized as “tier one” and get backed up the most often and most quickly. Perhaps that data can stay on disk for fast restore.

Everything else that is not business critical is considered “tier two” or “junk data” and should go directly to tape to be stored since it is not business critical. Junk data, like photos or temp files, can be deleted.

Tier two data is a great target for hierarchical storage management (HSM), which allows organizations to store data on different tiers based on specific policies and enables administrators to migrate and store data on the most appropriate tier. For example, older and less-frequently accessed data can be moved to a slower, less-expensive storage platform, such as tape, leaving more expensive disk storage available for more high-value data.

A data protection solution should help reduce costs by providing automated, policy-based data life-cycle management, moving data to the most cost-effective tier of storage while still meeting service level requirements. This helps ensure recovery objectives are met and transparent data access is achieved. Automated data archiving also helps organizations ensure compliance with data retention policies and reduces associated costs.

Recovery: A Unified Approach

Is your organization using different products to protect different types of data or different systems? If so, start thinking about standardizing on a single product. Think of all the time, training and resources that will be saved.

Unified recovery management (URM) brings under one user interface the ability to manage data protection throughout the business, supporting different applications and types of data on multiple operating systems in various locations and with diverse policies and backup requirements. From a single point, administrators can manage multiple data protection and recovery tools, including diverse solutions that are dedicated to different tasks. It helps eliminate the costs and complexities associated with deploying and managing multiple point solutions.

When looking for a company that provides data protection, look for one that simplifies and streamlines storage management, helping organizations control both the risks and costs of data protection and recovery. With fewer “moving parts” for managing the various solutions in operation, administrators can ensure faster, more reliable backup and recovery processes. The solution also provides built-in replication for highly available disaster recovery, helping to reduce downtime and the business costs that can result. These process improvements contribute to higher levels of service, making it easier for organizations to meet service level agreements.

The ability for a single person with very limited knowledge to manage an entire businesses’ data protection solution is important. With a unified approach, users gain the ability for IT staff to be nimble and forward-thinking. No longer are they in “reactive mode,” but they start to exist in “proactive mode.” This is important because it saves time and money.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish