David Flynn is Primary Data CTO & Co-founder.
Today, storage systems offer a wide range of capabilities across performance, protection, and price. This diversity gives IT unprecedented choice as they select the ideal storage to meet application needs. However, even with these options, it is clear the “right resource” often doesn’t stay that way for long, as an estimated 80 percent of an enterprise’s data is cold, and admins must still work hard to ensure applications don’t suffer from performance problems.
We have systems that are fast, systems that can scale, and systems that provide low-cost capacity, but what we do not have is a way to ensure all of those different attributes are used effectively to serve evolving application demands while reducing costs in the face of rapidly increasing data volume. Fixing the challenge of aligning the right data to the right resource is the next big step taking shape in storage industry innovation. This requires an automated approach that manages data by objectives, which can be done when data is virtualized through an enterprise metadata engine as I wrote about in a previous post on Data Center Knowledge.
A metadata engine seperates the data path from the control path so that storage can be unified within a single global namespace. This provides the ability for data to be managed across all storage, according to objectives set by IT or application owners. A metadata engine can then automatically and non-disruptively move data to the right storage to meet these objectives, ensuring desired service levels are always met, which is a crucial step toward the Software Defined Datacenter. Let’s examine how managing by objectives can ensure the right data is in the right place at the right time.
Modern Storage Offers Unique Capabilities
Flash, cloud and shared storage systems each deliver varying levels of performance, protection, price, and capacity, which in turn provides different benefits to business. Server-side flash storage is very fast, provides low latency and high IOPS, but is generally considered less reliable and is more expensive than shared storage. Shared storage, like NAS filers and SAN arrays, delivers lower performance and is more inexpensive than server-side flash, but offers higher levels of data reliability. Cloud and object storage options feature high capacity density and lower costs for storing cold data.
Set Your Target, Meet it Automatically
IT or application owners can automate data management by setting target objectives for data across the following attributes:
- Performance: IOPS, latency, and bandwidth objectives ensure ideal application performance
- Protection: Availability, reliability, durability, and security are set to meet application protection requirements.
- Time: Objectives can be based on file activity or inactivity, often in combination with other objectives. For example, objectives can ensure that all active files in a share that have been accessed in the last day are placed on storage that can deliver 10,000 IOPS, 100MB/bandwidth, and 0.5ms latency and all files that have not been accessed in the last 30 days are moved to a preservation tier, such as object or cloud storage.
- Pattern Matching: Objectives can be based on regular expression pattern matching. For example, admins could set objectives that files matching “.tmp” be stored on the local storage tier, while all other files are on shared storage tier.
Tiering Data By Objectives
With objective-based management, enterprises can tier data across different storage resources according to the different storage capabilities that the data requires. For example, many companies have data that goes cold quickly as it ages, as is the case with cell phone billing data that typically goes cold after a 30- to 45-day billing cycle. At many telecommunications companies, that data is rarely accessed again.
In addition, many applications have cyclic demands. Payroll might need higher performance once a month, but because IT can’t easily move data from a capacity tier to one with higher performance, they typically leave this data on storage that meets its peak needs.
With a system that can automate data placement, enterprises can “set and forget” data management objectives and be assured that data will meet its business requirements. If business needs change, a few clicks can realign data to the best resource for the evolving requirements. Storage administrators can even create a service catalog that application administrators can use to assign to their own service levels, with set costs per unit of data that are more reflective of actual consumed capacity rather than the high costs of overprovisioned storage.
Objective: Save with the Cloud
When it comes to cloud archival, the challenge is determining what data can be safely archived, and how to move that data once it is identified. Managing data by objectives allows IT to automatically identify data that meets the enterprise’s criteria for cloud archival and move it between the cloud and on-premises storage, as needed.
Many archiving solutions move data using simple rules based on attributes like file creation date. These solutions are error prone, can impact productivity, and require IT intervention to fix. Objective-based management makes decisions based on actual client, application, and user access, and can retrieve files automatically if they are needed again.
Companies are beginning to use the cloud as a store for their backup data, but restores can be costly due to the bandwidth charges associated with retrieving data from the cloud provider. Objective-based management can retrieve data granularly, at the file level, making it possible to restore just the file that is needed, without IT intervention, minimizing cloud bandwidth charges.
Automate Agility and Response to Changing Business Needs
Managing data by objectives gives petabyte-scale enterprises the ability to automate the movement of data according to business objectives, from creation to archival, including the integration of public clouds as an active archive. It also automates core management tasks, making it easy for companies to maximize storage efficiency and cost savings, while protecting performance and protection to meet required service levels.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.