Skip navigation
How Does Data Fabric Fit With Software-Defined Storage, and What Can It Do For You?

How Does Data Fabric Fit With Software-Defined Storage, and What Can It Do For You?

Infrastructure impacts applications’ performance, cost and scalability, yet few people grasp the importance of selecting the right type of platform.

Marc Fleischmann is CEO and Founder of Datera.

Infrastructure matters. The power of big data hinges upon its accessibility, so when information is siloed, it curtails the functionality of analytics. Unfortunately, this is an all too common occurrence for companies who rely on traditional monolithic storage solutions, which have fixed boundaries. Infrastructure impacts applications’ performance, cost and scalability, yet few people grasp the importance of selecting the right type of platform.

At one time, storage was relatively simple, with pre-defined capacity, performance and cost, but also relatively straightforward. Back then it was a simple matter of integrated hardware and software delivering a clearly defined service. But with the advent of the cloud, data proliferated at unprecedented rates and created exciting new possibilities. Similar to software-defined networking, software-defined storage has taken off over the past few years. If networking entails the movement of data from place to place, storage requires the preservation of data--its quality, reliability and endurance. Software-defined storage brings life to stored information, sorting it, organizing it and automating retrieval processes.

There are many implementations of software-defined storage. Most recently, however, hyper-converged solutions and scale-out distributed system (or data fabrics) have driven most of the use cases. Hyper-converged solutions have the benefit of being simple, turnkey, focused for support for virtual machines and targeted for small to medium size deployments. On the other hand, data fabrics provide a wide spectrum of capabilities, add scale, can support adaptive policies and morph as storage requirements evolve. The latter is more efficient, as it allows for independent scaling of compute and storage, while the former is more simple, as it packages compute and storage scaling.

Unlike traditional monolithic storage systems, a data fabric is agile, conforming to evolving application needs. As a result, companies can access data more readily, spend resources more sustainably  and deploy their applications faster. According to Forrester, data fabric can help enterprise architects “accelerate their big data initiatives, monetize big data sources, and respond more quickly to business needs and competitive threats.”

Here are some of the major ways that data fabric adds to software-defined storage, differs from traditional data storage, and how it impacts IT.

  1. Rapid scalability. The average application can take months to deploy, and deployment is often more of the bottleneck than development. Using a data fabric accelerates the process by automatically molding storage around the application. Removing the manual aspects saves IT personnel time and exponentially speeds the time to value. Companies that use an elastic data fabric can bring their application to more users, faster.
  2. Intent-defined. Unlike traditional storage, data fabrics adapt to applications’ specific requirements, learning the different infrastructure capabilities and intelligently matching them to application intent with targeted placement. This allows for multi-tenant deployment and performance guarantees.
  3. Infrastructure-as-code. Because a data fabric is built from code, it provides architects with the same flexibility that programming affords developers. Users don’t have to handcraft infrastructure for their applications--it is composed automatically and continuously. Developers, applications and tenants can instantly access storage when they need it.
  4. Minimized costs. Rather than fixed capital expense commitments, enterprise data fabric users only pay for what their application actually needs--a number that’s constantly changing in real-time. Data fabrics are supported on commodity hardware, so they are far less expensive than alternative options. Moore’s Law, articulated by Intel founder Gordon Moore, expressed the fundamental driving force of the IT industry: the capabilities of integrated circuits double roughly every two years. For businesses buying traditional monolithic storage systems, that means investing in expensive hardware that will become obsolete quickly. Data fabrics enable companies to use software on easily replaceable commodity hardware, ensuring that they always get the best value at the lowest price. Many organizations opt for a data fabric precisely because this flexibility and affordability.

Overall, a data fabric platform automates storage provisioning for applications, significantly simplifying the consumption compared to legacy systems that require that to be done manually. It’s faster and more adaptive, which allows enterprise IT teams to focus on building and improving the applications themselves. Data fabric technology is likely to become the data center solution of choice for the majority of enterprises within the next few years--a competitive advantage that will set IT-savvy companies miles ahead.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish