MongoDB expects WiredTiger to give its database a performance boost

MongoDB expects WiredTiger to give its database a performance boost

Software-Defined Storage: What Does It Really Mean?

1 comment

We’ve covered the concept of software-defined technologies (SDx) and have shown how this very real technology can help your data center truly expand. If you haven’t seen our recent SDx Guide, make sure to take a look because there are powerful solutions that directly impact how your organization controls and distributes data.

With that in mind, software-defined storage has begun to make an interesting impact in the data center and cloud world. Already, SDx helps bring data centers closer together, but what can it really do for the storage component?

Let’s take a look at the big picture for a second. The model of the traditional data center, dating back a few years, heavily revolved around the physical infrastructure. We didn’t have virtualization or the concept of the cloud as we know it today. With that, we began to have hardware sprawl issues around servers, racks, and other equipment. Virtualization helped sort that out.

Still, growth around resource demands continued. This expanded to other pieces inside of the data center – specifically storage. Just how many more disks could you buy? How many more physical controllers would you really need to handle an influx of cloud, virtualization and users? At some point, the logical layer would have to be introduced to help the storage component better operate.

And so software-defined storage began to emerge. The idea here isn’t to take away from the storage controller, rather, it’s to help direct data traffic much more efficiently at the virtual layer. The power really kicks in because software-defined storage creates a much more agnostic platform to work with. So what does the technology look like?

SoftwareDefinedStorage

Got the visual? Now let’s break it down.

Logical Storage Abstraction: Basically, you’re placing a powerful virtual layer between data request and the physical storage component. This layer allows you to manipulate how and where data is distributed. The great part here is that you’re still able to keep a heterogeneous storage infrastructure while still controlling the entire process from a virtual instance. You can present as many storage repositories to the software-defined storage layer and allow that instance to control data flow.

Intelligent Storage Optimization: Just because you have a logical storage control layer doesn’t mean you can’t still utilize the efficiencies of your existing storage. The software-defined storage component helps you push information to a specific type of repository. You’re able to control performance and capacity pools and further deliver information to the appropriate storage-type. However, your actual controllers can still help with thin-provisioning, deduplication, and more. The power is in the flexibility of this solution. You can present an entire array to the software-defined layer, or just a shelf or two.

Creating a more powerful storage platform: This hybrid storage model allows you to leverage the power of your physical infrastructure as well as your virtual. You’re able to create one logical control layer that helps you manage all of the physical storage points in your data center. This helps with storage diversification and helps prevent vendor lock-in. Logical storage abstraction also helps with migrating and moving data between storage arrays and between various underlying resources.

Pages: 1 2

About the Author

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. His architecture work includes virtualization and cloud deployments as well as business network design and implementation. Currently, Bill works as the National Director of Strategy and Innovation at MTM Technologies, a Stamford, CT based consulting firm.

Add Your Comments

  • (will not be published)

One Comment

  1. A single controller architecture is typically where scale and functionality get limited. What if you had a distributed, scale-out controller architecture that could front-end already purchased storage and provision commodity hardware as well from a single interface. I'm glad start ups are looking at the storage problem in new ways, eventually, the consumer will win in terms of cost, features and simplicity.