streamingdata

How Intelligent Storage Controllers Have Revolutionized the Industry

Add Your Comments

The storage of data (and lots of it) is a continued business demand. The storage industry is evolving to keep pace.

The storage of data (and lots of it) is a continued business demand. The storage industry is evolving to keep pace.

The data center environment continues to evolve. Current market and business demands have changed to revolve around cloud computing, more devices, and a focus on the end-user computing experience. Large or small – infrastructure is what has been keeping organizations operational. Within the data center, numerous technologies all work together to help bring powerful technologies to other sites, branches and the end-user. A major part of this environment has always been the storage component.

Over the past few years, the storage controller has advanced far beyond a device which only handles storage needs. With more cloud and IT consumerization – managing data, space and future storage requirements has become a greater challenge. So, as other technologies evolved; storage did as well. With modern storage appliances, organizations are able to do so much more than ever before. In all effects – storage has helped revolutionize how we work with and control data. Remember, resources are still expensive. So, why not deploy intelligent technologies which not only optimize, but can scale directly as well.

  • Logical storage segmentation/multi-tenancy. As organizations grow – many will develop regional departments or branch offices. In some cases, administrators would have to deploy a new storage controller to numerous locations; even if they needed just a bit of non-replicated storage. Now, modern controllers can be logically split to facilitate the delivery of “virtual storage” slices to various departments. Unlike simple storage provisioning – the branch administrator would receive a graphical user interface (GUI) and a “virtual controller.” To them, it looks like they have their own physical unit. In reality, there is a main storage cluster which has multi-tenancy enable. The primary admin can see all of these slices, but the branch administrators will only see the slice that they are provided. Those private instances can be controller, configured, and deployed all without impacting the main unit.
  • Storage thin provisioning. Storage utilization and provisioning has always been a challenge for organizations. With virtualization and many more workloads being placed onto a shared storage environment, organizations needed a way to better control data. With that came the technology around thin provisioning. Thin Provisioning utilizes the on-demand allocation of blocks of data versus the traditional method of allocating all the blocks at the very beginning. In using this type of storage-optimized solution administrators are able to eliminate almost all whitespace within the array. Not only does this help with avoiding the poor utilization rates, sometimes as low as 10 percent- 15 percent, thin provisioning can also optimize storage capacity utilization efficiency.  Effectively, organizations can acquire less storage capacity up front and then defer storage capacity upgrades in line with actual business usage. From an administrative perspective, this can reduce data center operating costs, like power usage and floor space, which is normally associated with keeping large amounts of unused disks spinning and operational.
  • Connecting to the cloud. No core data center function can escape the demands of the cloud. This includes storage technologies. With more systems connecting into the cloud, storage technologies have adapted around virtualization, cloud computing, and even big data. There really isn’t any one major, cloud-related, storage advancement. Rather, numerous new features and technologies have surfaced which directly optimize, secure and manage cloud-based workloads. For example, solid-state and flash storage arrays have been growing in number when it comes to high IOPS workloads. Technologies like VDI require additional resources to allow hundreds and even thousands of desktops to operate optimally. Another example is geo-fencing data and storage. In creating regulatory compliant storage environments, organizations can now fully control where their data goes and where the borders are required. Not only does this help with file sharing, it helps companies control how their data lives in a public or private cloud scenario.
  • Controlling big data. It really didn’t take too long for storage vendors to jump on the “big data” bandwagon. The big picture here is that data and the utilization of data will continue to grow. Storage vendors like EMC and NetApp took proactive approaches in partnering and deploying intelligent systems capable of supporting big data initiatives. For example, Open Solution from Netapp delivers a ready-to-deploy, enterprise-class infrastructure for Hadoop so businesses can control and gain insights from their data.  Furthermore, in partnering with server makers – storage vendors are now able to deploy validated reference architectures which provide reliable Hadoop clusters, seamless integration of Hadoop with existing infrastructure, and analysis on any kind of structured or unstructured data. From EMC’s perspective, their powerful Isilon scale-ready platform for Hadoop combines EMC’s Isilon scale-out network-attached storage (NAS) and EMC Greenplum HD. In working with these types of technologies, organizations are able to utilize a powerful data analytics engine on a flexible, efficient data storage platform.

With so many vendors pushing hard to advance the storage market, the above list can truly become much longer. Market trends clearly indicate growth in the consumer market as well as within the business organization. This means more end-points, many more users and a lot more data. Furthermore, high resource workloads demand smarter storage solutions which work to prevent bottlenecks.

In creating your data center, always plan around core components which are driving technological advancement. This means deploying scalable servers, solid networking components, and an intelligent storage system which can control growing data demands. As the market continues to push forward, administrators will need to work with storage solutions which meet business requirements both now and in the future.

For more on storage news and trends, bookmark our Storage Channel.

 

 

About the Author

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. His architecture work includes virtualization and cloud deployments as well as business network design and implementation. Currently, Bill works as the National Director of Strategy and Innovation at MTM Technologies, a Stamford, CT based consulting firm.

Add Your Comments

  • (will not be published)