New logical technologies are helping create operational efficiencies at all layers of the data center model. New terms and technological concepts are born because of gaps in the cloud deployment model.
This has led to the introduction of software-defined technologies (SDx), which abstract a number of different services to improve cloud and data center performance. But this also causes a bit of confusion. Where does this technology fit in? Is it really complicated? What does it really all mean to me?
To help simplify the many facets of the software-defined revolution, here is your SDx dictionary, which provides explanations and examples of the many ways in which software is redefining the ways data center and cloud infrastructure is managed.
Software-Defined Networking (SDN)
Because we have so many new connection points, it became necessary to create a better system to help control the flow of traffic. Traditional networking equipment focused too much on the physical layer where connections where required to accomplish the job. When cloud became a more widely used platform, it became necessary to abstract that physical layer. Now, we’re capable of controlling traffic which traverses the WAN completely at the software layer. This means network automation, optimization, and efficiency are no longer dependent on the physical infrastructure. VMware’s NSX, for example, creates a new model for how network traffic is controlled at the virtual layer. This introduces the capability to program, provision, and better manage both virtual and physical resources within the environment.
It’s important to note that SDN is also happening on the physical layer as well. Cisco’s NX-OS creates a modular building-block approach to the networking layer. Deployed on the entire switching stack, this networking operating system controls resiliency, virtualization, efficiency and even extensibility all at the logical layer. This type of intelligence can help dynamically route traffic in during peak times or even during outages. Not only is the physical layer being utilized to the fullest efficiency, administrators are able to create network flow automation policies to ensure continuous availability for both critical and standard workloads.
This has become a very interesting approach to controlling the storage layer. Much like servers and desktops, storage has experienced a bit of a physical infrastructure boom. There had to come a point where storage management became even more efficient. With that came the concept of software-defined storage. This is a virtual layer that sits in front of all storage components to control and distribute incoming requests to the appropriate storage pool. Atlantis ILIO USX, for example, creates a virtual layer where any storage controller can be inserted into the pool. With that, you can point DAS, Flash, SSD, spinning disk, and even RAM as a storage pool repository to the USX appliance.
From there, the software-defined storage system will intelligently push appropriate traffic to the appropriate pool. For example, archive data might be sent to less expensive storage while VDI requests are sent to a flash array. Similarly, VMware’s Virtual SAN, aims to aggregate both compute and storage resources directly from VMware vSphere hosts to create a simpler and better managed infrastructure. VMware introduces Storage Policy Based Management (SPBM) where administrators can now create intelligent storage policies aimed at availability and the enhancement of other virtual services. In creating that virtual layer, storage provisioning, scaling, and performance become direct benefits for the entire virtual infrastructure.
The concept of software-defined security falls directly in line with next-generation security technologies. Traditional security is simply not enough for today’s diverse infrastructure. The logical layer in the security realm was created to address new challenges around data in the cloud and more data within the actual data center. Checkpoint’s Virtual Appliance for Amazon Web Services helps create a direct software-defined security extension from a primary infrastructure directly into a cloud environment. This means utilizing advanced features spanning an entire WAN infrastructure including IPS, access controls, DLP, and unified security management.
Similarly, Palo Alto completely abstracted the security layer with their next-generation security operating system, PAN-OS. These virtual appliances can sit anywhere within the data center to process a variety of security requests. With an intelligent security operating system, administrators are able to utilize next-generation firewall capabilities, such as dynamic address groups, complete virtual machine monitoring, the creation of security policies that instantly sync with virtual workload creation, and a unified security management platform.
We continue our guide to SDx technologies providing explanations and examples of the many ways in which software is redefining the ways data center and cloud infrastructure is managed.
Software-Defined Data Center
This is a very important concept to understand since many data center and infrastructure shops are adopting the technology. The data center has truly become the home of all modern technologies. There is more data being pushed through a data center platform than ever before. To help control critical resources at all layers, data center controls needed to be abstracted. This happened at the virtual as well as at the infrastructure layer. VMware’s push around the software-defined data center describes an environment capable of robust performance while maintaining very high resiliency.
Effectively, they strive to completely unify the entire data center stack into the virtual layer to control network, storage, compute, and even management. Along the same idea of the software-defined data center comes the very powerful technology of a data center operating system. The IO.OS from IO Data Centers creates a completely logical layer to manage an entire global data center platform. This means complete visibility into a very distributed data center model where controls include integration with big data, various cloud environments, critical APIs, mobile resources, and much more. By creating a software layer to manage the entire data center model, you’re able to create a proactive, intelligent, infrastructure which is capable of real-time visibility and dynamic extensibility.
Automation and workflow automation has become critical to proactively manage the very dynamic nature of the cloud. Software-defined infrastructure takes into account the concept of hardware and software profiles within a converged system. Cisco UCS, for example, allows an organization to create a “follow-the-sun” data center model where resources can be dynamically re-provisioned based on workload, user location, time of day, and much more. This entire process can span a couple of racks, an entire data center or many data center environments which are globally distributed. The great part here is that it’s all designed around intelligent automation policies. Both physical and virtual resources can be allocated based on a variety of critical needs.
Controlling the cloud layer has become very important. Many organizations are actively looking into solutions which allow them to better manage a very heterogeneous cloud infrastructure. Technologies like Citrix’s CloudPlatform create a truly unified cloud management infrastructure capable of direct cloud elasticity, control, and optimized efficiency. By creating an application-centric platform, administrators are able to reliable orchestrate cloud workloads which span multiple data centers. This helps create a great turn-key solution which allows your software-defined cloud to span multiple cloud environments – thereby creating a powerful hybrid cloud platform which can still leverage existing resources; as well as new ones.
Similarly, OpenStack allows for a powerful private and public cloud control mechanism built on an open-source platform. The technology from OpenStack allows organizations of all sizes to create dynamic private and public cloud connections spanning the globe. In creating their open-source platform, OpenStack aims to simplify the implementation of cloud environments, as well as provide massively scalable solutions which are feature rich at that software-defined cloud layer.
Controlling resources, data flow, and the overall cloud infrastructure required the abstraction of the physical layer. This had to revolve around the entire process – Network, Storage, Compute, and even Data Center. Software-defined technologies directly interact with each other as well as their physical counterparts to create an intelligent environment capable of automation and much greater resiliency.
As cloud and infrastructure multi-tenancy continue to increase, SDx will help proactively control the allocation of critical resources. In many cases, this will dynamically improve both data center performance and the overall user experience. Although there are marketing terms tied around these solutions – remember, there are very real and tangible technologies behind all the buzz. As you build out your next-generation data center model , make sure to look at software-defined technologies and how they can positively impact your overall infrastructure.