The set of factors companies have to consider when devising their data center strategy has changed drastically. They have to take into account things like cloud services, mobile workforce, delivery of services at the network edge on top of the traditional requirements to maintain uptime and anticipate growth in capacity requirements. This April, we focus on what it means to own an enterprise data center strategy in this day and age.
We’re using more bandwidth, transferring more data, and demanding even richer user experiences. In a distributed cloud environment, connectivity is king.
We must know what we are pushing down the link to ensure optimal performance for the end user. Are we delivering rich media content or just small files? Bandwidth and WAN considerations must happen when the cloud environment is being designed.
As you look at the future cloud and WAN landscape, there’s no slowdown in utilization. According to Cisco, annual global cloud IP traffic will reach 8.6 ZB by the end of 2019, up from 2.1 ZB per year in 2014. Furthermore, global cloud IP traffic will more than quadruple over the next five years; and that same traffic will account for more than 83 percent of total data center traffic by 2019.
Now that WAN connectivity has greatly improved, cloud-based offerings have become more attractive. The emergence of the cloud has helped many organizations expand beyond their current physical data center. New types of cloud-based technologies allow IT environments to truly consolidate and grow their infrastructure quickly and affordably.
When it comes to WAN considerations, administrators must be aware of the type of cloud they are deploying and what they will be hosting. When working with a private, public, or hybrid cloud environment, planning will be the most important deployment step. During the planning phase, engineers and architects will examine how to build out their cloud environment and size it for future growth. By forecasting growth over a span of one, two, and three years, IT managers can be ready for spikes in usage and be prepared for the growth demands of the business. This level of preparedness is called cloud growth agility.
Key WAN Considerations
The ability to quickly and efficiently deliver workloads over the WAN will be crucial to the success of a distributed cloud-ready environment deployment. Special considerations must be made depending on the type of cloud or infrastructure. Some organizations will have multiple different links connecting their cloud environment for proper load balancing and high availability. Although each environment will have its own needs, there is a good set of best practices that can be followed for a respective site type:
- Major cloud data center: This is a central cloud computing environment with major infrastructure components. Hundreds or even thousands of users will be connecting to this type of environment. It can host major workload operations where workers from all over the world will connect and receive their data. The requirements here involve very high bandwidth and very low latency. Recommendations: MPLS, optical circuits, or carrier Ethernet services.
- Branch cloud data center: This is usually a smaller but still sizable cloud environment. This infrastructure would be used to house secondary but vital cloud systems. Here, administrators may be working with a few cloud delivered workloads which need to be distributed to a smaller amount of users. In this type of data center, requirements call for moderate bandwidth availability with the possible need for low latency. Recommendations: MPLS or a carrier Ethernet service.
- Small cloud data center for DR or testing: This is a small cloud data center with only a few components. Many times small distributed data centers are used for testing and development or for smaller DR purposes. Requirements in this environment call for low bandwidth but may still need low latency and the option for mobility. Recommendations: MPLS over T1/DSL, broadband wireless options, or internet VPNs.
Remember, when working with a distributed environment, site-to-site replication must be a consideration. Administrators must determine what they are pushing across the WAN and what type of connection to use. Also, it’s important to use existing tools for effective replication. This can be integrated into virtualization or WANOP solutions.
WANOP Benefits and Advancements
Many organizations are now leveraging the benefits of network virtualization and the coupling of WANOP technologies. AT&T, for example, says it will open source the software it is using for an in-house network virtualization push. AT&T’s software-defined architecture, ECOMP (enhanced control, orchestration, management and policy), aims to virtualize 75 per cent of the AT&T network by the year 2020. This will incorporate functions around NFV, SDN, and WANOP. So, what are the big benefits and advancements? Consider the following:
- WAN delivery based on context. You’re not just optimizing data traffic. This is far beyond simply ensuring that content gets delivered quickly. WANOP technologies now look at resources, workloads, and data from a contextual level. Policies around geolocation, user fingerprinting, and even data distribution are all functional layers within the WANOP architecture. More organizations can create powerful content delivery networks, optimize user experiences, and incorporate new business strategies, all with better WAN control options.
- Integration with cloud and virtualization. This is a big one. Not only do we have virtual WANOP appliances, we have virtual services which integrate and live in the cloud. You can optimize traffic regardless of where it lives. Private, public, and hybrid cloud environments can all be configured with precise WANOP specifications.
- Eliminating latency. The power behind WANOP technologies is the flexibility around control. You can dynamically manage WAN settings depending on the workload and the data being delivered. Basically, your ability to create granular WAN-based QoS controls allows you to manage data sets and applications at a new level. User access controls, content distribution, and even resources bursting are all functions WANOP can help with.
- Creating WAN security. Sometimes referred to as WAN Hardening, WANOP systems are integrating greater levels of security around your WAN ecosystem. For example, offering 256 bit AES encryption to secure all WAN traffic, controlling and encrypting all data in-flight, and doing all of this without any performance degradation. Furthermore, you can integrate WANOP controls into existing security policies and even into the network layer.
There are a lot of other big advancements as well. Basically, these systems are being designed to be closely coupled with virtualization and cloud layers. Most of all, they aim to give you very granular controls of your data and how you deliver it. With the introduction of virtual WANOP appliances, organizations of all sizes can begin to leverage a better delivery architecture.