Linda Faust is Director, Product Marketing, at Transmode and has 20 years experience in networking and storage networking industries.
New service and application requirements, including those for cloud computing, have made low latency in data networks a high priority. Cloud computing is becoming mainstream and is considered by many to be the next big change in the way fixed telecoms networks will operate. This service requires operators who are connecting data centers together to look at the latency of the route and the systems used, and if necessary, to take corrective action. When establishing a data center interconnection, low latency must be offered as a basic feature to ensure the facility owner is strongly positioned to serve customers with low latency demands.
What Contributes to Latency?
Network latency, a synonym for delay, is an expression of how much time it takes for a packet of data to get from one point to another. Several factors contribute to latency, including not only the time it takes for a packet to travel from one place to the other at the speed of light, but the fiber itself, and the equipment used to transmit and receive the packet. As additional equipment layers are added, network latency increases. The demand for low latency has been a driver for the adoption of optical networks based on Wave Division Multiplexing (WDM).
How Optical Networks Work
In an optical network, information is converted to a series of light pulses, which are transported along optical fibers and retrieved at a remote location. WDM-based optical networks communicate using multiple wavelengths (i.e. colors of light) transmitted on a fiber. This technique enables multiplication of capacity and bidirectional communication over a single fiber or fiber pair –- a fact of significant importance when fiber is scarce or expensive to lease. Depending on the wavelength spacing, WDM networks can support up to 80 channels, providing greatly increased capacity. This provides provide growth opportunities for applications and also allow new services to be deployed on the same network.
Case Study: Net2Ez
For example, service provider Net2Ez has achieved low latency to benefit its data center customers, who have high performance expectations, for throughput, latency and reliability. Net2EZ delivers uninterrupted operation of mission critical applications for content networks, enterprises and communications service providers. The company runs seven U.S.-based data centers and uses a WDM network with multiple 10G and 1G connections between its data centers.
The WDM network delivers other advantages in addition to low latency. Uninterrupted operation is ensured by using a protected fiber ring. The fiber infrastructure allows for increased capacity and is also cost effective. Today’s WDM networks can also offer simple to use, integrated management solutions, which significantly improve operational efficiency. The staff are able to remotely identify and resolve issues in a much more time effective manner. Finally, redundancy built into the system allows the service provider to remain live while maintenance on the fiber infrastructure is being performed.
Some Net2EZ customers have more latency sensitive applications than others. For example, they handle financial applications, including stock trading, which is extremely time sensitive. Another growing latency sensitive application is network gaming. And, although video transmission is not sensitive to latency, it is sensitive to changes in latency over time, known as jitter. As more applications have become sensitive to the latency and jitter impact, network equipment providers have been required to solve latency issues and deliver better performance. The introduction of very low latency WDM equipment only adds an additional latency of 4 nanoseconds to the fiber route, the equivalent of adding around 1 meter of fiber to the total route length. Optical transport equipment companies can now deliver ultra-low latency WDM based optical networks that enable service providers to differentiate themselves.
In some cases, cloud computing will bring additional complexity and unique latency related challenges. In other instances, multiple applications from independent users may share a physical cloud server, impacting the performance seen by latency sensitive applications. Ultra-low latency networks can help mitigate this issue, although additional sensitivity to user and application management may also be important.
Innovation in data center services is driving the industry to focus on latency and to provide low latency equipment and services. As services evolve, low latency becomes an increasingly important factor in a growing portion of applications and data center projects.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.