If you want to optimize network performance and the experience of your end users, you should host applications and data as close to them as possible. There are two main ways to do that — using a content delivery network (CDN) or hosting workloads in data centers that are physically proximate to your end users.
Which approach is best? The answer depends on factors such as your budget, the type of workloads you are managing, and how widely dispersed your users are. This article explores those and other factors in order to provide guidance on choosing between a CDN and data centers to improve workload performance.
How Do CDNs Impact Network Performance?
A content delivery network, or CDN, is a collection of servers that store cached copies of content. If the servers are spread out across a broad geographic area, requests for content from users based in different areas can be routed to the server closest to those users.
In this way, CDNs can improve the speed at which content reaches users. Although in theory data can move across the internet almost instantaneously, in practice, the longer the physical distance between a server and a user, the longer it will take for data to reach the user. The delays result from issues like the data having to move through more networking devices when it travels longer routes.
How Can Data Centers Improve Network Performance?
Data centers make it possible to achieve a similar network performance boost, albeit in different ways.
By being strategic about the location of the data centers they choose to host workloads, organizations can deploy workloads in data centers that are physically close to their users. There are hundreds of data centers spread across the world — including private data centers, colocation facilities, and data centers owned by cloud providers — making it feasible to choose data center locations best suited for network performance optimization.
An important difference between CDNs and data centers is that CDNs only store cached content, whereas data centers can host complete applications. This difference can have important consequences for the types of workloads you would ideally place in a CDN or data center.
Which Is Better — CDNs or Data Centers?
Whether it makes more sense to host a workload in a CDN or a data center to improve network performance depends on the following considerations.
On the whole, using a CDN to place content close to end users typically costs less than hosting the content in multiple data centers. However, whether a CDN will perform better from a cost perspective depends on:
- How much data you're transferring. Most CDN providers charge primarily based on the total volume of data transferred, but with data centers, data transfer volumes are only one of many potential factors in pricing. In some cases, workloads with very high network traffic requirements may be cheaper to host in your own data center.
- Whether you're setting up your own data center or using someone else's. Creating your own data center could cost millions of dollars, whereas using a colocation facility or public cloud may cost only a few thousand dollars per year, depending on your workload and infrastructure requirements.
- How many locations you need to host content in. CDNs are likely to cost much less if you need to cache content across dozens of locations, but hosting workloads in just a few strategically located data centers may make more financial sense if you need to serve a handful of areas where your users are concentrated.
In short, CDNs are cheaper than data centers in most cases as a way of distributing content to multiple users, but that's not universally true.
User location and dispersal
If your end users are dispersed across the globe, it almost certainly makes more sense to use a CDN to improve their performance than setting up multiple data centers. But in cases where users are concentrated in certain geographic regions — which they might be if, for example, your users are employees who need to access internal business applications from corporate campuses, not customers located in places of their choosing — it could make more sense to host workloads in data centers close to those sites.
Again, unlike data centers, CDNs don't host complete application instances. They just host caches of content. As a result, CDNs are good at improving the performance of workloads that depend on static content, but they're not as useful in situations where applications need to generate customized content in real time. In the latter case, a distributed set of data centers may work better than a CDN.
For example, if you have a web app that primarily serves generic content like images and video, caching the content in a CDN will typically improve the app's performance for users located in different areas. But if your app instead creates customized images for each individual user, a CDN won't improve performance as much because there is no way to generate and cache the custom content ahead of time.
Because a CDN only stores cached content, it doesn't give you the control that a data center brings. You can't access the physical infrastructure that powers your workloads, and you typically have a limited ability to configure network and security policies. With a data center, you usually have full control, especially if you're running the workloads on your own hardware (as opposed to using a public cloud).
This difference is not likely to matter much if you're dealing with workloads that don't need to be monitored or managed in a complex way. But it may make data centers a better choice than CDNs for workloads with unique management requirements.
CDNs and data centers can both help speed the delivery of application content, but they do so in different ways. CDNs are the simpler and cheaper option in general, although there are cases — especially those involving complex workloads that require more than just generic content caching to boost performance — where it makes more sense to distribute workloads across strategically located data centers than to use a CDN.