How long does a server last?
That's the burning question whose answer cost-efficient MSPs need to know.
Here's how to determine the lifespan you should expect from servers.
Computers are Not Cars
To determine server lifespan, it's important to recognize that computers are not like cars or other machines that simply wear out over time.
The difference is this: Most components in your car are mechanical.
Eventually, no matter how well designed those components are or how well you maintain them, they'll break down.
In contrast, most components inside servers are not mechanical.
CPUs, memory and motherboards don't have moving parts that wear out as a result of use.
As long as you cool these components properly, protect them for electrical surges and perform basic maintenance, the non-mechanical parts of a server will continue to function indefinitely.
There is one major part of servers that will wear out sooner or later: Hard disks.
The median life span of a hard drive is about six years.
Fortunately, hard disks are also among the least expensive and easiest components to replace in a server.
Just because your hard disks wear out doesn't mean your server has reached the end of its useful life.
Defining Server Lifespan
That brings us to the main question: How long is the expected lifetime of a production server?
This is a difficult question to answer because there are two distinct ways of thinking about how long a server remains usable:
- The first involves measuring how long it will keep functioning before critical components break down. As noted above, there is no easy answer to this question. Most of the components in a server are non-mechanical and can last indefinitely. Hard drives are the only big exception.
- The second way of measuring server lifetime is to think in terms of how long a server remains cost-efficient to maintain. At a certain point, continuing to manage servers that struggle to keep up with modern workloads is less effective than replacing them with new servers. It's easier to manage a single server that can handle a large modern workload, rather than managing three or four legacy servers to support an equal workload.
Legacy servers may also use energy less efficiently, which raises operating costs.
They may take up more space in the data center.
And they may eventually not be compatible with modern operating systems, although that is not a common occurrence.
Linux can run on virtually any server created in the last twenty years, and even Windows has a fair amount of legacy hardware compatibility.
How Long Will Your Server Last?
Most people will tell you that servers will last about five years and should then be replaced.
That's the rule of thumb that has developed in the industry.
As noted above, however, that's not the right way to think about server lifespan.
A server that receives routine maintenance, and whose hard disks are replaced as needed, could continue to run for decades – although it will likely not remain cost-effective for decades.
So, rather than assuming that there is a universal answer for how long a server can last, you should tailor the answer to your situation.
Calculate your maintenance and operating costs for your current servers and determine the point at which those costs become significantly greater than the cost of running more modern hardware.
Replacing your servers every five years just because that's what other people do is not necessarily the most cost-effective way to manage hardware.
It's also not exactly environmentally friendly – if that's important to you.
Avoiding the Question: The Cloud
Of course, you could also migrate all of your workloads to the cloud and stop worrying about server lifetime.
But until we live in a world where everything runs in the cloud, server lifetime still matters.
This article originally appeared on MSPmentor