Skip navigation
Abstract depiction of open-source technology. sleepyfellow / Alamy

The Role of Data Centers in API Security

Are APIs in your data center secure? We delve into this important topic for data center security.

API security is fast becoming the most significant type of cybersecurity threat for businesses to master. API security attacks have surged in recent years to the point that they are now more prevalent than ransomware, which had long been the top security risk in terms of frequency of attacks.

For data center managers and support teams, API security may seem like someone else's problem. After all, designing secure APIs is a task that falls mostly to developers, and monitoring APIs for security risks is something that's usually handled by security teams, not staff working inside data centers.

Still, data center teams have an important role to play in helping to ensure that the workloads running inside their facilities are protected from API security risks. Keep reading for a look at what can be done at the data center level to secure APIs.

What is API security?

API security is the practice of protecting APIs from attacks that could lead to the exposure of sensitive information, the disruption of critical services or the compromise of applications.

APIs, which allow discrete applications or resources to connect to each other over a network, have existed for decades, and they have always been subject to potential security risks. But given the explosion in API usage over the past decade – a trend that has resulted partly from the shift toward SaaS architectures and partly from the adoption of cloud-native, distributed architectures that rely on APIs to connect their loosely coupled components – API security has become a much more serious consideration for businesses today than it was just four or five years ago.

Securing APIs inside the data center

Again, the teams responsible for managing data centers aren't the primary defenders of APIs. APIs are a software resource, and they are not connected in any specific or unique way to data centers.

Nonetheless, many of the workloads hosted in modern data centers rely on APIs in various ways. A microservices application running in a data center might use internal APIs to allow its microservices to communicate with each other, for instance. Likewise, an app that is delivered to users using a SaaS model might expose a public API that customers can use to issue requests or transmit data to the application.

Defending workloads like these against attack starts with ensuring that the APIs are as secure as possible, and that they are properly monitored to detect incidents like malicious requests. But data center staff can do their part to strengthen API security further, in several ways.

Network throttling

One way to help mitigate the risk of API abuse is to throttle network requests, which means restricting how many requests an API is allowed to serve per minute. Throttling limits the impact of attacks where malicious actors compromise an API and use it to download large volumes of sensitive information.

Throttling rules can be enforced via networking equipment inside data centers. Data center management staff should coordinate with API developers to determine which level of throttling is appropriate, then implement the necessary throttling rules within their equipment.

API gateways

Data centers can also deploy API gateways to help secure APIs.

An API gateway is a management interface that accepts incoming API requests from clients, directs them to the appropriate backend resources and then sends the results back to the client. Because gateways serve as a buffer between clients and APIs, they can help to mitigate API abuse. Gateways can also collect monitor data about API requests and patterns to help detect threats.

Protecting internal APIs

One of the key benefits of data centers is that they provide more control over networking configurations than businesses can obtain in most public cloud environments.

Data center staff should leverage this control for API security purposes by designing networks such that the exposure of APIs to security risks is minimal. Internal APIs that don't need to be accessible from the Internet should be isolated from public networks, for example, and access to public APIs could be restricted in some cases to specific hosts or IP address ranges.

Air gapping

Data centers also support air gapping, which means isolating data from the network entirely in order to prevent abuse.

Air gapping doesn't directly protect APIs, since APIs by definition need to be connected to some type of network. But air gapping can mitigate the impact of API attacks that compromise data because air gapping ensures that a clean backup is available in the event that production data is compromised.

Data centers may not be the first line of defense against API security risks, but they have an important role to play in protecting APIs. As API attacks grow more and more prevalent, businesses should be thinking about how they can leverage resources inside data centers to build an additional layer of API security.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish