Skip navigation

Kubernetes and AI: Marriage Made in IT Heaven

Kubernetes is ideal for the job because AI algorithms must be able to scale to be optimally effective.

Carmine Rimi is AI Product Manager at Canonical.

For something that’s supposed to make life easier, artificial intelligence (AI)-enabled applications are really hard to build. Large AI programs are difficult to design and write, they involve many types of data, and porting them from one platform to another tends to be troublesome. And, several steps are required, each requiring different skills. Feature extraction, data collection verification and analysis, and machine resource management make up the majority of the codebase necessary to support a comparatively small subset of actual ML code.

There’s a lot of work to do to get the starting line, and a large amount of ongoing effort to keep the applications current. Fortunately, Kubernetes -- the open-source platform that automates the deployment and management of containerized applications, including complicated workloads like AI and machine learning -- can be a facilitator.

Containers provide a compact environment for processes to run. They’re easy to scale, they’re portable across a range of environments  -- from development to test to production  -- and, therefore, enable large, monolithic applications to be broken into targeted, easier-to-maintain microservices.

Kubernetes has had a meteoric rise as a container orchestration platform. A vast majority of respondents to a Cloud Native Computing Foundation survey say they are increasing their use of Kubernetes across a variety of development stages. Forrester stated recently that “Kubernetes has won the war for container orchestration dominance and should be at the heart of your microservices plans.”

Meanwhile, AI has become one of the most talked-about sensations in tech, with a 14-time increase in the number of AI startups since 2000, according to a Stanford University study, and a market size projected to reach $60 billion by 2025. AI’s potential to change business seems limitless. 

Kubernetes and AI represent converging trends. Most companies are running (or plan to move to) Kubernetes as a platform for their workloads, and AI is an increasingly important workload.

Kubernetes is ideal for the job because AI algorithms must be able to scale to be optimally effective. Some deep learning algorithms and data sets require a large amount of compute.  Kubernetes helps because it is all about scaling based on demand. It also provides a way to deploy AI-enabled workloads over multiple commodity servers across the software pipeline while abstracting away the management overhead. Once the models are trained, serving them in various deployment scenarios, from edge compute to central data centers, poses challenges to non-containerized forms of applications. Again, Kubernetes can provide the necessary flexibility for a distributed rollout of inference agents on a variety of substrates.

As organizations increasingly shift their attention to AI to decrease operating costs, improve decision-making and serve customers in new ways, Kubernetes-based containers are becoming the go-to technology to help enterprises adopt AI and machine learning. That momentum has been reflected in a raft of announcements across the industry in recent months:

In March, IBM launched Cloud Private for Data, which it described as an integrated data science and app development platform -- built on Kubernetes – to help companies “accelerate their AI journeys.”

Also in March, Nvidia announced GPU acceleration for Kubernetes to aid AI and data science workloads in multi-cloud clusters.

In December of last year, the Kubernetes project introduced Kubeflow, “dedicated to making deployments of machine learning workflows on Kubernetes simple, portable and scalable.” While Kubernetes started with just stateless services, the project said, “customers have begun to move complex workloads to the platform, taking advantage of rich APIs, reliability and performance provided by Kubernetes. One of the fastest growing use cases is to use Kubernetes as the deployment platform of choice for machine learning.”

At the beginning of 2017, among the three major public cloud vendors, only the Google Cloud Platform supported Kubernetes, with its Google Kubernetes Engine. By the end of the year, they were all on board after Microsoft added Kubernetes support to the Azure Container Service and Amazon debuted the Amazon Elastic Container Service for Kubernetes.

That’s a lot of activity in a relatively short span, and it shows the extent to which tech vendors and their customers are rallying around the notion that containers offer huge benefits in developing and managing AI components of applications.

The rise of AI is helping fuel the growing interest in containers as an excellent way to bring repeatability and  fault tolerance to these complex workloads. And, Kubernetes is becoming a de facto standard to manage these containerized AI applications.

It’s a wonderful match that should dramatically benefit enterprises for some time to come.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

 

 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish