There has been a 14-times increase in the amount of Artificial Intelligence (AI) start-ups launching since the turn of the century, according to a study by Stanford University. In the UK alone, says Carmine Rimi, AI product manager at Canonical – the company behind Ubuntu, AI developers witnessed a 200% spike in venture capital funding in the past year alone; as the transformative potential of AI smashes all boundaries.
The creation of AI applications to enhance ways of doing business and, indeed, people’s lives is a huge task. These applications are complicated to develop and build, as they involve such varying types of data; making porting to different platforms troublesome.
Above these challenges, several steps are needed at each stage to start constructing even the most basic AI application. A spectrum of skills is necessary, including feature extraction, data collection verification and analysis, and machine resource management, to underpin a comparatively tiny subset of actual ML code. A lot of work needs to happen before taking a position on the start line; alongside a large amount of ongoing effort to keep the applications up to date. All developers are searching for ways to beat these big challenges.
Contain yourself
The result of this search, to keeps apps up to date and balance workloads in app development, often comes to the same answer – Kubernetes. This open source platform can be a facilitator, as it can automate the deployment and management of containerised applications, comprising complicated workloads such as AI and Machine Learning. Kubernetes has enjoyed something spectacular because its capable of these things, but also as a container orchestration platform.
Forrester recently stated that “Kubernetes has won the war for container orchestration dominance and should be at the heart of your microservices plans”. Containers deliver a compact environment for processes to operate in. They are straightforward to scale, portable on a range of environments and they, therefore, enable large, monolithic applications to be split into targeted, easier-to-maintain, microservices. The majority of developers say they are leveraging Kubernetes across a variety of development stages, according to a Cloud Native Computing Foundation survey.
Most companies are running, or plan to start using, Kubernetes as platform for workloads. Of course, AI is a workload that is rapidly garnering importance. Kubernetes is ideal for this task, because AI algorithms must be able to scale to be effective. Certain deep learning algorithms and data sets need a large amount of compute. Kubernetes can help here, because it is focused on scaling around demand.
Kubernetes can also provide a roadmap to deploying AI-enabled workloads over multiple commodity servers, spanning the software pipeline, while abstracting out the management overhead. After the models are trained, serving them in differing deployment scenarios, from edge compute to central datacentres, is challenging for non-containerised application forms. Once again, Kubernetes can unlock the necessary flexibility for a distributed deployment of inference agents on a variety of substrates.
Changing focus
As businesses move their attention to AI to slash operating costs, improve decision-making and cater for customers in new ways, Kubernetes-based containers are rapidly becoming the number one technology to support companies in adopting AI and Machine Learning. Last December Kubernetes project unveiled Kubeflow, which is focused on making deployments of Machine Learning workflows on Kubernetes simple, portable and scalable.
While Kubernetes began life with just stateless services, the project stated that customers had started to move complex workloads to the platform, leveraging Kubernetes’ ‘rich APIs, reliability and performance’. One of the most rapidly growing use cases for Kubernetes is as the deployment platform of choice for Machine Learning.
At the start of 2017 only the Google Cloud Platform supported Kubernetes, with its Google Kubernetes Engine. At the culmination of the year, every major public cloud vendor was on board. Especially, after Microsoft added Kubernetes support to the Azure Container Service and Amazon debuted the Amazon Elastic Container Service for Kubernetes.
The ways that Kubernetes is being rolled out and leveraged by businesses is seemingly boundless. In a relatively short life span, Kubernetes has achieved a lot. This underlines the extent to which tech vendors, and their clients, are flocking to the notion that containers provide huge benefits in developing and managing the AI parts of applications. The emergence of AI is triggering a huge interest in containers to introduce repeatability and fault tolerance to these complicated workloads.
Kubernetes is becoming a de facto standard and fantastic match to manage containerised AI applications. It has proved itself and should go on to be of dramatic benefit to businesses for a long time to come.
The author is Carmine Rimi, AI product manager at Canonical.
Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow
Leave a Reply