Your submission was sent successfully! Close

Jump to main content
  1. Blog
  2. Article

Andreea Munteanu
on 4 October 2022


Kubeflow is an open-source MLOps platform that runs on top of Kubernetes. Kubeflow 1.6 was released September 7 2022 with Canonical’s official distribution, Charmed Kubeflow, following shortly after. It came with support for Kubernetes 1.22.

However, the MLOps landscape evolves quickly and so does Charmed Kubeflow.  As of today, Canonical supports the deployment of Charmed Kubeflow 1.6 on Charmed Kubernetes 1.23 and 1.24. This is essential as Kubernetes 1.22 is not maintained anymore, following the latest release of Kubernetes 1.25.

Kubeflow 1.6 for optimised advanced training

Kubeflow 1.6 came with new enhancements that focused on complex optimised model training. To be precise, the latest version focused on the stable version of the Kubeflow pipelines. They offer a better user experience through the stable version (KFP v2). Metadata is securely captured and recorded using the pipeline execution cache. 

Hyperparameter is also enabled with the latest version of Kubeflow. Training operators are the champions here. They combine population-based training (PBT) with various AI frameworks such as Tensorflow or PyTorch.

Join our upcoming webinar to learn more about hyperparameter tuning on Kubeflow

Register now

The latest version of Kubeflow also makes data processing more seamless by providing better tracking capabilities.  Trial logs are efficiently recorded and ML models are better measured. This makes evolution and debugging simpler. Preventing data drift is now possible, with the ability to detect data source failure.


Learn more about what’s new in Kubeflow 1.6 or watch one of our live streams: beta release and technical deep dive.

Kubeflow and the Kubernetes lifecycle

Kubernetes’ lifecycle supports the latest three minor releases, based on the official guidelines. Canonical’s official distribution, Charmed Kubernetes, follows the same baseline. As an extra step, Canonical offers expanded security maintenance for the two older versions. Each version of Kubernetes reaches its end of life after approximately 10 months. They are always announced when a new version is released.

Kubeflow 1.6 on Kubernetes 1.23 and beyond 

Canonical just finished the testing of Charmed Kubeflow 1.6 on two of the maintained versions of Charmed Kubernetes. It enables users to save time and continue using their Kubernetes version of choice when deploying the MLOps platform. Kubeflow has the same functionalities and features on all announced versions. It benefits from the new enhancements of Kubernetes.

From an enterprise perspective, this announcement is much more important. It allows the MLOps platform and orchestration tool to run in tandem and avoid security issues. It enables data scientists and machine learning engineers to focus on ML models, rather than infrastructure maintenance.

If you would like to benefit from these, make sure you run Charmed Kubeflow. You can either deploy it using the quickstart guide or upgrade to the latest version.

What next?

Currently, Canonical is working on supporting Charmed Kubeflow on the latest version of Kubernetes. It will be announced once the testing phase is completed and the application runs smoothly, and at maximum performance.

Learn more about Charmed Kubeflow


Related posts


Canonical
6 September 2023

Canonical launches AI roadshow

AI Article

Series of events will highlight generative AI use cases powered by open source software London, UK. 6 September 2023. Canonical, the publisher of Ubuntu, is launching its first AI roadshow. The series of events and presentations will highlight how enterprises can make better use of their own data and make artificial intelligence (AI) use ...


Andreea Munteanu
30 August 2023

LLMs explained: how to build your own private ChatGPT

AI Article

Large language models (LLMs) are the topic of the year. They are as complex as they are exciting, and everyone can agree they put artificial intelligence in the spotlight. Once LLms were released to the public, the hype around them grew and so did their potential use cases – LLM-based chatbots being one of them. ...


Andreea Munteanu
12 July 2023

Large language models (LLMs): what, why, how?

AI Article

Large language models (LLMs) are machine-learning models specialised in understanding natural language. They became famous once ChatGPT was widely adopted around the world, but they have applications beyond chatbots. LLMs are suitable to generate translations or content summaries. This blog will explain large language models (LLMs), inclu ...