Skip to main content

Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting our team. We will be in touch shortly.Close

  1. Blog
  2. Article

Kris Sharma
on 7 February 2023

Secure open source MLOps for AI/ML applications in financial services


The adoption of AI/ML in financial services is increasing as companies seek to drive more robust, data-driven decision processes as part of their digital transformation journey. For global banking, McKinsey estimates that AI technologies could potentially deliver up to $1 trillion of additional value each year. But productionising machine learning at scale is challenging. The machine learning lifecycle consists of many complex components, such as data ingestion and prep, model training, tuning, deployment, monitoring and much more. It also requires collaboration and hand-offs across teams, from Data Engineering to Data Science to ML Engineering. Naturally, there is a need for operational rigour to keep all these processes synchronous and working seamlessly. These goals are hard to accomplish without a solid framework to follow.

Machine Learning Operations (MLOps) provides enterprises a framework to successfully deploy AI/ML capabilities into production at any scale. This approach, combined with conscious efforts to secure your open-source supply chain, can contribute to the long-term stability of AI/ML powered applications and achieve tangible business benefits from AI/ML investments.

MLOps at financial institutions

MLOps is a set of practices for collaboration and communication between data scientists and IT operations. Applying these practices increases the quality, simplifies the management process, and automates the deployment of Machine Learning and Deep Learning models in large-scale production environments.

MLOps provides the automation capabilities to improve the deployment and maintenance of AI/ML applications. Like DevOps, MLOps relies on a collaborative and streamlined approach to the machine learning development lifecycle where the intersection of people, process, and technology optimises the end-to-end activities required to develop, build, and operate machine learning workloads. MLOps also makes it easier to align models with business needs, as well as the regulatory requirements of the financial services sector. 

Adopting MLOps allows financial institutions to increase productivity by providing self-service environments with access to curated data sets, improved reliability, data and model quality, auditability and repeatability.

Open-source MLOps platform

For financial institutions to reap the rewards of their machine learning efforts, models must be developed within a repeatable process using an MLOps platform that empowers data scientists to manage the end-to-end ML process efficiently.

An MLOps platform provides data scientists and software engineers with a collaborative environment that facilitates:

  • iterative data exploration,
  • real-time co-working capabilities for experiment tracking,
  • feature engineering,
  • model management,
  • and controlled model transitioning, deployment, and monitoring. 

An MLOps platform automates the operational and synchronisation aspects of the machine learning lifecycle. 

Kubeflow is a free and open-source MLOps platform for developing and deploying a machine learning (ML) system, and is one of the most popular open source MLOps toolkits. Data scientists at financial institutions use Kubeflow to build and experiment with ML pipelines while ML engineers and operational teams use it to deploy ML systems to various environments for development, testing, and production-level serving. Kubeflow provides components for each stage in the ML lifecycle, from exploration to training and deployment.

Securing an open-source MLOps platform

The Kubeflow platform is composed of various applications and scaffolding that includes Jupyter Notebook, Pipelines, KFServing, Katib, PyTorch Serving, TensorFlow Serving, various operators, Seldon Core, Istio, Argo, Prometheus and many more. 

Managing open-source software and all of its dependencies securely is an imperative for financial institutions. That holds true for an open-source MLOps platform as well. Financial institutions need secure open source software for building and maintaining AI/ML powered intelligent applications without compromising on their compliance, security, or support requirements.

As more and more financial institutions leverage open source technologies, it’s crucial that open-source libraries and AI/ML toolchains are also from a trusted source with an assurance of long-term security maintenance and platform stability.

For finserv organisations looking to adopt AI/ML at scale with a secure open-source MLOps platform, Canonical offers Charmed Kubeflow and Ubuntu Pro that is now generally available.

Open source MLOps with Charmed Kubeflow

Despite the clear benefits of Kubeflow for ML operations, deploying, configuring and maintaining Kubeflow is still hard. The number of applications and potential scenarios makes it difficult for the Kubeflow community to provide a one-size-fits-all solution for data scientists at financial institutions to consume. 

Canonical addresses this issue by packaging each application inside Kubeflow and providing a fully supported MLOps platform for any cloud – Charmed Kubeflow. 

Charmed Kubeflow packages more than 20+ applications and services that make up the latest version of Kubeflow to make deployment and operations even faster and simpler anywhere – on workstations, on-premises, on public, private and edge clouds.

Charmed Kubeflow is driven by Juju – an enterprise Operator Lifecycle Manager (OLM) that provides model-driven application management and next-generation infrastructure-as-code. In Juju, operators and applications are bundled as Charms – packages that include an operator together with metadata that supports the integration of many operators in a coherent aggregated system.

Charmed Kubeflow is an enterprise-ready and fully supported end-to-end MLOps platform for any cloud. It is one of the official distributions of the Kubeflow upstream project. Data scientists and machine learning engineers at financial institutions benefit from having ML deployments that are simple, portable, secure and scalable using Charmed Kubeflow. Charmed Kubeflow also benefits from Canonical’s Ubuntu Pro security maintenance for software packages that are part of Ubuntu Universe and Ubuntu Main repositories. Canonical offers a most comprehensive subscription for open-source software security delivered on every cloud, data centre, and desktop.

Securing your open-source AI/ML toolchains

Ubuntu Pro subscription includes CVE patches for the images that are specific to the Kubeflow application, in collaboration with the upstream community, such that the entire workflow is secure.

Ubuntu Pro expands security coverage for critical, high and medium Common Vulnerabilities and Exposures (CVEs) to thousands of applications and toolchains, including open source MLOps applications, Ansible, Apache Tomcat, Apache Zookeeper, Docker, Drupal, Nagios, Node.js, phpMyAdmin, Puppet, PowerDNS, Python 2, Redis, Rust, WordPress, and more. It is available for every Ubuntu LTS starting with 16.04 LTS.


“Transformative innovations such as AI and deep learning have vastly expanded the volume of data that enterprises must secure against anomalies and threats. NVIDIA’s collaboration with Canonical equips enterprises with the reliability and long-term security assurance needed to achieve breakthroughs that benefit society.”
Justin Boitano
Vice President of Enterprise Computing, NVIDIA

With an aim to grow the MLOps ecosystem, Charmed Kubeflow integrates with various other AI-specific or data-specific platforms that includes Kafka, Spark and MLFlow. Ubuntu Pro covers the full stack, from infrastructure to the operating system up to the application layer.

Ubuntu Pro is ideal for financial institutions who want to focus on innovation and be confident of ongoing security maintenance and dependency tracking. Canonical backports security fixes from newer versions of applications, giving data scientists, ML engineers and operational teams at financial institutions a path to long-term security with no forced upgrades. The result is a decade of open source platform stability.

Canonical has a track record of almost two decades of providing timely security updates for the main Ubuntu OS, with critical CVEs patched in less than 24 hours on average. Patches are applied for critical, high, and selected medium CVEs, with many zero-day vulnerabilities fixed under embargo for release the moment the CVE is public.


“ Tenable and Canonical collaborate to provide timely, accurate and actionable vulnerability alerts. Ubuntu Pro offers security patch assurance for a broad spectrum of open-source software. Together, we give customers a foundation for the trustworthy open source. ”
Robert Huber
Chief Security Officer, Tenable

Want to learn more about secure open source for financial services? Read our white paper!

Wish to know more on how Canonical is helping financial institutions to leverage open source for driving innovation at lower cost? Check out our financial services webpage.

Webinar – A secure open-source MLOps ecosystem for the financial services sector

Canonical is working on building an open-source MLOps ecosystem that enables developers and data scientists at financial institutions to perform optimised model training within one tool that has wider MLOps integration. 

Watch our on-demand webinar that took place on March 7, 2023 to find out how financial institutions can use secure open source MLOps at scale to achieve enduring business value.

Register now

Deliver transformative innovations in financial services using secure open source

Get in touch



Related posts


Andreea Munteanu
2 October 2024

Canonical joins OPEA to enable Enterprise AI

AI Machine Learning

Canonical is committed to enabling organizations to secure and scale their AI/ML projects in production. This is why we are pleased to announce that we have joined the Open Platform for Enterprise AI (OPEA), a project community at the Linux Foundation focused on enterprise AI using open-source tooling. What is OPEA? OPEA is a project ...


Canonical
17 September 2024

Introducing Data Science Stack: set up an ML environment with 3 commands on Ubuntu 

AI Article

Canonical, the publisher of Ubuntu, today announced the general availability of Data Science Stack (DSS), an out-of-the-box solution for data science that enables ML environments on your AI workstation. It is fully open source, free to use and native to Ubuntu.  It is also accessible on other Linux distributions, on Windows using Windows ...


Andreea Munteanu
10 September 2024

Let’s meet at World Summit AI and talk about open source and AI tooling, with a dash of GenAI

AI Article

Date: 9-10 October 2024 Booth: B8 After Data & AI Masters, we cross the North Sea to attend one of the leading AI events inEurope. Between the 9th and 10th of October, our team will be in Amsterdam at World Summit AI for the second year in a row. In 2023, we had a blast ...