Skip to main content

Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

Develop AI models and run
them at the edge – all on the
same open source stack

Edge computing and AI go hand-in-hand to drive some of today's most valuable enterprise use cases. Canonical delivers a single stack with everything you need to easily and securely deploy and manage AI at the edge. Unlock the power of real-time, AI-driven insight in the field.


Contact Canonical about AI solutions for edge computing

Why Canonical for enterprise AI edge computing?

  • Single solution with all the tools you need, engineered for edge AI
  • Simplify compliance with end-to-end security for your models and edge devices
  • Edge-optimised OS, Kubernetes, compute and data processing

Solve edge AI complexity with an integrated and optimised stack

From GPU optimisation to distributed data processing, Canonical delivers an integrated and optimised stack purpose-built to address the specific needs of edge AI.

By centralising your edge AI journey on a Canonical stack, you can mitigate the complexity of AI projects that rely on a wide array of open source tools. Keep your projects on track and ensure they deliver ROI in production.

Canonical's portfolio encompasses the complete edge AI toolset, including the OS, Kubernetes, MLOps, compute, data processing and certified hardware – all optimised for the edge.


Watch our AI at the edge webinar ›


Security and compliance for any edge AI use case

Protect your devices and stay compliant with current and emerging industry regulations – such as the EU Cyber Resilience Act – by getting your devices and all of your open source software secured.

Run your models securely on any architecture, and continuously protect your devices against emerging threats with reliable over-the-air updates and auto-rollback.

Enjoy up to 12 years of security maintenance for Ubuntu and thousands of open source packages.


Download the Ubuntu Pro for Devices datasheet ›



Move faster with certified hardware

Accelerate your edge AI projects with certified hardware. We partner directly with silicon vendors to optimise and certify our solutions with dedicated edge AI hardware, so you can get the best experience straight out of the box.


Learn more about certified hardware ›


Ubuntu Core: the edge-optimised OS

Minimal, secure and strictly confined, Ubuntu Core is an operating system designed for embedded devices. Ubuntu Core architecture is based on snaps – containerised software packages – and optimised for edge performance, security and reliable updates.

Ubuntu Core uses the same kernel, libraries and system software as classic Ubuntu, resulting in a smooth transition from development to production.

For your edge AI devices with a long lifespan, we provide up to 12 years of support for Ubuntu Core – the longest support window in the industry.


Explore Ubuntu Core ›


Continuous security and long-term value with OTA updates

Keep your machine learning models updated and secure post-deployment with seamless over-the-air (OTA) updates to your edge devices.

Our management tools enable you to deliver OTA updates in strict accordance with your IT policy, and automate an array of other tasks, in both connected and air-gapped environments.


Discover Landscape ›

With Ubuntu Core, you can create your own Dedicated Snap Store: a private, centralised platform for publishing and distributing software to your devices in a secure and validated way. Control exactly what snaps are available, which devices can access them, and when.


OTA updates with Ubuntu Core ›


Low footprint Kubernetes with AI integrations

Lightweight containers are ideal for edge environments, and those containers need orchestration. Canonical's MicroK8s is a minimal footprint, zero ops, pure upstream Kubernetes perfectly suited to the edge. MicroK8s is packaged as a secure snap that comes with all the necessary features built in – including key integrations for AI.

The NVIDIA GPU operator is integrated into Canonical Kubernetes, allowing automation of all software components required to provision NVIDIA GPUs.


Learn more about MicroK8s ›


Bring the power of the cloud to the edge

Process large data volumes and run complex ML models in the field by serving compute directly to the edge.

MicroClouds are small-scale, open source clouds consisting of low-footprint clusters of compute nodes with distributed storage and secure networking. They are lightweight, low-touch, and designed for repeatable and reliable remote deployments.


CTO's guide to MicroCloud for the edge ›


Learn more about edge and AI

Run AI at scale

Read our whitepaper to learn how to build a performant ML Stack with NVIDIA DGX and Kubeflow.


An overview of machine learning security risks

Discover the most prominent security threats facing AI projects and how to mitigate them.


Edge computing in automotive

Watch the webinar on how edge computing is impacting the automotive industry, including the rise of autonomous vehicles.


Send us a message to discuss your AI needs

Contact us ›