Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting our team. We will be in touch shortly.Close

  1. Blog
  2. Article

darkolarczyk
on 25 October 2021


Canonical is once again proud to be a sponsor of the Nvidia GTC event! Happening virtually on November 8-11, the conference will feature a wide variety of sessions on AI, computer graphics, data science, and more.

Register for the event

During this GTC, Canonical will be hosting two sessions. Join for us a co-hosted speech with Nvidia on implementing a low-latency streaming solution with Android Cloud. And if you’re interested in data science, tune in to our talk on using open-source tools to build your MLOps infrastructure. Take a look at the details below and reserve your free spot for the talks today!

Canonical’s Sessions at GTC

Implementing a Low-latency Streaming Solution to Power Android Cloud Gaming [A31704]

Wednesday, November 10th 10:00 AM – 10:50 AM CET

Implementing an Android in the cloud solution involves various challenges including the development of a low latency streaming solution. In this talk, Canonical and NVIDIA collaborate towards building a platform to run Android in containers at high scale and density and implementing low latency video streaming powered by NVIDIA technology. This includes a zero-copy render-to-encode pipeline to keep latency on the server-side as minimal as possible. Enhanced with the power of video encoders on NVIDIA GPUs, high quality and low latency solutions are now possible. In this talk, we will walk jointly through the different aspects and challenges of building such a solution and provide insights into how NVIDIA technology makes this possible.

RSVP for this session

Speakers:

Chad Cooper, Product Manager for Cloud Gaming, NVIDIA

Simon Fels, Engineering Manager, Anbox, Canonical

Tailored MLOps infrastructure using Open Source tools [A31634]

MLOps setups are vastly different from one another. There is no single platform that can address all requirements. In order to have a cost-effective, secure, and fast MLOps platform you need to tailor build it. Let’s go over 5 real cases, and check how you can architect a solution from open source components. We will discuss MLOps setups for automatic customs declaration filling, 4000 4K video streams in parallel, multiple production lines in a car factory, deploying your models straight to NVidia Jetsons from MLOps pipeline, and more.

RSVP for this session

Speaker:

Maciej Mazur, Product Manager, Canonical

See you there!

Related posts


Andreea Munteanu
14 September 2023

Meet us at World AI 2023

AI Article

The Canonical AI Roadshow has started. Meet us around the globe. Date: 11-12 October 2023 Location: Taets Art & Event Park, Amsterdam, Netherlands Booth: A24 The Canonical AI Roadshow is taking off. Generative AI, large language models (LLMs) and predictive analytics are shaping the future of technology. Experience the latest advances in ...


Andreea Munteanu
19 February 2024

Edge AI: what, why and how with open source

Ubuntu Article

Edge AI is transforming the way that devices interact with data centres, challenging organisations to stay up to speed with the latest innovations. From AI-powered healthcare instruments to autonomous vehicles, there are plenty of use cases that benefit from artificial intelligence on edge devices. This blog will dive into the topic, capt ...


Andreea Munteanu
30 January 2024

AI on-prem: what should you know?

Cloud and server Article

Organisations are reshaping their digital strategies, and AI is at the heart of these changes, with many projects now ready to run in production. Enterprises often start these AI projects on the public cloud because of the ability to minimise the hardware burden. However, as initiatives scale, organisations often look to migrate the workl ...