NVIDIA’s Ariel Kit Explains How NVIDIA BlueField DPUs Are Redefining Data Center Services
Tags: Datacentre , dpu , gtc , networking , nvidia , smartnic
NVIDIA is redefining the data center around the concept of data processing units (DPUs): powerful network cards running Ubuntu out of the box that combine hardware and software to deliver new classes of cloud architectures – in the data center and at the edge.
Whether for private clouds, edge computing or data center technologies, we are exploring these new possibilities. We spoke with Ariel Kit, Director, Product Management for the NVIDIA BlueField DPUs to learn more.
Ariel, why is NVIDIA redefining the data center around DPUs? How do DPUs relate to SmartNICs and GPUs?
Looking back on data center architecture, it was all designed around a single compute entity which is the CPU, and this entity was being served by dedicated interface cards that were used to handle specific tasks as the extension of the CPU. However, CPUs were designed for general-purpose computing and cannot efficiently handle the workloads to meet the needs of modern data centers, which use an increasing amount of real-time AI processing and data analytics, both of which are insanely compute-intensive.
In addition, with the evolution to ‘software defined everything’, all the data center infrastructure became software defined services that run on the CPU. This is creating significant networking and security overhead on the CPU cores, especially now that massive quantities of data need to be processed to support the digital transformation of the AI and IoT era.
A popular solution in the previous decade was to add more CPU cores; yet, with the end of Moore’s Law, scale-up CPUs are no longer cost-viable, reaching limits of single-node performance.
GPUs are now necessary to enable accelerated computing and to support the growth in AI/ML and data analytics-based workloads.
In a world of overutilised CPUs that handle traditional application compute and I/O management and security tasks, DPU-based SmartNICs accelerate data-intensive tasks and offload them from the CPU. These tasks are essential to disaggregate resources and make data centers composable by shifting the infrastructure services such as networking, storage and security to run accelerated on the DPU, in an isolated domain separate from the application domain.
So now the DPU joins the GPU and CPU as the third pillar of modern data centers.
What are the typical applications of DPUs? Are DPUs aimed at implementing new data center architectures, or accelerating workloads? Where are these devices getting deployed already?
At GTC, we’ve announced the NVIDIA BlueField-2 DPU, the most advanced accelerated data center networking solution, which incorporates a powerful Arm-based compute engine with all of the networking, storage, and security accelerators built into the latest ConnectX-6 Dx adapter. The NVIDIA DPU accelerates advanced networking functions like virtual switching and routing, load balancing, 5G networking for telecommunications and streaming, in addition to advanced storage capabilities such as virtualised elastic block storage exposing remote NVMe drives as if they were local, along with accelerated compression, encryption, and de-duplication.
A major focus for us is to help organisations deploy a better secured data center. With the massive impact of COVID-19, remote workforces extend beyond the organisation perimeter, increasing the risk and number of cyber attacks significantly. Today, in a multi-tenant environment, most traffic is east-west and modern workloads are dynamic and moving around the data center. In this environment, the DPU is the right tool to offer distributed security for the compute nodes by enabling agentless micro-segmentation, next-generation firewalls and transparent encryption for every cloud and edge compute node that can meet the strictest privacy regulations. We understand that some organisations compromise on application performance to achieve better security. DPUs eliminate this tradeoff by delivering secure networking and storage services at wire speed without sacrificing application performance.
We see great adoption of DPU solutions for bare metal and multi-tenant environments, where the only way to deploy networking policies and security enforcement is by running these services accelerated on DPUs.
Why do you think it’s important to offer Ubuntu on NVIDIA BlueField DPUs? Is it pre-installed out of the box?
Ubuntu is available today as an out-of-the-box option with NVIDIA BlueField-2 DPUs. This improves the developer experience when building new DPU infrastructure services. The IT experience for deploying DPUs in the data center must also be simple and intuitive with a strong OS security foundation. Additionally, we offer developers great value from the growing app ecosystem of Ubuntu and look forward to continuing to deliver value as the DPU-accelerated data center evolution progresses.
And are there additional, perhaps complementary, benefits of Ubuntu on DPUs?
That’s a great question. Ubuntu already integrates the inbox drivers to deliver key functionality on the DPU. Ubuntu is built with a front-and-center focus on security and includes features like secure boot, continuous security updates while the vibrant Ubuntu Linux eco-system allows developers to stand on shoulders of the community – which provides an array of common applications that can be reused. This allows enterprise developers to focus on the core application that is close to their product’s heart while accessing common services readily. Developers used to developing on Ubuntu in the cloud can now deploy easily using Ubuntu Core, which has an extremely small footprint. Ubuntu Core is perfect for the edge, where there is a great need for reliability in addition to security and bandwidth limitations.
This is all so exciting! The DPU has come a long way and already adds so much value. How do you see the DPU evolving in the future? Are there key markets which should be looking at this technology and questioning why it is not part of their plans yet?
It is very exciting.
NVIDIA DPUs, together with NVIDIA GPUs, are going to accelerate computing everywhere, from the heart of the data center to the edge. These are environments in which acceleration and security are critical, especially with the tremendous growth of 5G and IoT. Eventually in every use case and architecture in which infrastructure flexibility and scale is needed, you will find a DPU and likely a GPU. In the coming years I expect enterprises to embrace the DPU and ask themselves why they didn’t start using this technology sooner.
I want to get started! How do I get my hands on hardware and SDKs?
Start your journey with the NVIDIA Data Center Infrastructure on a Chip Architecture (DOCA) SDK here. You can learn more about NVIDIA DPU products here.
Ariel, thanks for joining us! It’s always such a pleasure talking with you.
You can read more about NVIDIA introducing their new family of BlueField DPUs to bring breakthrough networking, storage and security performance to every data center. You can learn more about Ubuntu’s security features here. Ubuntu Core is Ubuntu for embedded environments, optimised for security and reliable updates. Here is a link to Canonical’s talk about our support for the DPU in the enterprise at NVIDIA GTC October 2020.