NVIDIA DRA Driver: Open Source AI Infrastructure for Kubernetes | KubeCon 2026
NVIDIA Donates AI Infrastructure Software to Open Source Community
Amsterdam, Netherlands – NVIDIA today announced the donation of its Dynamic Resource Allocation (DRA) Driver for GPUs to the Cloud Native Computing Foundation (CNCF) at KubeCon Europe, a major conference for cloud-native technologies. The move aims to streamline the management of high-performance AI infrastructure and foster greater collaboration within the open-source community.
The NVIDIA DRA Driver, previously under vendor governance, will now be fully owned and maintained by the Kubernetes project. This transition is expected to encourage broader participation from developers and experts, accelerating innovation and ensuring the technology remains aligned with the evolving cloud landscape. “NVIDIA’s deep collaboration with the Kubernetes and CNCF community to upstream the NVIDIA DRA Driver for GPUs marks a major milestone for open source Kubernetes and AI infrastructure,” said Chris Aniszczyk, chief technology officer of CNCF.
The donation addresses a key challenge in modern computing: efficiently managing the powerful GPUs that are increasingly essential for artificial intelligence workloads. The vast majority of these AI workloads currently run on Kubernetes, an open-source platform designed to automate the deployment, scaling, and management of containerized applications.
The NVIDIA DRA Driver offers several key benefits for developers, including improved efficiency through smarter sharing of GPU resources, support for NVIDIA Multi-Process Service and Multi-Instance GPU technologies, and the ability to connect systems together using NVIDIA Multi-Node NVlink interconnect technology. It also provides flexibility by allowing developers to dynamically reconfigure hardware and precision through fine-tuned resource requests.
NVIDIA is collaborating with a number of industry leaders on this initiative, including Amazon Web Services, Broadcom, Canonical, Google Cloud, Microsoft, Nutanix, Red Hat, and SUSE. “Open source will be at the core of every successful enterprise AI strategy, bringing standardization to the high-performance infrastructure components that fuel production AI workloads,” said Chris Wright, chief technology officer and senior vice president of global engineering at Red Hat.
The company also announced expanded GPU support for Kata Containers, lightweight virtual machines that enhance workload isolation and security, enabling confidential computing for AI applications. This allows organizations to safeguard sensitive data while leveraging the power of hardware acceleration.
Beyond the DRA Driver donation, NVIDIA highlighted several other open-source projects, including NVSentinel, a GPU fault remediation system, and AI Cluster Runtime, an agentic AI framework, initially announced at the recent GTC conference. The company also announced that its KAI Scheduler, a high-performance AI workload scheduler, has been onboarded as a CNCF Sandbox project, opening it up for wider community contribution.
NVIDIA also introduced Grove, an open-source Kubernetes application programming interface for orchestrating AI workloads on GPU clusters, integrating it with the llm-d inference stack. Developers and organizations can begin using and contributing to the NVIDIA DRA Driver today, with live demonstrations available at the NVIDIA booth at KubeCon Europe.
