• NVIDIA Dynamo 1.0 provides a production-grade, open source foundation for inference at scale.
  • Dynamo and NVIDIA TensorRT-LLM optimizations integrate natively into open source frameworks such as LangChain, llm-d, LMCache, SGLang and vLLM to boost inference performance.
  • Dynamo boosts inference performance of NVIDIA Blackwell GPUs by up to 7x, lowering token cost and increasing revenue opportunity for millions of GPUs with free, open source software.
  • NVIDIA inference platform integrated by cloud service providers, Amazon Web Services (AWS), Microsoft Azure, Google Cloud and Oracle Cloud Infrastructure (OCI), along with NVIDIA cloud partners Alibaba Cloud, CoreWeave, Together AI and Nebius — and adopted by AI-native companies Cursor and Perplexity; inference endpoint providers Baseten, Deep Infra and Fireworks; and global enterprises ByteDance, Meituan, PayPal and Pinterest.

SAN JOSE, Calif., March 16, 2026 (GLOBE NEWSWIRE) -- GTC -- NVIDIA today announced NVIDIA Dynamo 1.0, open source software for generative and agentic inference at scale, with widespread global adoption. Together with the NVIDIA Blackwell platform, Dynamo 1.0 enables cloud providers, AI innovators and global enterprises to deliver high-performance AI inference with unmatched scale, efficiency and speed.

As agentic AI systems move into production across industries, scaling inference within a data center has become a complex challenge of resource orchestration, with requests of varying sizes and modalities, as well as performance objectives, arriving in unpredictable bursts.

Just as a computer's operating system coordinates hardware and applications, Dynamo 1.0 functions as the distributed "operating system" of AI factories, seamlessly orchestrating GPU and memory resources across the cluster to power complex AI workloads. In recent industry benchmarks, Dynamo boosted the inference performance of NVIDIA Blackwell GPUs by up to 7x, lowering token cost and increasing revenue opportunity for millions of GPUs with free, open source software.