Change the repository type filter
All
Repositories list
583 repositories
- Open-source deep-learning framework for exploring, building and deploying AI weather/climate workflows.
- CUDA Core Compute Libraries
- C++ and Python support for the CUDA Quantum programming model for heterogeneous quantum-classical workflows
- Generative AI reference workflows optimized for accelerated infrastructure and microservice architecture.
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
- A Python framework for accelerated simulation, data generation and spatial computing.
- A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper, Ada and Blackwell GPUs, to provide better performance with lower memory utilization in both training and inference.
- CUDA Kernel Benchmarking Library
- NeMo text processing for ASR and TTS
- TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in performant way.
- NVIDIA Federated Learning Application Runtime Environment