基于PaddlePaddle实现的语音识别,中文语音识别。项目完善,识别效果好。支持Windows,Linux下训练和预测,支持Nvidia Jetson开发板预测。
-
Updated
May 3, 2024 - Python
基于PaddlePaddle实现的语音识别,中文语音识别。项目完善,识别效果好。支持Windows,Linux下训练和预测,支持Nvidia Jetson开发板预测。
This is a sandbox manager developed using Django, providing isolated development environments with a suite of base functions and packages for each user on the same machine by using Docker.
Project demonstrates the power and simplicity of NVIDIA NIM (NVIDIA Inference Model), a suite of optimized cloud-native microservices, by setting up and running a Retrieval-Augmented Generation (RAG) pipeline.
🔎 Super-scale your images and run experiments with Residual Dense and Adversarial Networks.
Workflow that shows how to train neural networks on EC2 instances with GPU support and compares training times to CPUs
Speech synthesis (TTS) in low-resource languages by training from scratch with Fastpitch and fine-tuning with HifiGan
GPU-ready Dockerfile to run Stability.AI stable-diffusion model v2 with a simple web interface. Includes multi-GPUs support.
ROS2 Docker tutorial with VSCode
Move OpenGL/GLX rendering at containers on new level running its in Azure Batch
This repository is a one stop documentation for the tensorrt framework provided by NVIDIA. This repository contains every details starting from installation of tensorrt to deployment of model using Tensorrt.
You Only Look Once: Unified, Real-Time Object Detection
Training, inference, image vizualisation, database connector scripts
Detect, track and count different classes of objects.
A tool for running deep learning algorithms for semantic segmentation with satellite imagery
Tensorflow in Docker on Mesos #tfmesos #tensorflow #mesos
Jetson Nano Home Automator with Homebridge
NGC Container Replicator
tutorial on how to deploy a scalable autoregressive causal language model transformer using nvidia triton server
🔥🔥🔥🔥🔥🔥Docker NVIDIA Docker2 YOLOV5 YOLOX YOLO Deepsort TensorRT ROS Deepstream Jetson Nano TX2 NX for High-performance deployment(高性能部署)
Add a description, image, and links to the nvidia-docker topic page so that developers can more easily learn about it.
To associate your repository with the nvidia-docker topic, visit your repo's landing page and select "manage topics."