It is a simple library to speed up CLIP inference up to 3x (K80 GPU)
-
Updated
Jul 20, 2023 - Python
It is a simple library to speed up CLIP inference up to 3x (K80 GPU)
A modular Rust library for super fast Stable Diffusion inference - 45% faster than PyTorch 🔮
Python scripts for performing 6D pose estimation and shape reconstruction using the CenterSnap model in ONNX
「PyTorch Implementation of AnimeGANv2」を用いて、生成した顔画像を元の画像に上書きするデモ
Tools for simple inference testing using TensorRT, CUDA and OpenVINO CPU/GPU and CPU providers. Simple Inference Test for ONNX.
Tennis match analysis via computer vision techniques.
Python scripts performing semantic segmentation using the TopFormer model in ONNX.
Python scripts for performing Image Inpainting using the MST model in ONNX
Text Detection and Recognition using ONNX
Jetson Nano Setup without Monitor for JetBot Build. JupyterLab, ROS2 Dasing, Torch, Torch2trt, ONNX, ONNXRuntime-GPU and TensorFlow Installation Included. JupyterLab doesn't require Docker Container.
YOLOv8 inference using ONNX Runtime
EfficientViTSAM inference using ONNXRuntime
This is a deepfake tool that implements the swapping and restoration of faces in images and videos by InsightFace and GFPGAN solutions.
Optimized onnx inference script in python for RoubustVideoMatting.
This project makes it easier to make detections using onnx models and cuda's speed.
State of the art image upscaling, directly in your browser.
Eliminar segmentos de video con contenido NSFW.
Add a description, image, and links to the onnxruntime-gpu topic page so that developers can more easily learn about it.
To associate your repository with the onnxruntime-gpu topic, visit your repo's landing page and select "manage topics."