Open source real-time translation app for Android that runs locally
-
Updated
Jul 30, 2024 - C++
Open source real-time translation app for Android that runs locally
🛠 A lite C++ toolkit of awesome AI models, support ONNXRuntime, MNN, TNN, NCNN and TensorRT.
⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support.
An OBS plugin for removing background in portrait images (video), making it easy to replace the background when recording or streaming.
🍅🍅🍅YOLOv5-Lite: Evolved from yolov5 and the size of model is only 900+kb (int8) and 1.7M (fp16). Reach 15 FPS on the Raspberry Pi 4B~
Neural Text To Speech Di Dart Cepat Ringan Tanpa Koneksi Internet dan bisa berjalan di cpu
ONNX DART LIBRARY
small c++ library to quickly deploy models using onnxruntime
A simple Windows / Xbox app for generating AI images with Stable Diffusion.
Efficient CPU/GPU/Vulkan ML Runtimes for VapourSynth (with built-in support for waifu2x, DPIR, RealESRGANv2/v3, Real-CUGAN, RIFE, SCUNet and more!)
YOLOX + ROS2 object detection package (C++ only support)
分别使用OpenCV、ONNXRuntime部署yolov5-v6.1目标检测,包含C++和Python两个版本的程序。支持yolov5s,yolov5m,yolov5l,yolov5n,yolov5x,yolov5s6,yolov5m6,yolov5l6,yolov5n6,yolov5x6的十种结构的yolov5-v6.1
YOLOv5 ONNX Runtime C++ inference code.
分别使用OpenCV、ONNXRuntime部署YOLOX+ByteTrack目标跟踪,包含C++和Python两个版本的程序
yolov5 segmentation with onnxruntime and opencv
🔥Robust Video Matting C++ inference toolkit with ONNXRuntime、MNN、NCNN and TNN, via lite.ai.toolkit.
ONNX Runtime Server: The ONNX Runtime Server is a server that provides TCP and HTTP/HTTPS REST APIs for ONNX inference.
chineseocr lite onnx,超轻量级中文ocr demo,支持onnx推理 ( dbnet+crnn+anglenet)
分别使用OpenCV、ONNXRuntime部署NanoDet-Plus,包含C++和Python两个版本的程序
Add a description, image, and links to the onnxruntime topic page so that developers can more easily learn about it.
To associate your repository with the onnxruntime topic, visit your repo's landing page and select "manage topics."