Skip to content

Part of cpp-ml-server to store model configuration files.

Notifications You must be signed in to change notification settings

haritsahm/triton-ml-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Triton Machine Learning Model Collections

This is part of cpp-ml-server project that holds the models configurations for Triton Inference Engine.

Build Model

Please read how to use different model sources and model configurations with Triton Inference Server Guide

  1. Load and export Timm Classification Model to ONNX
// Modify the model name in the script then run
python3 export_onnx.py

Model Repository

  1. Imagenet Classification Static
    • Name: imagenet_classification_static
    • Max batch size: 1
    • Model origin: timm/efficientnetv2_rw_s

About

Part of cpp-ml-server to store model configuration files.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages