Skip to content

Latest commit

 

History

History
38 lines (27 loc) · 1.71 KB

DEPLOY.md

File metadata and controls

38 lines (27 loc) · 1.71 KB

Deployment

We provide support to some popular deployment tools. This part is built upon the implementation of YOLOX Deployment and the adaptation by ByteTrack.

ONNX support

  1. convert the pytorch model to onnx checkpoints, we provide an example here.

    # In pratice you may want smaller model for faster inference.
    python deploy/scripts/export_onnx.py --output-name  ocsort.onnx -f exps/example/mot/yolox_x_mix_det.py -c pretrained/bytetrack_x_mot17.pth.tar
  2. run on the provided model video by

    cd $OCSORT_HOME/deploy/ONNXRuntime
    python onnx_inference.py

TensorRT support (Python)

  1. Follow TensorRT Installation Guide and torch2trt to install TensorRT (Version 7 recommended) and torch2trt.

  2. Convert Model

    # you have to download checkpoint bytetrack_s_mot17.pth.tar from model zoo of ByteTrack
    python3 deploy/scripts/trt.py -f exps/example/mot/yolox_s_mix_det.py -c pretrained/bytetrack_s_mot17.pth.tar
  3. Run on a demo video

    python3 tools/demo_track.py video -f exps/example/mot/yolox_s_mix_det.py --trt --save_result

Note: We haven't validated the C++ support for TensorRT yet, please refer to ByteTrack guidance for adaptation for now.

ncnn support

Please follow the guidelines from ByteTrack to deploy by support from ncnn.