Skip to content

Latest commit

 

History

History

visual_detector

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Visual Detection Model Examples

These can be used on the fly with minimal or no changes to test deploy visual detection models to the Clarifai platform. See the required files section for each model below and deployment instruction.

YOLOF Requirements to run tests locally:

Download checkpoint and save it in yolof/config/:

$ wget -P yolof/config https://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427-8e864411.pth

Install dependecies to test locally

$ pip install -r yolof/requirements.txt

Torch serve model format faster-rcnn_torchserve

To utilize a Torch serve model (.mar file) created by running torch-model-archiver – essentially a zip file containing the model checkpoint, Python code, and other components – within this module, follow these steps:

  1. Unzip the .mar file to obtain your checkpoint.
  2. Implement your postprocess method in inference.py.

For example: Faster-RCNN example, suppose you already have .mar file following the torch serve example

unzip it to ./faster-rcnn_torchserve/model_store/hub/checkpoints as the Torch cache is configured to use this folder in torch serve inference.py.

$ unzip faster_rcnn.mar -d ./faster-rcnn_torchserve/model_store/hub/checkpoints/
# in model_store/hub/checkpoints you will have
model_store/hub/checkpoints/
├── MAR-INF
│   └── MANIFEST.json
├── model.py
└── fasterrcnn_resnet50_fpn_coco-258fb6c6.pth

Install dependecies to test locally

$ pip install -r faster-rcnn_torchserve/requirements.txt

YOLOX Requirements to run tests locally:

Download checkpoint and save it in yolox/configs/yolox/, e.g download x type of model:

$ wget -P yolox/configs/yolox/ https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_x_8x8_300e_coco/yolox_x_8x8_300e_coco_20211126_140254-1ef88d67.pth

Note: If you want to use a different model type or checkpoint, remember to update the checkpoint and config_path in the inference.py file accordingly.

Install dependecies to test locally

$ pip install -r yolox/requirements.txt

Deploy the model to Clarifai

Steps to deploy one of above examples after downloading weights and testing to the Clarifai platform.

Note: set --no-test flag for build and upload command to disable testing

  1. Build
$ clarifai build model <path/to/folder> # either `faster-rcnn_torchserve` or `yolof` or `yolox`

upload *.clarifai file to storage to obtain direct download url

  1. Upload
$ clarifai upload model <path/to/folder> --url <your_url>