Skip to content

Latest commit

 

History

History
65 lines (48 loc) · 2.29 KB

README_CR.md

File metadata and controls

65 lines (48 loc) · 2.29 KB

Crossroad scenario.

Data preparation

As an example of usage please download a small dataset from here. To run training, you firstly need to create LMDB files. The annotations should be stored in <DATA_DIR>/annotation_train_cvt.json and <DATA_DIR>/annotation_val_cvt.json files.

Create LMDB files

To create LMDB files go to the '$CAFFE_ROOT/python/lmdb_utils/' directory and run the following scripts:

  1. Run docker in interactive session with mounted directory with WIDER dataset
nvidia-docker run --rm -it --user=$(id -u) -v <DATA_DIR>:/data ttcf bash
  1. Convert original annotation to Pascal VOC format for training subset. This way the annotation makes compatible with Caffe SSD tools, required for data LMDB generation.
python3 $CAFFE_ROOT/python/lmdb_utils/convert_to_voc_format.py /data/annotation_train_cvt.json /data/train.txt
  1. Run bash script to create LMDB:
bash $CAFFE_ROOT/python/lmdb_utils/create_cr_lmdb.sh
  1. Close docker session by ctrl+D and check that you have lmdb files in <DATA_DIR>/lmdb.

Person-vehicle-bike crossroad detection training

On next stage we should train the Person-vehicle-bike crossroad (four class) detection model. To do this follow next steps:

cd ./models
python3 train.py --model crossroad \
                --weights person-vehicle-bike-detection-crossroad-0078.caffemodel \
                --data_dir <DATA_DIR> \
                --work_dir<WORK_DIR> \
                --gpu <GPU_ID>

Person-vehicle-bike crossroad detection model evaluation

To evaluate the quality of trained Person-vehicle-bike crossroad detection model on your test data you can use provided scripts.

python3 evaluate.py --type cr \
    --dir <WORK_DIR>/crossroad/<EXPERIMENT_NUM> \
    --data_dir <DATA_DIR> \
    --annotation annotation_val_cvt.json \
    --iter <ITERATION_NUM>

Export to IR format

python3 mo_convert.py --type cr \
    --name crossroad \
    --dir <WORK_DIR>/crossroad/<EXPERIMENT_NUM> \
    --iter <ITERATION_NUM> \
    --data_type FP32

demo

You can use this demo to view how resulting model performs.