Skip to content

wangchen1801/FPD

Repository files navigation

Introduction

Fine-Grained Prototypes Distillation for Few-Shot Object Detection (AAAI2024)

fpd_architecture

This repo is based on MMFewShot.

Quick Start

# creat a conda environment
conda create -n fpd python=3.8
conda activate fpd
conda install pytorch==1.12.1 torchvision==0.13.1 torchaudio cudatoolkit=11.3 -c pytorch -c conda-forge

# dependencies
pip install openmim
mim install mmcv-full==1.6.0
mim install mmcls==0.25.0
mim install mmdet==2.24.0
pip install -r requirements.txt

# install mmfewshot
pip install git+https://github.com/open-mmlab/mmfewshot.git
# or manually download the code, then
# cd mmfewshot
# pip install .

# install FPD
python setup.py develop

Prepare Datasets

Please refer to mmfewshot/data for the data preparation steps.

Results on VOC Dataset

  • Base Training
Config Split Base AP50 ckpt
config 1 79.8 ckpt
config 2 80.3 ckpt
config 3 80.2 ckpt
  • Few Shot Fine-tuning
Config Split Shot Novel AP50 ckpt log
config 1 10 68.4 ckpt log
config 2 10 53.9 ckpt log
config 3 10 62.9 ckpt log

Results on COCO Dataset

  • Base Training
Config Base mAP ckpt
config 36.0 ckpt
  • Few Shot Fine-tuning
Config Shot Novel mAP (nAP) ckpt log
config 30 20.1 ckpt log

Evaluation

# single-gpu test
python test.py ${CONFIG} ${CHECKPOINT} --eval mAP|bbox

# multi-gpus test
bash dist_test.sh ${CONFIG} ${CHECKPOINT} ${NUM_GPU} --eval mAP|bbox
  • For example, test pretrained weights on VOC Split1 10-shot with 2 gpus:
bash dist_test.sh \
    configs/fpd/voc/split1/fpd_r101_c4_2xb4_voc-split1_10shot-fine-tuning.py \
    ./work_dirs/fpd_r101_c4_2xb4_voc-split1_10shot-fine-tuning/fpd_r101_c4_2xb4_voc-split1_10shot-fine-tuning_iter_2000.pth 2 --eval mAP
  • Test pretrained weights on COCO 30-shot with 2 gpus:
bash dist_test.sh \
    configs/fpd/coco/fpd_r101_c4_2xb4_coco_30shot-fine-tuning.py \
    ./work_dirs/fpd_r101_c4_2xb4_coco_30shot-fine-tuning/fpd_r101_c4_2xb4_coco_30shot-fine-tuning_iter_18000.pth 2 --eval bbox

Training

# single-gpu training
python train.py ${CONFIG}

# multi-gpus training
bash dist_train.sh ${CONFIG} ${NUM_GPU}
  • Training FPD on VOC dataset with 2 gpus:
# base training
bash dist_train.sh \
    configs/fpd/voc/split1/fpd_r101_c4_2xb4_voc-split1_base-training.py 2
    
# few-shot fine-tuning
bash dist_train.sh \
    configs/fpd/voc/split1/fpd_r101_c4_2xb4_voc-split1_10shot-fine-tuning.py 2
  • Training FPD on COCO dataset with 2 gpus:
# base training
bash dist_train.sh \
    configs/fpd/coco/fpd_r101_c4_2xb4_coco_base-training.py 2
    
# few-shot fine-tuning
bash dist_train.sh \
    configs/fpd/coco/fpd_r101_c4_2xb4_coco_30shot-fine-tuning.py 2 

Citation

If you would like to cite this paper, please use the following BibTeX entries:

@InProceedings{wang2024fpd,
  title={Fine-Grained Prototypes Distillation for Few-Shot Object Detection},
  author={Wang, Zichen and Yang, Bo and Yue, Haonan and Ma, Zhenghao},
  booktitle = {Proceedings of the 38th AAAI Conference on Artificial Intelligence (AAAI-24)},
  year={2024}
}

About

Official code of the paper "Fine-Grained Prototypes Distillation for Few-Shot Object Detection (AAAI 2024)"

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published