Skip to content

[CVPR2023] This is the official implementation of H-Deformable-DETR w/ ViT-L (MAE).

License

Notifications You must be signed in to change notification settings

kxqt/H-Deformable-DETR

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

H-Deformable-DETR

This is the official implementation of the paper "DETRs with Hybrid Matching".

Authors: Ding Jia, Yuhui Yuan, Haodi He, Xiaopei Wu, Haojun Yu, Weihong Lin, Lei Sun, Chao Zhang, Han Hu

News

2022.09.14 We support H-Deformable-DETR w/ ViT-L (MAE) achieves 56.6 AP on COCO val with 4-scale feature maps without using LSJ (large scale jittering) adopted by the original ViT-Det.

Model ZOO

We provide a set of baseline results and trained models available for download:

Models with ViT (MAE) backbone

Name Backbone query LSJ encoder epochs AP download
H-Deformable-DETR + tricks ViT-B 300 6 12 50.6 model
H-Deformable-DETR + tricks ViT-B 300 2 12 49.8 model
H-Deformable-DETR + tricks ViT-B 300 0 12 47.1 model
H-Deformable-DETR + tricks ViT-L 300 6 12 51.1 model
H-Deformable-DETR + tricks ViT-L 300 6 36 55.5 model
H-Deformable-DETR + tricks ViT-L 300 6 75 56.5 model
H-Deformable-DETR + tricks ViT-L 300 6 100 56.6 model

Installation

We test our models under python=3.7.10,pytorch=1.10.1,cuda=10.2. Other versions might be available as well.

  1. Clone this repo
git https://github.com/HDETR/H-Deformable-DETR.git
cd H-Deformable-DETR
  1. Install Pytorch and torchvision

Follow the instruction on https://pytorch.org/get-started/locally/.

# an example:
conda install -c pytorch pytorch torchvision
  1. Install other needed packages
pip install -r requirements.txt
pip install openmim
mim install mmcv-full
pip install mmdet
  1. Compiling CUDA operators
cd models/ops
python setup.py build install
# unit test (should see all checking is True)
python test.py
cd ../..

Data

Please download COCO 2017 dataset and organize them as following:

coco_path/
  ├── train2017/
  ├── val2017/
  └── annotations/
  	├── instances_train2017.json
  	└── instances_val2017.json

Run

To train a model using 8 cards

GPUS_PER_NODE=8 ./tools/run_dist_launch.sh 8 <config path> \
    --coco_path <coco path>

To train/eval a model with the swin transformer backbone, you need to download the backbone from the offical repo frist and specify argument--pretrained_backbone_path like our configs.

To eval a model using 8 cards

GPUS_PER_NODE=8 ./tools/run_dist_launch.sh 8 <config path> \
    --coco_path <coco path> --eval --resume <checkpoint path>

Distributed Run

You can refer to Deformable-DETR to enable training on multiple nodes.

Modified files compared to vanilla Deformable DETR

To support swin backbones

  • models/backbone.py
  • models/swin_transformer.py
  • mmcv_custom

To support eval in the training set

  • datasets/coco.py
  • datasets/__init__.py

To support Hybrid-branch, tricks and checkpoint

  • main.py
  • engine.py
  • models/deformable_detr.py
  • models/deformable_transformer.py

To support fp16

  • models/ops/modules/ms_deform_attn.py
  • models/ops/functions/ms_deform_attn_func.py

To fix a pytorch version bug

  • util/misc.py

Addictional packages needed

  • wandb: for logging
  • mmdet: for swin backbones
  • mmcv: for swin backbones
  • timm: for swin backbones

Citing H-Deformable-DETR

If you find H-Deformable-DETR useful in your research, please consider citing:

@article{jia2022detrs,
  title={DETRs with Hybrid Matching},
  author={Jia, Ding and Yuan, Yuhui and He, Haodi and Wu, Xiaopei and Yu, Haojun and Lin, Weihong and Sun, Lei and Zhang, Chao and Hu, Han},
  journal={arXiv preprint arXiv:2207.13080},
  year={2022}
}

@article{zhu2020deformable,
  title={Deformable detr: Deformable transformers for end-to-end object detection},
  author={Zhu, Xizhou and Su, Weijie and Lu, Lewei and Li, Bin and Wang, Xiaogang and Dai, Jifeng},
  journal={arXiv preprint arXiv:2010.04159},
  year={2020}
}

About

[CVPR2023] This is the official implementation of H-Deformable-DETR w/ ViT-L (MAE).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 72.9%
  • Cuda 15.0%
  • Shell 10.6%
  • C++ 1.5%