Skip to content

Let's create a simple, efficient objectness branch to address the imbalance between foregrounds and backgrounds!

License

Notifications You must be signed in to change notification settings

ml-lab/objnessdet

 
 

Repository files navigation

Object Detectors with Objectness for Addressing Imbalance

Let's incorporate an simple objectness branch into the detector to address the foreground-background imbalance!

retinanet_with_objectness

This repository objnessdet provides an implementation for this. Without undersampling or Focal Loss, the RetinaNet (one-stage), Faster R-CNN (two-stage), FCOS (anchor-free) could achieve similar or even better COCO AP results.

Introduction

  • We report comparable COCO AP results for object detectors with and without sampling/reweighting schemes. Such the schemes, e.g. undersampling, Focal Loss and GHM, have always been considered as an especially essential component for training detectors, which is supposed to alleviate the extreme imbalance between foregrounds and backgrounds.
  • Nevertheless, our report reveals that for addressing the imbalance to achieve higher accuracy, these schemes are not necessary. Specifically, by three simple training/inference strategies --- decoupling objectness from classification, biased initialization, threshold movement, we successfully abandon sampling/reweighting schemes in the representatives of one-stage (RetinaNet), two-stage (Faster R-CNN), and anchor-free (FCOS) detectors, with the not worse performance than the vanilla models.
  • As the sampling/reweighting schemes usually introduce laborious hyper-parameters tuning, we expect our discovery could simplify the training procedure of object detectors.

Highlights

  • Implementation: based on maskrcnn-benchmark and FCOS, but containing Faster/Mask R-CNN, RetinaNet and FCOS detectors.
  • Easy training/evaluation: providing training/evaluation/test bash scripts for simplify training and inference.
  • Imbalance Scheme: adding objectness scheme, which requires no laborious hyper-parameters tuning.

Installation

Check INSTALL.md for installation instructions.

Model Zoo and Baselines

Pre-trained models (with objectness), baselines can be found in MODEL_ZOO.md.

Model AP AP50 AP75 APs APm APl
RetinaNet_R_50_FPN_1x (baseline) 36.4 55.0 39.0 19.9 40.3 48.9
RetinaNet-Obj_R_50_FPN_1x (-focalloss, +objness) 36.5 55.7 38.7 19.8 40.2 49.0
FasterRCNN_R_50_FPN_1x (baseline) 36.8 58.4 40.0 20.7 39.7 47.9
FasterRCNN-Obj_R_50_FPN_1x (-focalloss, +objness) 37.2 58.7 40.2 21.6 40.6 48.3
FCOS-Obj_R_50_FPN_1x 37.0 55.8 39.5 21.4 40.9 47.8
FCOS-Obj_R_50_FPN_1x (-focalloss, +objness) 36.7 55.6 39.2 20.8 41.2 47.2

Training on COCO

After installation, by using the scripts train.sh, with the modification of the dataset prefix in maskrcnn_benchmark/config/paths_catalog.py, you can easily train the model on COCO train2017.

Evaluation on COCO

After installation, by using the scripts eval.sh, you can easily evaluate the model on COCO val2017.

Other Details

See the original benchmark maskrcnn-benchmark for more details.

Citations

Please consider citing this project in your publications if it helps your research. The following is a BibTeX reference. The BibTeX entry requires the url LaTeX package.

@misc{joya2019objness,
author = {Joya Chen, Dong Liu, Tong Xu and Enhong Chen},
title = {{ObjnessDet: Object Detectors with Objectness for Addressing Imbalance}},
year = {2019},
howpublished = {\url{https://github.com/ChenJoya/objnessdet}},
note = {Accessed: [Insert date here]}
}

License

objnessdet is released under the MIT license. See LICENSE for additional details.

About

Let's create a simple, efficient objectness branch to address the imbalance between foregrounds and backgrounds!

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 76.9%
  • Cuda 17.9%
  • C++ 4.3%
  • Other 0.9%