Skip to content

naver-ai/dual-teacher

Repository files navigation

Switching Temporary Teachers for Semi-Supervised Semantic Segmentation

PWC

Switching Temporary Teachers for Semi-Supervised Semantic Segmentation
Jaemin Na, Jung-Woo Ha, Hyung Jin Chang, Dongyoon Han*, and Wonjun Hwang*.
In NeurIPS 2023.


Abstract: The teacher-student framework, prevalent in semi-supervised semantic segmentation, mainly employs the exponential moving average (EMA) to update a single teacher's weights based on those of the student. However, EMA updates raise a problem in that the weights of the teacher and student are getting coupled, causing a potential performance bottleneck. Furthermore, this problem may get severer when training with more complicated labels such as segmentation masks but with few annotated data. This paper introduces Dual Teacher, a simple yet effective approach that employs dual temporary teachers aiming to the student to alleviate the coupling problem. The temporary teachers work in shifts and are progressively improved, so consistently keep the teacher and student from becoming excessively close. Specifically, the temporary teachers periodically take turns generating pseudo-labels to train a student model and keep the distinct characteristics of the student model for each epoch. Consequently, Dual Teacher achieves competitive performance on the PASCAL VOC, Cityscapes, and ADE20K benchmarks with remarkably shorter training times than state-of-the-art methods. Moreover, we demonstrate that our approach is model-agnostic and compatible with both CNN- and Transformer-based models.

Dataset

Download ADE20K dataset and modify your path in configuration file.
For semi-supervised learning scenarios, split the images based on the partitions of the text files in the ADEChallengeData2016.

├── ./data
    ├── ADEChallengeData2016
        ├── images
          ├── training631_l
          ├── training631_u
        ├── annotations
          ├── training631_l
          ├── training631_u

Installation

For installation, please refer to the guidelines in MMSegmentation v0.13.0.

Other requirements: pip install timm==0.3.2

An example (works for me): CUDA 10.1 and pytorch 1.7.1

pip install torchvision==0.8.2
pip install timm==0.3.2
pip install mmcv-full==1.2.7
pip install opencv-python==4.5.1.48
cd Dual-Teacher && pip install -e . --user

Training

Download initial weights ( google drive ) pretrained on ImageNet-1K, and put them in a folder pretrained/.

Modify img_dir and ann_dir according to the partitions in configuration file.

bash dist_train.sh # Multi-gpu training

License

Please find the LICENSE file. This code, built on the SegFormer codebase, adheres to the same license.

Citation

@inproceedings{na2023switching,
  title={Switching Temporary Teachers for Semi-Supervised Semantic Segmentation},
  author={Jaemin Na and Jungwoo Ha and Hyungjin Chang and Dongyoon Han and Wonjun Hwang},
  journal={Advances in Neural Information Processing Systems (NeurIPS)},
  year={2023}
}

Contact

For questions, please contact: osial46@ajou.ac.kr

About

Official code for the NeurIPS 2023 paper "Switching Temporary Teachers for Semi-Supervised Semantic Segmentation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages