Skip to content

jingyang2017/SRD_ossl

Repository files navigation

The implementation is based on the open-source benchmark RepDistiller.

This repo

(1) introduces more realistic open-set semi-supervised learning settings (OS-SSL):CIFAR-100 as labeled data, Tiny-ImageNet or Places365 as unlabeled data.

(2) covers the extention of SRD to OS-SSL:

(3) benchmarks 12 state-of-the-art knowledge distillation methods from RepDistiller in the OS-SSL settning

(4) benchmarks 7 state-of-the-art semi-supervised methods in the proposed OS-SSL setting based on open sourced codes (PseudoLabel, MeanTeacher, MixMatch, FixMatch, MTCR, T2T, OpenMatch).

Method

Requirements

  • Python>= 3.6
  • PyTorch>=1.0.1
  • tensorboard
  • tensorboardX
  • tqdm
  • progress
  • matplotlib
  • numpy
  • scikit-learn
  • scikit-image
  • opencv-python

Preparation

  • Download TinyImageNet: wget http://cs231n.stanford.edu/tiny-imagenet-200.zip Put in folder 'tinyImageNet200'
  • Download: Places365 datasets. torchvision.datasets.Places365(folder, download=True,small=True) Put in folder 'places365'

Running

  1. Fetch the pretrained teacher models by:

    sh scripts/fetch_pretrained_teachers.sh
    

    which will download and save the models to save/models

  2. Run distillation by following commands in runs/run_kd_distill.sh.

  3. Run semi-supervised by following commands in runs/run_ssl.sh.

Citation

@article{yang2022srd,
    title={Knowledge Distillation Meets Open-Set Semi-Supervised Learning},
    author={Jing Yang, Xiatian Zhu, Adrian Bulat, Brais Martinez, Georgios Tzimiropoulos},
    journal={arXiv preprint arXiv:2205.06701},
    year={2022}
}

license

This project is licensed under the MIT License

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published