Skip to content

swlzq/Semi-Online-KD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Semi-Online Knowledge Distillation

Implementations of Semi-Online Knowledge Distillation.

Requirements

This repo was tested with Python 3.8, PyTorch 1.5.1, torchvision 0.6.1, CUDA 10.1.

Training

  1. Train vanilla model by:

    python main.py -c ./configs/vanilla.yaml --gpu 0 --name experimental_name
  2. Train SOKD by:

    python main.py -c ./configs/sokd.yaml --gpu 0 --name experimental_name

Compared methods can be found at the following repos:

About

The pytorch implementation of SOKD (BMVC2021).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages