Skip to content

Plrbear/auxSKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 

Repository files navigation

auxSKD

Pytorch implementation of Auxiliary Learning for Self-Supervised Video Representation via Similarity-based Knowledge Distillation, published as a CVPR 2022 workshop paper.

Pretraining

Data preparation

Follow instructions in VideoPace

Auxiliary Pretraining

cd auxSKD
python train.py --gpu 0,1 --bs 30 --lr 0.001 --height 128 --width 171 --crop_sz 112 --clip_len 16

Primary Pretraining - VSPP

cd ..
cd VSPP
python train.py --gpu 0,1 --ckpt ./auxSKD_pretrained_weights.pth --bs 30 --lr 0.001 --height 128 --width 171 --crop_sz 112 --clip_len 16

Acknowlegement

Part of our codes are adapted from ISD and VideoPace, we thank the authors for their contributions.

Contact

For any question, please file an issue or contact
Amirhossein Dadashzadeh: amir.dzd@gmail.com

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages