Skip to content

Official implementation of CVPR 2023 paper Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation.

Notifications You must be signed in to change notification settings

LinglanZhao/BiDistFSCIL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[CVPR2023] Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation

framework

Requirements

Datasets

We follow FSCIL setting to use the same data index_list for training. Please follow the guidelines in CEC to prepare them. Scripts for experiments on mini-imagenet are as follows, and the full codes will be available upon acceptance:

Pretrain scripts

mini-imagenet (We also provide our pre-trained model so this step is optional.)

$ python train.py --dataset mini-imagenet --exp_dir experiment --epoch 200 --batch_size 256 --init_lr 0.1 --milestones 120 160 --val_start 100 --change_val_interval 160

Testing scripts

mini-imagenet

$ python test.py --dataset mini-imagenet --exp_dir experiment --needs_finetune --ft_iters 100 --ft_lr 0.001 --ft_factor 1.0 --ft_T 16 --w_d 100 --part_frozen --ft_KD_all --ft_teacher fixed --bilateral --BC_hidden_dim 64 --BC_lr 0.01 --w_BC_binary 50 --EMA_logits --w_l 1 --EMA_FC_lr 0.01

Acknowledgment

Our project references the codes in the following repos.

About

Official implementation of CVPR 2023 paper Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages