Skip to content

yawencui/UaD-ClE

Repository files navigation

UaD-ClE

This repository contains the PyTorch implementation for TNNLS paper "Uncertainty-Aware Distillation for Semi-Supervised Few-Shot Class-Incremental Learning"

The code is based on CVPR19_Incremental_Learning.

Running environment

Pytorch 1.10 with cuda 10.0 and cudnn 7.4 in Ubuntu 18.04 system

Dataset

We follow FSCIL setting to use the same data index_list for training.

For CUB200, you can download from this link. Please put the downloaded file under data/cub folder and unzip it.

Checkpoint

We provide a trained model of the first session. Please download from this link and put it in ./checkpoint.

Training script for CUB200

# train, the checkpoints will be save in ./checkpoint
CUDA_VISIBLE_DEVICES=0 python train_cub.py --resume --uncertainty_distillation --frozen_backbone_part --flip_on_means --adapt_lamda

Citation

If you find our project useful in your research, please cite:

@article{cui2023uncertainty,
  title={Uncertainty-Aware Distillation for Semi-Supervised Few-Shot Class-Incremental Learning},
  author={Cui, Yawen and Deng, Wanxia and Chen, Haoyu and Liu, Li},
  journal={IEEE Transactions on Neural Networks and Learning Systems},
  year={2023}
  publisher={IEEE}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages