Skip to content

czjghost/AFS

Repository files navigation

AFS (Adaptive Focus Shifting algorithm)

Official implementation of the paper [New Insights on Relieving Task-Recency Bias for Online Class Incremental Learning] (TCSVT 2023).

The backbone of project mainly refers to online-continual-learning.

Requirements

Create a virtual enviroment

virtualenv online-cl

Activating a virtual environment

source online-cl/bin/activate

Installing packages

pip install -r requirements.txt

Datasets

Online Class Incremental

  • Split CIFAR10
  • Split CIFAR100
  • Split Mini-ImageNet

Data preparation

Compared methods

Except our implementation code, you could easily find other implementation results from SCR, DVC, ER-ACE and OCM.

  • AGEM: Averaged Gradient Episodic Memory (ICLR, 2019) [Paper]
  • ER: Experience Replay (ICML Workshop, 2019) [Paper]
  • MIR: Maximally Interfered Retrieval (NeurIPS, 2019) [Paper]
  • GSS: Gradient-Based Sample Selection (NeurIPS, 2019) [Paper]
  • GDumb: Greedy Sampler and Dumb Learner (ECCV, 2020) [Paper]
  • ASER: Adversarial Shapley Value Experience Replay (AAAI, 2021) [Paper]
  • SCR: Supervised Contrastive Replay (CVPR Workshop, 2021) [Paper]
  • DVC: Dual View Consistency (CVPR, 2022) [Paper]
  • OCM: Online Continual learning based on Mutual information maximization (ICML, 2022) [Paper]
  • ER-ACE: Cross-Entropy based Alternative (ICLR, 2022) [Paper]

Tricks

  • In our paper, the main trick (RV) can be found in BER [Paper] and online CL survey [Paper] which has been implemented by RaptorMai.

Run commands

Detailed descriptions of options can be found in general_main.py.

Command for duplicate results

You can run python file "run_cifar10.py", "run_cifar100.py" and "run_mini.py" to reimplement our paper results, for example:

  python run_mini.py

Detailed commands are as follows:

CIFAR-10

Memory = 0.2k

  python general_main.py --agent er --loss rfocal --classify max --data cifar10 --eps_mem_batch 100 --mem_size 200 --review_trick True --kd_trick True --kd_lamda 0.05 --cor_prob 0.99 --T 20.0 --fix_order True

Memory = 0.5k

  python general_main.py --agent er --loss rfocal --classify max --data cifar10 --eps_mem_batch 100 --mem_size 500 --review_trick True --kd_trick True --kd_lamda 0.05 --cor_prob 0.99 --T 20.0 --fix_order True

Memory = 1k

  python general_main.py --agent er --loss rfocal --classify max --data cifar10 --eps_mem_batch 100 --mem_size 1000 --review_trick True --kd_trick True --kd_lamda 0.1 --cor_prob 0.99 --T 20.0 --fix_order True

CIFAR-100

Memory = 1k

  python general_main.py --agent er --loss rfocal --classify max --data cifar100 --eps_mem_batch 100 --mem_size 1000 --review_trick True --kd_trick True --kd_lamda 0.15 --cor_prob 0.99 --T 20.0 --fix_order True

Memory = 2k

  python general_main.py --agent er --loss rfocal --classify max --data cifar100 --eps_mem_batch 100 --mem_size 2000 --review_trick True --kd_trick True --kd_lamda 0.05 --cor_prob 0.99 --T 20.0 --fix_order True

Memory = 5k

  python general_main.py --agent er --loss rfocal --classify max --data cifar100 --eps_mem_batch 100 --mem_size 5000 --review_trick True --kd_trick True --kd_lamda 0.1 --cor_prob 0.99 --T 20.0 --fix_order True

Mini-Imagenet

Memory = 1k

  python general_main.py --agent er --loss rfocal --classify max --data mini_imagenet --eps_mem_batch 100 --mem_size 1000 --review_trick True --kd_trick True --kd_lamda 0.05 --cor_prob 0.99 --T 20.0 --fix_order True

Memory = 2k

  python general_main.py --agent er --loss rfocal --classify max --data mini_imagenet --eps_mem_batch 100 --mem_size 2000 --review_trick True --kd_trick True --kd_lamda 0.1 --cor_prob 0.99 --T 20.0 --fix_order True

Memory = 5k

  python general_main.py --agent er --loss rfocal --classify max --data mini_imagenet --eps_mem_batch 100 --mem_size 5000 --review_trick True --kd_trick True --kd_lamda 0.05 --cor_prob 0.99 --T 20.0 --fix_order True

Citation

If you use this paper/code in your research, please consider citing us:

New Insights on Relieving Task-Recency Bias for Online Class Incremental Learning

Accepted at TCSVT2023.

@ARTICLE{10287323,
  author={Liang, Guoqiang and Chen, Zhaojie and Chen, Zhaoqiang and Ji, Shiyu and Zhang, Yanning},
  journal={IEEE Transactions on Circuits and Systems for Video Technology}, 
  title={New Insights on Relieving Task-Recency Bias for Online Class Incremental Learning}, 
  year={2023},
  volume={},
  number={},
  pages={1-1},
  doi={10.1109/TCSVT.2023.3325651}}

Other traditional papers we encourage you to cite can be found in RaptorMai.

Reference

Thanks RaptorMai for selflessly sharing his implementation about recent state-of-the-art methods.

About

Official implementation of the paper [New Insights on Relieving Task-Recency Bias for Online Class Incremental Learning]

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages