Skip to content


Folders and files

Last commit message
Last commit date

Latest commit



31 Commits

Repository files navigation

Cross-task class-incremental learning

Implementation of "On the importance of cross-task features for class-incremental learning" [Suppl.][arXiv][Poster]

Accepted at the International Conference on Machine Learning Workshop (ICML-W) on Theory and Foundation of Continual Learning, 2021.

This implementation extends FACIL (Framework for Analysis of Class-Incremental Learning) by proposing two new approaches: BAL_FT and BAL_JOINT. The proposed approaches are in ./src/approach/ and can be used with the FACIL framework by directly adding them to their corresponding approach folder.


Clone this github repository:

git clone
cd cross_task_cil
Optionally, create an environment to run the code (click to expand).

Using a conda environment

Development environment based on Conda distribution. All dependencies are in environment.yml file.

Create env

To create a new environment check out the repository and type:

conda env create --file environment.yml --name crosstask

Notice: set the appropriate version of your CUDA driver for cudatoolkit in environment.yml.

Environment activation/deactivation

conda activate crosstask
conda deactivate

Set up your data path by modifying _BASE_DATA_PATH in ./src/datasets/

To run the basic code:

python3 -u src/

More options are explained in the FACIL framework. Also, more specific options on approaches, loggers, datasets and networks are available on the corresponding in the FACIL subfolders.

Reproduction of the results

We provide scripts to reproduce the specific scenarios proposed in On the importance of cross-task features for class-incremental learning:

  • CIFAR-100 (10 and 20 tasks) with ResNet-32 with fixed and growing memory
bash {ctf/noctf}_{grow/fixd} <gpu> <results_dir> <num_tasks>
  • ImageNet-Subset (25 tasks) with Resnet-18 and growing memory
bash {ctf/noctf}_{grow/joint} <gpu> <results_dir>

Where "ctf" stands for +CFT and "noctf" uses the multitask loss thus does not learn cross-task features (-CTF). Each of these can be combined with either growing memory (i.e ctf_grow) or fixed memory (ctf_fixd). For the upper bounds, use (i.e ctf_joint) instead.

Our provided results are an average of 10 runs. Check out all available in the scripts folder.


Please check the MIT license that is listed in this repository.


  title={On the importance of cross-task features for class-incremental learning},
  author={Soutif--Cormerais, Albin and Masana, Marc and Van de Weijer, Joost and Twardowski, Bart{\l}omiej},
  booktitle={International Conference on Machine Learning Workshop},


Code for the paper "On the importance of cross-task features for class-incremental learning"