This directory contain our implementation of our paper entitled ”Few-Shot Continual Learning via Flat-to-Wide Approaches" (FLOWER) This project is developed based on F2M Project with modification and addition of our methods.
Pleasee see "code" directory to see the implementation of the algorithms.
Pleasee see "logs" directory to see the raw results of our experiments.
This project is run by using the following environment:
- NVIDIA NGC container 21.03-py3: The container contains python 3.8, pytorch 1.9.0, torchvision, numpy, and the other libraries. Please read the fllowing link for the complete documentation: https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/rel_21-03.html#rel_21-03
- wandb (optional) You can install wandb by using command "pip install wandb" on top of the container
We evaluate our system in the benchmark datasets, including CUB-200-2011, CIFAR100, miniImageNet.
Please download CUB-200-2011, CIFAR100 and miniImageNet.
Please see file example_running_script.txt in code directory
The main contribution of our work is as follow:
-
Base task learning (please see *_model.py methods directory): F2MJ (Flat minima + projection), F2MMAS (Flat minima + MAS weight importance)+ F2MMASJ (FLOWER base task learning), F2MMASJNOPL (FLOWER base task learning without prototype loss), F2MSI Flat minima + SI weight importance). Those files are developed from F2MModel with necessary modifications.
-
Continual tasks learning (please see *_model.py methods directory): flower1 (FLOWER continual tasks learning), flower_no_psi (FLOWER without projection), flower_no_fm (FLOWER without flat minima), flower_no_ball (FLOWER without ball augmentation) flower_no_mas (FLOWER without MAS weight importance), flower_no_pl (FLOWER without prototype loss)
-
Incremental learning procedure without memory: incremental_procedure_nomem.py The file is developed from F2MModel with necessary modifications.
-
Minor modification: We modify incremental learning procedure script (incremental_procedure.py) for debugging purpose.
This FLOWER project is released under the Apache 2.0 license. Please see code/LICENSE directory.