Skip to content

YifeiZhou02/BT-2

Repository files navigation

BT-2

Research Code for "BT^2: Backward-compatible Training with Basis Transformation" (https://arxiv.org/abs/2211.03989).

Code adapted from https://github.com/apple/ml-fct.

dimension_reduction

Requirements

We suggest using Conda virtual environments, please run:

conda env create -f environment.yml
conda activate sm86

Dataset Preparation

Make dataset and checkpoint directories.

mkdir data_store
mkdir checkpoints

Cifar 100

Please refer to https://www.cs.toronto.edu/~kriz/cifar.html for downloading Cifar 100.

Imagenet 1k

Please refer to https://www.image-net.org/challenges/LSVRC/2012/index.php for downloading the Imagenet 1k.

Example Experiments on Cifar 100

We provide training and evaluation experiment configurations for Cifar 100 in ./configs. The following commands are backward compatible training experiments from ResNet50 to ResNet50 (with data change of 50 classes to 100 classes).

Train Old Backbone Model

python train_backbone.py --config configs/cifar100_backbone_old.yaml

Train Independent New Backbone Model

python train_backbone.py --config configs/cifar100_backbone_new.yaml

Train Backward-compatible New Model with Basis Transformation

python train_feature_transfer.py --config configs/cifar100_transfer.yaml

Train Backward-compatible New Model with BCT (https://arxiv.org/abs/2003.11942)

python train_BCT.py --config configs/cifar100_BCT.yaml

Evaluation of Old/Old (query feature/ gallery feature)

python eval.py --config configs/cifar100_eval_old_old.yaml

Evaluation of New/New (query feature/ gallery feature)

python eval.py --config configs/cifar100_eval_new_new.yaml

Evaluation of New/Old (query feature/ gallery feature)

python eval.py --config configs/cifar100_eval_old_new.yaml

Example Experiments on Imagenet

We provide training and evaluation experiment configurations for Imagenet 1k in ./configs. Commands are similar to commands used for experiments in Cifar 100.

Checkpoints and results:

We provide trained checkpoints using config files in example configurations and in the paper here.

Example experiment results on Cifar100.

Method Setting TOP1 TOP5 meanAP
Independent $\phi_{old}/\phi_{old}$
$\phi_{new}/\phi_{old}$
$\phi_{new}/\phi_{new}$
33.6
0.8
62.7
55.4
4.9
74.6
24.4
1.5
49.9
BCT $\phi_{new}/\phi_{old}$
$\phi_{new}/\phi_{new}$
25.0
60.0
62.1
71.9
24.7
47.3
$BT^2$(ours) $\phi_{new}/\phi_{old}$
$\phi_{new}/\phi_{new}$
38.7
62.4
64.6
75.1
27.7
50.5

Example experiment results on Imagenet1k.

Method Setting TOP1 TOP5 meanAP
Independent $\phi_{old}/\phi_{old}$
$\phi_{new}/\phi_{old}$
$\phi_{new}/\phi_{new}$
40.9
0.1
67.9
55.8
0.5
81.4
33.6
0.2
52.3
BCT $\phi_{new}/\phi_{old}$
$\phi_{new}/\phi_{new}$
44.3
65.3
66.4
80.0
34.6
54.0
$BT^2$(ours) $\phi_{new}/\phi_{old}$
$\phi_{new}/\phi_{new}$
44.4
66.6
65.7
81.1
35.0
54.6

* Some results are different from the paper due to sensitivity to hyperparameters and random seeds.

Experiment results of sequence updates on Imagenet1k.

Method Setting TOP1 TOP5 meanAP
Independent $\phi_{alex}/\phi_{alex}$
$\phi_{vgg}/\phi_{vgg}$
$\phi_{res}/\phi_{res}$
$\phi_{vit}/\phi_{vit}$
46.6
63.2
67.9
78.0
66.3
79.0
81.4
87.5
29.1
49.6
52.3
72.4
BCT $\phi_{vgg}/\phi_{alex}$
$\phi_{vgg}/\phi_{vgg}$
$\phi_{res}/\phi_{alex}$
$\phi_{res}/\phi_{vgg}$
$\phi_{res}/\phi_{res}$
$\phi_{vit}/\phi_{alex}$
$\phi_{vit}/\phi_{vgg}$
$\phi_{vit}/\phi_{res}$
$\phi_{vit}/\phi_{vit}$
54.4
58.4
46.0
48.9
64.3
54.9
57.5
70.3
73.9
74.1
75.4
71.9
75.2
79.1
82.0
84.1
85.1
86.0
36.2
47.0
30.6
44.4
52.7
36.3
50.5
57.0
65.8
$BT^2$(ours) $\phi_{vgg}/\phi_{alex}$
$\phi_{vgg}/\phi_{vgg}$
$\phi_{res}/\phi_{alex}$
$\phi_{res}/\phi_{vgg}$
$\phi_{res}/\phi_{res}$
$\phi_{vit}/\phi_{alex}$
$\phi_{vit}/\phi_{vgg}$
$\phi_{vit}/\phi_{res}$
$\phi_{vit}/\phi_{vit}$
56.5
61.0
56.7
61.5
66.6
57.9
62.5
72.0
75.6
75.6
77.2
78.5
80.8
80.8
83.5
86.5
87.0
87.4
37.1
47.5
37.2
50.6
56.8
37.6
52.7
60.6
68.0

Cite our paper

@misc{zhou2023bt2,
      title={$BT^2$: Backward-compatible Training with Basis Transformation}, 
      author={Yifei Zhou and Zilu Li and Abhinav Shrivastava and Hengshuang Zhao and Antonio Torralba and Taipeng Tian and Ser-Nam Lim},
      year={2023},
      eprint={2211.03989},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

About

Research Code for "BT^2: Backward-compatible Training with Basis Transformation" (https://arxiv.org/abs/2211.03989).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages