XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage [Paper]
Jae-Jun Lee, Sung Whan Yoon
AISTATS 2024
-
Clone its repository
git clone https://github.com/johnjaejunlee95/XB-MAML.git
-
Install torchmeta
conda create env -y -n xbmaml python=3.9 conda activate xbmaml pip install torchmeta
Download all datasets via this link.
Alternatively, if you prefer to download them individually, please use the following links:
- CIFAR-FS: link
- mini-ImageNet: link
- tiered-ImageNet: link
- Omniglot: link
- Aircraft, Birds, Fungi, Texture (ABF, BTAF): link
python train_meta.py \
--multi \
--temp_scaling 5 \
--batch_size 2 \
--update_step 3 \
--update_step_test 7 \
--update_lr 0.03 \
--regularizer 5e-4 \
--datasets miniimagenet \
--epoch 60000 \
--max_test_task 1000 \
--gpu_id 0
python train_meta.py \
--multi \
--temp_scaling 8 \
--batch_size 2 \
--update_step 3 \
--update_step_test 7 \
--update_lr 0.05 \
--regularizer 1e-3 \
--datasets_path /your/own/path \
--datasets MetaABF \
--epoch 80000 \
--max_test_task 600 \
--gpu_id 0
python test_meta.py --datasets_path /your/own/path --checkpoint_path ./save/ckpt/ --datasets MetaABF --num_test 1
option arguments:
--epochs: epoch number (default: 60000)
--num_ways: N-way (default: 5)
--num_shots: k shots for support set (default: 5)
--num_shots_test: number of query set (default: 15)
--imgc: RGB(image channel) (default: 3)
--filter_size: size of convolution filters (default: 64)
--batch_size: meta-batch size (default: 2)
--max_test_task: number of tasks for evaluation (default: 600)
--meta_lr: outer-loop learning rate (default: 1e-3)
--update_lr: inner-loop learning rate (default: 1e-2)
--update_step: number of inner-loop update steps while training (default: 5)
--update_test_step: number of inner-loop update steps while evaluating (default: 10)
--dropout: dropout probability (default: 0.)
--gpu_id: gpu device number (default: 0)
--model: model architecture: Conv-4, ResNet12 (default: conv4)
--datasets: datasets: miniimagenet, tieredimagenet, cifar-fs, MetaABF, MetaBTAF, MetaCIO (default: MetaABF)
--multi: Apply XB-MAML (store true)
--datasets_path: Datasets directory path
--checkpoint_path: checkpoint directory path (default:./save/ckpt/)
--version: file version (default: 0)
@InProceedings{pmlr-v238-lee24b,
title = { {XB-MAML}: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage },
author = {Lee, Jae-Jun and Sung Whan, Yoon},
booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics},
pages = {3196--3204},
year = {2024},
editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen},
volume = {238},
series = {Proceedings of Machine Learning Research},
month = {02--04 May},
publisher = {PMLR},
pdf = {https://proceedings.mlr.press/v238/lee24b/lee24b.pdf},
url = {https://proceedings.mlr.press/v238/lee24b.html}
}