Skip to content
/ ICLR24 Public

Official code for ICLR 2024 paper, "A Hard-to-Beat Baseline for Training-free CLIP-based Adaptation"

Notifications You must be signed in to change notification settings

mrflogs/ICLR24

Repository files navigation

A Hard-to-Beat Baseline for Training-free CLIP-Based Adaptation

Zhengbo Wang1,2 ,  Jian Liang2,3† ,  Lijun Sheng1,2 ,  Ran He2,3 ,  Zilei Wang1 ,  Tieniu Tan4  
1University of Science and Technology of China    2CRIPAC & MAIS, Institute of Automation, Chinese Academy of Sciences    3School of Artificial Intelligence, University of Chinese Academy of Sciences    4Nanjing University

ICLR, 2024

[Paper]      [Project Page]      [Code]

Requirements

Installation

Create a conda environment and install dependencies:

conda create -n h2b python=3.9
conda activate h2b

pip install -r requirements.txt

# Install the according versions of torch and torchvision
conda install pytorch torchvision cudatoolkit

Dataset

Follow DATASET.md to install ImageNet and other datasets referring to CoOp.

Get Started

Configs

The running configurations can be modified in configs/setting/dataset.yaml, including evaluation setting, shot numbers, visual encoders, and hyperparamters.

Numerical Results

We provide numerical results in few-shot classification in Figure 1 at exp.log.

Running

For few-shot classification:

CUDA_VISIBLE_DEVICES=0 python main_few_shots.py --config configs/few_shots/dataset.yaml

For base-to-new generalization:

CUDA_VISIBLE_DEVICES=0 python main_base2new.py --config configs/base2new/dataset.yaml

For out-of-distribution generalizaiton:

CUDA_VISIBLE_DEVICES=0 python main_robustness.py --config configs/robustness/imagenet_rn50.yaml

Acknowledgement

This repo benefits from CLIP, CoOp, and SHIP. Thanks for their wonderful work.

Citation

@inproceedings{wang2024baseline,
  title={A Hard-to-Beat Baseline for Training-free CLIP-based Adaptation},
  author={Wang, Zhengbo and Liang, Jian and Sheng, Lijun and He, Ran and Wang, Zilei and Tan, Tieniu},
  booktitle={The Twelfth International Conference on Learning Representations (ICLR)},
  year={2024}
}

Contact

If you have any question, feel free to contact zhengbowang@mail.ustc.edu.cn.

About

Official code for ICLR 2024 paper, "A Hard-to-Beat Baseline for Training-free CLIP-based Adaptation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published