Skip to content

hustvl/SuperCLIP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SuperCLIP: CLIP with Simple Classification Supervision

Weiheng Zhao1 · Zilong Huang2 · Xinggang Wang1 · Jiashi Feng2

HUST Vision Lab1 & Bytedance2

Paper PDF

The reason you should try SuperCLIP: It offers significant gains with only a 0.077% increase in FLOPs and no extra annotated data needed. It dramatically alleviates the performance drop of CLIP-style models under small-batch training, and is fully compatible with modern CLIP variants (e.g., SigLIP, FLIP) while also delivering clear improvements when integrated into multi-modal LLM frameworks like LLaVA.

teaser

Table of Contents

News

  • 2025-09-19: Accepted by NeurIPS 2025. [✔]
  • 2025-11-06: Code release. [✔]

Getting Started

Installation

# Clone the repository
git clone https://github.com/hustvl/SuperCLIP.git
cd SuperCLIP

# Install dependencies
pip install -r requirements.txt

Datasets

Configuration

Update the paths in the training script to point to your local datasets:

  • Set DATA_PATH to the Datacomp-1B root.
  • Set VAL_DATA_PATH to the ImageNet-1K validation set.

File to edit: train.sh

Training

Start SuperCLIP training with:

bash train.sh <config_path> superclip

Acknowledgments

Our codebase is built upon:

We thank the OpenCLIP and SuperClass teams for contributing such impressive code and models to the community.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published