Skip to content

ml-jku/MIM-Refiner

Repository files navigation

MIM-Refiner

PWC PWC

[Project Page] [Paper] [Models] [Codebase Demo Video] [BibTeX]

Pytorch implementation and pre-trained models of MIM-Refiner.

mimrefiner_schematic

Pre-trained Models

Pre-trained models can be found here

They can also be loaded via torchhub:

import torch

model = torch.hub.load("ml-jku/MIM-Refiner", "mae_refined_l16")
model = torch.hub.load("ml-jku/MIM-Refiner", "mae_refined_h14")
model = torch.hub.load("ml-jku/MIM-Refiner", "mae_refined_twob14")
model = torch.hub.load("ml-jku/MIM-Refiner", "d2v2_refined_l16")
model = torch.hub.load("ml-jku/MIM-Refiner", "d2v2_refined_h14")

An example how to use torchhub models for a k-NN classifier can be found here.

python eval_knn_torchhub.py --model mae_refined_l16 --data_train /imagenet/train/ --data_test /imagenet/val

Note that the results of this script can differ slightly from the the paper results as the paper results remove the last LayerNorm of pre-norm ViTs and use bfloat16 precision.

Train your own models

Instructions to setup the codebase on your own environment are provided in SETUP_CODE, SETUP_DATA and SETUP_MODELS.

A video to motivate design choices of the codebase and give an overview of the codebase can be found here.

Configurations to train, evaluate or analyze models can be found here.

Citation

If you like our work, please consider giving it a star ⭐ and cite us

@article{alkin2024mimrefiner,
      title={MIM-Refiner: A Contrastive Learning Boost from Intermediate Pre-Trained Representations}, 
      author={Benedikt Alkin and Lukas Miklautz and Sepp Hochreiter and Johannes Brandstetter},
      journal={arXiv preprint arXiv:2402.10093},
      year={2024}
}

About

A Contrastive Learning Boost from Intermediate Pre-Trained Representations

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published