Skip to content

alexandre-eymael/CropMAE

Repository files navigation

CropMAE

PyTorch implementation of CropMAE [arXiv]. Our code is based on the offial PyTorch implementation of MAE.

CropMAE illustration

Checkpoints

Dataset $J\&F_m$ mIoU PCK@0.1 Download
ImageNet 60.4 33.3 43.6 link
K400 58.6 33.7 42.9 link

Training

Environment

Create a virtual environment (e.g., using conda or venv) with Python 3.11 and install the dependencies:

conda create --name CropMAE python=3.11
conda activate CropMAE
python -m pip install -r requirements.txt

Starting the training

This section assumes that you want to run CropMAE with default parameters. You can run python3 train_cropmae_in.py -h to have a complete list of possible parameters that you can change.

Single GPU

To start the training on a single GPU, you just have to provide the path to your dataset (typically ImageNet):

python train_cropmae_in.py --data_path=path/to/imagenet/folder

Multi-GPUs

We provide a script to start the training on a cluster of GPUs using slurm. Modify the scripts/train_cropmae_in.sh with the parameters you want to use and start the training with:

cd scripts && sbatch train_cropmae_in.sh

Evaluation

Prerequisites

Download the DAVIS, JHMDB, and VIP datasets.

Perform evaluation

Adapt downstreams/propagation/start.py to include the paths to the datasets you have previously downloaded. You may also adjust other parameters related to the evaluation, though the default settings are based on the ones we used. After making these adjustments, simply execute the following command to start the evaluation:

python3 -m downstreams.propagation.start {name} {epoch} {checkpoint}

This will create the folder downstreams/propagation/{name}_{epoch} and evaluate the checkpoint {checkpoint} for the three downstream tasks. The results will be saved in this folder, printed to standard output, and reported on Weights & Biases if enabled.

About

PyTorch implementation of CropMAE

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published