This repository provides PyTorch implementation for the paper "A Probabilistic Approach to Self-Supervised Learning using Cyclical Stochastic Gradient MCMC".
Accepted at:
Frontiers in Probabilistic Inference: Sampling Meets Learning Workshop, ICLR 2025
Paper: Camera-ready (ICLR OpenReview)
Clone the repo and install using:
pip install .
To obtain distribution over representations in pretraining phase use bayesianbyol.py in core module (based on Bayesian Byol).
Alternatively, you can take samples from the posterior using Bayesian SimCLR bayesiansimclr.py.
To trian the model simply run:
python bayesianbyol.py
or
python bayesiansimclr.py
For downstream tasks, create different data splits using split_datasets.py in core module:
python split_datasets.py
To evaluate the probabilistic representations on an image classification task, use finetune.py in the core module.
The function finetunes the pretrained models using samples from the posterior across various data splits.
Performance is reported on the test set (or validation set, e.g., in ImageNet-10) by marginalizing over the learned representations.
To run:
python finetune.py
- Pytorch
- Numpy
- Matplotlib
- Scikit-learn
- Seaborn
This project is licensed under the MIT License.
