Skip to content

Mjavan/PSelf-Supervised

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PSelf-Supervised

Probabilistic Self-Supervised Learning using Cyclical Stochastic Gradient MCMC

This repository provides PyTorch implementation for the paper "A Probabilistic Approach to Self-Supervised Learning using Cyclical Stochastic Gradient MCMC".

Alt text

Accepted at:

Frontiers in Probabilistic Inference: Sampling Meets Learning Workshop, ICLR 2025

Paper: Camera-ready (ICLR OpenReview)


Installation

Clone the repo and install using: pip install .

Usage

Pretraining

To obtain distribution over representations in pretraining phase use bayesianbyol.py in core module (based on Bayesian Byol).

Alternatively, you can take samples from the posterior using Bayesian SimCLR bayesiansimclr.py.

To trian the model simply run:

python bayesianbyol.py

or

python bayesiansimclr.py

Dataset Splits

For downstream tasks, create different data splits using split_datasets.py in core module:

python split_datasets.py

Evaluation

To evaluate the probabilistic representations on an image classification task, use finetune.py in the core module. The function finetunes the pretrained models using samples from the posterior across various data splits. Performance is reported on the test set (or validation set, e.g., in ImageNet-10) by marginalizing over the learned representations. To run:

python finetune.py


Dependencies

  • Pytorch
  • Numpy
  • Matplotlib
  • Scikit-learn
  • Seaborn

Licens

This project is licensed under the MIT License.

About

A probabilistic approach to self supervised learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published