Skip to content

KingJamesSong/latent-flow

Repository files navigation

latent-factozied-flow

NeurIPS23 "Flow Factorized Representation Learning"

Yue Song1,2, T. Anderson Keller1, Nicu Sebe2, Max Welling1
1University of Amsterdam, the Netherlands
2University of Trento, Italy

MNIST Shapes3D Falcol3D Isaac3D




Overview


Illustration of our flow factorized representation learning: at each point in the latent space we have a distinct set of tangent directions ∇uk which define different transformations we would like to model in the image space. For each path, the latent sample evolves to the target on the potential landscape following dynamic optimal transport.


Depiction of our model in plate notation. (Left) Supervised, (Right) Weakly-supervised. White nodes denote latent variables, shaded nodes denote observed variables, solid lines denote the generative model, and dashed lines denote the approximate posterior. We see, as in a standard VAE framework, our model approximates the initial one-step posterior p(z0|x0), but additionally approximates the conditional transition distribution p(zt|zt−1, k) through dynamic optimal transport over a potential landscape.

Setup

First, clone the repository and navigate into it:

git clone https://github.com/KingJamesSong/latent-flow.git
cd latent-flow

We recommend setting up a new conda environment for this project. You can do this using the following command:

conda create --name latent-flow-env python=3.11
conda activate latent-flow-env

Next, install the necessary dependencies. This project requires PyTorch. You can find the installation instructions on the PyTorch setup page.

After installing PyTorch, install the remaining dependencies from the requirements.txt file:

pip install -r requirements.txt

For development purposes, you may also want to install the dependencies listed in requirements_dev.txt:

pip install -r requirements_dev.txt

It is recommended to set your IDEs autoformatter to use black and to enable "format on save".

Finally, install the package itself. If you plan on modifying the code, install it in editable mode using the -e option:

pip install -e .

This will allow your changes to be immediately reflected in the installed package.

The code assumes that all datasets are placed in the ./data folder. This folder is going to be created automatically if necessary. However, if you'd like to use a different folder for your datasets, you can create a symbolic link to that folder. This can be done using the following commands:

For Unix-based systems (Linux, MacOS), use the ln command:

ln -s /path/to/your/dataset/folder ./data

This command creates a symbolic link named ./data that points to /path/to/your/dataset/folder.

For Windows systems, use the mklink command:

mklink /D .\data C:\path\to\your\dataset\folder

This command creates a symbolic link named .\data that points to C:\path\to\your\dataset\folder.

Please replace /path/to/your/dataset/folder and C:\path\to\your\dataset\folder with the actual path to your dataset folder.

Usage

Please check the scripts folder for the training and evaluation codes.

Citation

If you think the code is helpful to your research, please consider citing our paper:

@inproceedings{song2023flow,
  title={Flow Factorized Representation Learning},
  author={Song, Yue and Keller, Andy and Sebe, Nicu and Welling, Max},
  booktitle={NeurIPS},
  year={2023}
}

If you have any questions or suggestions, please feel free to contact me via yue.song@unitn.it.

About

NeurIPS23 "Flow Factorized Representation Learning"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published