TensorFlow and PyTorch implementation of Deep generative second order ODEs with Bayesian neural networks by
Çağatay Yıldız, Markus Heinonen and Harri Lahdesmäki.
We tackle the problem of learning low-rank latent representations of possibly high-dimensional sequential data trajectories. Our model extends Variational Auto-Encoders (VAEs) for sequential data with a latent space governed by a continuous-time probabilistic ordinary differential equation (ODE). We propose
- a powerful second order ODE that allows modelling the latent dynamic ODE state decomposed as position and momentum
- a deep Bayesian neural network to infer latent dynamics.
Here is our video summarizing the paper:
Minimal PyTorch Implementation
In addition to the TensorFlow implementation decribed below, we provide a minimal, easy-to-follow PyTorch implementation for clarity. Check ode2vae_mnist_minimal.py for more details. The dataset needed to run the script is here. Make sure to update the path or put both files into the same folder.
Replicating the Experiments
The code is developed and tested on
hickle library is also needed to load the datasets.
Training and test scripts are placed in the
scripts directory. In order to run reproduce an experiment, run the following command from the project root folder:
Once the optimization is completed, you can see the performance on test set by running
The datasets can be downloaded from here (1.9 GB). The folders contain
- preprocessed walking sequences from CMU mocap library
- rotating mnist dataset generated using this implementation
- bouncing ball dataset generated using the code provided with the original paper.
Do not forget to update the dataset paths in bash scripts with the local path to the downloaded folder.
Figures from Trained Models
This folder (20 MB) contains TensorFlow graphs of already optimized models. After downloading run
to reproduce the results. Similarly, the path argument in test bash files needs to be overriden by the downloaded checpoint folder path.