Skip to content
ODE2VAE: Deep generative second order ODEs with Bayesian neural networks
Python Shell
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
images animations updated Dec 5, 2019
model update Jun 4, 2019
.gitignore gitignore added, images deleted Jun 4, 2019 Update Dec 5, 2019 pytorch implementation and poster added Dec 5, 2019
ode2vae_poster.pdf pytorch implementation and poster added Dec 5, 2019 update Jun 4, 2019 update Jun 3, 2019

TensorFlow and PyTorch implementation of Deep generative second order ODEs with Bayesian neural networks by
Çağatay Yıldız, Markus Heinonen and Harri Lahdesmäki.

model architecture

We tackle the problem of learning low-rank latent representations of possibly high-dimensional sequential data trajectories. Our model extends Variational Auto-Encoders (VAEs) for sequential data with a latent space governed by a continuous-time probabilistic ordinary differential equation (ODE). We propose

  1. a powerful second order ODE that allows modelling the latent dynamic ODE state decomposed as position and momentum
  2. a deep Bayesian neural network to infer latent dynamics.


Here is our video summarizing the paper:

ODE2VAE video

Minimal PyTorch Implementation

In addition to the TensorFlow implementation decribed below, we provide a minimal, easy-to-follow PyTorch implementation for clarity. Check for more details. The dataset needed to run the script is here. Make sure to update the path or put both files into the same folder.

Replicating the Experiments

The code is developed and tested on python3.7 and TensorFlow 1.13. hickle library is also needed to load the datasets.

Training and test scripts are placed in the scripts directory. In order to run reproduce an experiment, run the following command from the project root folder:


Once the optimization is completed, you can see the performance on test set by running


All Datasets

The datasets can be downloaded from here (1.9 GB). The folders contain

  1. preprocessed walking sequences from CMU mocap library
  2. rotating mnist dataset generated using this implementation
  3. bouncing ball dataset generated using the code provided with the original paper.

Do not forget to update the dataset paths in bash scripts with the local path to the downloaded folder.

Figures from Trained Models

This folder (20 MB) contains TensorFlow graphs of already optimized models. After downloading run


to reproduce the results. Similarly, the path argument in test bash files needs to be overriden by the downloaded checpoint folder path.

Example Walking Sequences

test + synthesized sequences test + synthesized sequences

Rotating Threes

Long Term Bouncing Balls Predictions

bouncing ball data + reconstructions

You can’t perform that action at this time.