Skip to content

Reproduced a paper by Tomczak and Welling which introduced the Variational Mixture of Posteriors Prior (or Vamp-Prior): a new and better performing prior for Variational Autoencoders.

Notifications You must be signed in to change notification settings

dxvxd5/DD2434_Large_VAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

124 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DD2434_Large_VAE

Overview

This is a re-implementation of VAE with VampPrior, which was proposed by Tomczak & Welling. [1]

Installation

Install the necessary packages by running,

pip install -r requirements.txt

Testing

To test the VAE-model. Nagivate to the root directory in the terminal and run,

python -m large_vae.main

References

Tomczak, J., & Welling, M. (2018, March). VAE with a VampPrior. In International Conference on Artificial Intelligence and Statistics (pp. 1214-1223). PMLR.

About

Reproduced a paper by Tomczak and Welling which introduced the Variational Mixture of Posteriors Prior (or Vamp-Prior): a new and better performing prior for Variational Autoencoders.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •