Skip to content

ssnio/SteinVarGradDescPML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

92 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project in ML and AI (Prof. Opper) - SVGD

Term Project on Stein Variational Gradient Descent - Description

Term Project conducted in winter semester 2020/21 in the course "Projects in Machine Learning and Artificial Intelligence" at Technical University Berlin, Institute of Software Engineering and Theoretical Computer Science, Research Group Methods in Artifical Intelligence (lead by Prof. Dr. Manfred Opper).

The method we analysed was Stein Variational Gradient Descent, a non-parametric variational inference method, that iteratively applies smooth transformations on a set of initial particles to construct a transport map wich moves the initial particles such that they closely approximate (in terms of KL divergence) an otherwise intractable target distribution.

We reimplemented the codebase and rerun the experiments the authors have documented in their original repository and further performed additional experiments that showcase characteristics and limitations of the SVGD.


Demo

Pseudo-Julia implementation

using Distributions
using ForwardDiff
using KernelFunctions

function SVGD(x, scorefun, kernel, grad_kernel, n_epochs, ϵ)
    n_particles = size(x, 1) # number of particles
    for i in 1:n_epochs
        scorefun_X = scorefun(x) # ∇ log p(x)
        kxy = kernel(x) # k(⋅, x)
		dxkxy = grad_kernel(x) # ∇k(⋅, x)
        ϕ = ((kxy * dlogpdf_val) .+ dxkxy) ./ n_particles # steepest descent
        x += ϵ .* ϕ # updating the particles
    end
    return x
end

SVGD for Neal's Funnel

More Particles


Installation

We used different languages (Python 3.6; Julia 1.5) and following libraries throughout our experiments. Please install them to rerun our experiments:

Python:

  • PyTorch
  • Numpy
  • Scikit-Learn
  • Matplotlib
  • Scipy

Julia:

  • Statistics
  • Distances
  • Random
  • ForwardDiff
  • LinearAlgebra
  • Distributions
  • Plots
  • KernelFunctions
  • EvalMetrics
  • MLDataUtils

Code structure

  • presentation_slides (mid-term and final presentation pdfs)
  • src (code of all experiments)
    • data (experiment datasets from original repo)
    • python
      • bayesian_neural_network (pytorch implementation of SVGD, applied on boston housing dataset)
      • multi_variate_normal (Jupyter Notebook that applies numpy implementation on 2D gaussian example)
      • SVGD.py (numpy implementation of SVGD)
    • julia
      • bayesian_regression (bayesian regression experiments, applying Julia implementation on covertype dataset)
      • gaussian_mixture_anealing (gaussian mixture models experiemnts, applyuing Julia implementation on mixtures of 2D gaussians, partially using annealing SVGD)
      • multi_variate_normal (Jupyter Notebook that applies Julia implementation on 2D gaussian example)
      • SVGD.jl (julia implementation of SVGD)
  • statics (training artifacts and generated graphics)

Team Members


References

Main references

Further Learning Resources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published