Skip to content

The collection of papers about combining deep learning and Bayesian nonparametrics

Notifications You must be signed in to change notification settings

otokonoko8/deep-Bayesian-nonparametrics-papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 

Repository files navigation

deep-Bayesian-nonparametrics-papers

The collection of papers about combining deep learning with Bayesian nonparametric approaches

We made a concise name "deep Bayesian non-parametrics"(DBNP) to a series of work bringing the fields of deep learning and Bayesian nonparametrics together. Generally, not only DBNP means combining the neural networks with stochastic processes in Bayesian modelling, but also leveraging common and effective structures of deep learning, such as convolution, recurrence and deep hierachies in the setting of Bayesian nonparameterics, introducing nonparametric methods into structure design of neural nets, and reinterpreting neural nets as Bayesian nonparametric models from any perspective. Meanwhile, corresponding training methods designed for these models, especially approximate inference, are also our concerns.

Deep Gaussian Processes, Inference Algorithms and Applications

  1. Deep Gaussian Processes
  2. Nested Variational Compression in Deep Gaussian Processes
  3. Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation
  4. Variational Auto-encoded Deep Gaussian Processes
  5. Deep Gaussian Processes for Regression using Approximate Expectation Propagation
  6. Random Feature Expansions for Deep Gaussian Processes
  7. Doubly Stochastic Variational Inference for Deep Gaussian Processes
  8. Deep Gaussian Processes with Decoupled Inducing Inputs
  9. Deep Gaussian Processes with Convolutional Kernels
  10. Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo
  11. Efficient Global Optimization using Deep Gaussian Processes
  12. Deep Convolutional Gaussian Processes
  13. Deep Gaussian Processes for Multi-fidelity Modeling
  14. Deep Gaussian Processes with Importance-Weighted Variational Inference
  15. Compositional Uncertainty in Deep Gaussian Processes
  16. Implicit Posterior Variational Inference for Deep Gaussian Processes

Reinterpretation of Neural Networks as Bayesian Nonparametric Models

  1. Deep Bayesian Neural Nets as Deep Matrix Gaussian Processes
  2. Deep Neural Networks as Gaussian Processes
  3. Gaussian Process Behaviour in Wide Deep Neural Networks
  4. Deep Convolutional Networks as Shallow Gaussian Processes
  5. Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
  6. On the Connection between Neural Processes and Gaussian Processes with Deep Kernels
  7. Approximate Inference Turns Deep Networks into Gaussian Processes
  8. Non-Gaussian Processes and Neural Networks at Finite Widths
  9. Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes

Gaussian Processes with Neural-network-inspired-structures and Inference Algorithms

  1. Recurrent Gaussian Processes
  2. Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation
  3. Convolutional Gaussian Processes
  4. Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

Gaussian Processes Inputs Transformed by Deep Architecture

  1. Deep Kernel Learning
  2. Learning Scalable Deep Kernels with Recurrent Structure
  3. Stochastic Variational Deep Kernel Learning
  4. Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance
  5. Calibrating Deep Convolutional Gaussian Processes
  6. Differentiable Compositional Kernel Learning for Gaussian Processes
  7. Deep Learning with Differential Gaussian Process Flows
  8. Finite Rank Deep Kernel Learning
  9. Adaptive Deep Kernel Learning

Bayesian Nonparametric Neural Latent Variable Models/ Amortised Inference with Nonparamteric Priors

  1. Stick-breaking Variational Autoencoders
  2. Indian Buffet Process Deep Generative Models
  3. Nonparametric Variational Autoencoders for Hierarchical Representation Learning
  4. Nonparametric Bayesian Deep Networks with Local Competition
  5. A Bayesian Nonparametric Topic Model with Variational Auto-encoders
  6. Deep Bayesian Nonparametric Tracking
  7. Gaussian Process Prior Variational Autoencoders
  8. Deep Generative Model with Beta Bernoulli Process for Modeling and Learning Confounding Factors
  9. Stick-breaking Neural Latent Variable Models
  10. Deep Bayesian Nonparametric Factor Analysis
  11. Deep Factors with Gaussian Processes for Forecasting

Bayesian Nonparametric Neural Networks/ Approximate Inference with Implicit Stochastic Processes as Priors

  1. Variational Implict Processes
  2. Functional Variational Bayesian Neural Networks
  3. Functional Bayesian Neural Networks for Model Uncertainty Quantification
  4. Functional Space Particle Optimization for Bayesian Neural Networks
  5. Characterizing and Warping the Function space of Bayesian Neural Networks

Neural Networks Meta-Learning/ Hyperparameter-tuning via Bayesian Nonparametric Approaches

  1. Mapping Gaussian Process Priors to Bayesian Neural Networks
  2. Nonparametric Bayesian Deep Networks with Local Competition
  3. Neural Architecture Search with Bayesian Optimisation and Optimal Transport
  4. Gaussian Process Neurons
  5. Characterizing and Warping the Function space of Bayesian Neural Networks

About

The collection of papers about combining deep learning and Bayesian nonparametrics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published