Skip to content

Latest commit

 

History

History
executable file
·
40 lines (40 loc) · 1.59 KB

2007-03-09-probabilistic-dimensional-reduction-with-the-span-g-span-aussian-process-latent-variabl.md

File metadata and controls

executable file
·
40 lines (40 loc) · 1.59 KB
abstract author categories day errata extras group key layout linkpdf month published section title venue year
Density modelling in high dimensions is a very difficult problem. Traditional approaches, such as mixtures of Gaussians, typically fail to capture the structure of data sets in high dimensional spaces. In this talk we will argue that for many data sets of interest, the data can be represented as a lower dimensional manifold immersed in the higher dimensional space. We will then present the Gaussian Process Latent Variable Model (GP-LVM), a non-linear probabilistic variant of principal component analysis (PCA) which implicitly assumes that the data lies on a lower dimensional space. Having introduced the GP-LVM we will review extensions to the algorithm, including dynamics, learning of large data sets and back constraints. We will demonstrate the application of the model and its extensions to a range of data sets, including human motion data, a vowel data set and a robot mapping problem.
family given gscholar institute twitter url
Lawrence
Neil D.
r3SJcvoAAAAJ
University of Sheffield
lawrennd
Lawrence-ncrg07
9
label link
Demos Software
label link
Main Software
gplvm
Lawrence-ncrg07
talk
ftp://ftp.dcs.shef.ac.uk/home/neil/gplvm_07_02.pdf
3
2007-03-09
pre
Probabilistic Dimensional Reduction with the <span>G</span>aussian Process Latent Variable Model
Neural Computing Research Group, Aston University, U.K.
2007