abstract | author | categories | day | errata | extras | group | key | layout | linkpdf | month | published | section | title | venue | year | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Density modelling in high dimensions is a very difficult problem. Traditional approaches, such as mixtures of Gaussians, typically fail to capture the structure of data sets in high dimensional spaces. In this talk we will argue that for many data sets of interest, the data can be represented as a lower dimensional manifold immersed in the higher dimensional space. We will then present the Gaussian Process Latent Variable Model (GP-LVM), a non-linear probabilistic variant of principal component analysis (PCA) which implicitly assumes that the data lies on a lower dimensional space. Having introduced the GP-LVM we will review extensions to the algorithm, including dynamics, learning of large data sets and back constraints. We will demonstrate the application of the model and its extensions to a range of data sets, including human motion data, a vowel data set and a robot mapping problem. |
|
|
9 |
|
gplvm |
Lawrence-ncrg07 |
talk |
ftp://ftp.dcs.shef.ac.uk/home/neil/gplvm_07_02.pdf |
3 |
2007-03-09 |
pre |
Probabilistic Dimensional Reduction with the <span>G</span>aussian Process Latent Variable Model |
Neural Computing Research Group, Aston University, U.K. |
2007 |