Skip to content

wildart/ManifoldLearning.jl

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
src
 
 
 
 
 
 
 
 
 
 
 
 
 
 

ManifoldLearning

A Julia package for manifold learning and nonlinear dimensionality reduction.

Documentation Build Status

Methods

  • Isomap
  • Diffusion maps
  • Locally Linear Embedding (LLE)
  • Hessian Eigenmaps (HLLE)
  • Laplacian Eigenmaps (LEM)
  • Local tangent space alignment (LTSA)
  • t-Distributed Stochastic Neighborhood Embedding (t-SNE)

Installation

The package can be installed with the Julia package manager. From the Julia REPL, type ] to enter the Pkg REPL mode and run:

pkg> add ManifoldLearning

Examples

A simple example of using the Isomap reduction method.

julia> X, _ = ManifoldLearning.swiss_roll();

julia> X
3×1000 Array{Float64,2}:
  -3.19512  3.51939   -0.0390153-9.46166   3.44159
  29.1222   9.99283    2.25296       25.1417   28.8007
 -10.1861   6.59074  -11.037         -1.04484  13.4034

julia> M = fit(Isomap, X)
Isomap(outdim = 2, neighbors = 12)

julia> Y = transform(M)
2×1000 Array{Float64,2}:
 11.0033  -13.069   16.7116-3.26095   25.7771
 18.4133   -6.2693  10.6698     20.0646   -24.8973

Performance

Most of the methods use k-nearest neighbors method for constructing local subspace representation. By default, neighbors are computed from a distance matrix of a dataset. This is not an efficient method, especially, for large datasets.

Consider using a custom k-nearest neighbors function, e.g. from NearestNeighbors.jl or FLANN.jl.

See example of custom knn function here.