Laplacian Eigenmaps

Laplacian Eigenmaps (LEM) method uses spectral techniques to perform dimensionality reduction. This technique relies on the basic assumption that the data lies in a low-dimensional manifold in a high-dimensional space. The algorithm provides a computationally efficient approach to non-linear dimensionality reduction that has locally preserving properties [1].

This package defines a LEM type to represent a Laplacian eigenmaps results, and provides a set of methods to access its properties.

ManifoldLearning.LEMType
LEM{NN <: AbstractNearestNeighbors, T <: Real} <: NonlinearDimensionalityReduction

The LEM type represents a Laplacian eigenmaps model constructed for T type data with a help of the NN nearest neighbor algorithm.

source
StatsAPI.fitMethod
fit(LEM, data; k=12, maxoutdim=2, ɛ=1.0, nntype=BruteForce)

Fit a Laplacian eigenmaps model to data.

Arguments

  • data: a matrix of observations. Each column of data is an observation.

Keyword arguments

  • k: a number of nearest neighbors for construction of local subspace representation
  • maxoutdim: a dimension of the reduced space.
  • nntype: a nearest neighbor construction class (derived from AbstractNearestNeighbors)
  • ɛ: a Gaussian kernel variance (the scale parameter)
  • laplacian: a form of the Laplacian matrix used for spectral decomposition
    • :unnorm: an unnormalized Laplacian
    • :sym: a symmetrically normalized Laplacian
    • :rw: a random walk normalized Laplacian

Examples

M = fit(LEM, rand(3,100)) # construct Laplacian eigenmaps model
R = predict(M)          # perform dimensionality reduction
source
StatsAPI.predictMethod
predict(R::LEM)

Transforms the data fitted to the Laplacian eigenmaps model R into a reduced space representation.

source

References

  • 1Belkin, M. and Niyogi, P. "Laplacian Eigenmaps for Dimensionality Reduction and Data Representation". Neural Computation, June 2003; 15 (6):1373-1396. DOI:10.1162/089976603321780317