Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.

Slides:



Advertisements
Similar presentations
Coherent Laplacian 3D protrusion segmentation Oxford Brookes Vision Group Queen Mary, University of London, 11/12/2009 Fabio Cuzzolin.
Advertisements

Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
A Geometric Perspective on Machine Learning 何晓飞 浙江大学计算机学院 1.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
L15:Microarray analysis (Classification) The Biological Problem Two conditions that need to be differentiated, (Have different treatments). EX: ALL (Acute.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Dimensionality reduction: Some Assumptions High-dimensional data often lies on or near a much lower dimensional, curved manifold. A good way to represent.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
Learning a Kernel Matrix for Nonlinear Dimensionality Reduction By K. Weinberger, F. Sha, and L. Saul Presented by Michael Barnathan.
THE MANIFOLDS OF SPATIAL HEARING Ramani Duraiswami | Vikas C. Raykar Perceptual Interfaces and Reality Lab University of Maryland, College park.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Dimensionality Reduction
Manifold learning: MDS and Isomap
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Dimensionality Reduction Part 2: Nonlinear Methods
1 LING 696B: MDS and non-linear methods of dimension reduction.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
CS685 : Special Topics in Data Mining, UKY The UNIVERSITY of KENTUCKY Dimensionality Reduction CS 685: Special Topics in Data Mining Spring 2008 Jinze.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Dimensionality Reduction Part 1: Linear Methods Comp Spring 2007.
國立雲林科技大學 National Yunlin University of Science and Technology Supervised Nonlinear Dimensionality Reduction for Visualization and Classification Xin Geng,
Multi-index Evaluation Algorithm Based on Locally Linear Embedding for the Node importance in Complex Networks Fang Hu
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Eric Xing © Eric CMU, Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Unsupervised Riemannian Clustering of Probability Density Functions
Dimensionality Reduction
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Dimensionality Reduction
Object Modeling with Layers
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Dimensionality Reduction
Nonlinear Dimension Reduction:
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Nonlinear Dimensionality Reduction Approaches

Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional observations. Classical techniques  Principle Component Analysis—preserves the variance  Multidimensional Scaling—preserves inter-point distance  Isomap  Locally Linear Embedding

Common Framework Algorithm  Given data. Construct a nxn affinity matrix M.  Normalize M, yielding.  Compute the m largest eigenvalues and eigenvectors of. Only positive eigenvalues should be considered.  The embedding of each example is the vectorwith the i-th element of the j-th principle eigenvectorof. Alternatively (MDS and Isomap), the embedding is, with. If the first m eigenvalues are positive, then is the best approximation of using only m corrdinates, in the sense of squared error.

Linear Dimensionality Reduction PCA  Finds a low-dimensional embedding of the data points that best preserves their variance as measured in the high-dimensional input space MDS  Finds an embedding that preserves the inter-point distances, equivalent to PCA when the distances are Euclidean.

Multi-Dimensional Scaling MDS starts from a notion of distacne of affinity that is computed each pair of training examples. The normalizing step is equivalent to dot products using the “double-centering” formula: The embedding of example is given by where is the k-th eigenvector of. Note that if then where is the average value of

Nonlinear Dimensionality Reduction Many data sets contain essential nonlinear structures that invisible to PCA and MDS Resorts to some nonlinear dimensionality reduction approaches.

A Global Geometric Framework for Nonlinear Dimensionality Reduction (Isomap) Joshua B. Tenenbaum, Vin de Silva, John C. Langford

Example 64X64 Input Images form 4096-dimensional vectors Intrinsically, three dimensions is enough for presentations Two pose parameters and azimuthal lighting angle

Isomap Advantages Combining the major algorithmic features of PCA and MDS  Computational efficiency  Global optimality  Asymptotic convergence guarantees Flexibility of learning a broad class of nonlinear manifold

Example of Nonlinear Structure Swiss roll Only the geodesic distances reflect the true low-dimensional geometry of the manifold.

Intuition Built on top of MDS. Capturing in the geodesic manifold path of any two points by concatenating shortest paths in-between. Approximating these in-between shortest paths given only input-space distance.

Algorithm Description Step 1 Determining neighboring points within a fixed radius based on the input space distance These neighborhood relations are represented as a weighted graph G over the data points. Step 2 Estimating the geodesic distances between all pairs of points on the manifold M by computing their shortest path distancesin the graph G Step 3 Constructing an embedding of the data in d-dimensional Euclidean space Y that best preserves the manifold’s geometry

Construct Embeddings The coordinate vector for points in Y are chosen to minimize the cost function wheredenotes the matrix of Euclidean distances and the matrix norm The operator converts distances to inner products.

Dimension The true dimensionality of data can be estimated from the decrease in error as the dimensionality of Y is increased.

Manifold Recovery Guarantee Isomap is guaranteed asymptotically to recover the true dimensionality and geometric structure of nonlinear manifolds As the sample data points increases, the graph distancesprovide increasingly better approximations to the intrinsic geodesic distances

Examples Interpolations between distant points in the low- dimensional coordinate space.

Summary Isomap handles non-linear manifold Isomap keeps the advantages of PCA and MDS  Non-iterative procedure  Polynomial procedure  Guaranteed convergence Isomap represents the global structure of a data set within a single coordinate system.

Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul

LLE Neighborhood preserving embeddings Mapping to global coordinate system of low dimensionality No need to estimate pairwise distances between widely separated points Recovering global nonlinear structure from locally linear fits

Algorithm Description We expect each data point and its neighbors to lie on or close to a locally linear patch of the manifold. We reconstruct each point from its neighbors. where summarize the contribution of jth data point to the ith data reconstruction and is what we will estimated by optimizing the error  Reconstructed from only its neighbors  W j sums to 1

Algorithm Description A linear mapping for transform the high dimensional coordinates of each neighbor to global internal coordinates on the manifold. Note that the cost defines a quadratic form where The optimal embedding is found by computing the bottom d eigenvector of M, d is the dimension of the embedding

Illustration

Examples Two Dimensional Embeddings of Faces

Examples

Thank you