Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009

Slides:



Advertisements
Similar presentations
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Advertisements

A Geometric Perspective on Machine Learning 何晓飞 浙江大学计算机学院 1.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Correspondence & Symmetry
Dimension reduction : PCA and Clustering Slides by Agnieszka Juncker and Chris Workman.
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Dimensionality Reduction
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Dimensionality Reduction. Multimedia DBs Many multimedia applications require efficient indexing in high-dimensions (time-series, images and videos, etc)
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Estimating Intrinsic Dimension Justin Eberhardt UMD, Mathematics and Statistics Advisor: Dr. Kang James.
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Learning a Kernel Matrix for Nonlinear Dimensionality Reduction By K. Weinberger, F. Sha, and L. Saul Presented by Michael Barnathan.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Materials Process Design and Control Laboratory A NONLINEAR DIMENSION REDUCTION STRATEGY FOR GENERATING DATA DRIVEN STOCHASTIC INPUT MODELS Baskar Ganapathysubramanian.
Dimension reduction : PCA and Clustering Slides by Agnieszka Juncker and Chris Workman modified by Hanne Jarmer.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
CSE 185 Introduction to Computer Vision Face Recognition.
Dimensionality Reduction
Manifold learning: MDS and Isomap
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Dimensionality Reduction Part 2: Nonlinear Methods
1 LING 696B: MDS and non-linear methods of dimension reduction.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Multidimensional Scaling By Marc Sobel. The Goal  We observe (possibly non-euclidean) proximity data. For each pair of objects number ‘i’ and ‘j’ we.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
CS685 : Special Topics in Data Mining, UKY The UNIVERSITY of KENTUCKY Dimensionality Reduction CS 685: Special Topics in Data Mining Spring 2008 Jinze.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Dimensionality Reduction Part 1: Linear Methods Comp Spring 2007.
國立雲林科技大學 National Yunlin University of Science and Technology Supervised Nonlinear Dimensionality Reduction for Visualization and Classification Xin Geng,
Eric Xing © Eric CMU, Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
CSCE822 Data Mining and Warehousing
Dipartimento di Ingegneria «Enzo Ferrari»,
Dimensionality Reduction
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Dimensionality Reduction
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Dimension reduction : PCA and Clustering
Nonlinear Dimension Reduction:
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009 A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum (Stanford), Vin de Silva (Stanford), John C. Langford (CMU) SCIENCE 2000 Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009

Outline Motivations PCA, Principle Component Analysis MDS, Multidimensional Scaling Isomap, Isometric Feature Mapping Examples Summary

Motivations Finding meaningful low-dimensional structures hidden in their high-dimensional observations High-dimensional sensory input, Human Brain, a small number of perceptually relevant features High-dimensional data, Machine Learning, relevant low dimensional features Discovering the nonlinear degrees of freedom underling complex natural observations: human handwriting, face images, Swiss roll ….

3 degrees of freedom: Up-down pose, Left-right pose, Lighting direction 2 degrees of freedom on the manifold 2 degrees of freedom: Top arch articulation, Bottom loop articulation

PCA, Principle Component Analysis PCA finds a low-dimensional embedding of the data points that best preserves their variance as measured in the high dimensional input space Pi is the eigenvector corresponding to the ith largest eigenvalue of the covariance matrix of X (Wiki)

MDS, Multidimensional Scaling MDS finds an embedding that preserves the pairwise distances (or generalized disparities) between data points, equivalent to PCA when those distances are Euclidean. “An MDS algorithm starts with a matrix of item–item similarities, then assigns a location to each item in N-dimensional space, where N is specified a priori. For sufficiently small N, the resulting locations may be displayed in a graph or 3D visulization.” Wiki Objective: choose to minimize the stress, where is the Euclidian distance between points i and j on the map, and is the dissimilarity between them.

Advantages and Limitations of PCA and MDS Computational efficiency Global optimality Asymptotic convergence guarantees Limitations Many data sets contain essential nonlinear structures that are invisible to PCA and MDS. (The failure of the Euclidean structure in the input space)

Isomap, Isometric Feature Mapping Basic idea: Isomap builds on classical MDS. It seeks to preserve the intrinsic geometry of the data, as captured in the geodesic manifold distances between all pairs of data points. The keypoint is estimating the geodesic distance between faraway points, given only input space distances.

Isomap: Algorithms Step 1: Based on the distance in the input space (fixed radius or K nearest neighbors), determining which points are neighbors on the Manifold M. These neighborhood relations are represented as a weighted graph G over the data points.

Isomap: Algorithms Step 2: Estimating the geodesic distances between all pairs of points on the manifold M by computing their shortest path distances in the graph G. A simple algorithm:

Isomap: Algorithms Step 3: Applying classical MDS to the matrix of graph distances , constructing an embedding of the data in a d-dimensional Euclidean space Y that best preserves the manifold’s estimated intrinsic geometry. Minimizing

Isomap: Examples

Performance comparison PCA, MDS PCA, MDS Isomap Isomap A. Face images varying in pose and illumination B. Swiss roll data

Summary Isomap is capable of discovering the nonlinear degrees of freedom that underline complex natural observations. It efficiently (noniterative, polynomial time) computes a globally optimal solution. It is guaranteed to converge asymptotically to the true structure for intrinsically Euclidean manifolds.