Nonlinear Dimensionality Reduction Approach (ISOMAP)

Slides:



Advertisements
Similar presentations
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Advertisements

A Geometric Perspective on Machine Learning 何晓飞 浙江大学计算机学院 1.
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction Keywords: Dimensionality reduction, manifold learning, subspace learning,
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Image Manifolds : Learning-based Methods in Vision Alexei Efros, CMU, Spring 2007 © A.A. Efros With slides by Dave Thompson.
“Human Control of an Anthropomorphic Robot Hand”
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
1 NONLINEAR MAPPING: APPROACHES BASED ON OPTIMIZING AN INDEX OF CONTINUITY AND APPLYING CLASSICAL METRIC MDS TO REVISED DISTANCES By Ulas Akkucuk & J.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Manifold learning and pattern matching with entropic graphs Alfred O. Hero Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Similarities, Distances and Manifold Learning Prof. Richard C. Wilson Dept. of Computer Science University of York.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Factor Graphs Young Ki Baik Computer Vision Lab. Seoul National University.
Nonlinear Dimensionality Reduction for Hyperspectral Image Classification Tim Doster Advisors: John Benedetto & Wojciech Czaja.
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
Dimensionality reduction: Some Assumptions High-dimensional data often lies on or near a much lower dimensional, curved manifold. A good way to represent.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
THE MANIFOLDS OF SPATIAL HEARING Ramani Duraiswami | Vikas C. Raykar Perceptual Interfaces and Reality Lab University of Maryland, College park.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Presented by Xianwang Wang Masashi Sugiyama.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Dimensionality Reduction
Manifold learning: MDS and Isomap
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Dimensionality Reduction Part 2: Nonlinear Methods
1 LING 696B: MDS and non-linear methods of dimension reduction.
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
Extension and Evaluation of Multidimensional Scaling (MDS) for Geometric Microphone Array Calibration Amarnag Subramanya and Stanley T. Birchfield Dept.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
國立雲林科技大學 National Yunlin University of Science and Technology Supervised Nonlinear Dimensionality Reduction for Visualization and Classification Xin Geng,
Multi-index Evaluation Algorithm Based on Locally Linear Embedding for the Node importance in Complex Networks Fang Hu
Eric Xing © Eric CMU, Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
CSCE822 Data Mining and Warehousing
Unsupervised Riemannian Clustering of Probability Density Functions
Dipartimento di Ingegneria «Enzo Ferrari»,
Dimensionality Reduction
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Dimensionality Reduction
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Nonlinear Dimensionality Reduction Approach (ISOMAP) 2006. 2. 28 Young Ki Baik Computer Vision Lab. Seoul National University

References A global geometric framework for nonlinear dimensionality reduction J. B. Tenenbaum, V. De Silva, J. C. Langford (Science 2000) LLE and Isomap Analysis of Spectra and Colour Images Dejan Kulpinski (Thesis 1999) Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering Yoshua Bengio et.al. (TR 2003)

Contents Introduction PCA and MDS ISOMAP Conclusion

Dimensionality Reduction The goal The meaningful low-dimensional structures hidden in their high-dimensional observations. Classical techniques PCA (Principle Component Analysis) – preserves the variance MDS (MultiDimensional Scaling) - preserves inter-point distance ISOMAP LLE (Locally Linear Embedding)

Linear Dimensionality Reduction PCA Finds a low-dimensional embedding of the data points that best preserves their variance as measured in the high-dimensional input space. MDS Finds an embedding that preserves the inter-point distances, equivalent to PCA when the distances are Euclidean.

Linear Dimensionality Reduction MDS distances Relation

Nonlinear Dimensionality Reduction Many data sets contain essential nonlinear structures that invisible to PCA and MDS Resort to some nonlinear dimensionality reduction approaches.

ISOMAP Example of Non-linear structure (Swiss roll) Only the geodesic distances reflect the true low-dimensional geometry of the manifold. ISOMAP (Isometric feature Mapping) Preserves the intrinsic geometry of the data. Uses the geodesic manifold distances between all pairs. This figure is a example of non-linear structure.

ISOMAP (Algorithm Description) Step 1 Determining neighboring points within a fixed radius based on the input space distance . These neighborhood relations are represented as a weighted graph G over the data points. Step 2 Estimating the geodesic distances between all pairs of points on the manifold by computing their shortest path distances in the graph G. Step 3 Constructing an embedding of the data in d-dimensional Euclidean space Y that best preserves the manifold’s geometry. This figure is a example of non-linear structure.

ISOMAP (Algorithm Description) Step 1 Determining neighboring points within a fixed radius based on the input space distance . # ε-radius # K-nearest neighbors These neighborhood relations are represented as a weighted graph G over the data points. K=4 ε This figure is a example of non-linear structure. i j k

ISOMAP (Algorithm Description) Step 2 Estimating the geodesic distances between all pairs of points on the manifold by computing their shortest path distances in the graph G. Can be done using Floyd’s algorithm or Dijkstra’s algorithm j i k This figure is a example of non-linear structure.

ISOMAP (Algorithm Description) Step 3 Constructing an embedding of the data in d-dimensional Euclidean space Y that best preserves the manifold’s geometry. Minimize the cost function: This figure is a example of non-linear structure. Solution: take top d eigenvectors of the matrix

Isomap : filled circles Experimental results # FACE # Hand writing : face pose and illumination : bottom loop and top arch MDS : open triangles Isomap : filled circles

Discussion Isomap handles non-linear manifold. Isomap keeps the advantages of PCA and MDS. Non-iterative procedure Polynomial procedure Guaranteed convergence Isomap represents the global structure of a data set within a single coordinate system.