Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.

Slides:



Advertisements
Similar presentations
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
Advertisements

Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Self-Organizing Maps Projection of p dimensional observations to a two (or one) dimensional grid space Constraint version of K-means clustering –Prototypes.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Image Manifolds : Learning-based Methods in Vision Alexei Efros, CMU, Spring 2007 © A.A. Efros With slides by Dave Thompson.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
1 NONLINEAR MAPPING: APPROACHES BASED ON OPTIMIZING AN INDEX OF CONTINUITY AND APPLYING CLASSICAL METRIC MDS TO REVISED DISTANCES By Ulas Akkucuk & J.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Nonlinear Dimensionality Reduction for Hyperspectral Image Classification Tim Doster Advisors: John Benedetto & Wojciech Czaja.
Using Dijkstra’s Algorithm to Find a Shortest Path from a to z 1.
Estimating Intrinsic Dimension Justin Eberhardt UMD, Mathematics and Statistics Advisor: Dr. Kang James.
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Dimensionality Reduction
Spoken Language Group Chinese Information Processing Lab. Institute of Information Science Academia Sinica, Taipei, Taiwan
Manifold learning: MDS and Isomap
Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego.
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Dimensionality Reduction Part 2: Nonlinear Methods
1 LING 696B: MDS and non-linear methods of dimension reduction.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Supervisor: Nakhmani Arie Semester: Winter 2007 Target Recognition Harmatz Isca.
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
Out of sample extension of PCA, Kernel PCA, and MDS WILSON A. FLORERO-SALINAS DAN LI MATH 285, FALL
Eric Xing © Eric CMU, Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
CSCE822 Data Mining and Warehousing
INTRODUCTION TO Machine Learning 3rd Edition
Unsupervised Riemannian Clustering of Probability Density Functions
کاربرد نگاشت با حفظ تنکی در شناسایی چهره
Dipartimento di Ingegneria «Enzo Ferrari»,
Dimensionality Reduction
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Object Modeling with Layers
CS4670: Intro to Computer Vision
Video Analysis via Nonlinear Dimensionality Reduction Technique
Nonlinear Dimension Reduction:
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Manifold Learning Dimensionality Reduction

Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s algorithm Reference

Introduction (dim. reduction) Dimensionality Reduction Linear PCA MDS Non-linear Isomap(2000) LLE(2000) SDE(2005)

Introduction (dim. reduction) Principal Component Analysis x ∑

Introduction (dim. reduction) Dimensionality Reduction Linear PCA MDS Non-linear Isomap(2000) LLE(2000) SDE(2005)

Introduction (dim. reduction) Multidimensional Scaling ChicagoRaleighBostonSeattleS.F.AustinOrlando Chicago0 Raleigh6410 Boston Seattle S.F Austin Orlando

Introduction (dim. reduction)

Dimensionality Reduction Linear PCA MDS Non-linear Isomap(2000) LLE(2000) SDE(2005)

Introduction (manifold) Linear methods do nothing more than “ globally transform ” (rotate/translate..) data. Sometimes need to “ unwrap ” the data first PCA

Introduction (dim. reduction) The task of dimensionality reduction is to find a small number of features to represent a large number of observed dimensions.

Introduction (manifold)

Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s algorithm Reference

Isomap (overall procedure) Compute fully-connected neighborhood of points for each item (k nearest) Calculate pairwise Euclidean distances within each neighborhood Use Dijkstra ’ s Algorithm to compute shortest path from each point to non- neighboring points Run MDS on resulting distance matrix

Isomap (Approximating geodesic dist.)

is not much bigger than

Isomap (Approximating geodesic dist.) is not much bigger than

Isomap (Approximating geodesic dist.) is not much bigger than

Isomap (Approximating geodesic dist.) is not much bigger than

Isomap (Approximating geodesic dist.)

Isomap (Dijkstra ’ s Algorithm) Greedy breadth-first algorithm to compute shortest path from one point to all other points

Isomap (Dijkstra ’ s Algorithm) Greedy breadth-first algorithm to compute shortest path from one point to all other points

Isomap (Dijkstra ’ s Algorithm) Greedy breadth-first algorithm to compute shortest path from one point to all other points

Isomap (Dijkstra ’ s Algorithm) Greedy breadth-first algorithm to compute shortest path from one point to all other points

Isomap

Reference