Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE

Slides:



Advertisements
Similar presentations
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
Advertisements

Non-linear Dimensionality Reduction by Locally Linear Inlaying Presented by Peng Zhang Tianjin University, China Yuexian Hou, Peng Zhang, Xiaowei Zhang,
1 Manifold Alignment for Multitemporal Hyperspectral Image Classification H. Lexie Yang 1, Melba M. Crawford 2 School of Civil Engineering, Purdue University.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction Keywords: Dimensionality reduction, manifold learning, subspace learning,
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
Watching Unlabeled Video Helps Learn New Human Actions from Very Few Labeled Snapshots Chao-Yeh Chen and Kristen Grauman University of Texas at Austin.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
One-Shot Multi-Set Non-rigid Feature-Spatial Matching
“Random Projections on Smooth Manifolds” -A short summary
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Manifold Learning Using Geodesic Entropic Graphs Alfred O. Hero and Jose Costa Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan -
Manifold learning and pattern matching with entropic graphs Alfred O. Hero Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Diffusion Maps and Spectral Clustering
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Representative Previous Work
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
1 Learning from Shadows Dimensionality Reduction and its Application in Artificial Intelligence, Signal Processing and Robotics Ali Ghodsi Department of.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Dimensionality Reduction
Manifold learning: MDS and Isomap
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Dimensionality Reduction Part 2: Nonlinear Methods
1 LING 696B: MDS and non-linear methods of dimension reduction.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
H. Lexie Yang1, Dr. Melba M. Crawford2
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Unsupervised Feature Selection for Multi-Cluster Data Deng Cai, Chiyuan Zhang, Xiaofei He Zhejiang University.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
Eric Xing © Eric CMU, Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
CSCE822 Data Mining and Warehousing
Manifold Learning Student: Ali Taalimi 05/05/2012.
INTRODUCTION TO Machine Learning 3rd Edition
Unsupervised Riemannian Clustering of Probability Density Functions
CS 2750: Machine Learning Dimensionality Reduction
Dipartimento di Ingegneria «Enzo Ferrari»,
Machine Learning Basics
Dimensionality Reduction
ISOMAP TRACKING WITH PARTICLE FILTERING
Dimensionality Reduction
Object Modeling with Layers
Learning with information of features
FACE RECOGNITION USING LAPLACIANFACES
Nonlinear Dimension Reduction:
Using Manifold Structure for Partially Labeled Classification
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE Laplacian eigenmap

Motivations In computer vision, one can create large image datasets These datasets can not be described effectively using a linear model November 16, 2018 Computer Vision

Limitations of Linear Models November 16, 2018 Computer Vision

Problem Statement Assume we have a smooth low dimensional manifold in a high dimensional space We have samples on the manifold in the high dimensional space We want to discover the low dimensional structure intrinsic to the manifold November 16, 2018 Computer Vision

Approaches ISOMAP Locally Linear Embeddings (LLE) Laplacian Eigenmaps Tenenbaum, de Silva, Langford, 2000 Locally Linear Embeddings (LLE) Roweis, Saul (2000) Laplacian Eigenmaps Belkin, Niyogi (2002) Hessian Eigenmaps (HLLE) Grimes, Donoho (2003); Local Space Tangent Alignment (LTSA) Zhang, Za (2003) SemiDefinite Embedding (SDE) Weinberger, Saul (2004) November 16, 2018 Computer Vision

Neighborhoods Two ways to select neighboring objects: k nearest neighbors (k-NN) – can make non-uniform neighbor distance across the dataset ε-ball prior knowledge of the data is needed to make reasonable neighborhoods size of neighborhood can vary Corresponding to Parzen Windows and Kn-nearest neighbor estimation November 16, 2018 Computer Vision

Isomap Only geodesic distances reflect the true low dimensional geometry of the manifold Geodesic distances are hard to compute even if you know the manifold In a small neighborhood Euclidian distance is a good approximation of the geodesic distance For faraway points, geodesic distance is approximated by adding up a sequence of “short hops” between neighboring points November 16, 2018 Computer Vision

Geodesic Distance Euclidean distance needs not be a good measure between two points on a manifold Length of geodesic is more appropriate Example: Swiss roll Figure from LLE paper November 16, 2018 Computer Vision

11/16/2018 Isomap Algorithm Find neighborhood of each object by computing distances between all pairs of points and selecting closest Build a graph with a node for each object and an edge between neighboring points. Euclidian distance between two objects is used as edge weight Use a shortest path graph algorithm to fill in distance between all non-neighboring points Apply classical MDS on this distance matrix Dist matrix is double centered November 16, 2018 Computer Vision

Isomap November 16, 2018 Computer Vision

Isomap on Face Images November 16, 2018 Computer Vision

Isomap on Hand Images November 16, 2018 Computer Vision

Isomap on written two-s November 16, 2018 Computer Vision

Locally Linear Embedding (LLE) Isomap attempts to preserve geometry on all scales, mapping nearby points close and distant points far away from each other LLE attempts to preserve local geometry of the data by mapping nearby points on the manifold to nearby points in the low dimensional space November 16, 2018 Computer Vision

LLE – General Idea Locally, on a fine enough scale, everything looks linear Represent object as linear combination of its neighbors Representation indifferent to affine transformation Assumption: same linear representation will hold in the low dimensional space November 16, 2018 Computer Vision

LLE – matrix representation X = W*X where X is p*n matrix of original data W is n*n matrix of weights and Wij =0 if Xj is not neighbor of Xi rows of W sum to one Need to solve system Y = W*Y Y is q*n matrix of underlying low dimensional data Minimize error: November 16, 2018 Computer Vision

LLE - algorithm Find k nearest neighbors in X space Solve for reconstruction weights W Compute embedding coordinates Y using weights W: create sparse matrix M = (I-W)'*(I-W) Compute bottom q+1 eigenvectors of M Set i-th row of Y to be i+1 smallest eigen vector November 16, 2018 Computer Vision

LLE November 16, 2018 Computer Vision

LLE – Effect of Neighborhood Size November 16, 2018 Computer Vision

LLE – with face picture November 16, 2018 Computer Vision

Extended Isomap for Classification The idea is to use the geodesic distance to the training instances as a feature vector Then reduce the dimension using the Fisher linear discriminant analysis November 16, 2018 Computer Vision

Extended Isomap for Classification November 16, 2018 Computer Vision

Extended Isomap for Classification November 16, 2018 Computer Vision

Extended Isomap for Classification November 16, 2018 Computer Vision

Extended Isomap for Classification November 16, 2018 Computer Vision

Laplacian Eigenmaps First construct an adjacency graph, similar to LLE and Isomap Assign weights using November 16, 2018 Computer Vision

Laplacian Eigenmaps Compute eigenvalues and eigenvectors for a generalized eigenvector problem Which minimizes November 16, 2018 Computer Vision

Laplacian Eigenmaps November 16, 2018 Computer Vision

Laplacian Eigenmaps November 16, 2018 Computer Vision

Laplacian Eigenmaps November 16, 2018 Computer Vision