Spectral Methods Tutorial 6 1 © Maks Ovsjanikov

Slides:



Advertisements
Similar presentations
Coherent Laplacian 3D protrusion segmentation Oxford Brookes Vision Group Queen Mary, University of London, 11/12/2009 Fabio Cuzzolin.
Advertisements

3D Geometry for Computer Graphics
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Differential geometry I
Spectral methods 1 © Alexander & Michael Bronstein,
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
PCA + SVD.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
Topology-Invariant Similarity and Diffusion Geometry
Xianfeng Gu, Yaling Wang, Tony Chan, Paul Thompson, Shing-Tung Yau
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Isometry invariant similarity
Principal Component Analysis
Spectral embedding Lecture 6 1 © Alexander & Michael Bronstein
Iterative closest point algorithms
3D Geometry for Computer Graphics
Correspondence & Symmetry
Uncalibrated Geometry & Stratification Sastry and Yang
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Face Recognition Jeremy Wyatt.
Spectral Embedding Alexander Bronstein, Michael Bronstein
1 Numerical geometry of non-rigid shapes In the Rigid Kingdom In the Rigid Kingdom Lecture 4 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
3D Geometry for Computer Graphics
3D Geometry for Computer Graphics
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Dimensionality Reduction
1 Numerical geometry of non-rigid shapes Non-Euclidean Embedding Non-Euclidean Embedding Lecture 6 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
Dimensionality Reduction. Multimedia DBs Many multimedia applications require efficient indexing in high-dimensions (time-series, images and videos, etc)
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
SVD(Singular Value Decomposition) and Its Applications
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004.
Functional maps: A Flexible Representation of Maps Between Shapes SIGGRAPH 2012 Maks Ovsjanikov 1 Mirela Ben-Chen 2 Justin Solomon 2 Adrian Butscher 2.
Modal Shape Analysis beyond Laplacian (CAGP 2012) Klaus Hildebrandt, Christian Schulz, Christoph von Tycowicz, Konrad Polthier (brief) Presenter: ShiHao.Wu.
TEMPLATE BASED SHAPE DESCRIPTOR Raif Rustamov Department of Mathematics and Computer Science Drew University, Madison, NJ, USA.
Shape Matching for Model Alignment 3D Scan Matching and Registration, Part I ICCV 2005 Short Course Michael Kazhdan Johns Hopkins University.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Shape Analysis and Retrieval Statistical Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004.
Axial Flip Invariance and Fast Exhaustive Searching with Wavelets Matthew Bolitho.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
Manifold learning: MDS and Isomap
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Non-Linear Dimensionality Reduction
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Mesh Segmentation via Spectral Embedding and Contour Analysis Speaker: Min Meng
Out of sample extension of PCA, Kernel PCA, and MDS WILSON A. FLORERO-SALINAS DAN LI MATH 285, FALL
CSE 554 Lecture 8: Alignment
Spectral Methods for Dimensionality
CENG 789 – Digital Geometry Processing 08- Rigid-Body Alignment
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
LECTURE 10: DISCRIMINANT ANALYSIS
Morphing and Shape Processing
Dimensionality Reduction
Structure from motion Input: Output: (Tomasi and Kanade)
Principal Component Analysis
George Mason University
Feature space tansformation methods
Generally Discriminant Analysis
LECTURE 09: DISCRIMINANT ANALYSIS
NonLinear Dimensionality Reduction or Unfolding Manifolds
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

Spectral Methods Tutorial 6 1 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 1

Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS and PCA review. Metric MDS. Kernel PCA, kernel trick, relation to Metric MDS. Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

Classic MDS (classical scaling) recap. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling) recap. Given a dissimilarity matrix arising from a normed vector space: We want to find the coordinates of points that would give rise to E.g. given pairwise distances between cities on a map, find the locations: Can only hope to find up to rotation, translation

Classic MDS (classical scaling). On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling). Centering matrix H: Define , where Attention: Only works for normed vector spaces!

Classic MDS (classical scaling). On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling). Define , Express to obtain Note that if , then for any orthonormal Since is symmetric, can find its eigendecomposition: and

Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). Although is a matrix, it has only non-zero eigenvalues if was sampled from . Can project on the first eigenvectors, by taking:

Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). Although is a matrix, it has only non-zero eigenvalues if was sampled from . Can project on the first eigenvectors, by taking: Optimality condition of classic MDS Theorem: If is a set of points in with distances: For any k-dimensional orthonormal projection , the distortion is minimized when is projected onto its principal directions,

Classic MDS – Relation to PCA. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.

Classic MDS – Relation to PCA. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.

Classic MDS – Relation to PCA. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.

Classic MDS – Relation to PCA. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Using the centering matrix, we can express: For any eigenvalue of we have: which implies: The eigenvalues of and are the same and the eigenvectors are given by

Classic MDS – Relation to PCA. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. The eigenvalues of and are the same and the eigenvectors are given by: has the advantage that its size is and it is positive definite rather than positive-semidefinite. Eigendecomposition more stable. If we’re only given pairwise distances, cannot construct directly. Solving different problems!

On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Metric MDS. Suppose instead of minimizing distortion (stress), we want to minimize derived stress. Given pairwise distances , find a set of points to minimize: Even if come from a Euclidean space, the problem is much more difficult. Resort to numerical optimization. Differentiate w.r.t. to to get the gradient. Alternative: perform classical MDS on derived distances. Eigensystem. Problem: The matrix is no longer guaranteed to be positive semi-definite. Critchley F., Multidimensional Scaling: a short critique and a new algorithm, COMPSTAT, 1978

On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Basic Idea: represent a point by its image in a feature space: Domains can be completely different! Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional)

On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Basic Idea: represent a point by its image in a feature space: Domains can be completely different! Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional)

On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Could do PCA in the feature space: compute covariance matrix of feature vectors, and perform its eigen-decomposition. However, instead of , could use If the dimension of feature vectors > , this is more efficient! To center the data, so that can use the centering matrix and find eigenvalues of Schölkopf, B., et al., Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 1998

Kernel PCA and Metric MDS. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Spherical (isotropic) kernel. Depends only on the distance between points: If we assume that then:

Kernel PCA and Metric MDS. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Suppose we’re given a matrix of pairwise distances: If we set then In matrix form: , and moreover: Thus, performing Classical MDS on is equivalent to performing it on A. Classical MDS on attempts to approximate: which is a nonlinear function of distance. So classical MDS on is metric MDS on

Kernel PCA and Metric MDS. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Thus, performing Classical MDS on is equivalent to performing it on A. Classical MDS on attempts to approximate: which is a nonlinear function of distance. Classical MDS on is metric MDS on . Since , it is positive semi-definite if the kernel is chosen appropriately. This is not the case for arbitrary Metric MDS functions. An advantage of doing Kernel PCA is that a new point can be quickly projected onto a pre-computed basis. Difficult with numerical optimization.

On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Summary: If the distance matrix comes from points in a normed vector space, MDS reduces to an Eigenvalue Problem – classical scaling. This classical MDS is also closely related to PCA, which computes the optimal basis when positions are known. Kernel PCA transforms points to a feature space and uses the kernel trick to compute PCA in this space. Metric MDS approximates derived distances , for some given function . If the kernel is spherical, then Kernel PCA is a special case of Metric MDS, for the function

Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS and PCA review. Metric MDS. Kernel PCA, kernel trick, relation to Metric MDS. Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

Images by Q.-X. Huang et al. 08 Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Problem: Given 2 articulated shapes in different poses, find point correspondences : Many degrees of freedom, cannot apply rigid alignment. Images by Q.-X. Huang et al. 08

Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: Embed each shape into a feature space, defined by the Laplacian. The embedding is isometry invariant: for any isometric deformation . The embedding is only defined up to a rigid transform in the feature space. Find the optimal rigid transform in the feature space to find the correspondences.

Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: The shape is defined as a point cloud. Approximate the Laplacian: Solve the generalized eigenvalue problem: Find the most significant eigenvalues/vectors. For each data point , let Where is the i-th eigenvector of

Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: For each data point , let Where is the i-th eigenvector of . Would like to have for corresponding points. However, each eigenvector is only defined up to a sign. Reflection: If correspond to the same eigenvalue, then for any is also an eigenvector. Rotation: Points from the two point sets can be aligned using: where is orthogonal.

Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: Given point correspondences it is easy to obtain the optimal orthogonal matrix: SVD approach from optimal rigid alignment. Let , and compute its singular value decomposition: The optimal solution is given by: With this step, can perform ICP in the feature space to find the optimal correspondeces.

Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Results:

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Main Goal: Find a good, isometry-invariant shape descriptor. Good: Efficient, Easily Computable, Insensitive to local topology changes (unlike MDS)

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Main Idea: For every point define a Global Point Signature Where is an eigenvector of the Laplace-Beltrami operator. GPS is a mapping of the surface onto an infinite dimensional space. Each point gets a signature.

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Properties of GPS: If . GPS is isometry invariant (since Laplace-Beltrami is) Given all eigenfunctions and eigenvalues, can recover the shape up to isometry (not true if only eigenvalues are known). Euclidean distances in the GPS embedding are meaningful: K-means done on the embedding provides a segmentation.

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Comparing GPS: Given a shape, determine its GPS embedding. Construct a histogram of pairwise GPS distances (note that GPS is defined up to sign flips, distances are preserved) For any 2 shapes, compute the -norm difference between their histograms. For refined comparisons use more than one histogram.

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Results:

Conclusions Kernel methods attempt to embed the shape into a feature space, that can be manipulated more easily. Laplacian embedding is useful because of its isometry-invariance. Can be used for comparing non-rigid shapes under isometric deformations. Sign flipping and repeated eigenvalues can cause difficulties (no canonical way to chose them). Limitations: Embeddings are not necessarily stable or mesh independent. Difficult to compute for large meshes (millions of points) Both topological and geometric stability is not well understood.