1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.

Slides:



Advertisements
Similar presentations
Coherent Laplacian 3D protrusion segmentation Oxford Brookes Vision Group Queen Mary, University of London, 11/12/2009 Fabio Cuzzolin.
Advertisements

3D Geometry for Computer Graphics
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Spectral methods 1 © Alexander & Michael Bronstein,
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
PCA + SVD.
Two-View Geometry CS Sastry and Yang
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
Topology-Invariant Similarity and Diffusion Geometry
1 Numerical Geometry of Non-Rigid Shapes Diffusion Geometry Diffusion geometry © Alexander & Michael Bronstein, © Michael Bronstein, 2010 tosca.cs.technion.ac.il/book.
Xianfeng Gu, Yaling Wang, Tony Chan, Paul Thompson, Shing-Tung Yau
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
1 Michael Bronstein Computational metric geometry Computational metric geometry Michael Bronstein Department of Computer Science Technion – Israel Institute.
Isometry invariant similarity
Principal Component Analysis
Spectral embedding Lecture 6 1 © Alexander & Michael Bronstein
Iterative closest point algorithms
3D Geometry for Computer Graphics
Dimensionality Reduction and Embeddings
Dimensional reduction, PCA
Correspondence & Symmetry
Uncalibrated Geometry & Stratification Sastry and Yang
Spectral Embedding Alexander Bronstein, Michael Bronstein
Numerical geometry of non-rigid shapes
1 Numerical geometry of non-rigid shapes In the Rigid Kingdom In the Rigid Kingdom Lecture 4 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Non-Euclidean Embedding
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
3D Geometry for Computer Graphics
3D Geometry for Computer Graphics
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
1 Numerical Geometry of Non-Rigid Shapes Invariant shape similarity Invariant shape similarity © Alexander & Michael Bronstein, © Michael Bronstein,
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Dimensionality Reduction
1 Numerical geometry of non-rigid shapes Non-Euclidean Embedding Non-Euclidean Embedding Lecture 6 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
Dimensionality Reduction. Multimedia DBs Many multimedia applications require efficient indexing in high-dimensions (time-series, images and videos, etc)
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
SVD(Singular Value Decomposition) and Its Applications
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004.
Functional maps: A Flexible Representation of Maps Between Shapes SIGGRAPH 2012 Maks Ovsjanikov 1 Mirela Ben-Chen 2 Justin Solomon 2 Adrian Butscher 2.
Shape Matching for Model Alignment 3D Scan Matching and Registration, Part I ICCV 2005 Short Course Michael Kazhdan Johns Hopkins University.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Expression-invariant Face Recognition using Geodesic Distance Isometries Kerry Widder A Review of ‘Robust expression-invariant face recognition from partially.
Shape Analysis and Retrieval Statistical Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004.
Axial Flip Invariance and Fast Exhaustive Searching with Wavelets Matthew Bolitho.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
Manifold learning: MDS and Isomap
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Non-Linear Dimensionality Reduction
David Levin Tel-Aviv University Afrigraph 2009 Shape Preserving Deformation David Levin Tel-Aviv University Afrigraph 2009 Based on joint works with Yaron.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Affine Registration in R m 5. The matching function allows to define tentative correspondences and a RANSAC-like algorithm can be used to estimate the.
Mesh Segmentation via Spectral Embedding and Contour Analysis Speaker: Min Meng
Out of sample extension of PCA, Kernel PCA, and MDS WILSON A. FLORERO-SALINAS DAN LI MATH 285, FALL
CSE 554 Lecture 8: Alignment
CENG 789 – Digital Geometry Processing 08- Rigid-Body Alignment
LECTURE 10: DISCRIMINANT ANALYSIS
Morphing and Shape Processing
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
Principal Component Analysis
Feature space tansformation methods
LECTURE 09: DISCRIMINANT ANALYSIS
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009

2 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., Classic MDS and PCA review. 2.Metric MDS. 3.Kernel PCA, kernel trick, relation to Metric MDS. 4.Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

3 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling) recap. 1.Given a dissimilarity matrix arising from a normed vector space: 2.We want to find the coordinates of points that would give rise to E.g. given pairwise distances between cities on a map, find the locations: Can only hope to find up to rotation, translation

4 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling). 1.Centering matrix H: 2.Define, where Attention: Only works for normed vector spaces!

5 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling). 2.Define, 3.Express to obtain Note that if, then for any orthonormal 4.Since is symmetric, can find its eigendecomposition: and

6 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). 1.Although is a matrix, it has only non-zero eigenvalues if was sampled from. 2.Can project on the first eigenvectors, by taking:

7 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). 1.Although is a matrix, it has only non-zero eigenvalues if was sampled from. 2.Can project on the first eigenvectors, by taking: Optimality condition of classic MDS Theorem: If is a set of points in with distances: For any k-dimensional orthonormal projection, the distortion is minimized when is projected onto its principal directions,

8 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Classic MDS – Relation to PCA. 1.During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: 2.Try to find a more natural basis to express the points in. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

9 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Classic MDS – Relation to PCA. 1.During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: 2.Try to find a more natural basis to express the points in. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

10 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Classic MDS – Relation to PCA. 1.During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: 2.Try to find a more natural basis to express the points in. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

11 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Classic MDS – Relation to PCA. 1.During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: 2.Using the centering matrix, we can express: 3.For any eigenvalue of we have: which implies: 4.The eigenvalues of and are the same and the eigenvectors are given by On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

12 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Classic MDS – Relation to PCA. 1.The eigenvalues of and are the same and the eigenvectors are given by: 2. has the advantage that its size is and it is positive definite rather than positive-semidefinite. Eigendecomposition more stable. 3.If we’re only given pairwise distances, cannot construct directly. Solving different problems! On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

13 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Metric MDS. 1.Suppose instead of minimizing distortion (stress), we want to minimize derived stress. Given pairwise distances, find a set of points to minimize: 2.Even if come from a Euclidean space, the problem is much more difficult. 3.Resort to numerical optimization. Differentiate w.r.t. to to get the gradient. 4.Alternative: perform classical MDS on derived distances. Eigensystem. Problem: The matrix is no longer guaranteed to be positive semi-definite. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Critchley F., Multidimensional Scaling: a short critique and a new algorithm, COMPSTAT, 1978

14 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Kernel PCA. 1.Basic Idea: represent a point by its image in a feature space: 2.Domains can be completely different! 3.Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional) On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

15 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Kernel PCA. 1.Basic Idea: represent a point by its image in a feature space: 2.Domains can be completely different! 3.Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional) On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

16 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Kernel PCA. 1.Could do PCA in the feature space: compute covariance matrix of feature vectors, and perform its eigen-decomposition. 2.However, instead of, could use If the dimension of feature vectors >, this is more efficient! 3.To center the data, so that can use the centering matrix and find eigenvalues of On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Schölkopf, B., et al., Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 1998

17 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Kernel PCA and Metric MDS. 1.Spherical (isotropic) kernel. Depends only on the distance between points: 2.If we assume that then: On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

18 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Kernel PCA and Metric MDS. 1.Suppose we’re given a matrix of pairwise distances: 2.If we set then In matrix form:, and moreover: 3.Thus, performing Classical MDS on is equivalent to performing it on A. 4.Classical MDS on attempts to approximate: which is a nonlinear function of distance. So classical MDS on is metric MDS on On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

19 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Kernel PCA and Metric MDS. 1.Thus, performing Classical MDS on is equivalent to performing it on A. 2.Classical MDS on attempts to approximate: which is a nonlinear function of distance. Classical MDS on is metric MDS on. 3. Since, it is positive semi-definite if the kernel is chosen appropriately. This is not the case for arbitrary Metric MDS functions. 4.An advantage of doing Kernel PCA is that a new point can be quickly projected onto a pre-computed basis. Difficult with numerical optimization. On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

20 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Summary: 1.If the distance matrix comes from points in a normed vector space, MDS reduces to an Eigenvalue Problem – classical scaling. 2.This classical MDS is also closely related to PCA, which computes the optimal basis when positions are known. 3.Kernel PCA transforms points to a feature space and uses the kernel trick to compute PCA in this space. 4.Metric MDS approximates derived distances, for some given function. 5.If the kernel is spherical, then Kernel PCA is a special case of Metric MDS, for the function On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

21 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., Classic MDS and PCA review. 2.Metric MDS. 3.Kernel PCA, kernel trick, relation to Metric MDS. 4.Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

22 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Problem: 1.Given 2 articulated shapes in different poses, find point correspondences : 2.Many degrees of freedom, cannot apply rigid alignment. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Images by Q.-X. Huang et al. 08

23 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Approach: 1.Embed each shape into a feature space, defined by the Laplacian. 2.The embedding is isometry invariant: for any isometric deformation. 3.The embedding is only defined up to a rigid transform in the feature space. 4.Find the optimal rigid transform in the feature space to find the correspondences. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007

24 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Approach: 1.The shape is defined as a point cloud. Approximate the Laplacian: 2.Solve the generalized eigenvalue problem: 3.Find the most significant eigenvalues/vectors. 4.For each data point, let Where is the i -th eigenvector of Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007

25 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Approach: 1.For each data point, let Where is the i -th eigenvector of. 2.Would like to have for corresponding points. However, each eigenvector is only defined up to a sign. Reflection: 3. If correspond to the same eigenvalue, then for any is also an eigenvector. Rotation: 4.Points from the two point sets can be aligned using: where is orthogonal. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007

26 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Approach: 1.Given point correspondences it is easy to obtain the optimal orthogonal matrix: SVD approach from optimal rigid alignment. 2.Let, and compute its singular value decomposition: 3.The optimal solution is given by: 4.With this step, can perform ICP in the feature space to find the optimal correspondeces. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007

27 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Results: Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007

28 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Main Goal: Find a good, isometry-invariant shape descriptor. Good: Efficient, Easily Computable, Insensitive to local topology changes (unlike MDS) Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

29 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Main Idea: For every point define a Global Point Signature Where is an eigenvector of the Laplace-Beltrami operator. GPS is a mapping of the surface onto an infinite dimensional space. Each point gets a signature. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

30 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Properties of GPS: 1.If. 2.GPS is isometry invariant (since Laplace-Beltrami is) 3.Given all eigenfunctions and eigenvalues, can recover the shape up to isometry (not true if only eigenvalues are known). 4.Euclidean distances in the GPS embedding are meaningful: K-means done on the embedding provides a segmentation. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

31 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Comparing GPS: 1.Given a shape, determine its GPS embedding. 2.Construct a histogram of pairwise GPS distances (note that GPS is defined up to sign flips, distances are preserved) 3.For any 2 shapes, compute the -norm difference between their histograms. 4.For refined comparisons use more than one histogram. Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

32 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Results: Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007

33 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. 1.Kernel methods attempt to embed the shape into a feature space, that can be manipulated more easily. 2.Laplacian embedding is useful because of its isometry-invariance. Can be used for comparing non-rigid shapes under isometric deformations. 3.Sign flipping and repeated eigenvalues can cause difficulties (no canonical way to chose them). Limitations: 1.Embeddings are not necessarily stable or mesh independent. 2.Difficult to compute for large meshes (millions of points) 3.Both topological and geometric stability is not well understood. Conclusions