Shape Analysis and Retrieval Statistical Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Shape Analysis and Retrieval D2 Shape Distributions Notes courtesy of Funk et al., SIGGRAPH 2004.
3D Geometry for Computer Graphics
Wavelets Fast Multiresolution Image Querying Jacobs et.al. SIGGRAPH95.
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
Extended Gaussian Images
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Symmetric Matrices and Quadratic Forms
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Face Recognition Recognized Person Face Recognition.
Chapter 5 Orthogonality
Principal Component Analysis
Computer Graphics Recitation 5.
Iterative closest point algorithms
3D Geometry for Computer Graphics
Computer Graphics Recitation 6. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
Correspondence & Symmetry
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 4 March 30, 2005
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Face Recognition Jeremy Wyatt.
Shape Descriptors I Thomas Funkhouser CS597D, Fall 2003 Princeton University Thomas Funkhouser CS597D, Fall 2003 Princeton University.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Shape Matching and Anisotropy Michael Kazhdan, Thomas Funkhouser, and Szymon Rusinkiewicz Princeton University Michael Kazhdan, Thomas Funkhouser, and.
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Orthogonality and Least Squares
Stats & Linear Models.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
SVD(Singular Value Decomposition) and Its Applications
Description of 3D-Shape Using a Complex Function on the Sphere Dejan Vranić and Dietmar Saupe Slides prepared by Nat for CS
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Signal Processing and Representation Theory Lecture 1.
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004.
Shape Matching for Model Alignment 3D Scan Matching and Registration, Part I ICCV 2005 Short Course Michael Kazhdan Johns Hopkins University.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Shape Analysis and Retrieval Structural Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004.
AN ORTHOGONAL PROJECTION
SVD: Singular Value Decomposition
Linear algebra: matrix Eigen-value Problems
Axial Flip Invariance and Fast Exhaustive Searching with Wavelets Matthew Bolitho.
Signal Processing and Representation Theory Lecture 4.
Shape Descriptors Thomas Funkhouser and Michael Kazhdan Princeton University Thomas Funkhouser and Michael Kazhdan Princeton University.
Signal Processing and Representation Theory Lecture 2.
Signal Processing and Representation Theory Lecture 3.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
CSE 185 Introduction to Computer Vision Face Recognition.
1 Latent Concepts and the Number Orthogonal Factors in Latent Semantic Analysis Georges Dupret
Methods for 3D Shape Matching and Retrieval
Participant Presentations Please Sign Up: Name (Onyen is fine, or …) Are You ENRolled? Tentative Title (???? Is OK) When: Thurs., Early, Oct., Nov.,
Partial Shape Matching. Outline: Motivation Sum of Squared Distances.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
CSE 554 Lecture 8: Alignment
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
LECTURE 10: DISCRIMINANT ANALYSIS
Lecture 8:Eigenfaces and Shared Features
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
Shape Analysis and Retrieval
Parallelization of Sparse Coding & Dictionary Learning
Linear Algebra Lecture 39.
Symmetric Matrices and Quadratic Forms
LECTURE 09: DISCRIMINANT ANALYSIS
Presentation transcript:

Shape Analysis and Retrieval Statistical Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004

Outline: Shape Descriptors Statistical Shape Descriptors Singular Value Decomposition (SVD)

Shape Matching General approach: Define a function that takes in two models and returns a measure of their proximity. D,D, M1M1 M1M1 M3M3 M2M2 M 1 is closer to M 2 than it is to M 3

Shape Descriptors Shape Descriptor: A structured abstraction of a 3D model that is well suited to the challenges of shape matching Descriptors 3D Models D, D,

Matching with Descriptors Preprocessing  Compute database descriptors Run-Time 3D Query Shape Descriptor 3D Database Best Matches

Matching with Descriptors Preprocessing  Compute database descriptors Run-Time  Compute query descriptor 3D Query Shape Descriptor 3D Database Best Matches

Matching with Descriptors Preprocessing  Compute database descriptors Run-Time  Compute query descriptor  Compare query descriptor to database descriptors 3D Query Shape Descriptor 3D Database Best Matches

Matching with Descriptors Preprocessing  Compute database descriptors Run-Time  Compute query descriptor  Compare query descriptor to database descriptors  Return best Match(es) 3D Query Shape Descriptor 3D Database Best Matches

Shape Matching Challenge Need shape descriptor that is:  Concise to store –Quick to compute –Efficient to match –Discriminating 3D Query Shape Descriptor 3D Database Best Matches

Shape Matching Challenge Need shape descriptor that is: –Concise to store  Quick to compute –Efficient to match –Discriminating 3D Query Shape Descriptor 3D Database Best Matches

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute  Efficient to match –Discriminating 3D Query Shape Descriptor 3D Database Best Matches

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match  Discriminating 3D Query Shape Descriptor 3D Database Best Matches

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating  Invariant to transformations –Invariant to deformations –Insensitive to noise –Insensitive to topology –Robust to degeneracies Different Transformations (translation, scale, rotation, mirror)

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations  Invariant to deformations –Insensitive to noise –Insensitive to topology –Robust to degeneracies Different Articulated Poses

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations –Invariant to deformations  Insensitive to noise –Insensitive to topology –Robust to degeneracies Scanned Surface Image courtesy of Ramamoorthi et al.

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations –Invariant to deformations –Insensitive to noise  Insensitive to topology –Robust to degeneracies Different Tessellations Different Genus Images courtesy of Viewpoint & Stanford

Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations –Invariant to deformations –Insensitive to noise –Insensitive to topology  Robust to degeneracies No Bottom! Images courtesy of Utah & De Espona

Outline: Shape Descriptors Statistical Shape Descriptors Singular Value Decomposition (SVD)

Statistical Shape Descriptors Challenge: Want a simple shape descriptor that is easy to compare and gives a continuous measure of the similarity between two models. Solution: Represent each model by a vector and define the distance between models as the distance between corresponding vectors.

Statistical Shape Descriptors Properties: –Structured representation –Easy to compare –Generalizes the matching problem Models represented as points in a fixed dimensional vector space

Statistical Shape Descriptors General Approaches: –Retrieval –Clustering –Compression –Hierarchical representation Models represented as points in a fixed dimensional vector space

Outline: Shape Descriptors Statistical Shape Descriptors Singular Value Decomposition (SVD)

Complexity of Retrieval Given a query: Compute the distance to each database model Sort the database models by proximity Return the closest matches ~ Best Match(es) 3D Query Database ModelsSorted Models D(Q,M i ) Q M1M1 M2M2 MkMk M1M1 M2M2 MkMk ~ ~ ~ M1M1 ~ M2M2

Complexity of Retrieval If there are k models in the database and each model is represented by an n-dimensional vector: Computing the distance to each database model: –O(k n) time Sort the database models by proximity: –O(k logk) time If n is large, retrieval will be prohibitively slow.

Algebra Definition: Given a vector space V and a subspace W  V, the projection onto W, written  W, is the map that sends v  V to the nearest vector in W. If {w 1,…,w m } is an orthonormal basis for W, then:

Tensor Algebra Definition: The inner product of two n-dimensional vectors v={v 1,…,v n } and w={w 1,…,w n }, written  v,w , is the scalar value defined by:

Tensor Algebra Definition: The outer product of two n-dimensional vectors v={v 1,…,v n } and w={w 1,…,w n }, written v  w, is the matrix defined by:

Tensor Algebra Definition: The transpose of an mxn matrix M, written M t, is the nxm matrix with: Property: For any two vectors v and w, the transpose has the property:

SVD Compression Key Idea: Given a collection of vectors in n-dimensional space, find a good m-dimensional subspace (m<<n) in which to represent the vectors.

SVD Compression Specifically: If P={p 1,…,p k } is the initial n-dimensional point set, and {w 1,…,w m } is an orthonormal basis for the m-dimensional subspace, we will compress the point set by sending:

SVD Compression Challenge: To find the m-dimensional subspace that will best capture the initial point information.

Variance of the Point Set Given a collection of points P={p 1,…,p k }, in an n-dimensional vector space, determine how the vectors are distributed across different directions. pipi p1p1 p2p2 pkpk

Variance of the Point Set Define the Var P as the function: giving the variance of the point set P in direction v (assume |v|=1). pipi p1p1 p2p2 pkpk

Variance of the Point Set More generally, for a subspace W  V, define the variance of P in the subspace W as: If {w 1,…,w m } is an orthonormal basis for W, then:

Variance of the Point Set Example: The variance in the direction v 1 is large, while the variance in the direction v 2 is small.  If we want to compress down to one dimension, we should project the points onto v 1 pipi p2p2 p1p1 v1v1v1v1 v2v2v2v2 pkpk

Covariance Matrix Definition: The covariance matrix M P, of a point set P={p 1,…,p k } is the symmetric matrix which is the sum of the outer products of the p i :

Covariance Matrix Theorem: The variance of the point set P in a direction v is equal to:

Covariance Matrix Theorem: The variance of the point set P in a direction v is equal to: Proof:

Singular Value Decomposition Theorem: Every symmetric matrix M can be written out as the product: where O is a rotation/reflection matrix (OO t =Id) and D is a diagonal matrix with the property:

Singular Value Decomposition Implication: Given a point set P, we can compute the covariance matrix of the point set, M P, and express the matrix in terms of its SVD factorization: where {v 1,…,v n } is an orthonormal basis and i is the variance of the point set in direction v i.

Singular Value Decomposition Compression: The vector subspace spanned by {v 1,…,v m } is the vector sub-space that maximizes the variance in the initial point set P. If m is too small, then too much information is discarded and there will be a loss in retrieval accuracy.

Singular Value Decomposition Hierarchical Matching: First coarsely compare the query to database vectors. If {query is coarsely similar to target} –Refine the comparison Else –Do not refine O(k n) matching becomes O(k m) with m<<n and no loss of retrieval accuracy.

Singular Value Decomposition Hierarchical Matching: SVD expresses the initial vectors in terms of the eigenbasis: Because there is more variance in v 1 than in v 2, more variance in v 2 than in v 3, etc. this gives a hierarchical representation of the data so that coarse comparisons can be performed by comparing only the first m coefficients.

Efficient to match? Preprocessing: Compute SVD factorization Transform database descriptors Run-Time: 3.Transform Query SVD Query

Efficient to match? 4.Low resolution sort      Distance to Query Query Database

Efficient to match? 5.Update closest matches 6.Resort Query      Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query      Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query      Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query      Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query      Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query   =0.289   Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query   =0.289   Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query  =0.301 =0.289   Database Distance to Query

Efficient to match? 5.Update closest matches 6.Resort Query  =0.301 =0.289   Database Distance to Query

Singular Value Decomposition Theorem: Every symmetric matrix M can be written out as the product: where O is a rotation/reflection matrix (OO t =Id) and D is a diagonal matrix with the property:

Singular Value Decomposition Proof: 1.Every symmetric matrix has at least one real eigenvector v. 2.If v is an eigenvector and w is perpendicular to v then Mw is also perpendicular to v. v wvwv Since M maps the subspace of vectors perpendicular to v back into itself, we can look at the restriction of M to the subspace and iterate to get the next eigenvector.

Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: v F(v)F(v)F(v)F(v)

Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. v0v0v0v0 F(v0)F(v0)F(v0)F(v0)

Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. Then  F(v 0 )= v 0. v0v0v0v0 F(v0)F(v0)F(v0)F(v0)

Singular Value Decomposition If F has a maximum at some point v 0 then  F(v 0 )= v 0. If w 0 is on the sphere, next to v 0, then w 0 -v 0 is nearly perpendicular to v 0. And for any small vector w 1 perpendicular to v 0, v 0 + w 1 is nearly on the sphere v0v0v0v0 w0w0w0w0 v0v0v0v0 w 0 -v 0 w0w0w0w0

Singular Value Decomposition If F has a maximum at some point v 0 then  F(v 0 )= v 0. For small values of w 0 close to v 0, we have: For v 0 to be a maximum, we must have: for all w 0 near v 0. Thus,  F(v 0 ) must be perpendicular to all vectors that are perpendicular to v 0, and hence must itself be a multiple of v 0.

Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. Then  F(v 0 )= v 0. v0v0v0v0 F(v0)F(v0)F(v0)F(v0)

Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. Then  F(v 0 )= v 0. But  F(v)=2Mv v0v0v0v0 F(v0)F(v0)F(v0)F(v0)  v 0 is an eigenvector of M.

Singular Value Decomposition Proof: 1.Every symmetric matrix has at least one eigenvector v. 2.If v is an eigenvector and w is perpendicular to v then Mw is also perpendicular to v.

Singular Value Decomposition Proof (Step 2): If w is perpendicular to v, then  v,w  =0. Since M is symmetric: so that Mw is also perpendicular to v.