Normal Estimation in Point Clouds 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Chapter Outline 3.1 Introduction
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
GATE Reconstruction from Point Cloud (GATE-540) Dr.Çağatay ÜNDEĞER Instructor Middle East Technical University, GameTechnologies & General Manager.
Surface normals and principal component analysis (PCA)
Dimensionality Reduction PCA -- SVD
PCA + SVD.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
IDEAL 2005, 6-8 July, Brisbane Multiresolution Analysis of Connectivity Atul Sajjanhar, Deakin University, Australia Guojun Lu, Monash University, Australia.
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Dimensionality reduction. Outline From distances to points : – MultiDimensional Scaling (MDS) – FastMap Dimensionality Reductions or data projections.
Principal Component Analysis
Motion Analysis Slides are from RPI Registration Class.
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Computer Graphics Recitation 6. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
IMA Tutorial, Instantaneous Motions - Applications to Problems in Reverse Engineering and 3D Inspection H. Pottmann.
3D Geometry for Computer Graphics
Singular Value Decomposition COS 323. Underconstrained Least Squares What if you have fewer data points than parameters in your function?What if you have.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Clustering In Large Graphs And Matrices Petros Drineas, Alan Frieze, Ravi Kannan, Santosh Vempala, V. Vinay Presented by Eric Anderson.
3D Geometry for Computer Graphics
Olga Sorkine’s slides Tel Aviv University. 2 Spectra and diagonalization A If A is symmetric, the eigenvectors are orthogonal (and there’s always an eigenbasis).
Orthogonality and Least Squares
DATA MINING LECTURE 7 Dimensionality Reduction PCA – SVD
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Principal Component Analysis. Consider a collection of points.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
5.1 Orthogonality.
SVD(Singular Value Decomposition) and Its Applications
Empirical Modeling Dongsup Kim Department of Biosystems, KAIST Fall, 2004.
Summarized by Soo-Jin Kim
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Day 1 Eigenvalues and Eigenvectors
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
CSE 185 Introduction to Computer Vision Face Recognition.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
Principal Warps: Thin-Plate Splines and the Decomposition of Deformations 김진욱 ( 이동통신망연구실 ; 박천현 (3D 모델링 및 처리연구실 ;
CSE 554 Lecture 8: Alignment
Lecture 16: Image alignment
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
9.3 Filtered delay embeddings
Singular Value Decomposition
Object Modeling with Layers
SVD: Physical Interpretation and Applications
Techniques for studying correlation and covariance structure
Principal Component Analysis
Recitation: SVD and dimensionality reduction
3.3 Network-Centric Community Detection
Feature space tansformation methods
Principal Component Analysis
Clustering The process of grouping samples so that the samples are similar within each group.
Presentation transcript:

Normal Estimation in Point Clouds 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine

# Implicit Surface Reconstruction ● Implicit function from point clouds ● Need consistently oriented normals < 0 > 0 0 March 13, 2013Olga Sorkine-Hornung2

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung3

# Normal Estimation March 13, 2013Olga Sorkine-Hornung4 ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree)

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung5

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung6

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung7

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung8

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung9

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung10

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung11

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung12

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung13

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung14

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung15

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung16

# Normal Estimation ● Assign a normal vector n at each point cloud point x  Estimate the direction by fitting a local plane  Find consistent global orientation by propagation (spanning tree) March 13, 2013Olga Sorkine-Hornung17

# Local Plane Fitting ● For each point x in the cloud, pick k nearest neighbors or all points in r -ball: ● Find a plane Π that minimizes the sum of square distances: March 13, 2013Olga Sorkine-Hornung18

# Local Plane Fitting March 13, 2013Olga Sorkine-Hornung19 ● For each point x in the cloud, pick k nearest neighbors or all points in r -ball: ● Find a plane Π that minimizes the sum of square distances:

# Linear Least Squares? x y March 13, 2013Olga Sorkine-Hornung20

# Linear Least Squares? ● Find a line y = ax+b s.t. x y March 13, 2013Olga Sorkine-Hornung21

# Linear Least Squares? ● Find a line y = ax+b s.t. ● But we would like true orthogonal distances x y March 13, 2013Olga Sorkine-Hornung22

# Best Fit with SSD x y x y March 13, 2013Olga Sorkine-Hornung23

# Principle Component Analysis (PCA) ● PCA finds an orthogonal basis that best represents a given data set ● PCA finds the best approximating line/plane/orientation… (in terms of  distances 2 ) x y z x y x y March 13, 2013Olga Sorkine-Hornung24

# Notations ● Input points: ● Looking for a (hyper) plane passing through c with normal n s.t. March 13, 2013Olga Sorkine-Hornung25

# Notations ● Input points: ● Centroid: ● Vectors from the centroid: m March 13, 2013Olga Sorkine-Hornung26

# ● It can be shown that: ● m minimizes SSD ● m will be the origin of the (hyper)-plane ● Our problem becomes: Centroid: 0-dim Approximation m March 13, 2013Olga Sorkine-Hornung27

# Hyperplane Normal ● Minimize! March 13, 2013Olga Sorkine-Hornung28

# Hyperplane Normal ● Minimize! March 13, 2013Olga Sorkine-Hornung29

# Hyperplane Normal ● Constrained minimization – Lagrange multipliers March 13, 2013Olga Sorkine-Hornung30

# Hyperplane Normal ● Constrained minimization – Lagrange multipliers March 13, 2013Olga Sorkine-Hornung31

# Hyperplane Normal ● Constrained minimization – Lagrange multipliers What can be said about n ?? March 13, 2013Olga Sorkine-Hornung32

# Hyperplane Normal ● Constrained minimization – Lagrange multipliers n is the eigenvector of S with the smallest eigenvalue March 13, 2013Olga Sorkine-Hornung33

# ● Solution with Principal Components Analysis (PCA):  Just use PCA ● Input: ● Compute centroid = plane origin ● Compute SVD of ● Singular Value Decomposition: Y = UΣV* ● Plane normal n is last column vector of U. Best Fitting Plane Recipe -- PCA March 13, 2013Olga Sorkine-Hornung34

# Relation between SVD / Eigenvectors ● If we have a Singular Value Decomposition of a matrix Y: Y = UΣV* ● Then the column vectors of U are the eigenvectors of YY*. March 13, 2013Olga Sorkine-Hornung35

# ● Input: ● Compute centroid = plane origin ● Solution using Eigenvectors: ● Compute scatter matrix ● The plane normal n is the eigenvector of S with the smallest eigenvalue Best Fitting Plane Recipe -- Eigenvectors March 13, 2013Olga Sorkine-Hornung36

# What does Scatter Matrix do? ● Let’s look at a line l through the center of mass m with direction vector v, and project our points x i onto it. The variance of the projected points x i is: Original setSmall varianceLarge variance l l l l v m xixi xixi l yiyi March 13, 2013Olga Sorkine-Hornung37

# What does Scatter Matrix do? Original setSmall varianceLarge variance l l l l v m xixi xixi l yiyi March 13, 2013Olga Sorkine-Hornung38 ● The scatter matrix measures the variance of our data points along the direction v

# Principal Components ● Eigenvectors of S that correspond to big eigenvalues are the directions in which the data has strong components (= large variance). ● If the eigenvalues are more or less the same – there is no preferable direction. March 13, 2013Olga Sorkine-Hornung39

# Principal Components ● There’s no preferable direction ● S looks like this: ● Any vector is an eigenvector ● There’s a clear preferable direction ● S looks like this:   is close to zero, much smaller than March 13, 2013Olga Sorkine-Hornung40

# Normal Orientation ● PCA may return arbitrarily oriented eigenvectors ● Wish to orient consistently ● Neighboring points should have similar normals March 13, 2013Olga Sorkine-Hornung41

# ● Build graph connecting neighboring poin ts  Edge (i,j) exists if x i ∈ kNN(x j ) or x j ∈ kNN(x i ) ● Propagate normal orientation through graph  For neighbors x i, x j : Flip n j if n i T n j < 0  Fails at sharp edges/corners ● Propagate along “safe” paths (parallel normals)  Minimum spanning tree with angle-based edge weights w ij = 1- | n i T n j | Normal Orientation March 13, 2013Olga Sorkine-Hornung42

# ● Build graph connecting neighboring poin ts  Edge (i,j) exists if x i ∈ kNN(x j ) or x j ∈ kNN(x i ) ● Propagate normal orientation through graph  For neighbors x i, x j : Flip n j if n i T n j < 0  Fails at sharp edges/corners ● Propagate along “safe” paths (parallel normals)  Minimum spanning tree with angle-based edge weights Normal Orientation March 13, 2013Olga Sorkine-Hornung43