Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys.

Slides:



Advertisements
Similar presentations
Projective 3D geometry class 4
Advertisements

The fundamental matrix F
Lecture 11: Two-view geometry
3D reconstruction.
The Trifocal Tensor Multiple View Geometry. Scene planes and homographies plane induces homography between two views.
Computing 3-view Geometry Class 18
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 3: Projective Geometry Matthias Rüther Slides courtesy of Marc Pollefeys Department of Computer.
Two-View Geometry CS Sastry and Yang
Multiple View Reconstruction Class 24 Multiple View Geometry Comp Marc Pollefeys.
N-view factorization and bundle adjustment CMPUT 613.
Epipolar Geometry class 11 Multiple View Geometry Comp Marc Pollefeys.
Scene Planes and Homographies class 16 Multiple View Geometry Comp Marc Pollefeys.
Projective structure from motion
Multiple View Geometry
More on single-view geometry class 10 Multiple View Geometry Comp Marc Pollefeys.
3D reconstruction class 11
Projective 2D geometry (cont’) course 3
Parameter estimation class 5 Multiple View Geometry Comp Marc Pollefeys.
Computing F and rectification class 14 Multiple View Geometry Comp Marc Pollefeys.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Uncalibrated Geometry & Stratification Sastry and Yang
Parameter estimation class 6 Multiple View Geometry Comp Marc Pollefeys.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry
Multiple View Geometry Comp Marc Pollefeys
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Two-view geometry Epipolar geometry F-matrix comp. 3D reconstruction Structure comp.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Triangulation and Multi-View Geometry Class 9 Read notes Section 3.3, , 5.1 (if interested, read Triggs’s paper on MVG using tensor notation, see.
Multiple View Geometry in Computer Vision
Single View Metrology Class 3. 3D photography course schedule (tentative) LectureExercise Sept 26Introduction- Oct. 3Geometry & Camera modelCamera calibration.
Assignment 2 Compute F automatically from image pair (putative matches, 8-point, 7-point, iterative, RANSAC, guided matching) (due by Wednesday 19/03/03)
Camera Models class 8 Multiple View Geometry Comp Marc Pollefeys.
 -Linearities and Multiple View Tensors Class 19 Multiple View Geometry Comp Marc Pollefeys.
Singular Value Decomposition. Homogeneous least-squares Span and null-space Closest rank r approximation Pseudo inverse.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
More on single-view geometry class 10 Multiple View Geometry Comp Marc Pollefeys.
Multiple View Reconstruction Class 23 Multiple View Geometry Comp Marc Pollefeys.
Projected image of a cube. Classical Calibration.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Camera Calibration class 9 Multiple View Geometry Comp Marc Pollefeys.
Projective 2D geometry course 2 Multiple View Geometry Comp Marc Pollefeys.
Multiple View Geometry
The Trifocal Tensor Class 17 Multiple View Geometry Comp Marc Pollefeys.
Multiple View Geometry. THE GEOMETRY OF MULTIPLE VIEWS Reading: Chapter 10. Epipolar Geometry The Essential Matrix The Fundamental Matrix The Trifocal.
Structure Computation. How to compute the position of a point in 3- space given its image in two views and the camera matrices of those two views Use.
Projective 2D geometry course 2 Multiple View Geometry Comp Marc Pollefeys.
Epipolar geometry The fundamental matrix and the tensor
Camera Calibration class 9 Multiple View Geometry Comp Marc Pollefeys.
Projective cameras Motivation Elements of Projective Geometry Projective structure from motion Planches : –
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 6a: Shape from Stereo, short summary Matthias Rüther Slides partial courtesy of Marc Pollefeys.
Computing F. Content Background: Projective geometry (2D, 3D), Parameter estimation, Algorithm evaluation. Single View: Camera model, Calibration, Single.
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
EECS 274 Computer Vision Projective Structure from Motion.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Projective 2D geometry course 2 Multiple View Geometry Comp Marc Pollefeys.
Parameter estimation class 5
Epipolar geometry.
Epipolar Geometry class 11
3D Photography: Epipolar geometry
Multiple View Geometry Comp Marc Pollefeys
More on single-view geometry class 10
Estimating 2-view relationships
Multiple View Geometry Comp Marc Pollefeys
3D reconstruction class 11
Camera Calibration class 9
Parameter estimation class 6
Presentation transcript:

Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp Marc Pollefeys

Content Background: Projective geometry (2D, 3D), Parameter estimation, Algorithm evaluation. Single View: Camera model, Calibration, Single View Geometry. Two Views: Epipolar Geometry, 3D reconstruction, Computing F, Computing structure, Plane and homographies. Three Views: Trifocal Tensor, Computing T. More Views: N-Linearities, Multiple view reconstruction, Bundle adjustment, auto- calibration, Dynamic SfM, Cheirality, Duality

Multiple View Geometry course schedule (subject to change) Jan. 7, 9Intro & motivationProjective 2D Geometry Jan. 14, 16(no class)Projective 2D Geometry Jan. 21, 23Projective 3D Geometry(no class) Jan. 28, 30Parameter Estimation Feb. 4, 6Algorithm EvaluationCamera Models Feb. 11, 13Camera CalibrationSingle View Geometry Feb. 18, 20Epipolar Geometry3D reconstruction Feb. 25, 27Fund. Matrix Comp.Structure Comp. Mar. 4, 6Planes & HomographiesTrifocal Tensor Mar. 18, 20Three View ReconstructionMultiple View Geometry Mar. 25, 27MultipleView ReconstructionBundle adjustment Apr. 1, 3Auto-CalibrationPapers Apr. 8, 10Dynamic SfMPapers Apr. 15, 17CheiralityPapers Apr. 22, 24DualityProject Demos

Maximum Likelihood Estimation DLT not invariant  normalization Geometric minimization invariant Iterative minimization Cost function Parameterization Initialization Minimization algorithm

Automatic computation of H Objective Compute homography between two images Algorithm (i)Interest points: Compute interest points in each image (ii)Putative correspondences: Compute a set of interest point matches based on some similarity measure (iii)RANSAC robust estimation: Repeat for N samples (a) Select 4 correspondences and compute H (b) Calculate the distance d  for each putative match (c) Compute the number of inliers consistent with H (d  <t) Choose H with most inliers (iv)Optimal estimation: re-estimate H from all inliers by minimizing ML cost function with Levenberg-Marquardt (v)Guided matching: Determine more matches using prediction by computed H Optionally iterate last two steps until convergence

Algorithm Evaluation and Error Analysis Bounds on performance Covariance estimation ? ? residual error uncertainty

Algorithm evaluation measured coordinates estimated quantities true coordinates Test on real data or test on synthetic data Generate synthetic correspondences Add Gaussian noise, yielding Estimate from using algorithm maybe also Verify how well or Repeat many times (different noise, same  )

Error in one image Error in two images Estimate, then Note: residual error ≠ absolute measure of quality of e.g. estimation from 4 points yields e res =0 more points better results, but e res will increase Estimate so that, then

Optimal estimators (MLE) f X P Estimate expected residual error of MLE, Other algorithms can then be judged to this standard f :  M →  N (parameter space to measurement space) NN MM MM NN SMSM dimension of submanifold S M = #essential parameters

n X X X SMSM Assume S M locally planar around projection of isotropic Gaussian distribution on  N with total variance N  2 onto a subspace of dimension s is an isotropic Gaussian distribution with total variance s  2

N measurements (independent Gaussian noise   ) model with d essential parameters (use s=d and s=(N-d)) (i)RMS residual error for ML estimator (ii)RMS estimation error for ML estimator n X X X SMSM

Error in one image Error in two images

Covariance of estimated model Previous question: how close is the error to smallest possible error? Independent of point configuration Real question: how close is estimated model to real model? Dependent on point configuration (e.g. 4 points close to a line)

Forward propagation of covariance Let v be a random vector in  M with mean v and covariance matrix , and suppose that f :  M →  N is an affine mapping defined by f ( v )= f ( v )+ A ( v - v ). Then f ( v ) is a random variable with mean f ( v ) and covariance matrix A  A T. Note: does not assume A is a square matrix

Example:

Non-linear propagation of covariance Let v be a random vector in  M with mean v and covariance matrix , and suppose that f :  M →  N differentiable in the neighborhood of v. Then, up to a first order approximation, f ( v ) is a random variable with mean f ( v ) and covariance matrix J  J T, where J is the Jacobian matrix evaluated at v Note: good approximation if f close to linear within variability of v

Example:

Backward propagation of covariance f :  M →  N NN MM X f -1 P X 

Backward propagation of covariance assume f is affine X f -1 P X  what about f -1 o  ? solution: minimize:

Backward propagation of covariance X f -1 P X 

Backward propagation of covariance X f -1 P X  If f is affine, then non-linear case, obtain first order approximations by using Jacobian

Over-parameterization In this case f is not one-to-one and rank J < M so can not hold e.g. scale ambiguity  infinite variance! However, if constraints are imposed, then ok. Invert d x d in stead of M x M

Over-parameterization When constraint surface is locally orthogonal to the null space of J e.g. usual constraint ||P||=1 nullspace ||P||=1 (pseudo-inverse)

Example: error in one image (i)Estimate the transformation from the data (ii)Compute Jacobian, evaluated at (iii)The covariance matrix of the estimated is given by

Example: error in both images separate in homography and point parameters

Using covariance matrix in point transfer Error in one image Error in two images (if h and x independent, i.e. new points)

 =1 pixel  =0.5cm (Crimisi’97) Example:

 =1 pixel  =0.5cm (Crimisi’97) Example:

(Crimisi’97) Example:

Monte Carlo estimation of covariance To be used when previous assumptions do not hold (e.g. non-flat within variance) or to complicate to compute. Simple and general, but expensive Generate samples according to assumed noise distribution, carry out computations, observe distribution of result

Next class: Camera models