Globally Optimal Estimates for Geometric Reconstruction Problems Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics.

Slides:



Advertisements
Similar presentations
1 A Convex Polynomial that is not SOS-Convex Amir Ali Ahmadi Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of.
Advertisements

A Robust Super Resolution Method for Images of 3D Scenes Pablo L. Sala Department of Computer Science University of Toronto.
C&O 355 Mathematical Programming Fall 2010 Lecture 9
Applied Informatics Štefan BEREŽNÝ
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
3D reconstruction.
Engineering Optimization
MS&E 211 Quadratic Programming Ashish Goel. A simple quadratic program Minimize (x 1 ) 2 Subject to: -x 1 + x 2 ≥ 3 -x 1 – x 2 ≥ -2.
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2014 – 35148: Continuous Solution for Boundary Value Problems.
Sum of Squares and SemiDefinite Programmming Relaxations of Polynomial Optimization Problems The 2006 IEICE Society Conference Kanazawa, September 21,
APPENDIX A: REVIEW OF LINEAR ALGEBRA APPENDIX B: CONVEX AND CONCAVE FUNCTIONS V. Sree Krishna Chaitanya 3 rd year PhD student Advisor: Professor Biswanath.
Engineering Optimization
Computer vision: models, learning and inference
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
1 Nonlinear Control Design for LDIs via Convex Hull Quadratic Lyapunov Functions Tingshu Hu University of Massachusetts, Lowell.
by Rianto Adhy Sasongko Supervisor: Dr.J.C.Allwright
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Structure from motion.
Numerical Optimization
Multiple View Geometry
Curve-Fitting Regression
Chebyshev Estimator Presented by: Orr Srour. References Yonina Eldar, Amir Beck and Marc Teboulle, "A Minimax Chebyshev Estimator for Bounded Error Estimation"
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
Approximation Algorithms
Constrained Optimization
Unconstrained Optimization Problem
Linear Equations in Linear Algebra
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Structure Computation. How to compute the position of a point in 3- space given its image in two views and the camera matrices of those two views Use.
Tier I: Mathematical Methods of Optimization
Today Wrap up of probability Vectors, Matrices. Calculus
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
CHAPTER SIX Eigenvalues
C&O 355 Mathematical Programming Fall 2010 Lecture 17 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Systems of Linear Equation and Matrices
Epipolar geometry The fundamental matrix and the tensor
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Nonlinear Programming Models
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Introduction to Semidefinite Programs Masakazu Kojima Semidefinite Programming and Its Applications Institute for Mathematical Sciences National University.
College Algebra Sixth Edition James Stewart Lothar Redlin Saleem Watson.
Large-Scale Matrix Factorization with Missing Data under Additional Constraints Kaushik Mitra University of Maryland, College Park, MD Sameer Sheoreyy.
Linear Programming: Formulations, Geometry and Simplex Method Yi Zhang January 21 th, 2010.
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Computation of the solutions of nonlinear polynomial systems
Chapter 5 Systems and Matricies. Chapter 5 Systems and Matricies.
René Vidal and Xiaodong Fan Center for Imaging Science
Georgina Hall Princeton, ORFE Joint work with Amir Ali Ahmadi
Nonnegative polynomials and applications to learning
Parameter estimation class 5
Epipolar geometry.
Lecture 8 – Nonlinear Programming Models
Polynomial DC decompositions
Nuclear Norm Heuristic for Rank Minimization
Estimating 2-view relationships
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Numerical Analysis Lecture14.
I.4 Polyhedral Theory (NW)
EE 458 Introduction to Optimization
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
I.4 Polyhedral Theory.
Chapter 2. Simplex method
Presentation transcript:

Globally Optimal Estimates for Geometric Reconstruction Problems Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics and Computer Science Weizmann Institute 3 June 2007

outline  Motivation and Introduction  Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)  Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation  Application in vision Finding optimal structure Partial relaxation and Schur ’ s complement

Motivation Geometric Reconstruction Problems Polynomial optimization problems (POPs)

Triangulation problem in L 2 norm Multi view - optimization2-views – exact solution

Triangulation problem in L 2 norm perspective camera i x y z err

Triangulation problem in L 2 norm minimize reprojection error in all cameras Polynomial minimization problem Non convex

More computer vision problems Reconstruction problem: known cameras, known corresponding points find 3D points that minimize the projection error of given image points –Similar to triangulation for many points and cameras Calculating homography given 3D points on a plane and corresponding image points, calculate homography Many more problems

Optimization problems

Introduction to optimization problems

optimization problems NP - complete

optimization problems optimization convexnon convex Linear Programming (LP) SemiDefinite Programming (SDP) solutions exist: interior point methods problems: local optimum or high computational cost

non convex optimization init level curves of f Many algorithms Get stuck in local minima MinMax Non convex feasible set

optimization problems optimization convexnon convex LPSDP solutions exist: interior point methods problems: local optimum or high computational cost global optimization – algorithms that converge to optimal solution relaxation of problem

outline  Motivation and Introduction  Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)  Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation  Application in vision Finding optimal structure Partial relaxation and Schur ’ s complement

positive semidefinite (PSD) matrices Definition: a matrix M in R n×n is PSD if 1. M is symmetric: M=M T 2. for all M can be decomposed as AA T (Cholesky) Proof:

positive semidefinite (PSD) matrices Definition: a matrix M in R n×n is PSD if 1. M is symmetric: M=M T 2. for all M can be decomposed as AA T (Cholesky)

principal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order:

diagonal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order:

diagonal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order:

diagonal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal third order: det(M) second order:

Set of PSD matrices in 2D

Set of PSD matrices This set is convex Proof:

LMI – linear matrix inequality

LMI example: find the feasible set of the 2D LMI

reminder

LMI example: find the feasible set of the 2D LMI 1st order principal minors

LMI example: find the feasible set of the 2D LMI 2nd order principal minors

LMI example: find the feasible set of the 2D LMI 3rd order principal minors Intersection of all inequalities

Semidefinite Programming (SDP) = LMI an extension of LP

outline  Motivation and Introduction  Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)  Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation  Application in vision Finding optimal structure Partial relaxation and Schur ’ s complement

Sum Of Squares relaxation (SOS) Unconstrained polynomial optimization problem (POP) means the feasible set is R n H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.

Sum Of Squares relaxation (SOS)

SOS relaxation for unconstrained polynomials

monomial basis example

SOS relaxation to SDP SDP

SOS relaxation to SDP example: SDP

SOS for constrained POPs possible to extend this method for constrained POPs by use of generalized Lagrange dual

SOS relaxation summary So we know how to solve a POP that is a SOS And we have a bound on a POP that is not an SOS H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, POP SOS problem SOS relaxation Global estimate SDP

relaxations SOS: POP SOS problem SOS relaxation Global estimate SDP POP linear & LMI problem LMI relaxation Global estimate SDP + converge LMI:

outline  Motivation and Introduction  Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)  Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation  Application in vision Finding optimal structure Partial relaxation and Schur ’ s complement

LMI relaxations Constraints are handled Convergence to optimum is guaranteed Applies to all polynomials, not SOS as well

A maximization problem Note that: a.Feasible set is non-convex. b.Constraints are quadratic Feasible set

LMI – linear matrix inequality, a reminder

Goal An SDP: Motivation Polynomial Optimization Problem SDP with solution close to global optimum of the original problem What is it good for? SDP problems can be solved much more efficiently then general optimization problems.

POP Linear + LMI + rank constraints SDP LMI: LMI Relaxations is iterative process Step 1: introduce new variables Step 2: relax constraints Apply higher order relaxations

LMI relaxation – step 1 Replace monomials by “lifting variables” Rule: Example: (the R 2 case)

Introducing lifting variables Lifting

Not equivalent to the original problem. Lifting variables are not independent in the original problem: New problem is linear, in particular convex

Goal, more specifically Linear problem (obtained by lifing) “relations constraints” on lifting variables + SDP Relaxation

Question: how do we guarantee that the relations between lifting variables hold?

LMI relaxation – step 2 Apply lifting and get: Take the basis of the degree 1 polynomials. Note that: Because:

If the relations constraints hold then This is because we can decompose M as follows: Assuming relations hold Rank M = 1

We’ve seen: Relations constraints hold What about the opposite: Relations constraints hold This is true as well

By the following: LMI relaxation – step 2, continued Relations constraints hold All relations equalities are in the set of equalities

Conclusion of the analysis The “y feasible set” Subset of feasible set with Relations constraints hold

Original problem is equivalent to the following: together with the additional constraint Relaxation, at last We denote moment matrix of order 1 Relax by dropping the non-convex constraint

LMI relaxation of order 1 Feasible set

Rank constrained LMI vs. unconstrained

LMI relaxations of higher order It turns out that we can do better: Apply LMI relaxations of higher order A tighter SDP Relaxations of higher order incorporate the inequality constraints in LMI We show relaxation of order 2 It is possible to continue and apply relaxations Theory guarantees convergence to global optimum

Let be a basis of polynomials of degree 2. Again, Lifting gives: Again, we will relax by dropping the rank constraint. LMI relaxations of second order

Inequality constraints to LMI Replace our constraints by LMIs and have a tighter relaxation. Linear Constraint : Lifting LMI Constraint : For example,

This procedure brings a new SDP LMI relaxations of order 2 Second SDP feasible set is included in the first SDP feasible set Silimarly, we can continue and apply higher order relaxations.

If the feasible set defined by constraints is compact, then under mild additional assumptions, Lassere proved in 2001 that there is an asymptotic convergence guarantee: is the solution to k’th relaxation is the solution for the original problem (finding a maximum) Theoretical basis for the LMI relaxations Moreover, convergence is fast: is very close to for small k Lasserre J.B. (2001) "Global optimization with polynomials and the problem of moments" SIAM J. Optimization 11, pp

The method provides a certificate of global optimality: An important experimental observation: SDP solution is global optimum Minimizing Low rank moment matrix Checking global optimality

We add to the objective function the trace of the moment matrix weighted by a sufficiently small positive scalar

LMI relaxations in vision Application

outline  Motivation and Introduction  Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)  Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation  Application in vision Finding optimal structure Partial relaxation and Schur ’ s complement

Camera center Image plane Finding optimal structure A perspective camera P is the camera matrix and is the depth. Measured image points are corrupted by independent Gaussian noise. We want to minimize the least squares errors between measured and projected points. The relation between a U in the 3D space and u in the image plane is given by:

We therefore have the following optimization problem: is the Euclidean distance. is the set of unknowns Where are polynomials. Each term in the cost function can be written as: Our objective is therefore to minimize a sum of rational functions.

How can we turn the optimization problem into a polynomial optimization problem? Suppose that each term in has an upper bound, then Then our optimization problem is equivalent to the following: This is a polynomial optimization problem, for which we apply LMI relaxations. Note that we introduced many new variables – one for each term.

Problem: an SDP with a large number of variables can be computationally demanding. A large number of variables can arise from: LMI relaxations of high order Introduction of new variables as we’ve seen This is where partial relaxations come in. For that we introduce Schur’s complement. Partial Relaxations

Schur’s comlement Set:

Schur’s comlement - applying Derivation of right side: C - B T * A -1 * B > 0

Schur’s complement allows us to state our optimization problem as follows: Partial relaxations The only non-linearity is due to We can apply LMI relaxations only on and leave If we were to apply full relaxations for all variables, the problem would become Intractable for small N.

Partial relaxations Disadvantage of partial relaxations: we are not able to ensure asymptotic convergence to the global optimum. However, we have a numerical certificate of global optimality just as in the case of full relaxations: The moment matrix of the relaxed variables is of rank one Solution of partially relaxed problem is the global optimum

Full relaxation vs. partial relaxtion Application: Triangulation, 3 cameras Goal: find the optimal 3D point. Camera matrices are known, measured point is assumed to be in the origin of each view. Camera matrices:

Summary Geometric Vision Problems to POPs –Triangulation and reconstruction problem Relaxations of POPs –Sum Of Squares (SOS) relaxation Guarantees bound on optimal solution Usually solution is optimal –Linear Matrix Inequalities (LMI) relaxation First order LMI relaxation: lifting, dropping rank constraint Higher order LMI relaxation: linear constraints to LMIs Guarantee of convergence, reference to Lassere Certificate of global optimality Application in vision Finding optimal structure Partial relaxation and Schur’s complement Triangulation problem, benefit of partial relaxations

References  F. Kahl and D. Henrion. Globally Optimal Estimates for Geometric Reconstruction Problems. Accepted IJCV  H. Waki, S. Kim, M. Kojima, and M. Muramatsu. Sums of squares and semidefinite programming relaxations for polynomial optimization problems with structured sparsity. SIAM J. Optimization, 17(1):218 – 242,  J. B. Lasserre. Global optimization with polynomials and the problem of moments. SIAM J. Optimization, 11:796 – 817,  S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press,  R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, Second Edition.