Presentation is loading. Please wait.

Presentation is loading. Please wait.

Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals Richard Szeliski Microsoft Research UW-MSR Course on Vision Algorithms CSE/EE 577,

Similar presentations


Presentation on theme: "Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals Richard Szeliski Microsoft Research UW-MSR Course on Vision Algorithms CSE/EE 577,"— Presentation transcript:

1 Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals Richard Szeliski Microsoft Research UW-MSR Course on Vision Algorithms CSE/EE 577, 590CV, Spring 2004

2 4/30/2004NLS and Sparse Matrix Techniques2 Readings Press et al., Numerical Recipes, Chapter 15 (Modeling of Data) Nocedal and Wright, Numerical Optimization, Chapter 10 (Nonlinear Least-Squares Problems, pp. 250-273) Shewchuk, J. R. An Introduction to the Conjugate Gradient Method Without the Agonizing Pain. Bathe and Wilson, Numerical Methods in Finite Element Analysis, pp.695-717 (sec. 8.1-8.2) and pp.979-987 (sec. 12.2) Golub and VanLoan, Matrix Computations. Chapters 4, 5, 10. Nocedal and Wright, Numerical Optimization. Chapters 4 and 5. Triggs et al., Bundle Adjustment – A modern synthesis. Workshop on Vision Algorithms, 1999.

3 4/30/2004NLS and Sparse Matrix Techniques3 Outline Nonlinear Least Squares simple application (motivation) linear (approx.) solution and least squares normal equations and pseudo-inverse LDL T, QR, and SVD decompositions correct linearization and Jacobians iterative solution, Levenberg-Marquardt robust measurements

4 4/30/2004NLS and Sparse Matrix Techniques4 Outline Sparse matrix techniques simple application (structure from motion) sparse matrix storage (skyline) direct solution: LDL T with minimal fill-in larger application (surface/image fitting) iterative solution: gradient descent conjugate gradient preconditioning

5 Non-linear Least Squares

6 4/30/2004NLS and Sparse Matrix Techniques6 Triangulation – a simple example Problem: Given some image points {(u j,v j )} in correspondence across two or more images (taken from calibrated cameras c j ), compute the 3D location X cjcj X (uj,vj)(uj,vj)

7 4/30/2004NLS and Sparse Matrix Techniques7 Image formation equations u (X c,Y c,Z c ) ucuc f

8 4/30/2004NLS and Sparse Matrix Techniques8 Simplified model Let R=I (known rotation), f=1, Y = v j = 0 (flatland) How do we solve this set of equations (constraints) to find the best (X,Z) ? (xj,zj)(xj,zj) X ujuj

9 4/30/2004NLS and Sparse Matrix Techniques9 “Linearized” model Bring the denominator over to the LHS or (Measures horizontal distance to each line equation.) How do we solve this set of equations (constraints)? (xj,zj)(xj,zj) X ujuj

10 4/30/2004NLS and Sparse Matrix Techniques10 Linear regression Overconstrained set of linear equations or Jx = r where J j0 =1, J j1 = -u j is the Jacobian and r j = x j -u j z j is the residual (xj,zj)(xj,zj) X ujuj

11 4/30/2004NLS and Sparse Matrix Techniques11 Normal Equations How do we solve Jx = r ? Least squares: arg min x ║Jx-r║ 2 E = ║Jx-r║ 2 = (Jx-r) T (Jx-r) = x T J T Jx – 2x T J T r – r T r ∂E/∂x = 2(J T J)x – 2J T r = 0 (J T J)x = J T rnormal equations A x = b” (A is Hessian) x = [(J T J) -1 J T ]rpseudoinverse

12 4/30/2004NLS and Sparse Matrix Techniques12 LDL T factorization Factor A = LDL T, where L is lower triangular with 1s on diagonal, D is diagonal How? L is formed from columns of Gaussian elimination Perform (similar) forward and backward elimination/substitution LDL T x = b, DL T x = L -1 b, L T x = D -1 L -1 b, x = L -T D -1 L -1 b

13 4/30/2004NLS and Sparse Matrix Techniques13 LDL T factorization – details

14 4/30/2004NLS and Sparse Matrix Techniques14 LDL T factorization – details

15 4/30/2004NLS and Sparse Matrix Techniques15 LDL T factorization – details

16 4/30/2004NLS and Sparse Matrix Techniques16 LDL T factorization – details

17 4/30/2004NLS and Sparse Matrix Techniques17 LDL T factorization – details

18 4/30/2004NLS and Sparse Matrix Techniques18 LDL T factorization – details

19 4/30/2004NLS and Sparse Matrix Techniques19 LDL T factorization – details

20 4/30/2004NLS and Sparse Matrix Techniques20 LDL T factorization – details

21 4/30/2004NLS and Sparse Matrix Techniques21 LDL T factorization – details

22 4/30/2004NLS and Sparse Matrix Techniques22 LDL T factorization – details

23 4/30/2004NLS and Sparse Matrix Techniques23 LDL T factorization – details

24 4/30/2004NLS and Sparse Matrix Techniques24 LDL T and Cholesky Variant: Cholesky: A = GG T, where G = LD 1/2 (involves scalar √) Advantages: more stable than Gaussian elimination Disadvantage: less stable than QR: (cond. #) 2 Complexity: (m+n/3)n 2 flops

25 4/30/2004NLS and Sparse Matrix Techniques25 QR decomposition Alternative solution for Jx = r Find an orthogonal matrix Q s.t. J = Q R,where R is upper triangular Q R x = r R x = Q T rsolve for x using back subst. Q is usu. computed using Householder matrices, Q = Q 1 …Q m, Q j = I – βv j v j T Advantages: sensitivity / condition number Complexity: 2n 2 (m-n/3) flops

26 4/30/2004NLS and Sparse Matrix Techniques26 SVD Most stable way to solve system Jx = r. J = U T Σ V,where U and V are orthogonal Σ is diagonal (singular values) Advantage: most stable (very ill conditioned problems) Disadvantage: slowest (iterative solution)

27 4/30/2004NLS and Sparse Matrix Techniques27 “Linearized” model – revisited Does the “linearized” model which measures horizontal distance to each line give the optimal estimate? No! (xj,zj)(xj,zj) X ujuj

28 4/30/2004NLS and Sparse Matrix Techniques28 Properly weighted model We want to minimize errors in the measured quantities Closer cameras (smaller denominators) have more weight / influence. Weight each “linearized” equation by current denominator? (xj,zj)(xj,zj) X ujuj

29 4/30/2004NLS and Sparse Matrix Techniques29 Optimal estimation Feature measurement equations Likelihood of (X,Z) given {u i,x j,z j }

30 4/30/2004NLS and Sparse Matrix Techniques30 Non-linear least squares Log likelihood of (x,z) given {u i,x j,z j } How do we minimize E? Non-linear regression (least squares), because û i are non-linear functions of {u i,x j,z j }

31 4/30/2004NLS and Sparse Matrix Techniques31 Levenberg-Marquardt Iterative non-linear least squares Linearize measurement equations Substitute into log-likelihood equation: quadratic cost function in (Δx,Δz)

32 4/30/2004NLS and Sparse Matrix Techniques32 Levenberg-Marquardt Linear regression (sub-)problem: with û i Similar to weighted regression, but not quite.

33 4/30/2004NLS and Sparse Matrix Techniques33 What if it doesn’t converge? Multiply diagonal by (1 + λ), increase λ until it does Halve the step size (my favorite) Use line search Other trust region methods [Nocedal & Wright] Levenberg-Marquardt

34 4/30/2004NLS and Sparse Matrix Techniques34 Other issues: Uncertainty analysis: covariance Σ = A -1 Is maximum likelihood the best idea? How to start in vicinity of global minimum? What about outliers? Levenberg-Marquardt

35 4/30/2004NLS and Sparse Matrix Techniques35 Robust regression Data often have outliers (bad measurements) Use robust penalty applied to each set of joint measurements [Black & Rangarajan, IJCV’96] For extremely bad data, use random sampling [RANSAC, Fischler & Bolles, CACM’81]

36 Sparse Matrix Techniques Direct methods

37 4/30/2004NLS and Sparse Matrix Techniques37 Structure from motion Given many points in correspondence across several images, {(u ij,v ij )}, simultaneously compute the 3D location X i and camera (or motion) parameters (K, R j, t j ) Two main variants: calibrated, and uncalibrated (sometimes associated with Euclidean and projective reconstructions)

38 4/30/2004NLS and Sparse Matrix Techniques38 Bundle Adjustment Simultaneous adjustment of bundles of rays (photogrammetry) What makes this non-linear minimization hard? many more parameters: potentially slow poorer conditioning (high correlation) potentially lots of outliers gauge (coordinate) freedom

39 4/30/2004NLS and Sparse Matrix Techniques39 Simplified model Again, R=I (known rotation), f=1, Z = v j = 0 (flatland) This time, we have to solve for all of the parameters {(X i,Z i ), (x j,z j )}. (xj,zj)(xj,zj) XiXi u ij

40 4/30/2004NLS and Sparse Matrix Techniques40 Lots of parameters: sparsity Only a few entries in Jacobian are non-zero

41 4/30/2004NLS and Sparse Matrix Techniques41 Sparse LDL T / Cholesky First used in finite element analysis [Bathe…] Applied to SfM by [Szeliski & Kang 1994] structure | motion fill-in

42 4/30/2004NLS and Sparse Matrix Techniques42 Skyline storage [Bathe & Wilson]

43 4/30/2004NLS and Sparse Matrix Techniques43 Sparse matrices–common shapes Banded (tridiagonal), arrowhead, multi-banded : fill-in Computational complexity: O(n b 2 ) Application to computer vision: snakes (tri-diagonal) surface interpolation (multi-banded) deformable models (sparse)

44 4/30/2004NLS and Sparse Matrix Techniques44 Sparse matrices – variable reordering Triggs et al. – Bundle Adjustment

45 Sparse Matrix Techniques Iterative methods

46 4/30/2004NLS and Sparse Matrix Techniques46 Two-dimensional problems Surface interpolation and Poisson blending

47 4/30/2004NLS and Sparse Matrix Techniques47 Poisson blending

48 4/30/2004NLS and Sparse Matrix Techniques48 Poisson blending → multi-banded (sparse) system

49 4/30/2004NLS and Sparse Matrix Techniques49 One-dimensional example Simplified 1-D height/slope interpolation tri-diagonal system (generalized snakes)

50 4/30/2004NLS and Sparse Matrix Techniques50 Direct solution of 2D problems Multi-banded Hessian : fill-in Computational complexity: n x m image O(nm m 2 ) … too slow!

51 4/30/2004NLS and Sparse Matrix Techniques51 Iterative techniques Gauss-Seidel and Jacobi Gradient descent Conjugate gradient Non-linear conjugate gradient Preconditioning … see Shewchuck’s TR

52 4/30/2004NLS and Sparse Matrix Techniques52 Conjugate gradient … see Shewchuck’s TR for rest of notes …

53 4/30/2004NLS and Sparse Matrix Techniques53 Iterative vs. direct Direct better for 1D problems and relatively sparse general structures SfM where #points >> #frames Iterative better for 2D problems More amenable to parallel (GPU?) implementation Preconditioning helps a lot (next lecture)

54 4/30/2004NLS and Sparse Matrix Techniques54 Monday’s lecture (Applications) Preconditioning Hierarchical basis functions (wavelets) 2D applications: interpolation, shape-from-shading, HDR, Poisson blending, others (rotoscoping?)

55 4/30/2004NLS and Sparse Matrix Techniques55 Monday’s lecture (Applications) Structure from motion Alternative parameterizations (object- centered) Conditioning and linearization problems Ambiguities and uncertainties New research: map correlation


Download ppt "Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals Richard Szeliski Microsoft Research UW-MSR Course on Vision Algorithms CSE/EE 577,"

Similar presentations


Ads by Google