Download presentation

Presentation is loading. Please wait.

Published byLauren Chriswell Modified over 2 years ago

1
Computer Vision Multiple View Geometry Marc Pollefeys COMP 256

2
Computer Vision 2 Last class Gaussian pyramid Laplacian pyramid Gabor filters Fourier transform

3
Computer Vision 3 Histograms : co-occurrence matrix Not last class …

4
Computer Vision 4 Texture synthesis [Zalesny & Van Gool 2000] 2 analysis iterations 6 analysis iterations 9 analysis iterations

5
Computer Vision 5 View-dependent texture synthesis [Zalesny & Van Gool 2000]

6
Computer Vision 6 Efros & Leung 99 Assuming Markov property, compute P(p|N(p)) –Building explicit probability tables infeasible –Instead, lets search the input image for all similar neighborhoods thats our histogram for p To synthesize p, just pick one match at random p non-parametric sampling Input image Synthesizing a pixel

7
Computer Vision 7 p Efros & Leung 99 extended Observation: neighbor pixels are highly correlated Input image non-parametric sampling B Idea: unit of synthesis = block Exactly the same but now we want P(B|N(B)) Much faster: synthesize all pixels in a block at once Not the same as multi-scale! Synthesizing a block

8
Computer Vision 8 Input texture B1B2 Random placement of blocks block B1 B2 Neighboring blocks constrained by overlap B1B2 Minimal error boundary cut

9
Computer Vision 9 min. error boundary Minimal error boundary overlapping blocksvertical boundary _ = 2 overlap error

10
Computer Vision 10

11
Computer Vision 11

12
Computer Vision 12

13
Computer Vision 13 Why do we see more flowers in the distance? Perpendicular textures [Leung & Malik CVPR97]

14
Computer Vision 14 Shape-from-texture

15
Computer Vision 15 Jan 16/18-Introduction Jan 23/25CamerasRadiometry Jan 30/Feb1Sources & ShadowsColor Feb 6/8Linear filters & edgesTexture Feb 13/15Multi-View GeometryStereo Feb 20/22Optical flowProject proposals Feb27/Mar1Affine SfMProjective SfM Mar 6/8Camera CalibrationSilhouettes and Photoconsistency Mar 13/15Springbreak Mar 20/22SegmentationFitting Mar 27/29Prob. SegmentationProject Update Apr 3/5Tracking Apr 10/12Object Recognition Apr 17/19Range data Apr 24/26Final project Tentative class schedule

16
Computer Vision 16 THE GEOMETRY OF MULTIPLE VIEWS Reading: Chapter 10. Epipolar Geometry The Essential Matrix The Fundamental Matrix The Trifocal Tensor The Quadrifocal Tensor

17
Computer Vision 17 Epipolar Geometry Epipolar Plane Epipoles Epipolar Lines Baseline

18
Computer Vision 18 Epipolar Constraint Potential matches for p have to lie on the corresponding epipolar line l. Potential matches for p have to lie on the corresponding epipolar line l.

19
Computer Vision 19 Epipolar Constraint: Calibrated Case Essential Matrix (Longuet-Higgins, 1981)

20
Computer Vision 20 Properties of the Essential Matrix E p is the epipolar line associated with p. E T p is the epipolar line associated with p. E e=0 and E T e=0. E is singular. E has two equal non-zero singular values (Huang and Faugeras, 1989). T T

21
Computer Vision 21 Epipolar Constraint: Small Motions To First-Order: Pure translation: Focus of Expansion

22
Computer Vision 22 Epipolar Constraint: Uncalibrated Case Fundamental Matrix (Faugeras and Luong, 1992)

23
Computer Vision 23 Properties of the Fundamental Matrix F p is the epipolar line associated with p. F T p is the epipolar line associated with p. F e=0 and F T e=0. F is singular. T T

24
Computer Vision 24 The Eight-Point Algorithm (Longuet-Higgins, 1981) | F | =1. Minimize: under the constraint 2

25
Computer Vision 25 Non-Linear Least-Squares Approach (Luong et al., 1993) Minimize with respect to the coefficients of F, using an appropriate rank-2 parameterization.

26
Computer Vision 26 Problem with eight-point algorithm linear least-squares: unit norm vector F yielding smallest residual What happens when there is noise?

27
Computer Vision 27 The Normalized Eight-Point Algorithm (Hartley, 1995) Center the image data at the origin, and scale it so the mean squared distance between the origin and the data points is 2 pixels: q = T p, q = T p. Use the eight-point algorithm to compute F from the points q and q. Enforce the rank-2 constraint. Output T F T. T iiii ii

28
Computer Vision 28 Epipolar geometry example

29
Computer Vision 29 Example: converging cameras courtesy of Andrew Zisserman

30
Computer Vision 30 Example: motion parallel with image plane (simple for stereo rectification) courtesy of Andrew Zisserman

31
Computer Vision 31 Example: forward motion e e courtesy of Andrew Zisserman

32
Computer Vision 32 Fundamental matrix for pure translation courtesy of Andrew Zisserman auto-epipolar

33
Computer Vision 33 Fundamental matrix for pure translation courtesy of Andrew Zisserman

34
Computer Vision 34 Trinocular Epipolar Constraints These constraints are not independent!

35
Computer Vision 35 Trinocular Epipolar Constraints: Transfer Given p and p, p can be computed as the solution of linear equations. 123

36
Computer Vision 36 problem for epipolar transfer in trifocal plane! image from Hartley and Zisserman Trinocular Epipolar Constraints: Transfer There must be more to trifocal geometry…

37
Computer Vision 37 Trifocal Constraints

38
Computer Vision 38 Trifocal Constraints All 3x3 minors must be zero! Calibrated Case Trifocal Tensor

39
Computer Vision 39 Trifocal Constraints Uncalibrated Case Trifocal Tensor

40
Computer Vision 40 Trifocal Constraints: 3 Points Pick any two lines l and l through p and p. Do it again T( p, p, p )=

41
Computer Vision 41 Properties of the Trifocal Tensor Estimating the Trifocal Tensor Ignore the non-linear constraints and use linear least-squares Impose the constraints a posteriori. For any matching epipolar lines, l G l = 0. The matrices G are singular. They satisfy 8 independent constraints in the uncalibrated case (Faugeras and Mourrain, 1995). 213 Ti 1 i

42
Computer Vision 42 For any matching epipolar lines, l G l = Ti The backprojections of the two lines do not define a line!

43
Computer Vision putative matches 18 outliers 88 inliers 95 final inliers (26 samples) (0.43) (0.23) (0.19) Trifocal Tensor Example courtesy of Andrew Zisserman

44
Computer Vision 44 additional line matches Trifocal Tensor Example images courtesy of Andrew Zisserman

45
Computer Vision 45 Transfer: trifocal transfer doesnt work if l=epipolar line image courtesy of Hartley and Zisserman (using tensor notation)

46
Computer Vision 46 Image warping using T(1,2,N) (Avidan and Shashua `97)

47
Computer Vision 47 Multiple Views (Faugeras and Mourrain, 1995)

48
Computer Vision 48 Two Views Epipolar Constraint

49
Computer Vision 49 Three Views Trifocal Constraint

50
Computer Vision 50 Four Views Quadrifocal Constraint (Triggs, 1995)

51
Computer Vision 51 Geometrically, the four rays must intersect in P..

52
Computer Vision 52 Quadrifocal Tensor and Lines

53
Computer Vision 53 Quadrifocal tensor determinant is multilinear thus linear in coefficients of lines ! There must exist a tensor with 81 coefficients containing all possible combination of x,y,w coefficients for all 4 images: the quadrifocal tensor

54
Computer Vision 54 Scale-Restraint Condition from Photogrammetry

55
Computer Vision 55 from perspective to omnidirectional cameras perspective camera (2 constraints / feature) radial camera (uncalibrated) (1 constraints / feature) 3 constraints allow to reconstruct 3D point more constraints also tell something about cameras multilinear constraints known as epipolar, trifocal and quadrifocal constraints (0,0) l =(y,-x) (x,y)

56
Computer Vision 56 Radial quadrifocal tensor Linearly compute radial quadrifocal tensor Q ijkl from 15 pts in 4 views Reconstruct 3D scene and use it for calibration (2x2x2x2 tensor) (2x2x2 tensor) Not easy for real data, hard to avoid degenerate cases (e.g. 3 optical axes intersect in single point). However, degenerate case leads to simpler 3 view algorithm for pure rotation Radial trifocal tensor T ijk from 7 points in 3 views Reconstruct 2D panorama and use it for calibration (x,y)

57
Computer Vision 57 Non-parametric distortion calibration (Thirthala and Pollefeys, ICCV05) normalized radius angle Models fish-eye lenses, cata-dioptric systems, etc.

58
Computer Vision 58 Non-parametric distortion calibration (Thirthala and Pollefeys, ICCV05) normalized radius angle 90 o Models fish-eye lenses, cata-dioptric systems, etc.

59
Computer Vision 59 Next class: Stereo F&P Chapter 11 image I(x,y) image I´(x´,y´) Disparity map D(x,y) (x´,y´)=(x+D(x,y),y)

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google