3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.

Slides:



Advertisements
Similar presentations
Epipolar Geometry.
Advertisements

Lecture 11: Two-view geometry
3D reconstruction.
Stereo Vision Reading: Chapter 11
Study the mathematical relations between corresponding image points.
Two-View Geometry CS Sastry and Yang
Two-view geometry.
Lecture 8: Stereo.
Stereo.
Camera calibration and epipolar geometry
Last Time Pinhole camera model, projection
Computer Vision : CISC 4/689
Geometry of Images Pinhole camera, projection A taste of projective geometry Two view geometry:  Homography  Epipolar geometry, the essential matrix.
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2005 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Uncalibrated Geometry & Stratification Sastry and Yang
Lecture 21: Multiple-view geometry and structure from motion
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Stereopsis Mark Twain at Pool Table", no date, UCR Museum of Photography.
The plan for today Camera matrix
Lecture 20: Two-view geometry CS6670: Computer Vision Noah Snavely.
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
May 2004Stereo1 Introduction to Computer Vision CS / ECE 181B Tuesday, May 11, 2004  Multiple view geometry and stereo  Handout #6 available (check with.
Lec 21: Fundamental Matrix
CSE473/573 – Stereo Correspondence
Announcements PS3 Due Thursday PS4 Available today, due 4/17. Quiz 2 4/24.
Two-views geometry Outline Background: Camera, Projection Necessary tools: A taste of projective geometry Two view geometry:  Homography  Epipolar geometry,
Multiple View Geometry : Computational Photography Alexei Efros, CMU, Fall 2006 © Martin Quinn …with a lot of slides stolen from Steve Seitz and.
Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/12/11 Many slides adapted from Lana Lazebnik,
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
Computer Vision Spring ,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #15.
Automatic Camera Calibration
Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/05/15 Many slides adapted from Lana Lazebnik,
Computer vision: models, learning and inference
Camera Calibration & Stereo Reconstruction Jinxiang Chai.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Structure from images. Calibration Review: Pinhole Camera.
Multi-view geometry.
Lecture 12 Stereo Reconstruction II Lecture 12 Stereo Reconstruction II Mata kuliah: T Computer Vision Tahun: 2010.
Epipolar geometry The fundamental matrix and the tensor
Recap from Monday Image Warping – Coordinate transforms – Linear transforms expressed in matrix form – Inverse transforms useful when synthesizing images.
Epipolar geometry Epipolar Plane Baseline Epipoles Epipolar Lines
Stereo Vision Reading: Chapter 11 Stereo matching computes depth from two or more images Subproblems: –Calibrating camera positions. –Finding all corresponding.
Lecture 04 22/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Multiview Geometry and Stereopsis. Inputs: two images of a scene (taken from 2 viewpoints). Output: Depth map. Inputs: multiple images of a scene. Output:
Stereo Course web page: vision.cis.udel.edu/~cv April 11, 2003  Lecture 21.
Computer Vision, Robert Pless
Announcements Project 3 due Thursday by 11:59pm Demos on Friday; signup on CMS Prelim to be distributed in class Friday, due Wednesday by the beginning.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
Bahadir K. Gunturk1 Phase Correlation Bahadir K. Gunturk2 Phase Correlation Take cross correlation Take inverse Fourier transform  Location of the impulse.
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
stereo Outline : Remind class of 3d geometry Introduction
Feature Matching. Feature Space Outlier Rejection.
Solving for Stereo Correspondence Many slides drawn from Lana Lazebnik, UIUC.
Computer vision: models, learning and inference M Ahad Multiple Cameras
Reconstruction from Two Calibrated Views Two-View Geometry
Stereo March 8, 2007 Suggested Reading: Horn Chapter 13.
Multiple View Geometry and Stereo. Overview Single camera geometry – Recap of Homogenous coordinates – Perspective projection model – Camera calibration.
Lec 26: Fundamental Matrix CS4670 / 5670: Computer Vision Kavita Bala.
CSE 185 Introduction to Computer Vision Stereo 2.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Epipolar Geometry and Stereo Vision
René Vidal and Xiaodong Fan Center for Imaging Science
Two-view geometry Computer Vision Spring 2018, Lecture 10
Epipolar geometry.
Geometry 3: Stereo Reconstruction
Study the mathematical relations between corresponding image points.
Two-view geometry.
Multiple View Geometry for Robotics
Presentation transcript:

3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective

Two-views geometry Outline Background: Camera, Projection models Necessary tools: A taste of projective geometry Two view geometry:  Planar scene (homography ).  Non-planar scene (epipolar geometry). 3D reconstruction (stereo).

Perspective Projection Origin (0,0,0) is the Focal center X,Y ( x,y ) axis are along the image axis (height / width). Z is depth = distance along the Optical axis f – Focal length

Coordinates in Projective Plane P 2 k(0,0,1) k(x,y,0) k(1,1,1) k(1,0,1) k(0,1,1) “Ideal point” Take R 3 –{0,0,0} and look at scale equivalence class (rays/lines trough the origin).

2D Projective Geometry: Basics A point: A line: we denote a line with a 3-vector Line coordinates are homogenous Points and lines are dual: p is on l if Intersection of two lines/points

Cross Product in matrix notation [ ] x Hartley & Zisserman p. 581

2D Projective Transformation Projectivity: An invertible mapping h:P 2  P 2 S.T: Homography. A 3x3 (non singular) invertible matrix acting on homogenous 3-vectores. Collineation A transformations that map lines to lines Hartley & Zisserman p names 3 definitions

2D Projective Transformation H is defined up to scale 9 parameters 8 degrees of freedom Determined by 4 corresponding points how does H operate on lines? Hartley & Zisserman p. 32

Next Homography Objectives: Understand when&how to use homography End of review Students should be back in buisness Time line: 15 min (60 sec/slide)

Two-views geometry Outline Background: Camera, Projection Necessary tools: A taste of projective geometry Two view geometry:  Homography  Epipolar geometry, the essential matrix  Camera calibration, the fundamental matrix 3D reconstruction from two views (Stereo algorithms)

Two View Geometry When a camera changes position and orientation, the scene moves rigidly relative to the camera 3-D Scene u u’u’ Rotation + translation

Two View Geometry (simple cases) In two cases this results in homography: 1. Camera rotates around its focal point 2. The scene is planar Then:  Point correspondence forms 1:1mapping  depth cannot be recovered

Camera Rotation (R is 3x3 non-singular)

Planar Scenes Intuitively A sequence of two perspectivities Algebraically Need to show: Scene Camera 1 Camera 2

Summary: Two Views Related by Homography Two images are related by homography: One to one mapping from p to p’ H contains 8 degrees of freedom Given correspondences, each point determines 2 equations 4 points are required to recover H Depth cannot be recovered

Next Epipolar geometry Objectives: Essential & fundamental matrices End of Homographies. Students should understand:  Why in both cases we end with a homography Time line: 45 min (180 sec/slide)

The General Case: Epipolar Lines epipolar line

Epipolar Plane epipolar plane epipolar line Baseline P O O’

Epipole Every plane through the baseline is an epipolar plane It determines a pair of epipolar lines (one in each image) Two systems of epipolar lines are obtained Each system intersects in a point, the epipole The epipole is the projection of the center of the other camera epipolar plane epipolar lines Baseline O O’

Example

Epipolar Lines epipolar plane epipolar line Baseline P O O’ To define an epipolar plane, we define the plane through the two camera centers O and O’ and some point P. This can be written algebraically (in some world coordinates as follows:

Essential Matrix (algebraic constraint between corresponding image points) Set world coordinates around the first camera What to do with O’P? Every rotation changes the observed coordinate in the second image We need to de-rotate to make the second image plane parallel to the first Replacing by image points Other derivations Hartley & Zisserman p. 241

Essential Matrix (cont.) Denote this by: Then Define E is called the “essential matrix”

Properties of the Essential Matrix E is homogeneous Its (right and left) null spaces are the two epipoles 9 parameters Is linear E, E can be recovered up to scale using 8 points. Has rank 2. The constraint detE=0  7 points suffices In fact, there are only 5 degrees of freedom in E,  3 for rotation  2 for translation (up to scale), determined by epipole

Background The lens optical axis does not coincide with the sensor We model this using a 3x3 matrix the Calibration matrix Camera Internal Parameters or Calibration matrix

Camera Calibration matrix The difference between ideal sensor ant the real one is modeled by a 3x3 matrix K: (c x,c y ) camera center, (a x,a y ) pixel dimensions, b skew We end with

Fundamental Matrix F, is the fundamental matrix.

Properties of the Fundamental Matrix F is homogeneous Its (right and left) null spaces are the two epipoles 9 parameters Is linear F, F can be recovered up to scale using 8 points. Has rank 2. The constraint detF=0  7 points suffices

Epipolar Plane l’ l Baseline P O O’ Other derivations Hartley & Zisserman p. 223 x X’ e e’ e’

HomographyEpipolar Form ShapeOne-to-one mapConcentric epipolar lines D.o.f.88/5 F/E Eqs/pnt21 Minimal configuration 45+ (8, linear) Depth NoYes, up to scale Scene Planar (or no translation) 3D scene Two-views geometry Summary:

Next Stereo Objectives: Basic Terminology & triangulation. End of. Epipolar geometry Time line: 70 min (120 sec/slide)

Stereo Vision Objective: 3D reconstruction Input: 2 (or more) images taken with calibrated cameras Output: 3D structure of scene Steps:  Rectification  Matching  Depth estimation

Rectification Image Reprojection  reproject image planes onto common plane parallel to baseline Notice, only focal point of camera really matters (Seitz)

Rectification Any stereo pair can be rectified by rotating and scaling the two image planes (=homography) We will assume images have been rectified so  Image planes of cameras are parallel.  Focal points are at same height.  Focal lengths same. Then, epipolar lines fall along the horizontal scan lines of the images

Cyclopean Coordinates Origin at midpoint between camera centers Axes parallel to those of the two (rectified) cameras

Disparity The difference is called “disparity” d is inversely related to Z: greater sensitivity to nearby points d is directly related to b: sensitivity to small baseline

Next tasest of correspondence Objectives: The problem of correspondence (solutions in next class). End of. Stereo Time line: 80 min (60 sec/slide)

Main Step: Correspondence Search What to match?  Objects? More identifiable, but difficult to compute  Pixels? Easier to handle, but maybe ambiguous  Edges?  Collections of pixels (regions)?

Random Dot Stereogram Using random dot pairs Julesz showed that recognition is not needed for stereo

Random Dot in Motion

Finding Matches

SSD error disparity 1D Search More efficient Fewer false matches

Ordering

Comparison of Stereo Algorithms D. Scharstein and R. Szeliski. "A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms," International Journal of Computer Vision, 47 (2002), pp Ground truthScene

Results with window correlation Window-based matching (best window size) Ground truth

Scharstein and Szeliski

Graph Cuts (next class). Ground truthGraph cuts

Next class Stereo algorithms End of this class Time line: 100 min