Camera Calibration CS485/685 Computer Vision Prof. Bebis.

Slides:



Advertisements
Similar presentations
Single-view geometry Odilon Redon, Cyclops, 1914.
Advertisements

Lecture 11: Two-view geometry
Introduction to Computer Vision 3D Vision Lecture 4 Calibration CSc80000 Section 2 Spring 2005 Professor Zhigang Zhu, Rm 4439
3D reconstruction.
Computer vision: models, learning and inference
Two-view geometry.
Camera Calibration. Issues: what are intrinsic parameters of the camera? what is the camera matrix? (intrinsic+extrinsic) General strategy: view calibration.
Camera calibration and epipolar geometry
Vision, Video And Virtual Reality 3D Vision Lecture 13 Calibration CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
Structure from motion.
Used slides/content with permission from
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
Lecture 21: Multiple-view geometry and structure from motion
CS485/685 Computer Vision Prof. George Bebis
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Calibration Dorit Moshe.
Single-view geometry Odilon Redon, Cyclops, 1914.
The Pinhole Camera Model
Projected image of a cube. Classical Calibration.
May 2004Stereo1 Introduction to Computer Vision CS / ECE 181B Tuesday, May 11, 2004  Multiple view geometry and stereo  Handout #6 available (check with.
CSE473/573 – Stereo Correspondence
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
CS 558 C OMPUTER V ISION Lecture IX: Dimensionality Reduction.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Automatic Camera Calibration
Computer vision: models, learning and inference
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Final Exam Review CS485/685 Computer Vision Prof. Bebis.
Geometry and Algebra of Multiple Views
Epipolar geometry The fundamental matrix and the tensor
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Projective cameras Motivation Elements of Projective Geometry Projective structure from motion Planches : –
© 2005 Yusuf Akgul Gebze Institute of Technology Department of Computer Engineering Computer Vision Geometric Camera Calibration.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Projective Geometry. Projection Vanishing lines m and n.
视觉的三维运动理解 刘允才 上海交通大学 2002 年 11 月 16 日 Understanding 3D Motion from Images Yuncai Liu Shanghai Jiao Tong University November 16, 2002.
1 Formation et Analyse d’Images Session 7 Daniela Hall 25 November 2004.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Geometry of Multiple Views
Affine Structure from Motion
Single-view geometry Odilon Redon, Cyclops, 1914.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
EECS 274 Computer Vision Affine Structure from Motion.
3D Computer Vision and Video Computing 3D Vision Topic 2 of Part II Calibration CSc I6716 Fall 2009 Zhigang Zhu, City College of New York
3D Reconstruction Using Image Sequence
3D Computer Vision and Video Computing 3D Vision Topic 2 of Part II Calibration CSc I6716 Spring2013 Zhigang Zhu, City College of New York
Camera Model Calibration
Determining 3D Structure and Motion of Man-made Objects from Corners.
Single-view geometry Odilon Redon, Cyclops, 1914.
3D Computer Vision and Video Computing 3D Vision Topic 3 of Part II Calibration CSc I6716 Spring 2008 Zhigang Zhu, City College of New York
EECS 274 Computer Vision Projective Structure from Motion.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
Lec 26: Fundamental Matrix CS4670 / 5670: Computer Vision Kavita Bala.
Calibrating a single camera
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Two-view geometry Computer Vision Spring 2018, Lecture 10
Epipolar geometry.
Epipolar geometry continued
Reconstruction.
Two-view geometry.
Two-view geometry.
Multi-view geometry.
Single-view geometry Odilon Redon, Cyclops, 1914.
The Pinhole Camera Model
Presentation transcript:

Camera Calibration CS485/685 Computer Vision Prof. Bebis

Camera Calibration - Goal Estimate the extrinsic and intrinsic camera parameters. f/s x f/s y

Camera Calibration - How Using a set of known correspondences between point features in the world (X w, Y w, Z w ) and their projections on the image (x im, y im ) f/s x f/s y

Calibration Object Calibration relies on one or more images of a calibration object: (1) A 3D object of known geometry. (2) Located in a known position in space. (3) Yields image features which can be located accurately.

Calibration object: example Two orthogonal grids of equally spaced black squares. Assume that the world reference frame is centered at the lower left corner of the right grid, with axes parallel to the three directions identified by the calibration pattern.

Calibration pattern: example (cont’d) Obtain 3D coordinates (X w, Y w, Z w ) –Given the size of the planes, the number of squares etc. (i.e., all known by construction), the coordinates of each vertex can be computed in the world reference frame using trigonometry.

Calibration pattern: example (cont’d) Obtain 2D coordinates (x im, y im ) –The projection of the vertices on the image can be found by intersecting the edge lines of the corresponding square sides (or through corner detection).

Problem Statement Compute the extrinsic and intrinsic camera parameters from N corresponding pairs of points: and (x im_i, y im_i ), i = 1,..., N. Very well studied problem. There exist many different methods for camera calibration.

Methods (1) Indirect camera calibration (1.1) Estimate the elements of the projection matrix. (1.2) If needed, compute the intrinsic/extrinsic camera parameters from the entries of the projection matrix.

Methods (cont’d) (2) Direct camera calibration Direct recovery of the intrinsic and extrinsic camera parameters.

Method 1: Indirect Camera Calibration Review of basic equations Note: replaced (x im,y im ) with (x,y) for simplicity.

(Method 1) Step 1: solve for m ij ’s M has 11 independent entries. –e.g., divide every entry by m 11 Need at least 11 equations for computing M. Need at least 6 world-image point correspondences.

(Method 1) Step 1: solve for m ij ’s Each 3D-2D correspondence gives rise to two equations:

(Method 1) Step 1: solve for m ij ’s This leads to a homogeneous system of equations: N x 12 matrix

(Method 1) Step 1: solve for m ij ’s

(Method 1) Step 2: find intrinsic/extrinsic parameters

Let’s define the following vectors:

The solutions are as follows (see book chapter for details): The rest parameters are easily computed.... (Method 1) Step 2: find intrinsic/extrinsic parameters

Method 2: Direct Camera Calibration Review of basic equations –From world coordinates to camera coordinates –For simplicity, we will replace -T’ with T –Warning: this is NOT the same T as before! P c =RP w +T

Method 2: Direct Camera Calibration (cont’d) Review of basic equations –From camera coordinates to pixel coordinates: –Relating world coordinates to pixel coordinates:

Method 2: Direct Parameter Calibration Intrinsic parameters –Intrinsic parameters f, s x, s y, o x, and o y are not independent. –Define the following four independent parameters:

Method 2: Main Steps (1) Assuming that o x and o y are known, estimate all other parameters. (2) Estimate o x and o y

(Method 2) Step 1: estimate f x, α, R, and T To simplify notation, set (x im - o x, y im - o y ) = (x, y) Combining the equations above (i.e., same denominator), we have:

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Each pair of corresponding points must satisfy the previous equation: divide by f y and re-arrange terms:

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) where we obtain the following equation:

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Assuming N correspondences leads to a homogeneous system : N x 8 matrix

(Method 2) Step 1: estimate f x, α, R, and T (cont’d)

Determine α and | γ |

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Determine r 21, r 22, r 23, r 11, r 12, r 13, T y, T x (up to an unknown common sign)

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Determine r 31, r 32, r 33 –Can be estimated as the cross product of R 1 and R 2 : –The sign of R 3 is already fixed (the entries of R 3 remain unchanged if the signs of all the entries of R 1 and R 2 are reversed). We have estimated R  call the estimate

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Ensure the orthogonality of R –The computation of R does not take into account explicitly the orthogonality constraints. –The estimate of R cannot be expected to be orthogonal: –Enforce orthogonality on using SVD: –Replace D with I:

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Determine the sign of γ –Consider the following equations again:

(Method 2) Step 1: estimate f x, α, R, and T (cont’d)

Determine T z and f x –Consider the equation: –Let’s rewrite it in the form: or xT z +f x (r 11 X w +r 12 Y w +r 13 Z w +T x ) = -x(r 31 X w +r 32 Y w +r 33 Z w )

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) –We can obtain T z and f x by solving a system of equations like the above, written for N points: Using SVD, the (least-squares) solution is:

(Method 2) Step 1: estimate f x, α, R, and T (cont’d) Determine f y :

(Method 2) Step 2: estimate o x and o y The computation of o x and o y is based on the following theorem: Orthocenter Theorem: Let T be the triangle on the image plane defined by the three vanishing points of three mutually orthogonal sets of parallel lines in space. Then, (o x, o y ) is the orthocenter of T.

(Method 2) Step 2: estimate o x and o y (cont’d) We can use the same calibration pattern to compute three vanishing points (use three pairs of parallel lines defined by the sides of the planes). None of the three mutually orthogonal directions should not be near parallel to the image plane!

Comments To improve the accuracy of camera calibration, it is a good idea to estimate the parameters several times (i.e., using different images) and average the results. Localization errors –The precision of calibration depends on how accurately the world and image points are located. –Studying how localization errors "propagate" to the estimates of the camera parameters is very important.

Comments (cont’d) In theory, direct and indirect camera calibration should produce the same results. In practice, we obtain different solutions due to different error propagations. Indirect camera calibration is simpler and should be preferred when we do not need to compute the intrinsic/extrinsic camera parameters explicitly.

How should we estimate the accuracy of a calibration algorithm? Project known 3D points on the image Compare their projections with the corresponding pixel coordinates of the points. Repeat for many points and estimate “re-projection” error!