Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: ----- Camera is located at the origin of coordinate system of scene.

Slides:



Advertisements
Similar presentations
Lecture 11: Two-view geometry
Advertisements

3D reconstruction.
Last 4 lectures Camera Structure HDR Image Filtering Image Transform.
Institut für Elektrische Meßtechnik und Meßsignalverarbeitung Professor Horst Cerjak, Augmented Reality VU 2 Calibration Axel Pinz.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
Mapping: Scaling Rotation Translation Warp
Two-view geometry.
Camera Calibration. Issues: what are intrinsic parameters of the camera? what is the camera matrix? (intrinsic+extrinsic) General strategy: view calibration.
Camera calibration and epipolar geometry
Camera Models A camera is a mapping between the 3D world and a 2D image The principal camera of interest is central projection.
Structure from motion.
3D Geometry and Camera Calibration. 3D Coordinate Systems Right-handed vs. left-handedRight-handed vs. left-handed x yz x yz.
Used slides/content with permission from
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
CS485/685 Computer Vision Prof. George Bebis
3-D Geometry.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Calibration Dorit Moshe.
© 2003 by Davi GeigerComputer Vision October 2003 L1.1 Structure-from-EgoMotion (based on notes from David Jacobs, CS-Maryland) Determining the 3-D structure.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
COMP322/S2000/L23/L24/L251 Camera Calibration The most general case is that we have no knowledge of the camera parameters, i.e., its orientation, position,
Single-view geometry Odilon Redon, Cyclops, 1914.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
The Pinhole Camera Model
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
CS 558 C OMPUTER V ISION Lecture IX: Dimensionality Reduction.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #15.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Camera Geometry and Calibration Thanks to Martial Hebert.
Multi-view geometry.
Epipolar geometry The fundamental matrix and the tensor
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Homogeneous Coordinates (Projective Space) Let be a point in Euclidean space Change to homogeneous coordinates: Defined up to scale: Can go back to non-homogeneous.
Geometric Models & Camera Calibration
视觉的三维运动理解 刘允才 上海交通大学 2002 年 11 月 16 日 Understanding 3D Motion from Images Yuncai Liu Shanghai Jiao Tong University November 16, 2002.
Geometric Camera Models
Vision Review: Image Formation Course web page: September 10, 2002.
Course 11 Optical Flow. 1. Concept Observe the scene by moving viewer Optical flow provides a clue to recover the motion. 2. Constraint equation.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Single-view geometry Odilon Redon, Cyclops, 1914.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
Plane-based external camera calibration with accuracy measured by relative deflection angle Chunhui Cui , KingNgiNgan Journal Image Communication Volume.
EECS 274 Computer Vision Affine Structure from Motion.
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Computer vision: models, learning and inference M Ahad Multiple Cameras
Auto-calibration we have just calibrated using a calibration object –another calibration object is the Tsai grid of Figure 7.1 on HZ182, which can be used.
Camera Model Calibration
Single-view geometry Odilon Redon, Cyclops, 1914.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
EECS 274 Computer Vision Projective Structure from Motion.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Computer vision: models, learning and inference
Homogeneous Coordinates (Projective Space)
Epipolar geometry.
Uncalibrated Geometry & Stratification
Course 7 Motion.
Course 6 Stereo.
Two-view geometry.
Multi-view geometry.
Single-view geometry Odilon Redon, Cyclops, 1914.
The Pinhole Camera Model
Presentation transcript:

Course 12 Calibration

1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene Optic axis of camera is pointing in z-direction of scene coordinates Image plane is to perpendicular to z-axis with origin at (f, 0, 0) X and Y axes of image coordinates are parallel to x and y axes of scene coordinates, respectively No any distortions for camera, i.e. pinhole camera model.

1) What we need to calibrate: Absolute orientation: between two coordinate systems, e.g.,robot coordinates and model coordinates Relative orientation: between two camera systems Exterior orientation: between camera and scene systems Interior orientation: within a camera, such as camera constant, principal point, lens distortion, etc.

2) Coordinate systems: scene coordinates (global coordinates, world coordinates, absolute coordinates) Camera coordinates Image coordinate systems Pixel coordinates For a image of size m×n, image center

2. Absolute orientation: To find out relationship between two coordinate systems from 3 or more 3D points that have expressed in the two coordinate systems. Let a 3D point p is expressed by in model coordinate system. in absolute coordinate system. Then: where R is rotation of model coordinate corresponding to absolute coordinates; is the origin of model coordinate system in the absolute coordinate system.

Given: We want to find: It is the same expression as 3D motion estimates from 3D points. Therefore, all the algorithms of motion estimation (using 3D points) can be used to solve for absolute orientation.

One should remember the constraint of rotation matrix that is orthogonal matrix, i.e. This adds 6 additional equations in solving for rotation.

1) Solve rotation with quaternion: Orthonornal constraint of rotation matrix is absorbed in quaternion expression. In absolute coordinate system, a set of points at a 3D object are : In model coordinate system, the same set of points are correspondingly measured as: they satisfy

In absolute coordinate system: Centroid of the point set is And the ray from set centroid to point is: In the same way, in model coordinate system: Since and are parallel, We can solve for rotation by least-squares.

Using quaternion to express rotation and recall that we have:

Where i.e. where This equation can be solved by linear least squares, such as SVD. After is found, camera position can be easily calculated by

2) Scale Problem If absolute coordinate system and model coordinate system may have different measurements, scale problem is introduced. If one notices the fact that the distances between points of 3D scene are not affected by choice of coordinate systems, we can easily solve for scale factor:

Once scale factor is found, the problem becomes ordinary absolute orientation problem

3. Relative Orientation To determine the relationship between two camera coordinate systems from the projections of corresponding points in the two camera. (e.g., in stereo cases). This is to say, given pairs of image point correspondences, to find rotation and translation of the camera from one position to the other position.

1) Solve relative orientation problem by motion estimation. In motion estimation, camera is stationary. Object is moving. Using to find In stereo case, we can imagine that a camera first take an image of scene at position, and then moves to to take the second image of the same scene. The scene is stationary. Camera is moving.

Use to find, it is the same as stationary camera case with Use 8-point method, one can solve for rotation and translation.

2) Iterative method: Let and be the direction from camera centers to scene point, respectively. Since baseline of stereo system and the normal of epipolar plane are perpendicular, we have: It can be seen that baseline can only be solved with a scale factor. We put a constraint :

By least squares: five or more stereo image points are needed to solve for the relative orientation problem. In updating, we use the increment for baseline and for rotation, which constrains the rotation matrix to be orthonormal; i.e. where:

for baseline, iterative formula is where subject to

4. Exterior Orientation Using image point (X,Y) in image coordinates and the corresponding 3D point ( x,y,z ) of scene in absolute coordinate system to determine the relationship of camera system and the absolute system. We assume that image plane and camera are well calibrated. Let a scene point is expressed in absolute coordinate system: In camera coordinate system:

Express in camera coordinate system: i.e. (1)/(3) and (2)/(3), and considering We have :

Or As are known, there are 12 unknowns, 9 for rotation and 3 for translation, at least 6 point are needed to provide 12 equations. However, if considering 6 constraints for rotation, the minimum required point is 3 to calibrate the exterior orientation problem.

5. Interior Orientation To determine the internal geometry of the camera, such as: Camera constant: the distance from image plane to projection center Principal point: the origin location of image plane coordinate systems Lens distortion coefficients: optic property of camera Scale factors: the distance between rows and columns.

1) Calibration model of camera: Uncorrected image coordinates -----True image coordinates Principal point of image, expressed in system where

lens decentering: where

2) Calibration method Straight lines in scene should be straight lines in image. Put straight lines in scene, such as a paper printing with lines. Take image from the straight lines in scene With image, use Hough transform to group edge lines

Substitute calibration model into the equation of straight lines: Where denotes the point at line. Use lease-squares to find unknown parameters

3) Examples: Affine method for camera calibration Combine interior and exterior calibrations Scale error, translation error, rotations, skew error and differential scaling Cannot correct lens distortion D-3D point correspondences of calibration points are given. Correction model (affine transformation)

Since is an arbitrary matrix, the transformation involves rotation, translation, scaling, skew transform. Using projection model: Affine transformation becomes:

From exterior orientation of camera: Substitute exterior orientation relation into affine transformation model: where coefficients are absorbed into

With known 3D-2D point correspondences of Coefficients can be computed by SVD. After are obtained, affine transformation matrix can be formed:

And virtual scene points can be calculated from uncorrected image points. Thus, calibrated image points are obtained by

4) Example: Nonlinear method of calibration Combine interior and exterior calibrations Can correct lens distortions Give 2D-3D point correspondences for calibration points Since: And interior calibration model:

We have: The orthonormal property of rotation matrix is constrained by expressing rotation by 3 Euler angles

With a set of point correspondences of, we can solve the unknowns Iteratively with initial guess : = camera location in absolute coordinate systems. = nominal values of focal length of camera