Download presentation
Presentation is loading. Please wait.
Published byJosephine Boone Modified over 9 years ago
1
Course 12 Calibration
2
1.Introduction In theoretic discussions, we have assumed: ----- Camera is located at the origin of coordinate system of scene. ----- Optic axis of camera is pointing in z-direction of scene coordinates. ----- Image plane is to perpendicular to z-axis with origin at (f, 0, 0) ----- X and Y axes of image coordinates are parallel to x and y axes of scene coordinates, respectively. ----- No any distortions for camera, i.e. pinhole camera model.
3
1) What we need to calibrate: ----- Absolute orientation: between two coordinate systems, e.g.,robot coordinates and model coordinates. ----- Relative orientation: between two camera systems. ----- Exterior orientation: between camera and scene systems ----- Interior orientation: within a camera, such as camera constant, principal point, lens distortion, etc.
4
2) Coordinate systems: ----- scene coordinates (global coordinates, world coordinates, absolute coordinates). ----- Camera coordinates. ----- Image coordinate systems ----- Pixel coordinates For a image of size m×n, image center
5
2. Absolute orientation: To find out relationship between two coordinate systems from 3 or more 3D points that have expressed in the two coordinate systems. Let a 3D point p is expressed by in model coordinate system. in absolute coordinate system. Then: where R is rotation of model coordinate corresponding to absolute coordinates; is the origin of model coordinate system in the absolute coordinate system.
6
Given: We want to find: It is the same expression as 3D motion estimates from 3D points. Therefore, all the algorithms of motion estimation (using 3D points) can be used to solve for absolute orientation.
7
One should remember the constraint of rotation matrix that is orthogonal matrix, i.e. This adds 6 additional equations in solving for rotation.
8
1) Solve rotation with quaternion: Orthonornal constraint of rotation matrix is absorbed in quaternion expression. In absolute coordinate system, a set of points at a 3D object are : In model coordinate system, the same set of points are correspondingly measured as: they satisfy
9
In absolute coordinate system: Centroid of the point set is And the ray from set centroid to point is: In the same way, in model coordinate system: Since and are parallel, We can solve for rotation by least-squares.
10
Using quaternion to express rotation and recall that we have:
11
Where i.e. where This equation can be solved by linear least squares, such as SVD. After is found, camera position can be easily calculated by
12
2) Scale Problem If absolute coordinate system and model coordinate system may have different measurements, scale problem is introduced. If one notices the fact that the distances between points of 3D scene are not affected by choice of coordinate systems, we can easily solve for scale factor:
13
Once scale factor is found, the problem becomes ordinary absolute orientation problem
14
3. Relative Orientation To determine the relationship between two camera coordinate systems from the projections of corresponding points in the two camera. (e.g., in stereo cases). This is to say, given pairs of image point correspondences, to find rotation and translation of the camera from one position to the other position.
15
1) Solve relative orientation problem by motion estimation. In motion estimation, camera is stationary. Object is moving. Using to find In stereo case, we can imagine that a camera first take an image of scene at position, and then moves to to take the second image of the same scene. The scene is stationary. Camera is moving.
16
Use to find, it is the same as stationary camera case with Use 8-point method, one can solve for rotation and translation.
17
2) Iterative method: Let and be the direction from camera centers to scene point, respectively. Since baseline of stereo system and the normal of epipolar plane are perpendicular, we have: It can be seen that baseline can only be solved with a scale factor. We put a constraint :
18
By least squares: five or more stereo image points are needed to solve for the relative orientation problem. In updating, we use the increment for baseline and for rotation, which constrains the rotation matrix to be orthonormal; i.e. where:
19
for baseline, iterative formula is where subject to
20
4. Exterior Orientation Using image point (X,Y) in image coordinates and the corresponding 3D point ( x,y,z ) of scene in absolute coordinate system to determine the relationship of camera system and the absolute system. We assume that image plane and camera are well calibrated. Let a scene point is expressed in absolute coordinate system: In camera coordinate system:
21
Express in camera coordinate system: i.e. (1)/(3) and (2)/(3), and considering We have :
22
Or As are known, there are 12 unknowns, 9 for rotation and 3 for translation, at least 6 point are needed to provide 12 equations. However, if considering 6 constraints for rotation, the minimum required point is 3 to calibrate the exterior orientation problem.
23
5. Interior Orientation To determine the internal geometry of the camera, such as: ----- Camera constant: the distance from image plane to projection center. ----- Principal point: the origin location of image plane coordinate systems. ----- Lens distortion coefficients: optic property of camera. ----- Scale factors: the distance between rows and columns.
24
1) Calibration model of camera: ----- Uncorrected image coordinates -----True image coordinates ----- Principal point of image, expressed in system where
25
lens decentering: where
26
2) Calibration method ----- Straight lines in scene should be straight lines in image. Put straight lines in scene, such as a paper printing with lines. Take image from the straight lines in scene With image, use Hough transform to group edge lines
27
Substitute calibration model into the equation of straight lines: Where denotes the point at line. Use lease-squares to find unknown parameters
28
3) Examples: Affine method for camera calibration ----- Combine interior and exterior calibrations ----- Scale error, translation error, rotations, skew error and differential scaling. ----- Cannot correct lens distortion ----- 2D-3D point correspondences of calibration points are given. Correction model (affine transformation)
29
Since is an arbitrary matrix, the transformation involves rotation, translation, scaling, skew transform. Using projection model: Affine transformation becomes:
30
From exterior orientation of camera: Substitute exterior orientation relation into affine transformation model: where coefficients are absorbed into
31
With known 3D-2D point correspondences of Coefficients can be computed by SVD. After are obtained, affine transformation matrix can be formed:
32
And virtual scene points can be calculated from uncorrected image points. Thus, calibrated image points are obtained by
33
4) Example: Nonlinear method of calibration ----- Combine interior and exterior calibrations ----- Can correct lens distortions ----- Give 2D-3D point correspondences for calibration points Since: And interior calibration model:
34
We have: The orthonormal property of rotation matrix is constrained by expressing rotation by 3 Euler angles
35
With a set of point correspondences of, we can solve the unknowns Iteratively with initial guess : = camera location in absolute coordinate systems. = nominal values of focal length of camera
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.