Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th, 2009 1.

Slides:



Advertisements
Similar presentations
Zhengyou Zhang Vision Technology Group Microsoft Research
Advertisements

Image Registration  Mapping of Evolution. Registration Goals Assume the correspondences are known Find such f() and g() such that the images are best.
Probabilistic Inverse Dynamics for Nonlinear Blood Pattern Reconstruction Benjamin Cecchetto, Wolfgang Heidrich University of British Columbia.
Alignment Visual Recognition “Straighten your paths” Isaiah.
Lecture 5 Newton-Raphson Method
Pose Estimation Using Four Corresponding Points M.L. Liu and K.H. Wong, "Pose Estimation using Four Corresponding Points", Pattern Recognition Letters,
Simultaneous surveillance camera calibration and foot-head homology estimation from human detection 1 Author : Micusic & Pajdla Presenter : Shiu, Jia-Hau.
QR Code Recognition Based On Image Processing
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Introduction to Computer Vision 3D Vision Lecture 4 Calibration CSc80000 Section 2 Spring 2005 Professor Zhigang Zhu, Rm 4439
3D reconstruction.
Dynamic Occlusion Analysis in Optical Flow Fields
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
Chapter 6 Feature-based alignment Advanced Computer Vision.
3D M otion D etermination U sing µ IMU A nd V isual T racking 14 May 2010 Centre for Micro and Nano Systems The Chinese University of Hong Kong Supervised.
Camera Calibration. Issues: what are intrinsic parameters of the camera? what is the camera matrix? (intrinsic+extrinsic) General strategy: view calibration.
Computer Vision Optical Flow
Linear Methods for Regression Dept. Computer Science & Engineering, Shanghai Jiao Tong University.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Chapter 5 Orthogonality
Optical Flow Methods 2007/8/9.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Uncalibrated Geometry & Stratification Sastry and Yang
CS485/685 Computer Vision Prof. George Bebis
Multiple-view Reconstruction from Points and Lines
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
COMP322/S2000/L23/L24/L251 Camera Calibration The most general case is that we have no knowledge of the camera parameters, i.e., its orientation, position,
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
Automatic Camera Calibration
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Epipolar geometry The fundamental matrix and the tensor
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
© 2005 Yusuf Akgul Gebze Institute of Technology Department of Computer Engineering Computer Vision Geometric Camera Calibration.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Geometric Models & Camera Calibration
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Correspondence-Free Determination of the Affine Fundamental Matrix (Tue) Young Ki Baik, Computer Vision Lab.
Curve-Fitting Regression
CHAPTER 4 Adaptive Tapped-delay-line Filters Using the Least Squares Adaptive Filtering.
© 2005 Martin Bujňák, Martin Bujňák Supervisor : RNDr.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Ch. 3: Geometric Camera Calibration
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Plane-based external camera calibration with accuracy measured by relative deflection angle Chunhui Cui , KingNgiNgan Journal Image Communication Volume.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Yizhou Yu Texture-Mapping Real Scenes from Photographs Yizhou Yu Computer Science Division University of California at Berkeley Yizhou Yu Computer Science.
Computer vision: models, learning and inference M Ahad Multiple Cameras
Large-Scale Matrix Factorization with Missing Data under Additional Constraints Kaushik Mitra University of Maryland, College Park, MD Sameer Sheoreyy.
Determining 3D Structure and Motion of Man-made Objects from Corners.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
Introduction to Symmetry Analysis Brian Cantwell Department of Aeronautics and Astronautics Stanford University Chapter 1 - Introduction to Symmetry.
Computer vision: models, learning and inference
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Multiple View Geometry for Robotics
By: Mohammad Qudeisat Supervisor: Dr. Francis Lilley
Course 6 Stereo.
Propagation of Error Berlin Chen
Propagation of Error Berlin Chen
Presentation transcript:

Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 2

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 3

Introduction Important step for a 2D image to relates to the 3D world Involves relating the optical features of a lens to the sensing device ◦ Pose estimation, 3D motion estimation, automated assembly Parameters: image center and focal length ◦ Expressed in terms of image pixels Linear vs. Nonlinear, Lens distortion consideration vs. w/o consideration 4

Technique: Linearity LinearNonlinear Simpler to implement Most cannot model camera distortions Capable to consider complicated imaging model with many parameters Computationally expensive search procedure Reasonable good initial guess for convergence of the solutions 5

Major Drawback of existing algorithm Calibrate with predefined pattern ◦ Relating image projections to the camera parameters Recent algorithms suffer from the same limitation New discovery: Active Calibration 6

Active Calibration Camera capable of panning and tilitng can automatically calibrate itself ◦ Modeled from eye movement Active machines can keep track of object of interest Facilitate region-of-interest process 7

Active Calibration – How different? Does not need a starting estimate for focal length and image center Does not need prior information about focal length Does not need to match points or feature b/w images Reasonably accurate localization of contour Estimate of center (Not too far from true value) 8

Method of Calibration Using perspective distortion to measure calibration parameters Without using perspective distortion 9

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 10

Theoretical Derivation 11

Theoretical Derivation Lemma 1 Camera rotates by R and translate by T New image contours 12

Theoretical Derivation Lemma 1 - Proof Use two set of equation, 13

Theoretical Derivation Proposition 1 Depth ( Z ) is larger than ΔX, ΔY, ΔZ Camera moves by small tilt angle 14

Theoretical Derivation Proposition 1 – Proof Rotation matrix R at small tilt angle are negligible From Lemma 1 15

Theoretical Derivation Proposition 1 – Proof Expand right side of equation with Taylor series, because of small θ t With the same assumption, if camera moves by small pan angle θ p 16

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 17

Strategy for Active Calibration Want – A relation b/w lens parameters and image information w/ given image contours before & after camera motion Relate focal length to other camera parameters and the pan/tilt angles 18

Strategy for Active Calibration Proposition 2 Similar assumption as Proposition 1 Center of the lens is estimated with a small error ( δ x, δ y ) 19

Strategy for Active Calibration Proposition 2 – Proof From Proposition 1 Estimate image Center with error ( δ x, δ y ) Ignore δ x δ y term 20

Strategy for Active Calibration Proposition 3 (Plan A) Using tilt (or pan) movement and considering three independent static contours, two linear equation in δ x δ y can be obtained if negligible terms are ignored 21

Strategy for Active Calibration Proposition 3 – Proof Two different contour, C 1 and C 2 Point lying on C 1 & C 2, ( x (1),y (1) ) and ( x (2),y (2) ) From Proposition 2 22

Strategy for Active Calibration Proposition 3 – Proof Equate right side equations and simplify where 23

Strategy for Active Calibration Proposition 3 – Proof Third contour, C 3 Point on C 3, ( x (3),y (3) ) where 24

Strategy for Active Calibration Proposition 3 – Proof Finding f x and f y with estimated center “e” denote the estimate of a certain parameter 25

Strategy for Active Calibration Procedure Summary for Plan A Estimate δ x and δ y using (3) and (4) with three distinct image contour Obtain estimate for f x and f y by substituting resulting estimate into (5) and (6) Term and make (5) and (6) unstable 26

Strategy for Active Calibration Procedure Summary for Plan A Variation in x-coord. for any point is due to change in perspective distortion (tilt) Little change in the image y-coord. corresponding to a given 3-d point (pan) and are small (few pixel) Relative error can be large ◦ presence of noise and inaccuracies in localization of a contour Estimate in (5) & (6) are often unreliable 27

Strategy for Active Calibration Proposition 4 (Plan B) Using a single contour and pan/tilt camera movements f x and f y can be obtained if negligible terms are ignored 28

Strategy for Active Calibration Proposition 4 – Proof δ x and δ y are non-zero in the second equation in Proposition 1 Simplify The last three terms are negligible even if δ x and δ y are large 29

Strategy for Active Calibration Proposition 4 – Proof Simplifying eq (7) f x can be obtained with similar way 30

Strategy for Active Calibration Proposition 4 – Corollary Given two independent contours, pan/tilt camera movements, and estimate of f x and f y given by and respectively, δ x and δ y can be obtained by solving Considering from two independent contour from Proposition 2 31

Strategy for Active Calibration Proposition 4 – Proof Consider (8) Most practical system ◦ y 500 (8) is in form A = 1, B < 0, C is small compare to B 32

Strategy for Active Calibration Procedure Summary for Plan B Estimate f x and f y from (12) and (13) using a single image contour Solve for δ x and δ y by substituting resulting estimates into (10) and (11) and using another independent contour 33

Strategy for Active Calibration Proposition 5 When there is error in contour localization after pan/tilt movements, the ratio of the error in Plan A compared to Plan B for estimating f x (f y ) is approximately 34

Strategy for Active Calibration Proposition 5 – Proof 1. Introduce similar error term in and in (13) and (5) respectively 2. Simplify the expressions and consider the approximate magnitude of error in both the expressions 3. Take the ratio of these two terms 35

Strategy for Active Calibration Proposition 5 – Implication Error in Plan A can be as large as 30 times that of Plan B, for estimating focal lengths Plan A is theoretically more precise, but not reliable for noisy real scenes 36

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 37

Theoretical Error Analysis Effect of errors from various sources on the estimation of different parameters ◦ Errors in measurements of pan/tilt angles ◦ Effect of noise in the extraction of image contours 38

Theoretical Error Analysis Remark 1 Error in measurement of the pan (tilt) angle generates a proportional error in the estimate of f x ( f y ) Proof Consider(5) f x is proportional to the pan angle ◦ Any error in the measurement translates to a corresponding error in f x Any error from tilt angle generate a proportional error in f y 39

Theoretical Error Analysis Remark 2 Errors in measurement of the pan & tilt angles do not affect the estimate of the lens center ◦ independent contours from the same image are considered Proof Linear equations in δ x and δ y are obtained by equating the right hand sides of two equations 40

Theoretical Error Analysis Consider (1) & (2) Denote ε 1 : error in tilt angle Contour extracted from same image Then ( θ t +ε 1 ) of (3) cancels out from both sides ◦ Error in pan/tilt angle do not affect the estimate of lens center 41

Theoretical Error Analysis Consider two independent images generating the contours in (1) & (2) K 1 in (3) modifies to is not equal to 1 in general Errors in angle can change the estimate of the lens center if contours from independent images are considered 42

Theoretical Error Analysis Remark 3 The coefficients of the linear (3)-(6) are unbiased in the presence of uncorrelated noise with zero mean Coefficients involve a linear combination of terms These terms are unbiased in the presence of uncorrelated noise with zero mean 43

Theoretical Error Analysis Remark 4 The variance of the coefficients of (3)-(6) is inversely proportional to the number of points on a contour ◦ Uncorrelated noise with zero mean is considered Form of variances Inversely proportional to the number of points for which the averages were computed 44

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 45

Experimental Result Simulation – Validity of Algorithms Synthetic data used Three independent contour represented by three sets of 3D points Points projected onto the image plane Values quantized to the nearest integer Without noise, A produced more accurate estimate ◦ Less than 1 percent relative error in focal length estimate 46

Experimental Result Simulation – Variation of error in focal length estimate Change f x and f y from 100 to 1000 with interval 100 ◦ Keep other parameters fixed Discretization error influence A more when focal length was small ◦ A is not very robust to noise Larger the focal length, smaller the error relative to the focal length ◦ Better estimate production with A 47

Experimental Result Simulation – Variation of error in focal length estimate Error estimate from B does not drop off rapidly B is theoretically less accurate than A 48

Experimental Result Simulation – Gaussian Noise Added Poor performance with A ◦ 20, 28, and 40 percent error with 3, 4, and 5 noise standard deviation High robustness with B 49

Experimental Result Tracking Contour Match contours of interest during pan/tilt ◦ For automatic calibration Edges in the original image was thickened using the morphological operation of “dilation” Edges after pan/tilt was AND-ed with the dilated image to extract corresponding contours after camera rotation 50

A sequence of images for small pan movements of a camera 51

Corresponding edge images 52

Tracked over the sequence of image 53

Experimental Result Calibration With Real Images – initial image and its edge 54

Experimental Result Calibration With Real Images – panned image and its edge 55

Experimental Result Calibration With Real Images – matching contour 56

Experimental Result Calibration With Real Images Plan A Estimate of f x and f y, 693 and 981 With known pattern and refining initial estimate by trial and error, 890 and 1109 Plan B Estimate of f x and f y, 917 and 1142 Produces estimates fairly close to the true value 57

Experimental Result Other Environment 58

Experimental Result Other Environment Estimate produced using Plan B, 902 and 1123, 905 and 1099 Average relative error < 1.5% A produced very inaccurate estimate B produced stable estimates Lens center estimates are not very accurate for both 59

Outline Introduction Theoretical Derivation Strategies for Active Calibration Theoretical Error Analysis Experimental Result Conclusion and Future Work 60

Conclusion Algorithm do not require unique pattern ◦ Only need scenes with strong and stable edge A gives almost perfect estimates in an ideal environment B is suitable for noisy synthetic images or real scenes 61

Future Work Simplify algorithm further by considering roll movements of the camera Designing a simple method to obtain the optical center of the lens 62

Question ? 63