Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.

Slides:



Advertisements
Similar presentations
Vanishing points  .
Advertisements

Computer Vision, Robert Pless
Simultaneous surveillance camera calibration and foot-head homology estimation from human detection 1 Author : Micusic & Pajdla Presenter : Shiu, Jia-Hau.
CSE473/573 – Stereo and Multiple View Geometry
Sukhum Sattaratnamai Advisor: Dr.Nattee Niparnan
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
3D M otion D etermination U sing µ IMU A nd V isual T racking 14 May 2010 Centre for Micro and Nano Systems The Chinese University of Hong Kong Supervised.
Integrating a Short Range Laser Probe with a 6-DOF Vertical Robot Arm and a Rotary Table Theodor Borangiu Anamaria Dogar
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
A Versatile Depalletizer of Boxes Based on Range Imagery Dimitrios Katsoulas*, Lothar Bergen*, Lambis Tassakos** *University of Freiburg **Inos Automation-software.
Localization of Piled Boxes by Means of the Hough Transform Dimitrios Katsoulas Institute for Pattern Recognition and Image Processing University of Freiburg.
Camera calibration and epipolar geometry
Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
CS485/685 Computer Vision Prof. George Bebis
3-D Geometry.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
The Pinhole Camera Model
Camera Calibration CS485/685 Computer Vision Prof. Bebis.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Stereo Ranging with verging Cameras Based on the paper by E. Krotkov, K.Henriksen and R. Kories.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.1: 3D Geometry Jürgen Sturm Technische Universität München.
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Geometric Models & Camera Calibration
3D SLAM for Omni-directional Camera
An Introduction to Mobile Robotics CSE350/ Sensor Systems (Continued) 2 Sep 03.
3D-2D registration Kazunori Umeda Chuo Univ., Japan CRV2010 Tutorial May 30, 2010.
MESA LAB Multi-view image stitching Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of.
International Conference on Computer Vision and Graphics, ICCVG ‘2002 Algorithm for Fusion of 3D Scene by Subgraph Isomorphism with Procrustes Analysis.
视觉的三维运动理解 刘允才 上海交通大学 2002 年 11 月 16 日 Understanding 3D Motion from Images Yuncai Liu Shanghai Jiao Tong University November 16, 2002.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
Correspondence-Free Determination of the Affine Fundamental Matrix (Tue) Young Ki Baik, Computer Vision Lab.
Geometric Camera Models
Acquiring 3D models of objects via a robotic stereo head David Virasinghe Department of Computer Science University of Adelaide Supervisors: Mike Brooks.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Single-view geometry Odilon Redon, Cyclops, 1914.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
1 Camera calibration based on arbitrary parallelograms 授課教授:連震杰 學生:鄭光位.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
112/5/ :54 Graphics II Image Based Rendering Session 11.
12/24/2015 A.Aruna/Assistant professor/IT/SNSCE 1.
Graphics CSCI 343, Fall 2015 Lecture 16 Viewing I
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Computer vision: models, learning and inference M Ahad Multiple Cameras
Determining 3D Structure and Motion of Man-made Objects from Corners.
Single-view geometry Odilon Redon, Cyclops, 1914.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Digital Image Processing CSC331
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Computer vision: geometric models Md. Atiqur Rahman Ahad Based on: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
Computer vision: models, learning and inference
Processing visual information for Computer Vision
Paper – Stephen Se, David Lowe, Jim Little
Epipolar geometry.
Common Classification Tasks
Vehicle Segmentation and Tracking in the Presence of Occlusions
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
The Pinhole Camera Model
Presentation transcript:

Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland Siegwar IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2007 Presenter: Van-Dung Hoang October 12, 2013

Intelligent Systems Lab. 2 Content  Introduction  Camera and LRF model  Bearing angle images  Extrinsic laser-camera calibration  Experiments  Conclusions

Intelligent Systems Lab. 3 Introduction  Proposed new method for determining the position and direction of 3D LRF with respect to camera.  This approach does not require and calibration object (e.g. chessboard).  Laser range will be visualize in 3D range and highlighted edge of object.  Correspondence features of camera image and laser image will be manually selected.  Extrinsic parameters will be discovered by using PnP method.

Intelligent Systems Lab. 4 Camera model  The camera system consists of perspective camera, catadioptric mirror, which has a single center of projection.  Image point- 3D point estimate (u,v) is point in image. [x, y, z] is a ray from center of camera to point in world. is scalar value. F is a project function, it depend on the camera used.

Intelligent Systems Lab. 5 LRF model  3D laser system construct from 2D Laser SICK LMS200.  Combining the rotation of the mirror inside the 2D scanner with the external rotation of the scanner itself.  Impossible to adjust the two centers of rotation exactly on the same point.  These offset values have to be estimated by calibrating the 3D sensor by considering its observation model (other work).  This paper focuses extracting the extrinsic calibration of a 3D laser scanner with a camera is general (does not depend on the sensor model).

Intelligent Systems Lab. 6 LRF model  The sensor model can be written:  where ρ ij is the j-th measured distance with orientation θ j in the i-th scan line, and angle  j (external rotation) with the horizontal plane.  (d x, d z ) is offset of the external rotation axis from the center of the laser mirror.

Intelligent Systems Lab. 7 Bearing angle images  Highlight depth discontinuities and direction changes in range image so that the user can easily find the corresponding points of the camera image points.  Creation depth image from laser range (a)  Edge detection (b) use Sobel  Only consider the depth between two adjacent points significantly changes.  This work don’t consider the surface direction (normal vectors)  Highlight details of surface along some specific direction (vertical, horizontal,…)

Intelligent Systems Lab. 8 Bearing angle images  Bearing Angle (BA) the angle between the laser beam and the segment joining two consecutive measurement points (a) Where ρ i is the i- th depth value in the selected trace of the depth matrix and d  is the corresponding angle increment.  Constructing BA image.

Intelligent Systems Lab. 9 Bearing angle images PiPi P i-1 pipi p i-1

Intelligent Systems Lab. 10 Bearing angle images

Intelligent Systems Lab. 11 Extrinsic laser-camera calibration  Data process:  Collecting data and image.  Computing BA images  Manually select correspondence points between BA image and intensity image.  Store correspondence points where θ C and θ L are unit norm orientation vectors of camera and laser points d L is the point distances in laser frame.

Intelligent Systems Lab. 12 Extrinsic laser-camera calibration

Intelligent Systems Lab. 13 Extrinsic laser-camera calibration  Algorithm for discovering: rotation and translation parameters.  The extrinsic parameters of transformation between the camera and LRF are determined from known corresponding 3D points. It is solved by using the P3P method

Intelligent Systems Lab. 14 Extrinsic calibration extraction

Intelligent Systems Lab. 15 Extrinsic calibration extraction  The rotation matrix R =X if det(X)=1, otherwise for failure solution.  Step 4: Translation:

Intelligent Systems Lab. 16 Experimental results  Setup the system:  Camera SONY XCD-SX910-CR  Mirror: KAIDAN 360 One VR hyperbolic  Laser SICK LMS 200.  FOV 180 O, resolution 0.5 O  Rotating scanner.  FOV 360 O, resolution 1 O

Intelligent Systems Lab. 17 Experimental results Estimation of the translation (meters) versus the number of selected points. Estimation of the rotation (roll, pitch, and yaw angles) versus the number of selected points (the x-axis ranges from 4 to 10).

Intelligent Systems Lab. 18 Experimental results  Re-projection laser point onto intensity image

Intelligent Systems Lab. 19 Experimental results  Construction 3D point cloud from laser points and vision points.

Intelligent Systems Lab. 20 Conclusions  The method uses only a few correspondent points that manually selected by the user from natural scene.  No calibration patterns are required, nor more than one single laser- camera acquisition necessary.  The proposed method relies on a novel technique to visualize the range information obtained from a 3D laser scanner.  The BA images and the application of the method to an omni camera were the two main contributions of the paper.  Proposed approach requires no special equipment and allows the user to calibrate quickly the system.

Intelligent Systems Lab. THANK YOU FOR ATTENTION!

Intelligent Systems Lab. 22 Project function  This paper the camera coordinate system coincides with the single effective viewpoint.  The x-y plane is orthogonal to the mirror axis. The distance d between focal points of conic and the latus rectum l.