3D SLAM for Omni-directional Camera

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Probabilistic Robotics
Probabilistic Robotics
Automatic Feature Extraction for Multi-view 3D Face Recognition
Probabilistic Robotics
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
Silvina Rybnikov Supervisors: Prof. Ilan Shimshoni and Prof. Ehud Rivlin HomePage:
(Includes references to Brian Clipp
Robot Localization Using Bayesian Methods
IR Lab, 16th Oct 2007 Zeyn Saigol
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Probabilistic Robotics
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Reliable Range based Localization and SLAM Joseph Djugash Masters Student Presenting work done by: Sanjiv Singh, George Kantor, Peter Corke and Derek Kurth.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Multiple Human Objects Tracking in Crowded Scenes Yao-Te Tsai, Huang-Chia Shih, and Chung-Lin Huang Dept. of EE, NTHU International Conference on Pattern.
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Probabilistic Robotics
Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Robust estimation Problem: we want to determine the displacement (u,v) between pairs of images. We are given 100 points with a correlation score computed.
SLAM: Simultaneous Localization and Mapping: Part II BY TIM BAILEY AND HUGH DURRANT-WHYTE Presented by Chang Young Kim These slides are based on: Probabilistic.
Overview and Mathematics Bjoern Griesbach
ROBOT MAPPING AND EKF SLAM
EKF and UKF Day EKF and RoboCup Soccer simulation of localization using EKF and 6 landmarks (with known correspondences) robot travels in a circular.
Kalman filter and SLAM problem
/09/dji-phantom-crashes-into- canadian-lake/
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
Young Ki Baik, Computer Vision Lab.
Karman filter and attitude estimation Lin Zhong ELEC424, Fall 2010.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Visual Odometry David Nister, CVPR 2004
Extended Kalman Filter
SLAM Tutorial (Part I) Marios Xanthidis.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Robust Localization Kalman Filter & LADAR Scans
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
SLAM : Simultaneous Localization and Mapping
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
Using Sensor Data Effectively
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Simultaneous Localization and Mapping
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
Probabilistic Robotics
Extended Kalman Filter
Extended Kalman Filter
Presentation transcript:

3D SLAM for Omni-directional Camera Yuttana Suttasupa Advisor: Asst.Prof. Attawith Sudsang

Introduction Localization Mapping SLAM Robot can estimate its location with respect to landmarks in an environment Mapping Robot can reconstruct the position of landmarks that its encounter in an environment SLAM Robot build up a map and localize itself simultaneously while traversing in an unknown environment

The Problem Propose SLAM method for a hand-held omni-directional Camera Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model Using only bearing data from omni-images and no need any initialize information Reconstruct 3D camera path and 3D environment map (landmark-based)

The Problem Input a captured image sequence from an omni-directional camera

The Problem Output a camera state - 3D position and direction an environment map - 3D landmark positions

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Omni-directional Camera Our omni-directional camera Two parabolic mirrors CCD camera with 640×480 pixel @ 29.97 Hz 360° horizontal field of view -5° to 65° vertical field of view

Omni-directional Camera Normal camera Omni-directional camera Motivation ที่ใช้กล้องออมนิ คนอื่นไม่ทำ

Omni camera Calibration Find a mapping function from 2D image to 3D object Using Omnidirectional Camera Calibration Toolbox (Scaramuzza et al., 2006)

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

EKF SLAM Using extended kalman filter to solve SLAM problem Assume a robot position and a map probability distributions as gaussian distributions Predict a robot position and landmarks distributions using a robot motion model Correct the distributions using an observation model

EKF SLAM The distribution representation state Initial state Assume a robot position distribution with some value at the initial state state covariance robot probability distribution

EKF SLAM Predict state Using a robot motion model to predict a robot position robot รายละเอียดอยู่ในตัวเล่ม รู้ model หรือเปล่า Predicted state Predicted estimate covariance

EKF SLAM Correction state Using an observation model to update a robot position and landmark positions robot landmark measurement Observation model Innovation residual Updated state estimate adjustment Updated estimate covariance Innovation covariance Optimal Kalman gain

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Introduction to Problem Feature detection Problem How a computer recognizes objects from an image Feature association Problem How can we find feature relations between two images

Introduction to Problem Observability Problem A camera given only a bearing-only data How can we estimate a high dimensional state with low dimensional measurements Landmark How far is it? Camera

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Solution to Problem The proposed algorithm includes 3 steps Image Processing Detect features Find feature associations Calculate feature measurements SLAM Apply measurement data to SLAM Features and reference frames management Add and remove features from SLAM state Add and remove reference frames from SLAM state

Solution to Problem System Coordinate World Frame Camera Frame Reference Frames landmark Camera Frame World Frame Reference Frame

Solution to Problem SLAM State Camera state – represent camera frame Reference frame states – represent reference frames Landmark states – represent landmark positions

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Image Processing Input Output an image from an omni-directional camera Old SLAM state Output Feature measurement Feature association

Image Processing Feature detection (for new features) Using point features Finds corners in the image using harris corner detector

Image Processing Feature associations Describe which landmark that the feature in current image is associated to Find the relation between a current image and old features in an old image Using optical flow to track features Using template matching to refine a feature position

Image Processing Feature associations – features tracking Tracking features from a previous image to get a current features position Using pyramid Lucas-Kanade optical flow

Image Processing Feature associations – feature positions refinement Track features using optical flow may cause a feature drift Using pyramid template matching to correct a feature position search region feature patch current image with a drifted feature result after refinement using template matching

Image Processing Feature associations – feature position refinements Select patch from a reference image Patch rotation and scale may not match Transform function may need to apply to patch Not match Reference image Match Current image

Image Processing Feature associations – feature position refinements Find transform function by project 3D patch creating from a current image to a reference image 3D patch Image sphere Current image Reference image

Image Processing Find transform function Project every patch pixel may lead to a computational cost problem Use perspective transform as a transform function instead Need 4 project points to calculate a perspective transform function Real distortion Perspective distortion

Image Processing Feature associations – example

Image Processing Feature measurements Using feature points in omni-image to be a measurement data Feature points must be converted into bearing-only measurement in the form of yaw and pitch angles z landmark Ray (r) y ใส่ รูปที่มี rx ry ด้วย x

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Simultaneous localization and mapping (SLAM) Using EKF SLAM to estimate Camera state, Reference frame states and Landmark states Prediction Determine how a camera move Find state transition model (camera motion model) Correction How to measurement a landmark Find observation model

Simultaneous localization and mapping (SLAM) Input Measurement data from omni-image Output Estimated SLAM state Camera state Reference frame states Landmark states

Simultaneous localization and mapping (SLAM) Prediction Determine how a camera move But the camera motion is unpredictable Assume that a camera can move freely in any direction with some limit velocity Before predict After predict

Simultaneous localization and mapping (SLAM) Correction Using a bunch of measurement (include current measurements data and old measurements data at reference frame) to update SLAM state landmark Reference frame Current camera Reference frame

Simultaneous localization and mapping (SLAM) Correction Measurement data for landmark i Observation model for each measurement y' landmark is a landmark position in X coordinate

Simultaneous localization and mapping (SLAM) Correction step can separate in 2 parts Camera and reference frames Correction Landmarks Correction

Simultaneous localization and mapping (SLAM) Camera and reference frames Correction Assume that the measurement data can measurement landmark positions accurately The correction affects only a camera state and reference frame states Before correction After correction

Simultaneous localization and mapping (SLAM) Landmarks Correction Assume that the camera state is accurate The correction affects only landmark states Before correction After correction

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Features and reference frames management Remove features Feature points is out of image bound The landmark position is not accurate enough the feature of this landmark is out of bound

Features and reference frames management Add features Add new features using harris corner detector to detect a new feature Add new features when we have a new reference frame Add old features Consider that the old landmark may be appear in the omni image again เมื่อไหร่จะเพิ่ม reference frame เมื่อไหร่

Features and reference frames management Add new features Add new landmarks to SLAM state Estimate a landmark position by assume a large variance for a range data เมื่อไหร่จะเพิ่ม reference frame เมื่อไหร่

Features and reference frames management Add old features project an old landmark to the current image check if a feature available in the image using template matching landmark feature Image sphere

Features and reference frames management Add reference frame When no suitable reference frames for feature tracking When landmark number is below some threshold Select a current camera state as a new reference frame

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Experimental Results

Experimental Results

Experimental Results

Experimental Results

Experimental Results

Experimental Results

Outline Introduction Omni-directional Camera EKF-SLAM Introduction to Problem Solution to Problem Image Processing SLAM Features and reference frames management Experimental Results Result Evaluation

Result Evaluation Localization Evaluation Mapping Evaluation 2D localization evaluation 3D localization evaluation Mapping Evaluation

Result Evaluation 2D Localization Evaluation Using wiimote as a bird eye view camera Detect IR point on the omni camera while traversing in 2D plane by a mobile robot IR point

Result Evaluation 2D Localization Evaluation

Result Evaluation 2D Localization Evaluation

Result Evaluation 2D Localization Evaluation

Result Evaluation 3D Localization Evaluation Using wiimote attach with an omni-directional camera to localize the 3D camera position related to reference IR board

Result Evaluation 3D Localization Evaluation

Result Evaluation 3D Localization Evaluation

Result Evaluation 3D Localization Evaluation

Result Evaluation Mapping Evaluation Compare the mapping result with known structure environment

Result Evaluation Mapping Evaluation

Conclusion Summary Our algorithm can localize camera position and build up a map in 3D using only a omni-camera image Omni-directional camera move freely in an unknown indoor environment without knowing of camera motion model Evaluation Result shows the correspondence of the localization and mapping outcome with the ground truth

Thank you

Statistic Location Conf. room Corridor Stairway Max feature count 38 36 28 Min feature count 14 13 10 Avg. Feature count 23.9279 20.4866 13.7657 Max landmark count 43 65 Min landmark count 25 Avg. Landmark count 33.1654 45.0207 19.7684 Max time per frame (ms) 1019.73 1424.87 703.188 Min time per frame (ms) 34.9818 19.657 15.1698 Avg. time per frame (ms) 160.925 479.858 172.845 Frame count 804 1157 747

Visual SLAM for 3D Large-Scale Seabed Acquisition Employing Underwater Vehicles

Featureless Vehicle-Based Visual SLAM with a Consumer Camera

Scan-SLAM: Combining EKF-SLAM and Scan Correlation

The Problem Input Output a captured image sequence from an omni-directional camera Output a camera state 3D position and direction an environment map 3D landmark positions

Omni camera Calibration

Omni camera Calibration

Image Processing Feature detection (for new feature) Using point features Finds corners in the image using harris corner detector

Simultaneous localization and mapping (SLAM) Prediction Determine how a camera move But the camera motion is unpredictable Assume that a camera can move freely in any direction with some limit velocity Predicted State Predicted estimate covariance

Simultaneous localization and mapping (SLAM) Correction Measurement data for landmark i Observation model for landmark i y' while is transform function which transform a landmark position (y) from world coordinate to reference coordinate (x)

Simultaneous localization and mapping (SLAM) Camera and reference frames Correction Assume that the measurement data can measurement landmark positions accurately The correction affects only a camera state and reference frame states Innovation or measurement residual Innovation (or residual) covariance Optimal Kalman gain Update state estimate Update estimate covariance is Jacobian Matrix of function h at is Jacobian Matrix of function h at

Simultaneous localization and mapping (SLAM) Landmarks Correction Assume that the camera state is accurate The correction affects only landmark states Innovation or measurement residual Innovation (or residual) covariance Optimal Kalman gain Update state estimate Update estimate covariance is Jacobian Matrix of function h at is Jacobian Matrix of function h at

Solution to Problem SLAM State Camera state – represent camera frame Reference frame states – represent reference frames Landmark states – represent landmark positions

Features and reference frames management Select reference frame for feature tracking x y z x y z reference frame camera frame

Features and reference frames management Select reference frame for update SLAM state x y z x y z reference frame camera frame

Feature ray pitch yaw landmark x y z camera