Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Active Appearance Models
Advanced Mobile Robotics
Odometry Error Modeling Three noble methods to model random odometry error.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Probabilistic Robotics
May 16, 2015 Sparse Surface Adjustment M. Ruhnke, R. Kümmerle, G. Grisetti, W. Burgard.
(Includes references to Brian Clipp
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
Chapter 6 Feature-based alignment Advanced Computer Vision.
IR Lab, 16th Oct 2007 Zeyn Saigol
Using Perception for mobile robot. 2D ranging for mobile robot.
Probabilistic Robotics
Simultaneous Localization and Mapping
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Laser Scan Matching in Polar Coordinates with Application to SLAM
Sam Pfister, Stergios Roumeliotis, Joel Burdick
3D Mapping Robots Intelligent Robotics School of Computer Science Jeremy Wyatt James Walker.
Stereoscopic Light Stripe Scanning: Interference Rejection, Error Minimization and Calibration By: Geoffrey Taylor Lindsay Kleeman Presented by: Ali Agha.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
Probabilistic Robotics
Università La Sapienza Rome, Italy Scan matching in the Hough domain Andrea Censi, Luca Iocchi, Giorgio Grisetti dis.uniroma1.it
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
1 Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Joseph Djugash Sanjiv Singh George Kantor Wei Zhang Carnegie Mellon University.
Bootstrapping a Heteroscedastic Regression Model with Application to 3D Rigid Motion Evaluation Bogdan Matei Peter Meer Electrical and Computer Engineering.
Overview and Mathematics Bjoern Griesbach
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
EE887 Special Topics in Robotics Paper Review Initial Results in the Development Guidance System of a Guidance System for a Powered Wheelchair
Kalman filter and SLAM problem
Itamar Kahn, Thomas Lin, Yuval Mazor
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.
Probability in Robotics
3D Fingertip and Palm Tracking in Depth Image Sequences
Mutual Information-based Stereo Matching Combined with SIFT Descriptor in Log-chromaticity Color Space Yong Seok Heo, Kyoung Mu Lee, and Sang Uk Lee.
1/20 Obtaining Shape from Scanning Electron Microscope Using Hopfield Neural Network Yuji Iwahori 1, Haruki Kawanaka 1, Shinji Fukui 2 and Kenji Funahashi.
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
An Introduction to Mobile Robotics CSE350/ Sensor Systems (Continued) 2 Sep 03.
Quality Assessment for LIDAR Point Cloud Registration using In-Situ Conjugate Features Jen-Yu Han 1, Hui-Ping Tserng 1, Chih-Ting Lin 2 1 Department of.
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
ECGR4161/5196 – July 26, 2011 Read Chapter 5 Exam 2 contents: Labs 0, 1, 2, 3, 4, 6 Homework 1, 2, 3, 4, 5 Book Chapters 1, 2, 3, 4, 5 All class notes.
Scientific Writing Abstract Writing. Why ? Most important part of the paper Number of Readers ! Make people read your work. Sell your work. Make your.
Young Ki Baik, Computer Vision Lab.
Generalized Hough Transform
Automatic Minirhizotron Root Image Analysis Using Two-Dimensional Matched Filtering and Local Entropy Thresholding Presented by Guang Zeng.
1 Howard Schultz, Edward M. Riseman, Frank R. Stolle Computer Science Department University of Massachusetts, USA Dong-Min Woo School of Electrical Engineering.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
State Estimation for Autonomous Vehicles
Autonomous Navigation for Flying Robots Lecture 6.3: EKF Example
Probability in Robotics Trends in Robotics Research Reactive Paradigm (mid-80’s) no models relies heavily on good sensing Probabilistic Robotics (since.
Globally Consistent Range Scan Alignment for Environment Mapping F. LU ∗ AND E. MILIOS Department of Computer Science, York University, North York, Ontario,
Probability in Robotics Trends in Robotics Research Reactive Paradigm (mid-80’s) no models relies heavily on good sensing Probabilistic Robotics (since.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Stereo Vision Local Map Alignment for Robot Environment Mapping Computer Vision Center Dept. Ciències de la Computació UAB Ricardo Toledo Morales (CVC)
Reflectance Function Estimation and Shape Recovery from Image Sequence of a Rotating object Jiping Lu, Jim Little UBC Computer Science ICCV ’ 95.
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
Robust Localization Kalman Filter & LADAR Scans
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Using Sensor Data Effectively
Paper – Stephen Se, David Lowe, Jim Little
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Probabilistic Robotics
Principle of Bayesian Robot Localization.
Probability in Robotics
Presentation transcript:

Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical Engineering, California Institute of Technology Overview: Motivation Problem Formulation Experimental Results Conclusion, Future Work

Mobile Robot Localization Proprioceptive Sensors: ( Encoders, IMU ) - Odometry, Dead reckoning Exteroceptive Sensors: ( Laser, Camera ) - Global, Local Correlation Scan-Matching Scan 1Scan 2 Iterate Displacement Estimate Initial Guess Point Correspondence Scan-Matching Correlate range measurements to estimate displacement Can improve (or even replace) odometry – Roumeliotis TAI-14 Previous Work - Vision community and Lu & Milios [97]

1 m x500 Weighted Approach Explicit models of uncertainty & noise sources for each scan point: Sensor noise & errors Range noise Angular uncertainty Bias Point correspondence uncertainty Correspondence Errors Improvement vs. unweighted method: More accurate displacement estimate More realistic covariance estimate Increased robustness to initial conditions Improved convergence Combined Uncertanties

Weighted Formulation Error between k th scan point pair Measured range data from poses i and j sensor noise Goal: Estimate displacement (p ij,  ij ) bias true range = rotation of  ij Correspondence Error Noise Error Bias Error

LikLik  ll 1)Sensor Noise Covariance of Error Estimate Covariance of error between k th scan point pair = 2)Sensor Bias neglect for now see paper for details Pose i Correspondence Sensor Noise Bias

3)Correspondence Error = c ij k Estimate bounds of c ij k from the geometry of the boundary and robot poses Assume uniform distribution Max error where

Finding incidence angles  i k and  j k Hough Transform -Fits lines to range data -Local incidence angle estimated from line tangent and scan angle -Common technique in vision community (Duda & Hart [72]) -Can be extended to fit simple curves Scan Points Fit Lines ikik

Likelihood of obtaining errors {  ij k } given displacement Maximum Likelihood Estimation Position displacement estimate obtained in closed form Orientation estimate found using 1-D numerical optimization, or series expansion approximation methods Non-linear Optimization Problem

Experimental Results Increased robustness to inaccurate initial displacement guesses Fewer iterations for convergence Weighted vs. Unweighted matching of two poses 512 trials with different initial displacements within : +/- 15 degrees of actual angular displacement +/- 150 mm of actual spatial displacement Initial Displacements Unweighted Estimates Weighted Estimates

Unweighted Weighted

Displacement estimate errors at end of path Odometry = 950mm Unweighted = 490mm Weighted = 120mm Eight-step, 22 meter path More accurate covariance estimate - Improved knowledge of measurement uncertainty - Better fusion with other sensors

Conclusions and Future Work Developed general approach to incorporate uncertainty into scan-match displacement estimates. range sensor error models novel correspondence error modeling Method can likely be extended to other range sensors (stereo cameras, radar, ultrasound, etc.) requires some specific sensor error models Showed that accurate error modelling can significantly improve displacement and covariance estimates as well as robustness Future Work: Weighted correspondence for 3D feature matching

Conclusions and Future Work Developed general approach to incorporate uncertainty into scan-match displacement estimates. range sensor error models novel correspondence error modeling Method can likely be extended to other range sensors (stereo cameras, radar, ultrasound, etc.) requires some specific sensor error models Showed that accurate error modelling can significantly improve displacement and covariance estimates as well as robustness Future Work: Weighted correspondence for 3D feature matching

Uncertainty From Sensor Noise and Correspondence Error 1 m x500