Visual SLAM Visual SLAM SPL Seminar 2008. 8. 1 (Fri) Young Ki Baik Computer Vision Lab.

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Discussion topics SLAM overview Range and Odometry data Landmarks
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
Silvina Rybnikov Supervisors: Prof. Ilan Shimshoni and Prof. Ehud Rivlin HomePage:
(Includes references to Brian Clipp
Vision Based Control Motion Matt Baker Kevin VanDyke.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
Robot Localization Using Bayesian Methods
IR Lab, 16th Oct 2007 Zeyn Saigol
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Multiple People Detection and Tracking with Occlusion Presenter: Feifei Huo Supervisor: Dr. Emile A. Hendriks Dr. A. H. J. Stijn Oomes Information and.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Robotic Mapping: A Survey Sebastian Thrun, 2002 Presentation by David Black-Schaffer and Kristof Richmond.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Probabilistic Robotics
Robust Lane Detection and Tracking
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Goal: Fast and Robust Velocity Estimation P1P1 P2P2 P3P3 P4P4 Our Approach: Alignment Probability ●Spatial Distance ●Color Distance (if available) ●Probability.
Overview and Mathematics Bjoern Griesbach
Kalman filter and SLAM problem
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Markov Localization & Bayes Filtering
Olga Zoidi, Anastasios Tefas, Member, IEEE Ioannis Pitas, Fellow, IEEE
/09/dji-phantom-crashes-into- canadian-lake/
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
3D SLAM for Omni-directional Camera
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
(Wed) Young Ki Baik Computer Vision Lab.
Probabilistic Robotics Robot Localization. 2 Localization Given Map of the environment. Sequence of sensor measurements. Wanted Estimate of the robot’s.
Dynamic 3D Scene Analysis from a Moving Vehicle Young Ki Baik (CV Lab.) (Wed)
CSCE 643 Computer Vision: Structure from Motion
Young Ki Baik, Computer Vision Lab.
Particle Filters.
Computer Vision Lab Seoul National University Keyframe-Based Real-Time Camera Tracking Young Ki BAIK Vision seminar : Mar Computer Vision Lab.
Mobile Robot Localization (ch. 7)
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
Crowd Analysis at Mass Transit Sites Prahlad Kilambi, Osama Masound, and Nikolaos Papanikolopoulos University of Minnesota Proceedings of IEEE ITSC 2006.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
1 Assignment, Project and Presentation Mobile Robot Localization by using Particle Filter by Chong Wang, Chong Fu, and Guanghui Luo. Tracking, Mapping.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
Looking at people and Image-based Localisation Roberto Cipolla Department of Engineering Research team
Bundle Adjustment A Modern Synthesis Bill Triggs, Philip McLauchlan, Richard Hartley and Andrew Fitzgibbon Presentation by Marios Xanthidis 5 th of No.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
SLAM Techniques -Venkata satya jayanth Vuddagiri 1.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
Processing visual information for Computer Vision
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Simultaneous Localization and Mapping
Lecture 10 Causal Estimation of 3D Structure and Motion
Online Graph-Based Tracking
Probabilistic Map Based Localization
Principle of Bayesian Robot Localization.
Introduction to Object Tracking
Deblurring Shaken and Partially Saturated Images
Nome Sobrenome. Time time time time time time..
Probabilistic Robotics Bayes Filter Implementations FastSLAM
Presentation transcript:

Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.

Outline  What is SLAM?  What is Visual SLAM?  Overall process and Problem  Advances and challenges  Conclusion

What is SLAM?   SLAM : Simultaneous Localization and Mapping is a technique used by robots and autonomous vehicles to build up a map within an unknown environment while at the same time keeping track of their current position. Where am I ? Map building Observation

What is SLAM?   SLAM : Simultaneous Localization and Mapping basically uses some statistical techniques based on recursive Bayesian estimation such as Kalman filters and particle filters (aka. Monte Carlo methods).

What is Visual SLAM?   SLAM : Simultaneous Localization and Mapping can use many different types of sensor to acquire observation data used in building the map such as laser rangefinders, sonar sensors and cameras. Visual SLAM - is to use cameras as a sensor.

Why Visual SLAM?  Vision data can inform us more meaningful information (such as color, texture, shape…) relative to other sensors.

Overall process of Visual SLAM Map management Map management Measurement Initialization Prediction Update

Visual SLAM DEMO Mono-slam

Problems   Proposal   Data association   Filter   Map management   Real-time

Proposal   Odometry Most case of localization system, odometry information is provided as a prior knowledge to predict poses. Use Encoder to Distance & Angle Estimation in Forward movement case of vehicles. Continuously Update Distance & Angle Change + Odometry info. Left Encoder Distance Right Encoder Distance Angle Change Distance Change

Proposal   But… Hand held camera system (monocular camera)does not provide odometry. t t+1 ?

Tricks   Fixed Position, Velocity assumption   Localization can be failed when abrupt motion and/or sudden change are occurred.   The Solution still have to be found!!! P t+1 =P t +N P t+1 =P t + ∇ t( V t +N )

Data association   What is data association? To find matches between observed data and map. ?

Data association   For data association, at the beginning… Small (e.g. 11x11) image patches around salient points to represent features. Normalized Cross Correlation(NCC) to match features. Small patches + accurate search regions lead to fast pose estimation.

Data association   However Simple patches are insufficient for large view point or scale variantions. Small patches help speed but prone to mismatch. Search regions can’t always be trusted. (camera occlusion, motion blur)

Data association   A solution Use a richer feature descriptor such as SIFT-like descriptors. SIFT descriptor

Data association DEMO SLAM - SIFT vs NCC

Data association DEMO SLAM - SIFT

Data association   A solution Use other information. such as edge, line, contour, etc.

Data association   A good solution Is not yet found satisfying richer information and fast processing speed simultaneously. Richer descriptor Speed down

Which filter is appropriate?   Extended Kalman Filter (EKF) Optimal Minimum Mean Square Error (MMSE) estimator of the state by approximating 1 st order Talyor expansion. Uncented Kalman Filter (UKF) : 2 nd order Talyor approximation   Particle Filter (PF) Based on Sequential Monte Carlo.

EKF vs PF EKF Estimate 1 st order Talyor approximation. Scale of map Square of number of features Particle Filter Does not limit order of TA. Scale of map The increase is linear in relation to number of particles. N features x N features N particles x N features PF based approach is regarded more robust and faster than EKF based approach.

What filter appropriate to use? DEMO PF-based mono-SLAM

What filter appropriate to use? DEMO PF-based mono-SLAM

Map management   Scale of map Linearly increased in relation to number of features. Partitioned map (or grid map) is proposed as a solution.

Map management   A solution Log-likely increasing method in relation to number of features or particles (still not be found). scale N of features or particles

Is process able to work on-line?   SLAM problem have to be solved in real time. Fast Algorithm Parallel processing (SIMD, GPGPU, etc.) Speed!!! What’s needed for right now???

Some Challenges   Deal with large maps   Use different feature kinds on an informed way.   Benefit from other approaches such as SFM but keep efficiency.   Incorporate semantics and beyond- geometric scene.

Conclusion   SLAM problem is mathematically solved.   Computer vision implementation still be improving: Feature description Kind of features Parameterization and robustness

Q&AQ&A