Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) 2005. 9. 26 Young Ki Baik Computer Vision Lab. Seoul National University.

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

EKF, UKF TexPoint fonts used in EMF.
Discussion topics SLAM overview Range and Odometry data Landmarks
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Probabilistic Robotics
Probabilistic Robotics
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
(Includes references to Brian Clipp
Robot Localization Using Bayesian Methods
IR Lab, 16th Oct 2007 Zeyn Saigol
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Paper Discussion: “Simultaneous Localization and Environmental Mapping with a Sensor Network”, Marinakis et. al. ICRA 2011.
Active SLAM in Structured Environments Cindy Leung, Shoudong Huang and Gamini Dissanayake Presented by: Arvind Pereira for the CS-599 – Sequential Decision.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Probabilistic Robotics: Motion Model/EKF Localization
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang
Single Point of Contact Manipulation of Unknown Objects Stuart Anderson Advisor: Reid Simmons School of Computer Science Carnegie Mellon University.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Computer Vision Linear Tracking Jan-Michael Frahm COMP 256 Some slides from Welch & Bishop.
Overview and Mathematics Bjoern Griesbach
ROBOT MAPPING AND EKF SLAM
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
EKF and UKF Day EKF and RoboCup Soccer simulation of localization using EKF and 6 landmarks (with known correspondences) robot travels in a circular.
Kalman filter and SLAM problem
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Mobile Robot controlled by Kalman Filter
SLAM (Simultaneously Localization and Mapping)
/09/dji-phantom-crashes-into- canadian-lake/
Computer vision: models, learning and inference Chapter 19 Temporal models.
3D SLAM for Omni-directional Camera
(Wed) Young Ki Baik Computer Vision Lab.
The Kalman Filter ECE 7251: Spring 2004 Lecture 17 2/16/04
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Young Ki Baik, Computer Vision Lab.
Human-Computer Interaction Kalman Filter Hanyang University Jong-Il Park.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
CSE-473 Mobile Robot Mapping. Mapping with Raw Odometry.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
An Introduction To The Kalman Filter By, Santhosh Kumar.
Visual Odometry David Nister, CVPR 2004
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
Tracking with dynamics
By: Aaron Dyreson Supervising Professor: Dr. Ioannis Schizas
SLAM Tutorial (Part I) Marios Xanthidis.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
SLAM Techniques -Venkata satya jayanth Vuddagiri 1.
VANET – Stochastic Path Prediction Motivations Route Discovery Safety Warning Accident Notifications Strong Deceleration of Tra ffi c Flow Road Hazards.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
SLAM : Simultaneous Localization and Mapping
Using Sensor Data Effectively
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Paper – Stephen Se, David Lowe, Jim Little
Simultaneous Localization and Mapping
A Short Introduction to the Bayes Filter and Related Models
Motion Models (cont) 2/16/2019.
Kalman Filtering COS 323.
Presentation transcript:

Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) 2005. 9. 26 Young Ki Baik Computer Vision Lab. Seoul National University

Contents References Kalman filter and SLAM Mono-SLAM Simulation Demo Conclusion

References Real-Time Simultaneous Localisation and Mapping with a Single Camera Andrew J. Davison (ICCV 2003) A Solution to the Simultaneous Localization and Map Building (SLAM) problem Gamini Dissanayake. Et. Al. (IEEE Trans. Robotics and Automation 2001) An Introduction to the Kalman Filter G. Welch and G. Bishop (SIGGRAPH 2001) Site for Quaternion http://www.euclideanspace.com/maths/geometry/rotations

Kalman filter What is a Kalman filter? Applications Mathematical power tool Optimal recursive data processing algorithm Noise effect minimization Applications Tracking (head, hands etc.) Lip motion from video sequences of speakers Fitting spline Navigation Lot’s of computer vision problem

Kalman filter Kalman filter Example Sensor noise Measurement error Landmark Kalman filter How can we obtain optimal pose of robot and landmark simultaneously? Real location Robot Location with error Movement noise Refined location Localizing error (Processing error)

Kalman filter Example (Simple Gaussian form) Assumption All error form Gaussian noise Estimated value Measurement value

Kalman filter Example (Simple Gaussian form) Optimal variance Optimal value Innovation Kalman gain

SLAM SLAM If we have the solution to the SLAM problem… Simultaneously Localization and map building system EKF(Extended Kalman filter)-based framework If we have the solution to the SLAM problem… Allow robots to operate in an environment without a priori knowledge of a map Open up a vast range of potential application for autonomous vehicles and robot Research over the last decade has shown that SLAM is indeed possible

SLAM Kalman filter and SLAM problem Extended Kalman filter form for SLAM Prediction Observation Update : Previous value : Input and measure : Function : Computed value

Mono SLAM ? What is Mono SLAM? User input EKF-SLAM framework (EKF : Extended Kalman Filter) Single camera Unknown user input User input Known control input Encoder information of robot or vehicle (odometry) ? Most case of localization system, odometry information is used as initial moving value. Mono- slam don't use odometry information and it can be new feature.

Mono SLAM World frame model Camera frame R W : World coordinate World Frame W R : Local coordinate r r : Camera position vector in W frame y : Landmark position vector in W frame y

Mono SLAM Motion model 3D position and orientation This state vector is parameters for conventional SLAM .

Mono SLAM Motion model Key difference between Mono- and conventional-SLAM In the robot case, there is in possession of the control inputs driving the motion, such as “moving forward 1m with steering angle 5 degree” In the hand helded camera case, we do not have such prior information about a person’s movement. Assumption (Mono SLAM) In the case of a camera attached to a person, it takes account of the unknown intentions of the person, but these too can be statistically modeled. Constant velocity, constant angular velocity model are chosen as initial value and added undetermined accelerations occur with a Gaussian profile.

The total dimension of state vector is 13. Mono SLAM Motion model (Mono SLAM) 3D position and orientation The total dimension of state vector is 13.

Mono SLAM Motion model (Mono SLAM) Unknown user input (or noise vector) In each time step, unknown acceleration and angular acceleration processes of zero mean and Gaussian distribution.

Mono SLAM Motion model (Mono SLAM) State update function Quaternion trivially defined by the angle-axis rotation vector : Previous value : Unknown user input

Mono SLAM Motion model (Mono SLAM) Covariance update In the EKF, the new state estimate must be accompanied by the increase in state uncertainty (process noise covariance) for the camera after this motion. Qv is found via the Jacobian calculation

Mono SLAM Motion model (Mono SLAM) Covariance of noise vector The rate of growth of uncertainty in this motion model is determined by the size of , and setting these parameters to small or large values defines the smoothness of the motion we expect. small - We expect a very smooth motion with small accelerations, and would be well placed to track motion but unable to cope with sudden rapid movements High - The uncertainty in the system increases significantly at each time step. - This can be cope with rapid accelerations.

Simulation Demo Condition 3D view 2D view Simple circular motion Five 3D landmarks Observation is 2D using projective camera model 3D view 2D view Estimated LM Real LM Estimated Pos & Covariance Projected LM

Conclusion Conclusion Simulation result Localization is possible with out control input. Simulation result 3D position can be estimated using SLAM through the projected landmark information. It needs more debuging for perfect simulation.

Mono SLAM Quaternion form for orientation (or rotation) Eular angle Arbitrary 3D rotation is equal to one rotation (by scalar angle) around an axis. The result of any sequence of rotation is equal to a single rotation around an axis. 3 degree of freedom in 3D space Gimbal lock problem

Mono SLAM Quaternion form for orientation (or rotation) Axis angle Arbitrary 3D rotation composed by 3-d unit vector and 1-d angle value 4 degree of freedom in 3D space Singularity problem

Mono SLAM Quaternion form for orientation (or rotation) Quaternion angle Arbitrary 3D rotation composed by 3-d unit vector and 1-d angle value 4 degree of freedom in 3D space Why quaternion? Simpler algebra Easy to fix numerical error No singularity and Gimbal lock problem