LECTURE 6 Segment-based Localization. Position Measurement Systems The problem of Mobile Robot Navigation: Where am I? Where am I going? How should I.

Slides:



Advertisements
Similar presentations
Principles of Density Estimation
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
M 1 and M 2 – Masses of the two objects [kg] G – Universal gravitational constant G = 6.67x N m 2 /kg 2 or G = 3.439x10 -8 ft 4 /(lb s 4 ) r – distance.
KinectFusion: Real-Time Dense Surface Mapping and Tracking
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
Odometry Error Detection & Correction - Sudhan Kanitkar.
3D Shape Histograms for Similarity Search and Classification in Spatial Databases. Mihael Ankerst,Gabi Kastenmuller, Hans-Peter-Kriegel,Thomas Seidl Univ.
Kinematic Modelling in Robotics
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
(Includes references to Brian Clipp
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Section 9.3 The Dot Product
Laser Scan Matching in Polar Coordinates with Application to SLAM
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Sam Pfister, Stergios Roumeliotis, Joel Burdick
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Introduction to Robotics
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2005 with a lot of slides stolen from Steve Seitz and.
Introduction to Robotics Lecture II Alfred Bruckstein Yaniv Altshuler.
Calibration Dorit Moshe.
Topologically Adaptive Stochastic Search I.E. Lagaris & C. Voglis Department of Computer Science University of Ioannina - GREECE IOANNINA ATHENS THESSALONIKI.
Rotations and Translations. Representing a Point 3D A tri-dimensional point A is a reference coordinate system here.
1 Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Joseph Djugash Sanjiv Singh George Kantor Wei Zhang Carnegie Mellon University.
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
Orthogonality and Least Squares
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
DEPARTMENT OF MATHEMATI CS [ YEAR OF ESTABLISHMENT – 1997 ] DEPARTMENT OF MATHEMATICS, CVRCE.
PixelLaser: Range scans from image segmentation Nicole Lesperance ’11 Michael Leece ’11 Steve Matsumoto ’12 Max Korbel ’13 Kenny Lei ’15 Zach Dodds ‘62.
Presented by: Z.G. Huang May 04, 2011 Did You See Bob? Human Localization using Mobile Phones Romit Roy Choudhury Duke University Durham, NC, USA Ionut.
Probability of Error Feature vectors typically have dimensions greater than 50. Classification accuracy depends upon the dimensionality and the amount.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
APT: Accurate Outdoor Pedestrian Tracking with Smartphones TsungYun
MEGN 536 – Computational Biomechanics Euler Angles
15/09/2015handout 31 Robot Kinematics Logics of presentation: Kinematics: what Coordinate system: way to describe motion Relation between two coordinate.
Machine Vision for Robots
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Young Ki Baik, Computer Vision Lab.
Chapter 10 Rotation.
Inertial Navigation System Overview – Mechanization Equation
Manipulator’s Forward kinematics
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Autonomous Robots Robot Path Planning (3) © Manfred Huber 2008.
SLAM Tutorial (Part I) Marios Xanthidis.
CS654: Digital Image Analysis
1 Structural Geology Force and Stress - Mohr Diagrams, Mean and Deviatoric Stress, and the Stress Tensor Lecture 6 – Spring 2016.
MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 6 th, 2001.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
INTRODUCTION TO DYNAMICS ANALYSIS OF ROBOTS (Part 1)
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
DYNAMICS VECTOR MECHANICS FOR ENGINEERS: DYNAMICS Tenth Edition Ferdinand P. Beer E. Russell Johnston, Jr. Phillip J. Cornwell Lecture Notes: Brian P.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Chapter 3 Describing Motion: Kinematics in One Dimension.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
MiniSkybot: Kinematics
Simultaneous Localization and Mapping
Autonomous Cyber-Physical Systems: Sensing
Fitting Curve Models to Edges
A Short Introduction to the Bayes Filter and Related Models
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Presentation transcript:

LECTURE 6 Segment-based Localization

Position Measurement Systems The problem of Mobile Robot Navigation: Where am I? Where am I going? How should I get there? Perhaps the most important result from surveying the vast body of literature on mobile robot positioning is that to date there is no truly elegant solution for the problem (Johann Borenstien, UMich Ann Arbor). The many partial solutions can roughly be categorized into two groups: relative and absolute position measurements.

Classification of Localization Methods Relative Odometry: Uses encoders to measure wheel rotation. Is self contained and is ever ready to provide the vehicle with an estimate of position. Position error grows out of bound Inertial Navigation: Uses gyroscopes and accelerometers to measure rates of rotation and acceleration. Self contained. Unsuitable for accurate positioning over extended periods of time. High manufacturing and equipment cost.

Classification of Localization Methods Absolute Active Beacons: Computes the absolute position of the robot by measuring the direction of incidence of three or more actively transmitted beacons Artificial Landmark Recognition: Distinctive landmarks placed in known locations. Errors are bounded. Computationally intensive and raises questions for persistent real-time position updates

Today’s Lecture Classification of Data Points: How do you classify the newly obtained data point to the segments already present in the map Weighted correction vector: Having classified the data points to segments how to obtain the corrected position of the robot Quality Measures: Performance evaluate the obtained corrected position. i.e. how correct/probable is the corrected position Orientation Correction: Having obtained the corrected position is it possible to obtain the correct orientation of the robot

Classification of Data Points Under the assumption of small position error data points will not usually be too far away from the objects they represent The target line segment of each point is that segment to which the point is closest in an Euclidean sense The closest distance is computed by taking the minimum of the distance of the point to the two end-points of the target segment and the perpendicular distance if the perpendicular distance falls between the two endpoints of the line

Weighted Correction of the Image Points to the Target Let  x i,  y i be the displacement between the image point and the point resulting from its perpendicular projection onto the infinite line passing through the line segment Then d i is the distance between the ith range data point and its target segment computed in the manner specified in previous slide. The sigmoid function introduces a soft non-linearity by ensuring that points close to the target segments have a greater voting strength c(t) = c(0)(1-t/T). In other words the value of c decreases as iterations proceed and less and less points are brought into the correction vector estimate

Weighted Correction of the Image Points to the Target Then x c = x uc +  X, y c = y uc +  Y, where x c, x uc the corrected and uncorrected x component of the robot’s position If the target segments are parallel to one of the two axes of the coordinate frame then the position correction can only be done along the other orthogonal direction. This is called the hallway effect. In other words if the target segment is parallel to x axis then position correction can occur only along y and vice-versa

Quality Measures How correct are our corrections? The mean-squared error measure: Emse =  dist(p i,l i ) 2 /n, where p i is the ith range data point and l i is its corresponding target segment and dist is the closest distance between the two Global minimum of the function occurs at the true position of the robot. Hence higher Emse lesser is the probability that the corrected position is the true position. Emse is susceptible to outliers

Quality Measures Classification Factor: Here c is the neighborhood size, m = 2 is the steepness of the sigmoid, d=dist(p i,l i ). Higher the classification factor, higher is the probability that the corrected position represents the true position of the robot. Classification factor peaks at the true position of the robot

Quality Measures Ecf is not a useful measure for comparing two robot’s positions which are close to one another for their accuracy. Emse does not suffer from this Hence a combination of both of the form called comparative quantity is used as: Reference: “Precise positioning using model based maps”, 1994, IEEE ICRA