Download presentation

Presentation is loading. Please wait.

Published byBrenda Pinion Modified over 2 years ago

1
An Inexpensive Method for Evaluating the Localization Performance of a Mobile Robot Navigation System Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu, and Stan Birchfield Microsoft Robotics

2
Motivation Goal: Automatically measure performance of mobile robot navigation system Purpose: Internal comparison – how is my system improving over time? External comparison – how does my system compare to others? Requirements: Repeatable – not just playback of recorded file, but run the system again (with environment dynamics) Reproducible – others should be able to measure the performance of their system in their environment Comparable – need to compare solutions with different hardware and sensors, in different environments Inexpensive – cost should not be a barrier to use We focus only on localization performance here

3
Scalability System should scale in space (large environments) in time (long runs) in variety (different types of environments) Simplicity is key to scalability: Low setup time Easy calibration Inexpensive components Non-intrusive

4
Previous work Datasets: Radish, New College, SLAM datasets do not always have ground truth SLAM with ground truth: Rawseeds, Freiburg, TUM use prerecorded data, do not scale easily Qualitative evaluation: RoboCupRescue, RoboCupHome focus is on achieving a particular task Benchmarking initiatives: EURON, RoSta, PerMIS, RTP have not yet created definitive set of metrics / benchmarks for nav Comparison on small scale: Teleworkbench small scale Retroreflective markers and laser: Tong-Barfoot ICRA 2011 requires laser, subject to occlusion

5
Our approach Checkerboard pattern Yields 3D pose of camera relative to target Convert to 2D pose of robot on floor Landmark x y

6
A useful instrument Laser level: Upward facing laser provides plumb-up line Downward facing laser provides plumb-down line Horizontal laser (not used) Self-leveling, so plumb lines are parallel to gravity Used to determine point on ground directly below origin of target

7
Procedure Calibration Internal camera parameters External camera parameters w.r.t. robot (position, tilt) Floor parameters under each landmark (tilt) Map-building Build map When under landmark, user presses button Pose estimation + calibration robot pose w.r.t. landmark Store robot pose w.r.t. map* Runtime Generate sequence of waypoints When robot thinks it is under a landmark,* Pose estimation + calibration robot pose w.r.t. landmark Error is difference between pose at runtime and pose at map-building *Note: Any type of map can be used

8
internal camera parameters Coordinate systems image camera robot landmark world 2D/3D Euclidean 2D Euclidean (optional) 3D Euclidean (external camera parameters) 2D Euclidean (absolute metric) 2D/3D Euclidean (relative metric) { CALIBRATION { POSE ESTIMATION { LOCALIZATION (what we want) ? ?

9
Camera-to-robot calibration Need to determine: rotation between camera and robot 3 translation between camera and robot+ 3 6 parameters If floor were completely flat, and camera were mounted perfectly upright, then x r = x – d rc cos rc y r = y – d rc sin rc r = – a robot camera driving direction wheel base But floor is often not flat, and camera is never upright camera poserobot pose camera offset camera roll robot center

10
x r = x – d rc cos rc – z sin c cos ( c + ) – z sin f cos f y r = y – d rc sin rc – z sin c cos ( c + ) – z sin f cos f r = – a Camera-to-robot calibration When floor is not flat, and camera is not upright, then estimate tilt of camera w.r.t. floor normal ( c ) azimuth of camera tilt plane w.r.t. forward direction of robot ( c ) tilt of floor w.r.t. gravity ( f ) azimuth of floor tilt plane w.r.t. positive x axis of landmark ( f ) Rotate robot incrementally 360 degrees Rotation axis is perpendicular to floor Optical axis traces cone rcrc rfrf }} floor gravity floor normal optical axis cc ff

11
Calibration geometry floor landmark gravity

12
floor landmark gravity robot camera center Calibration geometry

13
robot Calibration geometry ff floor landmark camera center gravity

14
Calibration geometry ff floor landmark axis of rotation (= floor normal) ff camera center gravity

15
Calibration geometry ff floor landmark axis of rotation ff camera center gravity optical axis 1 cc

16
Calibration geometry ff floor landmark axis of rotation ff camera center gravity z1z1 optical axis 1 cc

17
Calibration geometry ff floor landmark axis of rotation ff camera center gravity z1z1 optical axis 1 cc x1x1

18
Calibration geometry ff floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 cc x1x1 rotate robot

19
Calibration geometry ff floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 cc x1x1 rotate robot These are 180 o apart

20
Calibration geometry cc ff cc floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 x1x1

21
x1x1 Calibration geometry cc ff z2z2 cc floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1

22
x1x1 Calibration geometry cc ff z2z2 x2x2 cc floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1

23
x1x1 Calibration geometry cc ff z2z2 x2x2 cc x 2 – x 1 floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 (x 1,z 1 ), (x 2,z 2 ) are from pose estimation sin c = (x 2 -x 1 ) / 2z sin f = (x 2 +x 1 ) / 2z where z = (z 1 +z 2 )/2 Note: x 1 + (x 2 -x 1 ) / 2 = (x 2 +x 1 ) / 2

24
Calibration geometry x1x1 cc ff z2z2 x2x2 cc x 2 – x 1 floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 radius of circle: distance from landmark center to circle center: (x 1,z 1 ), (x 2,z 2 ) are from pose estimation sin c = (x 2 -x 1 ) / 2z sin f = (x 2 +x 1 ) / 2z where z = (z 1 +z 2 )/2

25
Calibration geometry x1x1 cc ff z2z2 x2x2 cc x 2 – x 1 floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 radius of circle: distance from landmark center to circle center: where (x 1,z 1 ), (x 2,z 2 ) are from pose estimation r c / z sin c = (x 2 -x 1 ) / 2z sin f = (x 2 +x 1 ) / 2z r f / z where z = (z 1 +z 2 )/2

26
Calibration geometry (from real data) Azimuth angles Tilt angles where Top-down view of circle

27
Evaluating accuracy Mounted camera to carriage of CNC machine Move to different known (x,y, ), measure pose Covered area 1.3 x 0.6 m Position err: =5 =2 mm max=11 mm Angular err: =0.3 =0.2 deg max=1 deg

28
Evaluating accuracy Placed robot at 20 random positions under one landmark Position err usually < 20 mm Orient err usually < 1 deg

29
Evaluating accuracy 15 landmarks across 2 bldgs. Placed robot at 5 canonical positions Position err usually < 20 mm Orient err usually < 1 deg

30
Evaluating accuracy Our accuracy is comparable to other systems Our system is scalable to large environments GTvision/GTlaser from Ceriani et al. AR 2009 (Rawseeds) mocap from Kummerle et al. AR 2009 retroreflective from Tong, Barfoot ICRA 2011 scales to arbitrarily large environments scales to very large single-floor environments (with additional step)

31
Evaluating accuracy Two different buildings on the Microsoft campus

32
Evaluating accuracy Automated runs in 2 diff. environments Accuracy comparable Easy to setup Easy to maintain

33
Computing global coordinates Theodolite: Horizontal laser emanates from pan-tilt head Reflects off mirror Measures (w.r.t. gravity) horizontal distance to mirror pan angle to mirror tilt angle to mirror (not used)

34
Computing global coordinates For target positions: Repeatedly measure distance and angle for each triplet of targets with line-of- sight 2D Euclidean coordinates of all targets in a common global coordinate system High accuracy of theodolite removes nearly all drift Drift can be checked by adding all angles in a loop, comparing with 360 degrees (optional) l 12 l 23 l 34 l 45 l 15 l 67 l 78 theodolite reflector

35
Computing global coordinates Given l 1, l 2, (from theodolite) and t length (known), find Naïve solution is sensitive to noise Key is to use only measured values Better solution tan = ( l 1 - l 2 cos ) / where ( l 1 - l 2 cos ) 2 + l 2 2 Naïve solution sin = ( l 1 - l 2 cos ) / t length For target orientation: Place reflector under several positions within target theodolite reflector (multiple locations – only 2 needed) l1l1 l2l2 t length target theodolite

36
Navigation contest Microsoft and Adept are organizing Kinect Autonomous Mobile Robot Contest at IROS 2014 in Chicago http://www.iros2014.org/program/kinect-robot-navigation-contest

37
Conclusion System for evaluating localization accuracy of navigation Inexpensive Easy to setup Easy to maintain Highly accurate Scalable to arbitrarily large environments Scalable to arbitrarily run lengths (time or space) With theodolite, global coordinates are possible We have begun long-term, large-scale comparisons (results forthcoming) Mobile robot navigation contest at IROS 2014

38
Thanks!

Similar presentations

OK

Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.

Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on brand marketing jobs Hrm ppt on recruitment matters Ppt on bionics research Ppt on index numbers economics Water saving tips for kids ppt on batteries Ppt on principles of peace building support office What does appt only means Ppt on tata trucks commercial vehicle Ppt on class 9 motion powerpoint Ppt on time management training