Presentation is loading. Please wait.

Presentation is loading. Please wait.

An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.

Similar presentations


Presentation on theme: "An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute."— Presentation transcript:

1 An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute

2 Overview Introduction to Mobile Robotics Introduction to Mobile Robotics Background (RoboCup, Dutch Aibo Team) Background (RoboCup, Dutch Aibo Team) Approach Approach Results Results Conclusions Conclusions 2

3 Mobile robots SICO at Kosair Children's Hospital Dometic, Louisville, Kentucky Sony Aibos playing soccer Cinekids, De Balie, Amsterdam Robot cranes and trucks unloading ships Port of Rotterdam RC3000, the robocleaner Kärcher

4 Challenge Application: Robot Soccer 4

5 Robot localization Robot localization Robot localization.. is the problem of estimating the robot’s pose relative to a map of the environment. Probabilistic approaches Probabilistic approaches Noise Noise Ambiguity Ambiguity Uncertainty Uncertainty 5

6 Design Sensors Sensors Wheelsensors, GPS, Laserscanner, Camera.. Wheelsensors, GPS, Laserscanner, Camera.. Feature space Feature space Map and Belief Representation Map and Belief Representation Grid-based Maps, Topological graphs Grid-based Maps, Topological graphs Single/Multi Hypothesis Trackers Single/Multi Hypothesis Trackers Filters Filters Kalman Filter, Monte-Carlo Methods Kalman Filter, Monte-Carlo Methods 6

7 Design of Classical Approaches Artificial environments Artificial environments (Electro-magnetic) guiding lines (Electro-magnetic) guiding lines (Visual) landmarks (Visual) landmarks Special sensors Special sensors GPS GPS Laser-range-scanners Laser-range-scanners Omni-directional cameras Omni-directional cameras Computationally heavy Computationally heavy offline computation offline computation 7

8 Design of New Approach Natural environments Natural environments Human environments Human environments Unstructured and unknown for the robot Unstructured and unknown for the robot Normal sensors Normal sensors Camera Camera Reasonable requirements Reasonable requirements Real-time Real-time On-board On-board 8

9 Platform: Sony Aibo Internal camera 30fps 208x160 pixels Computer 64bit RISC processor 567 MHz 64 MB RAM 16 MB memorystick WLAN Actuators Legs: 4 x 3 joints Head: 3 joints 9

10 Approach 10

11 Demo Video Visual Compass 11

12 Approach - Synopsis 12

13 Localization Filter Raw imageColor class image Sector-based feature extraction Motion Model Estimated Motion Motion data Image data Previously learned map priorodometry-correctedposterior Sensor Model Correlation Likelihoods 13

14 Sector-based feature extraction (1) Camera field of view: 50° Head field of view: 230° 14

15 Sector-based feature extraction (2) For each sector: For each sector: Count color class transitions in vertical direction Count color class transitions in vertical direction Compute relative transition frequencies Compute relative transition frequencies 15

16 Sensor model (1) Relative frequency of transitions from color class i to color class j in direction φ Relative frequency of transitions from color class i to color class j in direction φ Frequency measurements originate from a probabilistic source (distribution) Frequency measurements originate from a probabilistic source (distribution) How to approximate these distributions? How to approximate these distributions? 16

17 Sensor model (2) Approximate source by a histogram distribution Approximate source by a histogram distribution (parameters constitute the map) 17

18 Sensor model (2) Likelihood that a single frequency measurement originated from direction φ Likelihood that a single frequency measurement originated from direction φ Likelihood that a full feature vector (one sector) originated from direction φ Likelihood that a full feature vector (one sector) originated from direction φ Likelihood that a camera image (set of features) originated from direction φ Likelihood that a camera image (set of features) originated from direction φ 18

19 Sensor model (2) Likelihood that a single frequency measurement originated from direction φ Likelihood that a single frequency measurement originated from direction φ Likelihood that all frequency measurements originated from direction φ Likelihood that all frequency measurements originated from direction φ Likelihood that whole camera image originated from direction φ Likelihood that whole camera image originated from direction φ

20 Localization filter Orientational component Use a Bayesian Filter to update robot‘s beliefs (circular grid buffer) Use a Bayesian Filter to update robot‘s beliefs (circular grid buffer) From this buffer, extract per time step From this buffer, extract per time step Heading estimate Heading estimate Variance estimate Variance estimate priorodometry-correctedposterior 20

21 Results

22 Results Results Brightly illuminated living room Applicable in natural indoor environment Applicable in natural indoor environment Good accuracy (error <5°) Good accuracy (error <5°) 22

23 Results Results Daylight office environment Applicable in natural office environment Applicable in natural office environment Very robust against displacement Very robust against displacement (error <20° over 15m) 23

24 Results Results Outdoor soccer field Applicable in natural outdoor environment Applicable in natural outdoor environment 24

25 Results RoboLab Results RoboLab 4-Legged soccer field Applicable in RoboCup soccer environment Applicable in RoboCup soccer environment 25

26 Results RoboLab Results RoboLab 4-Legged soccer field True average error <10° on a grid of 3x3m True average error <10° on a grid of 3x3m 26

27 Results Variable and Parameter Studies Distance to training spot Distance to training spot Changes in illumination Changes in illumination Angular resolution Angular resolution Scanning grid coverage Scanning grid coverage Number of color classes Number of color classes 27

28 Localization filter Translational component Use multiple training spots Use multiple training spots Each (projectively distorted) patch yields slightly different likelihoods Each (projectively distorted) patch yields slightly different likelihoods Interpolate translation from these likelihoods Interpolate translation from these likelihoods Visual Homing Visual Homing 28

29 Demo Video Visual Homing 29

30 Results Visual Homing x [cm] y [cm] Positioning accuracy Robot walks back to center after kidnap Proof of concept Proof of concept 30

31 Conclusions Novel approach to localization: Novel approach to localization: Works in unstructured environments Works in unstructured environments Accurate, robust, efficient, scaleable Accurate, robust, efficient, scaleable Interesting approach for mobile robots Interesting approach for mobile robots

32 Future Research Use Monte-Carlo Localization Use Monte-Carlo Localization Extend to dynamic environments Extend to dynamic environments Triangulation from two training spots Triangulation from two training spots Announced succeeding projects: Port to RoboCup Rescue Simulation (MSc. Project) Port to RoboCup Rescue Simulation (MSc. Project) RoboCup 2007 Open Challenge (DOAS Project) RoboCup 2007 Open Challenge (DOAS Project) 32

33 3rd Prize Technical Challenges of the 4-Legged League, RoboCup 2006 in Bremen 33

34 Thank You!


Download ppt "An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute."

Similar presentations


Ads by Google