_ Daniela Doroftei Royal Military School Av. De la Renaissance 30, B1000 Brussels, Belgium Contribution to ViewFinder – WP2.

Slides:



Advertisements
Similar presentations
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
Advertisements

Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Sensors For Robotics Robotics Academy All Rights Reserved.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
A Cloud-Assisted Design for Autonomous Driving Swarun Kumar Shyamnath Gollakota and Dina Katabi.
Bastien DURAND Karen GODARY-DEJEAN – Lionel LAPIERRE Robin PASSAMA – Didier CRESTANI 27 Janvier 2011 ConecsSdf Architecture de contrôle adaptative : une.
Abstract This project focuses on realizing a series of operational improvements for WPI’s unmanned ground vehicle Prometheus with the end goal of a winning.
Presenter- Dan Carey August 11 and 12, 2009.
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
CS 326 A: Motion Planning Planning Exploration Strategies.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Tele-presence for Control of a High-speed Robot Michael R. Kowalczyk and Joseph A. Schmeltzer Project Advisor: Jeffrey Horn Northern Evolutionary Robotics.
Brent Dingle Marco A. Morales Texas A&M University, Spring 2002
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Lecture 25 Dimitar Stefanov.
Study on Mobile Robot Navigation Techniques Presenter: 林易增 2008/8/26.
Distributed Robot Agent Brent Dingle Marco A. Morales.
A Robust Layered Control System for a Mobile Robot Rodney A. Brooks Presenter: Michael Vidal.
Sonar-Based Real-World Mapping and Navigation by ALBERTO ELFES Presenter Uday Rajanna.
Firefighter Indoor Navigation using Distributed SLAM (FINDS) Major Qualifying Project Matthew Zubiel Nick Long Advisers: Prof. Duckworth, Prof. Cyganski.
Global Positioning System (GPS) Learning Objectives: By the end of this topic you should be able to: describe how satellite communications systems are.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
EE887 Special Topics in Robotics Paper Review Initial Results in the Development Guidance System of a Guidance System for a Powered Wheelchair
Friday, 4/8/2011 Professor Wyatt Newman Smart Wheelchairs.
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Autonomous Unmanned Ground Vehicle Navigation: Present and Future Larry Jackel DARPA IPTO / TTO darpatech2004/
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
1 DTSI / Interactive Robotics Unit IST Advanced Robotics µdrones µDRone autOnomous Navigation for Environment Sensing JM ALEXANDRE CEA List.
Computer Vision. DARPA Challenge Seeks Robots To Drive Into Disasters. DARPA's Robotics Challenge offers a $2 million prize if you can build a robot capable.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Vision-based Navigation and Reinforcement Learning Path Finding for Social Robots Xavier Pérez *, Cecilio Angulo *, Sergio Escalera + and Diego Pardo *
Activity 1: Multi-sensor based Navigation of Intelligent Wheelchairs Theo Theodoridis and Huosheng Hu University of Essex 27 January 2012 Ecole Centrale.
The SmartWheeler platform Collaboration between McGill, U.Montreal, Ecole Polytechnique Montreal + 2 clinical rehab centers. Standard commercial power.
Localisation & Navigation
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Robotica Lecture 3. 2 Robot Control Robot control is the mean by which the sensing and action of a robot are coordinated The infinitely many possible.
Cooperating AmigoBots Framework and Algorithms
Monitoring, Modelling, and Predicting with Real-Time Control Dr Ian Oppermann Director, CSIRO ICT Centre.
Outline Previous Accomplishments o Last year's SURG o Mapkin Proposal Concept o Why is this useful? o The MikroKopter platform o Previous work Criteria.
LITERATURE SURVEY 1.EFFICIENT PATH PLANNING IN SEMI- FAULT TOLERANT ROBOTICS 2. Caesar robot.
Problem Description: Getting accurate location data for the NIMS node Problem Description: Getting accurate location data for the NIMS node Proposed Solution:
Mathew Davison Bobby Harkreader David Mackey Dhivya Padmanbhan.
Project Overview Autonomous robot with multiple modes of operation – Follow, run away, manual control Infrared sensors to detect warm bodies Ultrasonic.
Autonomous Guidance Navigation and Control Michael Gillham University of Kent SYSIASS Meeting University of Essex
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
By: Eric Backman Advisor: Dr. Malinowski.  Introduction  Goals  Project Overview and Changes  Work Completed  Updated Schedule.
GPS Tracking System An autonomous user tracking system is employed to navigate the vehicle using GPS data. The following diagram demonstrates the tracking.
1 Structure of Aalborg University Welcome to Aalborg University.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Global Positioning System
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Contents  Teleoperated robotic systems  The effect of the communication delay on teleoperation  Data transfer rate control for teleoperation systems.
Presentation: Shashank Gundu.  Introduction  Related work  Hardware Platform  Experiments  Conclusion.
Chapter 10. The Explorer System in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans On, Kyoung-Woon Biointelligence Laboratory.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Prime Mobility Group Group Members: Fredrick Baggett William Crick Sean Maxon Advisor: Dr. Elliot Moore.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Learning Roomba Module 5 - Localization. Outline What is Localization? Why is Localization important? Why is Localization hard? Some Approaches Using.
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
Dynamic Mission Planning for Multiple Mobile Robots Barry Brumitt and Anthony Stentz 26 Oct, 1999 AMRS-99 Class Presentation Brian Chemel.
Scarab Autonomous Traverse Carnegie Mellon December 2007 David Wettergreen.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Group 3 Corey Jamison, Joel Keeling, & Mark Langen
Deliberative control for satellite-guided water quality monitoring
Sensors For Robotics Robotics Academy All Rights Reserved.
Automation as the Subject of Mechanical Engineer’s interest
Sensors For Robotics Robotics Academy All Rights Reserved.
Patent Liability Analysis
-Koichi Nishiwaki, Joel Chestnutt and Satoshi Kagami
Presentation transcript:

_ Daniela Doroftei Royal Military School Av. De la Renaissance 30, B1000 Brussels, Belgium Contribution to ViewFinder – WP2

 The goal of this research project is to prepare the RobuDem robot for an outdoor crisis management task. 2

 To achieve this goal, the robot must be able to:  Be tele-operated by a remote user  Ensure its own safety by avoiding obstacles detected by its sensors (sonar, stereo, …)  Navigate autonomously in an unknown environment by mapping the surroundings  Detect chemical contamination  Navigate to pre-defined goal positions  Execute complex tasks like searching for human victims  Here we will focus on the design of the control architecture for such a robot 3

 RobuDem-platform:  Outdoor all-terrain robot  4 driving / steering wheels allowing for different drive modes  Supports heavy loads (up to 300kg) 4

 Ultrasound Sensors for obstacle detection  Joystick for remote control  Chemical sensor with integrated temperature sensor 5

 Differential GPS (and wheel encoders) for positioning 6

 Quad-cam vision system consisting of a Bumble- bee stereo vision system and 2 digital cameras. 7

 Sensors: : Green rounded rectangle  Processors: : Violet rectangle : Gold rectangle : Blue rectangle  Actuators: : Black oval SteerRobudem Sensors BehaviorProcessors VisualProcessors Positioning & Mapping 8

SteerRobudem Joystick  Transmission of joystick commands to the robot: FilterCommands  Transmission of filtered commands to the robot: 9

SteerRobudem Joystick FilterCommands Obey Joystick Behavior Controller  In semi-autonomous control, there is no longer a direct link between joystick and robot actuators:  Instead, there is a Behavior-Based Controller managing the commands sent to the robot  The “Obey Joystick” behavior is only 1 of the multiple behaviors contributing to the global robot control strategy 10

These commands are fused by a behavior-based controller, taking into account both behaviors to come to a globally optimal and consistent command to be sent to the robot The joystick returns the user commands, telling the robot where to go. The “ObeyJoystick” behavior calculates the best action to perform in order to follow the users’ commands. ObeyJoystick The on-board sonars deliver information about obstacles in the robots’ path. The “AvoidObstaclesUsingSonar” behavior calculates the best action to perform in order not to bump into the detected obstacles. FilterCommands SteerRobudem FuseBehaviors AvoidObstaclesUsingSonar SonarJoystick These commands are filtered and sent to the robot 11

 General requirements:  The robot needs accurate positioning  The robot needs to build up a model (map) of its environment to reason with this data  Robot Safety:  The robot needs to avoid areas with excessive heat  The robot needs to detect chemicals and avoid areas which are too contaminated  The robot needs to detect obstacles using its sonar sensors and its stereo vision system and avoid these obstacles  The robot needs to avoid all previously detected obstacles stored in the environmental model (map) 12

 Robot Goals:  The robot must be tele-operable  The robot needs to detect chemicals and find the source of contamination  The robot should maximize the knowledge about the environment it is put in.  The robot needs to search for human victims on the disaster site  The robot needs to be able to execute a user-defined trajectory, given through a set of waypoints  In the event of a loss of network connection, the robot should be able to return to the base station 13

These 2 positioning estimates are fused a first time to come to a more accurate and robust position estimate A Visual Simultaneous Localization and Mapping module analyses all this input data. It builds up an environmental model (map) and places the robot accurately on this map. Data from the rotation of the wheels, measured by wheel encoders, is used to estimate the position VisualSLAM CameraFramegrabberGISMap GPSOdometry PositionEstimation A real-time differential GPS system provides absolute positioning data by acquiring signals from at least 4 space-based satellites If available, GIS (Geographic Information System) data is used to initialize the maps The environ- ment is observed by a camera, which detects and tracks features in the environment to estimate the robot motion and to update the map Environmental model 6-dimensional position estimate 14

 A Visual SLAM module delivers robot position and a map  A chemical sensor instantaneously measures contaminant concentrations  A temperature sensor measures the temperature  This data is used to build up 2 maps containing the chemical and heat distribution  Using this map, 2 behaviors “AvoidChemicals” and “AvoidHotZones” try to steer the robot away from danger zones VisualSLAM AvoidChemicalsAvoidHotZones LocalHeatMap ChemicalSensorTempe-rature LocalChemicalMap 15

 The robot is controlled using a behavior-based controller which sends steering commands.  To generate these commands, the controller aims to fuse a number of objectives / tasks / requirements:  General requirements:  The robot needs accurate positioning  The robot needs to build up a model (map) of its environment to reason with this data VisualSLAM PositionEstimation CameraFramegrabber GISMap GPS Odometry FilterCommands SteerRobudem FuseBehaviors 16

 Robot Safety:  The robot needs to avoid areas with excessive heat  The robot needs to detect chemicals and avoid areas which are too contaminated FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature 17

 Robot Safety:  The robot needs to avoid all previously detected obstacles stored in the environmental model (map) FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM 18

 Robot Safety:  The robot needs to detect obstacles using its sonar sensors and avoid these obstacles FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar 19

 Robot Safety:  The robot needs to detect obstacles using its stereo vision system and avoid these obstacles FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber 20

FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick  Robot Goals:  The robot must be tele-operable 21

FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick GoToChemicals  Robot Goals:  The robot needs to detect chemicals and find the source of contamination 22

FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick GoToChemicals  Robot Goals:  The robot should maximize the knowledge about the environment it is put in MaximizeTerrainKnowledge 23

 Humans are searched for in each of the 4 camera images  A “Search Humans” behavior directs the robot in the persons’ direction FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick GoToChemicals  Robot Goals:  The robot needs to search for human victims MaximizeTerrainKnowledge PersonDetector PersonDetector PersonDetector PersonDetector CameraFramegrabber SearchHumans 24

FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick GoToChemicals MaximizeTerrainKnowledge PersonDetector PersonDetector PersonDetector PersonDetector CameraFramegrabber SearchHumans GoToGoals GlobalPathPlanner GoalAssigner 25

FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick GoToChemicals MaximizeTerrainKnowledge PersonDetector PersonDetector PersonDetector PersonDetector CameraFramegrabber SearchHumans GoToGoals GlobalPathPlanner GoalAssigner ReturnToBase ConnectionChecker 26

FilterCommands SteerRobudem FuseBehaviors VisualSLAM PositionEstimation AvoidChemicals AvoidHotZones LocalHeatMap LocalChemicalMap ChemicalSensor CameraFramegrabber GISMap GPS Odometry Tempe-rature AvoidObstaclesUsingSLAM AvoidObstaclesUsingSonar Sonar AvoidObstaclesUsingStereo StereoFramegrabber ObeyJoystick Joystick GoToChemicals MaximizeTerrainKnowledge PersonDetector PersonDetector PersonDetector PersonDetector CameraFramegrabber SearchHumans GoToGoals GlobalPathPlanner GoalAssigner ReturnToBase ConnectionChecker TaskAssigner 27

FilterCommands SteerRobudem FuseBehaviors AvoidObstaclesUsingSLAM GoToChemicals VisualSLAM PositionEstimation AvoidObstaclesUsingSonar AvoidObstaclesUsingStereo PersonDetector SearchHumans AvoidChemicals AvoidHotZones GoToGoals ReturnToBase ObeyJoystick GlobalPathPlanner LocalHeatMap LocalChemicalMap PersonDetector PersonDetector PersonDetector ChemicalSensor CameraFramegrabber CameraFramegrabber StereoFramegrabber Sonar Joystick TaskAssigner ConnectionChecker GISMap GPS Odometry Tempe-rature GoalAssigner MaximizeTerrainKnowledge 28

 Using this modular behavior based control framework the robot can:  Be tele-operated by a remote user  Ensure its own safety by avoiding obstacles detected by its sensors (sonar, stereo, …)  Navigate autonomously in an unknown environment by mapping the surroundings  Detect chemical contamination  Navigate to pre-defined goal positions  Execute complex tasks like searching for human victims 29

30

FilterCommands SteerRobudem FuseBehaviors AvoidObstaclesUsingSLAM GoToChemicals VisualSLAM PositionEstimation AvoidObstaclesUsingSonar AvoidObstaclesUsingStereo PersonDetector SearchHumans AvoidChemicals AvoidHotZones GoToGoals ReturnToBase ObeyJoystick GlobalPathPlanner LocalHeatMap LocalChemicalMap PersonDetector PersonDetector PersonDetector ChemicalSensor CameraFramegrabber CameraFramegrabber StereoFramegrabber Sonar Joystick TaskAssigner ConnectionChecker GISMap GPS Odometry Tempe-rature GoalAssigner MaximizeTerrainKnowledge 31