Robotics applications of vision-based action selection Master Project Matteo de Giacomi.

Slides:



Advertisements
Similar presentations
Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Advertisements

Reactive and Potential Field Planners
Lecture 7: Potential Fields and Model Predictive Control
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
NUS CS5247 Motion Planning for Camera Movements in Virtual Environments By Dennis Nieuwenhuisen and Mark H. Overmars In Proc. IEEE Int. Conf. on Robotics.
CSE 380 – Computer Game Programming Pathfinding AI
Optimizing Flocking Controllers using Gradient Descent
Roland Geraerts and Mark Overmars ICRA 2007 The Corridor Map Method: Real-Time High-Quality Path Planning.
The Vector Field Histogram Erick Tryzelaar November 14, 2001 Robotic Motion Planning A Method Developed by J. Borenstein and Y. Koren.
A Survey of Artificial Intelligence Applications in Water-based Autonomous Vehicles Daniel D. Smith CSC 7444 December 8, 2008.
Integrating a Short Range Laser Probe with a 6-DOF Vertical Robot Arm and a Rotary Table Theodor Borangiu Anamaria Dogar
AuRA: Principles and Practice in Review
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and.
Surface Variation and Mating Surface Rotational Error in Assemblies Taylor Anderson UGS June 15, 2001.
Using a GA to Create Prey Tactics Presented by Tony Morelli on 11/29/04.
Video Surveillance using Distance Maps January 2006 Theo Schouten Harco Kuppens Egon van den Broek.
Motor Schema - Based Mobile Robot Navigation System - Ronald C. Arkin.
Autonomous Mobile Robots CPE 470/670 Lecture 8 Instructor: Monica Nicolescu.
Motor Schema Based Navigation for a Mobile Robot: An Approach to Programming by Behavior Ronald C. Arkin Reviewed By: Chris Miles.
High Speed Obstacle Avoidance using Monocular Vision and Reinforcement Learning Jeff Michels Ashutosh Saxena Andrew Y. Ng Stanford University ICML 2005.
Steering Behaviors For Autonomous Characters
Behavior- Based Approaches Behavior- Based Approaches.
A Robust Layered Control System for a Mobile Robot Rodney A. Brooks Presenter: Michael Vidal.
On Three-Layer Architecture Erann Gat Jet Propulsion Laboratory California Institute of Technology Presentation by: Ekkasit Tiamkaew Date: 09/09/04.
Patent Liability Analysis Andrew Loveless. Potential Patent Infringement Autonomous obstacle avoidance 7,587,260 – Autonomous navigation system and method.
Locomotion Exploiting Body Dynamics - Semester Project - Student: Matteo de Giacomi Supervisor: Jonas Buchli.
Measuring Cooperative Robotic Systems Using Simulation-Based Virtual Environment Xiaolin Hu Computer Science Department Georgia State University, Atlanta.
Locomotion in modular robots using the Roombots Modules Semester Project Sandra Wieser, Alexander Spröwitz, Auke Jan Ijspeert.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Kalman filter and SLAM problem
1 Constant Following Distance Simulations CS547 Final Project December 6, 1999 Jeremy Elson.
Constraints-based Motion Planning for an Automatic, Flexible Laser Scanning Robotized Platform Th. Borangiu, A. Dogar, A. Dumitrache University Politehnica.
Localisation & Navigation
Exploration Robot with Stereovision Vladislav Richter Miroslav Skrbek FIT, CTU in Prague
Robotics- Basic On/Off Control Considerations. On/Off Control Forms the basis of most robotics operations Is deceptively simple until the consequences.
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
Robotica Lecture 3. 2 Robot Control Robot control is the mean by which the sensing and action of a robot are coordinated The infinitely many possible.
VISUAL MONITORING OF RAILROAD GRADE CROSSING AND RAILROAD TRACKS University of Central Florida.
ECE532 Final Project Demo Disparity Map Generation on a FPGA Using Stereoscopic Cameras ECE532 Final Project Demo Team 3 – Alim, Muhammad, Yu Ting.
Study on Genetic Network Programming (GNP) with Learning and Evolution Hirasawa laboratory, Artificial Intelligence section Information architecture field.
Robotica Lecture 3. 2 Robot Control Robot control is the mean by which the sensing and action of a robot are coordinated The infinitely many possible.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
Submitted by: Giorgio Tabarani, Christian Galinski Supervised by: Amir Geva CIS and ISL Laboratory, Technion.
Mobile Robot Navigation Using Fuzzy logic Controller
Hardware Sponsors National Aeronautics and Space Administration (NASA) NASA Goddard Space Flight Center (GSFC) NASA Goddard Institute for Space Studies.
Driver’s Sleepiness Detection System Idit Gershoni Introduction to Computational and Biological Vision Fall 2007.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
University of Windsor School of Computer Science Topics in Artificial Intelligence Fall 2008 Sept 11, 2008.
Behavior Control for Robotic Exploration of Planetary Surfaces Written by Erann Gat, Rajiv Desai, Robert Ivlev, John Loch and David P Miller Presented.
Evolutionary Robotics The Italian Approach The Khepera robot (1996) Developed at EPFL Lausanne, Switzerland(!) by Francesco Mondada Diameter: 55 mm Could.
Artificial Intelligence in Game Design Complex Steering Behaviors and Combining Behaviors.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Controlling Individual Agents in High-Density Crowd Simulation
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Final Presentation Prime Mobility Group Group Members: Fredrick Baggett William Crick Sean Maxon Project Advisor: Dr. Elliot Moore.
Visual Odometry David Nister, CVPR 2004
Artificial Intelligence in Game Design Lecture 8: Complex Steering Behaviors and Combining Behaviors.
CSCI 4310 Lecture 5: Steering Behaviors in Raven.
Robot Intelligence Technology Lab. 10. Complex Hardware Morphologies: Walking Machines Presented by In-Won Park
Basilio Bona DAUIN – Politecnico di Torino
4/22/20031/28. 4/22/20031/28 Presentation Outline  Multiple Agents – An Introduction  How to build an ant robot  Self-Organization of Multiple Agents.
Paper – Stephen Se, David Lowe, Jim Little
Ch 14. Active Vision for Goal-Oriented Humanoid Robot Walking (1/2) Creating Brain-Like Intelligence, Sendhoff et al. (eds), Robots Learning from.
Locomotion of Wheeled Robots
CIS 488/588 Bruce R. Maxim UM-Dearborn
-Koichi Nishiwaki, Joel Chestnutt and Satoshi Kagami
Stefan Oßwald, Philipp Karkowski, Maren Bennewitz
Real-time Uncertainty Output for MBES Systems
Planning.
Presentation transcript:

Robotics applications of vision-based action selection Master Project Matteo de Giacomi

Contents  Introduction  Controller Architecture  Webots implementation  Visual System  Amphibot II implementation  Conclusion

Introduction - Project Objectives - Related works - Used robots

Project Objectives Use Stereo Vision to make a real robot reactively:  Avoid Obsacles  Flee from Predators  Follow Preys

Related Works  Schema-based architecture [Arkin]  Potential Field [Andrews] [Kathib]  Steering [Reynolds]  Subsumption architecture [Brooks]

Used Robots  Amphibot II 8 body elements  Salamandra body elements and legs elements Control of Speed through a Drive signal and of the direction through a Turn signal

Controller Architecture - Overview - Behavioral Constants - Obstacle Avoidance

DRIVE,TURN correct behavior obstacles predator prey yes no Memory Turn, direction, pred_pos, prey_pos pred_pos, Pred_dist, fear Prey_pos, prey,_dist, persistance Controller Architecture Motor feedback Visual input error Disp_map pred_pos, pred_dist prey_pos, prey_dist motor position

Behavioral Constants  Reactivity (min time between two different behaviors)  Panic (when stuck, time after that the robot starts moving randomly)  Confidence (min distance to an object before collision danger is triggered)  Daring (min distance the robot can approach the predator)  Fear (time in fleeing state after having lost eye contact with the predator)  Persistence (while a prey is lost, time in search state before giving up)

Obstacle Avoidance (1)  Avoid Static Obstacles  Avoid Sudden obsacles (ex. foot)  Detect Dead-ends (requiring the implementation of Backward locomotion) FORWARD Turn = max(X) Drive = x center BACKWARD Turn = const. Drive = min(x center, max(x center, X\{x center })) Drive <= 0 Drive > 0

Obstacle Avoidance (2)  Avoidance is triggered if an obstacle is too close (see confidence) In a clutted environment, one tends to approach obstacles more than in an open space  Confidence varies according to an estimation of obstacle density

Webots Implementation - action selection - influence of behavioral constants

Interaction between behaviors Video: obstacle avoidance, prey and predator action selection

Influence of behavioral constants  When both a prey and a predator are detected Fear and Daring affect robot behavior

Visual System - Distance Measures Analysis - Prey and Predator Tracking

Input Mapping (1) 1…m 1 … … n Input: m x n distance grid Output: Polar distance map. Sectors distance estimation: minima between the cells of every column (pessimist approach) min(col 1 )min(…)min(col m )

Input Mapping (2)  Issue: Filmed area depends on robot‘s head position  Solution: Knowing Cam Angle and Angular Speed (depending on Turn and Drive): Map Camera Field on Visual Field

Input Mapping (3) Video: example of depth Map generation

Prey and Predator Tracking (1)  Shape recognition  Prey: small circle Turn so that circle centre is set in front of the robot Stop when sufficiently close  Predator: big circle Turn away as fast as possible

Prey and Predator Tracking (2)  Circular Hough Transform  Left-Right Size check

Prey and Predator Tracking (3)  Evaluate target expected size according to distance and compare with measured size

Amphibot II implementation - Introduction - Battery charge influence - Obstacle avoidance: results

Introduction  Differences from webots: Camera‘s range: 60° instead of 120° Input: more noisy Frame rate: is smaller Drive Signal: Its relation with amplitude and frequency critically depends on the environment and the used hardware

Battery charge influence  Estimation or measure of battery charge impossible, world rotation phase in mapping must be skipped

Results Video: setup presentation, obstacle avoidance

Conclusion - Results - Further Works

Results  Stereo-Vision system Effective for both obstacle avoidance and target recognition  Behavior Scalable (a joystick was added as a new behavior with minimal variations) Quick, memory inexpensive „Natural“ parameters:  One architecture, many behaviors  Several parameters to trim, „aestetic“ criteria

Further works  Camera-to-Wold mapping can be improved?  How to define parameter values?  Possible addition of a planner?  How can the visual system cope with a water enviroment?  Robot gait may adapt to the type of surface?

THE END Thank you! Any question?

Amphibot‘s Input Mapping  Polar map containing 19 sectors  Robot kept on place while oscillating parallel to a wall

Obstacle Avoidance Video: Dead-end detection

Prey Cornering Behavior Video: obstacle is ignored in case a prey is present (behavior feedback)

Turning vs. Reactivity  Tracking in a webots simulation  Low Reactivity produces an unnatural behavior  High Reactivity makes the robot react too slowly

Turning Radius vs. Battery charge Video: turning performance along time with constant drive and turn

Drive Signal vs. Amplitude and Frequency

Drive vs. Obstacle distance

Bonus: Hough Transform Video: circle tracking