Biologically Inspired Turn Control for Autonomous Mobile Robots Xavier Perez-Sala, Cecilio Angulo, Sergio Escalera.

Slides:



Advertisements
Similar presentations
Defining Rotations, Reflections, and Translations
Advertisements

A Neural Model for Detecting and Labeling Motion Patterns in Image Sequences Marc Pomplun 1 Julio Martinez-Trujillo 2 Yueju Liu 2 Evgueni Simine 2 John.
III. Strain and Stress Strain Stress Rheology Reading Suppe, Chapter 3 Twiss&Moores, chapter 15.
MICHAEL MILFORD, DAVID PRASSER, AND GORDON WYETH FOLAMI ALAMUDUN GRADUATE STUDENT COMPUTER SCIENCE & ENGINEERING TEXAS A&M UNIVERSITY RatSLAM on the Edge:
The optical sensor of the robot Phoenix-1 Aleksey Dmitriev.
Introduction To Tracking
1 Panoramic University of Amsterdam Informatics Institute.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
Biologically-inspired robot spatial cognition based on rat neurophysiological studies Alejandra Barrera and Alfredo Weitzenfeld Auton Robot Rakesh.
3-D Geometry.
CSc83029 – 3-D Computer Vision/ Ioannis Stamos 3-D Computational Vision CSc Optical Flow & Motion The Factorization Method.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
CS 6825: Motion Part 2 – Optical Flow Recall: Optical Flow is an approximation of the 2D motion field. is an approximation of the 2D motion field. Motion.
Overview 1.The Structure of the Visual Cortex 2.Using Selective Tuning to Model Visual Attention 3.The Motion Hierarchy Model 4.Simulation Results 5.Conclusions.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Field and Optical Flow. Outline Motion Field and Optical Flow Definition, Example, Relation Optical Flow Constraint Equation Assumptions & Derivation,
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
1 Motion in 2D image sequences Definitely used in human vision Object detection and tracking Navigation and obstacle avoidance Analysis of actions or.
Image Primitives and Correspondence
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Lunar Lander Phase B1 p. 0 9 th International Planetary Probe Workshop, Toulouse, June 21 th, 2012 Innovative Visual Navigation Solutions for ESA’s Lunar.
Fast Walking and Modeling Kicks Purpose: Team Robotics Spring 2005 By: Forest Marie.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
Quick Overview of Robotics and Computer Vision. Computer Vision Agent Environment camera Light ?
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Vision-based Navigation and Reinforcement Learning Path Finding for Social Robots Xavier Pérez *, Cecilio Angulo *, Sergio Escalera + and Diego Pardo *
1 CMPUT 412 Motion Control – Wheeled robots Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Waikato Margaret Jefferies Dept of Computer Science University of Waikato.
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
Chapter 2 Robot Kinematics: Position Analysis
3D SLAM for Omni-directional Camera
International Conference on Computer Vision and Graphics, ICCVG ‘2002 Algorithm for Fusion of 3D Scene by Subgraph Isomorphism with Procrustes Analysis.
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
Dr. Gallimore10/18/20151 Cognitive Issues in VR Chapter 13 Wickens & Baker.
1 Fundamentals of Robotics Linking perception to action 2. Motion of Rigid Bodies 南台科技大學電機工程系謝銘原.
T. Bajd, M. Mihelj, J. Lenarčič, A. Stanovnik, M. Munih, Robotics, Springer, 2010 GEOMETRIC DESCRIPTION OF THE ROBOT MECHANISM T. Bajd and M. Mihelj.
Autonomous Mobile Robots CPE 470/670 Lecture 6 Instructor: Monica Nicolescu.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
Robot Kinematics: Position Analysis 2.1 INTRODUCTION  Forward Kinematics: to determine where the robot ’ s hand is? (If all joint variables are known)
CS332 Visual Processing Department of Computer Science Wellesley College Analysis of Motion Measuring image motion.
3D Imaging Motion.
Turning Autonomous Navigation and Mapping Using Monocular Low-Resolution Grayscale Vision VIDYA MURALI AND STAN BIRCHFIELD CLEMSON UNIVERSITY ABSTRACT.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Chapter 3 Differential Motions and Velocities
CHS Physics Rolling Motion. A Rolling Wheel Consider three points on a rolling wheel. Consider three points on a rolling wheel. The Center of MassThe.
Presented by: Idan Aharoni
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Determining 3D Structure and Motion of Man-made Objects from Corners.
Motion Features for Action Recognition YeHao 3/11/2014.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Dynamic Perspective How a moving camera reveals scene depth and egomotion parameters Jitendra Malik UC Berkeley.
Unit 2 Vocabulary. Line of Reflection- A line that is equidistant to each point corresponding point on the pre- image and image Rigid Motion- A transformation.
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
Basilio Bona DAUIN – Politecnico di Torino
Date of download: 9/19/2016 Copyright © ASME. All rights reserved. From: A Comparative Study on Motion Characteristics of Three Two-Degree-of-Freedom Pointing.
Date of download: 10/22/2017 Copyright © ASME. All rights reserved.
Paper – Stephen Se, David Lowe, Jim Little
Path Curvature Sensing Methods for a Car-like Robot
Range Imaging Through Triangulation
Features Readings All is Vanity, by C. Allan Gilbert,
Presented by: Cindy Yan EE6358 Computer Vision
Multiple View Geometry for Robotics
Objective - To graph ordered pairs on the coordinate plane.
Projectile Motion Objective: I will distinguish between the horizontal and vertical components of projectile motion and analyze the factors that influence.
Detecting Motion Pattern in Optical Flow Fields
Presentation transcript:

Biologically Inspired Turn Control for Autonomous Mobile Robots Xavier Perez-Sala, Cecilio Angulo, Sergio Escalera

Motivation

Path planning Robot navigation Path execution Unexpected behaviour Path execution control is needed !

Overview General path execution control Turn control Only using:Camera images + “neck” sensor Without:Artificial landmarks, egomotion Consecutive frames  Motion information Successfully tested on Sony Aibo

Biological inspiration Goal Oriented Human Motion 3-4 years: Head stabilization 4-6 years: Egocentric representation of the environment 7-8 years: Landmarks  Intermediate goals 9-10 years: Exocentric representations Anticipatory movements Mobile Robotics Artificial vestibular systems Odometry, Egomotion... Navigation using landmarks SLAM This work!

Biological inspiration Goal Oriented Human Motion 3-4 years: Head stabilization 4-6 years: Egocentric representation of the environment 7-8 years: Landmarks  Intermediate goals 9-10 years: Exocentric representations Anticipatory movements Mobile Robotics Artificial vestibular systems Odometry, Egomotion... Navigation using landmarks SLAM This work!

Turn Control (I) 1.Initial Stage 2.Head rotation to the desired angle (Set point) 3.Body-head alignment (body control + head control) 4.Final orientation (Body and head aligned)

Turn Control (II) a)Body-Head alignment b)Rotation Angle c)SURF flow

Body-Head alignment a)Body-Head alignment b)Rotation Angle c)SURF flow

Body-Head alignment Body Control Head Control Set point:Maintain head orientation Action:“Pan” angle θ Error:Instantaneous head rotation ϕ Set point:Align the neck Action:Walk velocity Error:“Pan” angle θ

Rotation Angle a)Body-Head alignment b)Rotation Angle c)SURF flow

Rotation Angle (I) Motion field: Projection of 3-D relative velocity vectors of the scene points onto the 2-D image plane Pure rotations: Vectors show the image distortion due to camera rotation

Rotation Angle (II) Optical flow: 2-D displacements of brightness patterns in the image Optical flow = Motion field ?

Rotation Angle (III) Restriction: Rotation axis match the image plane Robot (Sony Aibo): Rotation involves a small translation Assumption: Pure rotation + noise in the measure

Rotation Angle (IV)

SURF Flow a)Body-Head alignment b)Rotation Angle c)SURF flow

SURF Flow(I) Optical flow restrictions: Brightness constancy Temporal persistence Spatial coherence

SURF Flow(II) 1.Corner detection 2.SURF description & selection 3.Correspondences 4.Parallel vectors 5.Mean length & mean angle

SURF Flow(III) Correspondences:Refinement:

Experiments 1.Rotation angle 2.Robot turning Robot: Sony Aibo ERS-7 PC - robot processing (wireless) Sampling time: 100ms Rotation measured using a zenithal camera

Rotation Angle

Maximum rotation/sampling time = 3º

Robot turning Specific software:General approach:

Results

Straight Forward (I)

Straight Forward (II)

Conclusions Only camera images and neck sensor are used Biologically inspired navigation Without artificial landmarks or egomotion Similar results that to specific system – Rotations > 15º Exportable system Problems with wireless connection

Future work Test on other platforms Correct robot trajectory using motor information Sampling rate decreasing

Thanks for your attention Questions?