Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
“Intelligent Systems for Welding Process Automation” Prepared for Dr. Arshad Momen Prepared By Naieem Khan CEG-263: Kinematics & Robotics.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
T1.1- Analysis of acceleration opportunities and virtualization requirements in industrial applications Bologna, April 2012 UNIBO.
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
A ROS-based Software Framework for the NimbRo-OP Humanoid Open Platform Philipp Allgeuer, Max Schwarz, Julio Pastrana Sebastian Schueller, Marcell Missura,
Programmable & Autonomous Humanoid Robot ”Robovie-M”
MICHAEL MILFORD, DAVID PRASSER, AND GORDON WYETH FOLAMI ALAMUDUN GRADUATE STUDENT COMPUTER SCIENCE & ENGINEERING TEXAS A&M UNIVERSITY RatSLAM on the Edge:
Real-Time Hand Gesture Recognition with Kinect for Playing Racing Video Games 2014 International Joint Conference on Neural Networks (IJCNN) July 6-11,
1 st Chinese - German Summer School Software development for 4 legged robot soccer competition Zheng Qianyi, Robot and Intelligent System Lab, Tongji University.
An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Project Proposal [5HC99]: Nao Robot playing Checkers Natalia Irigoyen Wouter Kuijpers Alejandro Betancourt.
Integrating a Short Range Laser Probe with a 6-DOF Vertical Robot Arm and a Rotary Table Theodor Borangiu Anamaria Dogar
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
INTEGRATION OF A SPATIAL MAPPING SYSTEM USING GPS AND STEREO MACHINE VISION Ta-Te Lin, Wei-Jung Chen, Fu-Ming Lu Department of Bio-Industrial Mechatronics.
Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and.
Embedded Visual Control for Robotics 5HC99 ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters,
Background S.A.U.V.I.M. Semi - Autonomous Underwater Vehicle for
X96 Autonomous Robot Design Review Saturday, March 13, 2004 By John Budinger Francisco Otibar Scott Ibara.
California Car License Plate Recognition System ZhengHui Hu Advisor: Dr. Kang.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
Oral Defense by Sunny Tang 15 Aug 2003
I mage and M edia U nderstanding L aboratory for Performance Evaluation of Vision-based Real-time Motion Capture Naoto Date, Hiromasa Yoshimoto, Daisaku.
Conference Room Laser Pointer System Preliminary Design Report Anna Goncharova Brent Hoover Alex Mendes.
Video Surveillance Capturing, Management and Analysis of Security Videos. -Abhinav Goel -Varun Varshney.
Michael McGrath IMDL Professors: Dr. A. Antonio Arroyo Dr. Eric M. Schwartz TA’s: Josh Weaver Tim Martin.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
VIRTUAL PROTOTYPING of ROBOTS DYNAMICS E. Tarabanov.
MindRACES, First Review Meeting, Lund, 11/01/2006 Fovea-Based Robot Control for Anticipation Studies in Various Scenarios Alexander Förster, Daan Wierstra,
Robot Compagnion Localization at home and in the office Arnoud Visser, Jürgen Sturm, Frans Groen University of Amsterdam Informatics Institute.
DARPA ITO/MARS Project Update Vanderbilt University A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes,
Knowledge Systems Lab JN 9/10/2002 Computer Vision: Gesture Recognition from Images Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Parameter.
SPIE'01CIRL-JHU1 Dynamic Composition of Tracking Primitives for Interactive Vision-Guided Navigation D. Burschka and G. Hager Computational Interaction.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Tour Guide Robot Project Face Detection and Face Orientation on The Mobile Robot Robotino Gökhan Remzi Yavuz Ayşenur Bilgin.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Sensors.
Digital Image Processing & Analysis Spring Definitions Image Processing Image Analysis (Image Understanding) Computer Vision Low Level Processes:
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Simulation.
Tracking with CACTuS on Jetson Running a Bayesian multi object tracker on a low power, embedded system School of Information Technology & Mathematical.
Humanoid Robots Motivation Humanoid Projects RoboCup Humanoid League Robots  Alpha  RoboSapien  Kondo Personal Robots.
Tracking with CACTuS on Jetson Running a Bayesian multi object tracker on an embedded system School of Information Technology & Mathematical Sciences September.
ESR 2 / ER 2 Testing Campaign Review A. CrivellaroY. Verdie.
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Control.
Figure ground segregation in video via averaging and color distribution Introduction to Computational and Biological Vision 2013 Dror Zenati.
Hybrid-Structure Robot Design From the authors of Chang Gung University and Metal Industries R&D Center, Taiwan.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
CHROMATIC TRAILBLAZER 25 th November, 2008 University of Florida, Department of Electrical & Computer Engineering, Intelligent Machine Design Lab (EEL.
Design of PCA and SVM based face recognition system for intelligent robots Department of Electrical Engineering, Southern Taiwan University, Tainan County,
Contents  Teleoperated robotic systems  The effect of the communication delay on teleoperation  Data transfer rate control for teleoperation systems.
Child-sized 3D Printed igus Humanoid Open Platform Philipp Allgeuer, Hafez Farazi, Michael Schreiber and Sven Behnke Autonomous Intelligent Systems University.
Single Player Foosball Table with an Autonomous Opponent ECE 4007 Senior Design Team FIFA Dr. James Hamblen Michael Aeberhard Shane Connelly Evan Tarr.
Knowledge Systems Lab JN 1/15/2016 Facilitating User Interaction with Complex Systems via Hand Gesture Recognition MCIS Department Knowledge Systems Laboratory.
Chapter 8. Learning of Gestures by Imitation in a Humanoid Robot in Imitation and Social Learning in Robots, Calinon and Billard. Course: Robots Learning.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D.
Engineering Solutions to Build an Inexpensive Humanoid Robot Based on a Distributed Control Architecture Vítor M.F. Santos 1 and Filipe M.T. Silva 2 1.
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
Automatic License Plate Recognition for Electronic Payment system Chiu Wing Cheung d.
Vítor M. F. Santos1 Filipe M. T. Silva2
J. Gonzalez-Gomez, E. Boemo
Senior Capstone Project Gaze Tracking System
CS294-1 Reading Aug 28, 2003 Jaein Jeong
Proprioceptive Visual Tracking of a Humanoid Robot Head Motion
Visual Tracking on an Autonomous Self-contained Humanoid Robot
PRELIMINARY DESIGN REVIEW
Presentation transcript:

Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines 08 – 10 September 2008, Coimbra, Portugal

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Outline Overview Objectives Self-Contained Platform Vision System Experimental Results Conclusions

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Overview Humanoid Platform Humanoid Robot developed at University of Aveiro Ambition is participation at Robocup Platform composed of 22 DOF’s Head on a PTU arrangement Up to 70 cm height and a mass of 6,5 kg

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Overview Distributed Control Architecture Master/Multi-Slave configuration on CAN Bus Central Processing Unit: Image processing and visual tracking External computer interaction for monitorization, debug or tele- operation Master CPU/Slaves communication interface Slaves Interface with actuators and sensors

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Objectives Central Processing Unit Integration Computational autonomy Development environment Vision System Development Visual Tracking Approach Detection and tracking of a moving target (ball)

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Self-Contained Platform CPU standard PCI-104 AMD Geode 500MHz 512Mb RAM SSD 1Gb Video Signal Capture PCMCIA FireWire board Dual PCMCIA PC104 module UniBrain 30fps (640x480) Camera Development Environment Linux based OpenCV

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Vision System

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Vision System Acquisition Mask Segmentation - H, S and V Components Object Location Pre-processing

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Vision System Dynamic Region of Interest (ROI) Reduced noise impact Faster calculus With ROI No ROI

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Vision System Dynamic Region of Interest (ROI)

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Vision System Visual Tracking Approach Keep target close to image centre Image based algorithm Fixed gains proportional law,, joint increment vector, constant gain matrix, error vector defined by the ball’s offset Variable gains nonlinear law,

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Experimental Results Self-Contained Platform Acquisition libdc1394 based library 320x240 with down-sampling: ~24ms Processing Without dynamic ROI: 15ms With dynamic ROI: 11ms Total = ~40ms 25Hz Times (ms) Max. (ms)Min. (ms)Avg. (ms)St. Dev. acquisition32,402011,878013,68202,0275 pyr down25,90509,47309,84321,6330 segmentation41,60309,33209,84562,4185 centroid location3,25500,39701,30790,4478 control0,15900,01400,01540,0093 actuation37,24602,16704,48502,6913 total118,660035,906039,15207,1468

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Experimental Results Visual Tracking Ball alignment ~1s Stationary error (~7 pixels)

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Experimental Results Visual Tracking Pan tracking with fixed gains Pan tracking Error increases in frontal area of the robot

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Experimental Results Visual Tracking Pan tracking with variable gains Frontal area error reduced Fixed GainsVariable Gains

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Experimental Results Visual Tracking Tilt tracking with variable gains Tilt tracking Error similar to the pan tracking Trunk increases the error

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Experimental Results Visual Tracking

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Conclusions Implemented architecture separates the high-level vision processing from the low-level actuators control Dynamic Region of Interest guarantees a greater noise immunity and faster calculus Low error location and alignment with stationary target, fast convergence Tracking error reveals the need of a more sophisticated control Autonomous Self-Contained Humanoid Platform 25Hz average processing rate, sufficient to deal with fast-stimuli and other quick changing visual entries

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Future Work Validate ball detection through shape detection Recognition of other elements, such as the ones present at the Robocup competition Explore alternative techniques of Visual Servoing Study the influence of the robot’s movement on the visual information and on the tracking system’s performance

11th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines University of Aveiro September Thank you for your atention