Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and.

Slides:



Advertisements
Similar presentations
A Keystone-free Hand-held Mobile Projection System Li Zhaorong And KH Wong Reference: Zhaorong Li, Kin-Hong Wong, Yibo Gong, and Ming-Yuen Chang, “An Effective.
Advertisements

The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
3D Video Generation and Service Based on a TOF Depth Sensor in MPEG-4 Multimedia Framework IEEE Consumer Electronics Sung-Yeol Kim Ji-Ho Cho Andres Koschan.
Robotics Simulator Intelligent Systems Lab. What is it ? Software framework - Simulating Robotics Algorithms.
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
Real-Time Tracking of an Unpredictable Target Amidst Unknown Obstacles Cheng-Yu Lee Hector Gonzalez-Baños* Jean-Claude Latombe Computer Science Department.
Mobility Improves Coverage of Sensor Networks Benyuan Liu*, Peter Brass, Olivier Dousse, Philippe Nain, Don Towsley * Department of Computer Science University.
A Self-Supervised Terrain Roughness Estimator for Off-Road Autonomous Driving David Stavens and Sebastian Thrun Stanford Artificial Intelligence Lab.
1 ENHANCED RSSI-BASED HIGH ACCURACY REAL-TIME USER LOCATION TRACKING SYSTEM FOR INDOOR AND OUTDOOR ENVIRONMENTS Department of Computer Science and Information.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Providing Haptic ‘Hints’ to Automatic Motion Planners Providing Haptic ‘Hints’ to Automatic Motion Planners Burchan Bayazit Joint Work With Nancy Amato.
Mobile Robot ApplicationsMobile Robot Applications Textbook: –T. Bräunl Embedded Robotics, Springer 2003 Recommended Reading: 1. J. Jones, A. Flynn: Mobile.
Providing Haptic ‘Hints’ to Automatic Motion Planners Providing Haptic ‘Hints’ to Automatic Motion Planners by Burchan Bayazit Department of Computer Science.
Path Planning of Robot in Three- dimensional Grid Environment based on Genetic Algorithms Hua Zhang, Manlu Liu, Ran Liu, Tianlian Hu Intelligent Control.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS 2007 (TPDS 2007)
Constraints-based Motion Planning for an Automatic, Flexible Laser Scanning Robotized Platform Th. Borangiu, A. Dogar, A. Dumitrache University Politehnica.
Mobility Limited Flip-Based Sensor Networks Deployment Reporter: Po-Chung Shih Computer Science and Information Engineering Department Fu-Jen Catholic.
Robot Autonomous Perception Model For Internet-Based Intelligent Robotic System By Sriram Sunnam.
SPIE'01CIRL-JHU1 Dynamic Composition of Tracking Primitives for Interactive Vision-Guided Navigation D. Burschka and G. Hager Computational Interaction.
Robot Crowd Navigation using Predictive Position Fields in the Potential Function Framework Ninad Pradhan, Timothy Burg, and Stan Birchfield Electrical.
Leslie Luyt Supervisor: Dr. Karen Bradshaw 2 November 2009.
Tour Guide Robot Project Face Detection and Face Orientation on The Mobile Robot Robotino Gökhan Remzi Yavuz Ayşenur Bilgin.
3D SLAM for Omni-directional Camera
F Networked Embedded Applications and Technologies Lab Department of Computer Science and Information Engineering National Cheng Kung University, TAIWAN.
Prediction-based Object Tracking and Coverage in Visual Sensor Networks Tzung-Shi Chen Jiun-Jie Peng,De-Wei Lee Hua-Wen Tsai Dept. of Com. Sci. and Info.
Intelligent Large Scale Sensing Systems (ILS 3 ) initiative Initiative Status and Activities Kevin M. McNeill, PhD Research Assoc. Professor Director,
Department of Electrical Engineering, Southern Taiwan University Robotic Interaction Learning Lab 1 The optimization of the application of fuzzy ant colony.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
Online Kinect Handwritten Digit Recognition Based on Dynamic Time Warping and Support Vector Machine Journal of Information & Computational Science, 2015.
A General-Purpose Platform for 3-D Reconstruction from Sequence of Images Ahmed Eid, Sherif Rashad, and Aly Farag Computer Vision and Image Processing.
Game-Theoretic Analysis of Mobile Network Coverage David K.Y. Yau.
Selection and Navigation of Mobile sensor Nodes Using a Sensor Network Atul Verma, Hemjit Sawant and Jindong Tan Department of Electrical and Computer.
Distributed Algorithms for Multi-Robot Observation of Multiple Moving Targets Lynne E. Parker Autonomous Robots, 2002 Yousuf Ahmad Distributed Information.
CHROMATIC TRAILBLAZER 25 th November, 2008 University of Florida, Department of Electrical & Computer Engineering, Intelligent Machine Design Lab (EEL.
Design of PCA and SVM based face recognition system for intelligent robots Department of Electrical Engineering, Southern Taiwan University, Tainan County,
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Covering Points of Interest with Mobile Sensors Milan Erdelj, Tahiry Razafindralambo and David Simplot-Ryl INRIA Lille - Nord Europe IEEE Transactions on.
1 Motion Fuzzy Controller Structure(1/7) In this part, we start design the fuzzy logic controller aimed at producing the velocities of the robot right.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Shibo He 、 Jiming Chen 、 Xu Li 、, Xuemin (Sherman) Shen and Youxian Sun State Key Laboratory of Industrial Control Technology, Zhejiang University, China.
Adaptive Tracking in Distributed Wireless Sensor Networks Lizhi Yang, Chuan Feng, Jerzy W. Rozenblit, Haiyan Qiao The University of Arizona Electrical.
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
A Load-Balanced Guiding Navigation Protocol in Wireless Sensor Networks Wen-Tsuen Chen Department of Computer Science National Tsing Hua University Po-Yu.
Bill Sacks SUNFEST ‘01 Advisor: Prof. Ostrowski Using Vision for Collision Avoidance and Recovery.
Adaptive Triangular Deployment Algorithm for Unattended Mobile Sensor Networks Ming Ma and Yuanyuan Yang Department of Electrical & Computer Engineering.
Path Planning Based on Ant Colony Algorithm and Distributed Local Navigation for Multi-Robot Systems International Conference on Mechatronics and Automation.
A Protocol for Tracking Mobile Targets using Sensor Networks H. Yang and B. Sikdar Department of Electrical, Computer and Systems Engineering Rensselaer.
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
1 Dynamic Speed and Sensor Rate Adjustment for Mobile Robotic Systems Ala’ Qadi, Steve Goddard University of Nebraska-Lincoln Computer Science and Engineering.
Selection and Navigation of Mobile Sensor Nodes Using a Sensor Network Atul Verma, Hemjit Sawant and Jindong Tan Department of Electrical and Computer.
On Mobile Sink Node for Target Tracking in Wireless Sensor Networks Thanh Hai Trinh and Hee Yong Youn Pervasive Computing and Communications Workshops(PerComW'07)
Deploying Sensors for Maximum Coverage in Sensor Network Ruay-Shiung Chang Shuo-Hung Wang National Dong Hwa University IEEE International Wireless Communications.
Remote Data Acquisition and Touch- based Control of a Mobile Robot using a Smart Phone Yong-Ho Seo, Chung-Sub Lee, Hyo-Young Jung and Tae-Kyu Yang Department.
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
Ali Ghadirzadeh, Atsuto Maki, Mårten Björkman Sept 28- Oct Hamburg Germany Presented by Jen-Fang Chang 1.
Evolving robot brains using vision Lisa Meeden Computer Science Department Swarthmore College.
ROBOT NAVIGATION AI Project Asmaa Sehnouni Jasmine Dsouza Supervised by :Dr. Pei Wang.
Obstacle avoidance with user in the loop for powered wheelchair 24th April 2012 Part-financed by the European Regional Development Fund 1 Xin JIN research.
Rapidly-Exploring Random Trees
CS b659: Intelligent Robotics
GESTURE CONTROLLED ROBOTIC ARM
Motion Segmentation at Any Speed
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Presentation transcript:

Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and Intelligent Systems Lab, Department of Electrical and Computer Engineering, The University of Tennessee, Knoxville, Tennessee, USA International Journal of Industrial Robot, Special Issue on Robot Control and Programming, Vol. 33, No. 6, pp , Presented by : 曹憲中

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Introduction The contributions of this paper are to present a mobile robotic system which can simultaneously track a moving object and avoid obstacles in real-time.

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

System architecture

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Image Input Phase The Logitech Web Camera has a fixed view and is attached to the robotic platform. It is used to acquire color 320x240 images.

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Object Tracking Phase Lucas and Kanade's algorithm M represents the mass motion vector of the tracked object (in 1x2 matrix form). X i represents each motion vector of the tracked object (in 1x2 matrixform). N represents amount of total motion vector.

Object Tracking Phase Conversion from image to 2D world coordinate system.

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Robot Control Phase Conversion from image to 2D world coordinate system.

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Obstacle Detection Phase It uses a range scanner to sense if there is any obstacle in its projected path. If no obstacle is detected, the robot mobility phase is activated. Subsequently, the control of the system returns back to the image input phase. Otherwise, the system uses the obstacle avoidance phase for generating another robot control command in order to avoid the obstacle.

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Obstacle Avoidance Phase They propose a new algorithm called dynamic goal potential fields (DGPF) which is based on the traditional Potential Fields methods to solve this type of problems.

The DGPF algorithm is based on the following: 1. Using the current configuration, goal configuration and sensor data, it runs a basic potential fields algorithm to predict a path; 2. If the goal configuration does not change too much, then the robot follows this path to avoid any obstacle; 3. If the goal configuration moves to a new position which has a big change from the old position, the algorithm randomly chooses some points in the predicted path and runs the basic Potential Fields method to compute several paths starting from these points based on current sensor data; 4. The path with the lowest cost is selected (based on Euclidian distance). The robot is now using the new path to move to the new goal configuration.

Obstacle Avoidance Phase

The speed of the object is 2 m/s, and the robot step size is 12 cm.

Obstacle Avoidance Phase The speed of the object is 2 m/s, and the robot step size is 50 cm.

Obstacle Avoidance Phase A better solution is to use a dynamic step size. When the object is moving slowly, a large step size is chosen to let the robot avoid the obstacle quickly. Conversely, a relatively small step size is set to allow the robot to choose a better adjusted path to move towards a new position when the object is moving quickly.

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

Experimental Results

Outline Introduction System architecture Image Input Phase Object Tracking Phase Robot Control Phase Obstacle Detection Phase Obstacle Avoidance Phase Experimental Results Conclusion

The system uses two sensors: a visual camera to sense the movement of any tracked object, and a range sensor to help the robot detect and then avoid obstacles in real-time while continuing to track the object. This paper also presents a modified Potential Fields method called DGPF method which is used to deal with real-time obstacle avoidance for object tracking.

Thank you for your attention.