Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D.

Slides:



Advertisements
Similar presentations
Real-Time Template Tracking
Advertisements

I/O Organization popo.
CS-334: Computer Architecture
The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
Discovering Computers Fundamentals, Third Edition CGS 1000 Introduction to Computers and Technology Fall 2006.
Sensor-based Situated, Individualized, and Personalized Interaction in Smart Environments Simone Hämmerle, Matthias Wimmer, Bernd Radig, Michael Beetz.
Object Recognition with Invariant Features n Definition: Identify objects or scenes and determine their pose and model parameters n Applications l Industrial.
Noman Haleem B.E. in Textile Engineering from National Textile University Specialization in Yarn Manufacturing Topic “Determination of Twist per Inch.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Lecture 1: Intro to Computers Yoni Fridman 6/28/01 6/28/01.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Computing motion between images
CSE 160 – Lecture 10 Programs 1 and 2. Program 1 Write a “launcher” program to specify exactly where programs are to be spawned, gather output, clean.
Virtual Dart – An Augmented Reality Game on Mobile Device Supervised by Prof. Michael R. Lyu LYU0604Lai Chung Sum ( )Siu Ho Tung ( )
Conceptual Design Review Senior Design
Automatic Camera Calibration for Image Sequences of a Football Match Flávio Szenberg (PUC-Rio) Paulo Cezar P. Carvalho (IMPA) Marcelo Gattass (PUC-Rio)
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Energy Smart Room GROUP 9 PRESENTERS DEMO DATE SPECIAL THANKS TO ADVISOR PRESENTERS Thursday April 19, 2007 Department of Electrical and Systems Engineering.
Track, Trace & Control Solutions © 2010 Microscan Systems, Inc. Introduction to Machine Vision for New Users Part 1 of a 3-part webinar series: Introduction.
Systems Software Operating Systems.
Vision Guided Robotics
Automation and Drives Vision Sensor SIMATIC VS 110 Image processing without the need for specialist knowledge.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
Team Members: Mohammed Hoque Troy Tancraitor Jonathan Lobaugh Lee Stein Joseph Mallozi Pennsylvania State University.
Unit 3a Industrial Control Systems
Systems Analysis and Design in a Changing World, 6th Edition
PROJECT PLANNING. PLANNING Planning is essential and software development is no exception. Achieving success in software development requires planning.
 Optical Scanners Optical Scanners  Scanners Scanners  Electronic Tablet/Pen Electronic Tablet/Pen  Digital Camera Digital Camera  Webcam Webcam.
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
Development of Web-Based E-Quality Control System Principal Investigators: Drs. Richard Chiou and Yongjin Kwon Research Assistants: Prathaban Mookiah and.
There are different types of translator. An Interpreter Interpreters translate one instruction at a time from a high level language into machine code every.
Chapter 10: Input / Output Devices Dr Mohamed Menacer Taibah University
Input/OUTPUT [I/O Module structure].
UNIT - 1Topic - 1. An electronic device, operating under the control of instructions stored in its own memory unit, that can accept data (input), manipulate.
Course Presentation EEL5881, Fall, 2003 Project: Network Reliability Tests Project: Network Reliability Tests Team: Gladiator Team: Gladiator Shuxin Li.
CSCE 5013 Computer Vision Fall 2011 Prof. John Gauch
Shutter Timing and Flash Synchronization System Joel Hoffa Shaun Pontsler November 10, 2005 Advisor: Professor Herr.
Systems Software Operating Systems. What is software? Software is the term that we use for all the programs and data that we use with a computer system.
Dr Mohamed Menacer College of Computer Science and Engineering Taibah University CE-321: Computer.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
NSF Engineering Research Center for Reconfigurable Manufacturing Systems University of Michigan College of Engineering Reconfigurable System for Turbine.
I Robot.
How to startpage 1. How to start How to specify the task How to get a good image.
1 Ch. 1: Software Development (Read) 5 Phases of Software Life Cycle: Problem Analysis and Specification Design Implementation (Coding) Testing, Execution.
Industrial Vision Industrial Vision: components.
CH10 Input/Output DDDData Transfer EEEExternal Devices IIII/O Modules PPPProgrammed I/O IIIInterrupt-Driven I/O DDDDirect Memory.
Gesture Recognition in a Class Room Environment Michael Wallick CS766.
Design of a Chess Playing Machine BAM Applications Shawn Domer, Rachel Kurschat, Jacqui Wagner, Robert Withrow, Kurt Witkowski.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
CSC190 Introduction to Computing Operating Systems and Utility Programs.
In-Sight 5100 Vision System. What is a Vision System?  Devices that capture and analyze visual information, and are used to automate tasks that require.
IT3002 Computer Architecture
I/O Organization Competency – C6. Important facts to remember when I/O devices are to be connected to CPU There is a vast variety of I/O devices. Some.
1 Teaching Innovation - Entrepreneurial - Global The Centre for Technology enabled Teaching & Learning, N Y S S, India DTEL DTEL (Department for Technology.
بسم الله الرحمن الرحيم MEMORY AND I/O.
Capstone Design Implementation of Depth Sensor Based on Structured Infrared Patterns June 11, 2013 School of Information and Communication Engineering,
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Hand Gestures Based Applications
Depth Analysis With Stereo Cameras
Definition CASE tools are software systems that are intended to provide automated support for routine activities in the software process such as editing.
Chapter 2 Design Tools.
A Framework for Automatic Resource and Accuracy Management in A Cloud Environment Smita Vijayakumar.
Machine Vision Acquisition of image data, followed by the processing and interpretation of these data by computer for some useful application like inspection,
Vision-based Interaction
The Implementation of a Glove-Based User Interface
Vision Tracking System
There are different types of translator.
Presentation transcript:

Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D. Yoder November 12, 2003

Problem Identification Goal: Develop a vision guided robot positioning control system Current systems are limited to Teach- Repeat style On completion the robot will automatically perform tasks specified by the software

Budget/Purchasing A budget of approximately $20,000 has been provided by Ohio Northern University –Robot –Gripper Desktop computer Two CCD (Charge-Coupled Device ) cameras

Background Information Articulated SCARA Selectively Compliant Assembly Robot Arm

Vision Systems Eye-On-Hand Configuration External Stereo Camera Configuration

Prototype Final prototype will include: –CCD camera(s) –Desktop computer –Robot Control Unit –Robot –Gripper

Specifications Robots are measured according to two specifications: –Repeatability –Absolute Accuracy

Object Recognition Algorithms Normalized Cross-Correlation –Slower –Less Sensitive To Light Shape-Based Matching –Faster –Similarity Measure The Sum of Absolute Differences

Difference Algorithm Visuals

Design Deliverables PurchasingTarget Date –Robot09/12/ /21/03 –Gripper09/12/ /21/03 Image Processing –Camera pan/tilt/zoom09/30/ /20/04 –CAD11/03/ /09/04 –Difference Algorithm11/04/ /19/03 –Object Recognition01/09/ /02/04 –User Interface02/02/ /20/04 General –Controller Integration12/01/ /02/04 –Gripper Implementation12/01/ /02/04 –Testing03/08/ /01/04

Overall Block Diagram ControllerCPURobot Gripper Cameras - Pan - Tilt - Zoom Image Processing C++/C# UI 3D Model - Laser - Virtual CAD Algorithm

Software Block Diagram Capture Initial Capture Current Compare Using Difference Read y? Identify Object / Find Coordinates Send Instructions Controller Commands Robot Check Actions Done ?

Robot Decision Matrix 0 – Unacceptable 1 – Acceptable 2 – Average 3 – Good 4 – Excellent 5 – Best Case

QUESTIONS ?

References ABB Product Specification Sheet (2003) Lin, C.T., Tsai D.M. (2002) Fast normalized cross correlation for defect detection. Machine Vision. Yuan-Ze University,1-5. Phil Baratti “Robot Precision” (personal communication, November ) Stegar, C.., Similarity measures for occlusion, clutter, and illumination invariant object recognition. In: B. Radig and S. Florczyk(eds), Mustererkennung 2001, Springer Munchen, pp Stegar, C., Ulrich, M Performance Evaluation of 2D Object Recognition Techniques. Technical Report: Technische Universitat Munchen, Robots and Manufacturing Automation, pg

Software Flow Description Take Initial Picture: –Initialize cameras –Capture image –Store image to hard disk or in RAM (will depend on speed) Take Current Picture: –Start current image capture loop –Capture image and store in memory

Software Flow Description Compare Current Picture With Initial … : –Use difference algorithm to compare initial image and current image – this yields a bitmap of the pixels that have changed based on a specified error factor –Store difference image in memory

Software Flow Description Ready to Manipulate Object? : –Check difference image to see that an object has entered the area – using Boolean to keep track –If object is already present check to see if it has stopped moving –If done moving, act on object, else return to ‘Take Current Picture’ Block

Software Flow Description Identify and Find Object Coordinates … : –Take pictures of scene using both cameras focused on object –Compare the images received from the cameras using the Stereo Vision Algorithm to determine object type and relevant points relative to the position of the robot arm Send Coordinates & Instructions to Robot Controller : –Communicate with robot controller sending incremental instructions based on object type and coordinates

Software Flow Description Use Stereo Vision Algorithm to make sure desired actions … : –Take pictures of scene using both cameras focused on object –Compare the images received from the cameras using the Stereo Vision Algorithm to determine whether the robot arm has moved correctly. After adjusting to mistakes by the robot, send more coordinates and instructions to robot controller until it has performed the correct task