Presentation is loading. Please wait.

Presentation is loading. Please wait.

Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D.

Similar presentations


Presentation on theme: "Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D."— Presentation transcript:

1 Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D. Yoder November 12, 2003

2 Problem Identification Goal: Develop a vision guided robot positioning control system Current systems are limited to Teach- Repeat style On completion the robot will automatically perform tasks specified by the software

3 Budget/Purchasing A budget of approximately $20,000 has been provided by Ohio Northern University –Robot –Gripper Desktop computer Two CCD (Charge-Coupled Device ) cameras

4 Background Information Articulated SCARA Selectively Compliant Assembly Robot Arm

5 Vision Systems Eye-On-Hand Configuration External Stereo Camera Configuration

6 Prototype Final prototype will include: –CCD camera(s) –Desktop computer –Robot Control Unit –Robot –Gripper

7 Specifications Robots are measured according to two specifications: –Repeatability –Absolute Accuracy

8 Object Recognition Algorithms Normalized Cross-Correlation –Slower –Less Sensitive To Light Shape-Based Matching –Faster –Similarity Measure The Sum of Absolute Differences

9 Difference Algorithm Visuals

10

11 Design Deliverables PurchasingTarget Date –Robot09/12/03 - 11/21/03 –Gripper09/12/03 - 11/21/03 Image Processing –Camera pan/tilt/zoom09/30/03 - 02/20/04 –CAD11/03/03 - 01/09/04 –Difference Algorithm11/04/03 - 12/19/03 –Object Recognition01/09/04 - 02/02/04 –User Interface02/02/04 - 02/20/04 General –Controller Integration12/01/03 - 02/02/04 –Gripper Implementation12/01/03 - 02/02/04 –Testing03/08/04 - 04/01/04

12 Overall Block Diagram ControllerCPURobot Gripper Cameras - Pan - Tilt - Zoom Image Processing C++/C# UI 3D Model - Laser - Virtual CAD Algorithm

13 Software Block Diagram Capture Initial Capture Current Compare Using Difference Read y? Identify Object / Find Coordinates Send Instructions Controller Commands Robot Check Actions Done ?

14 Robot Decision Matrix 0 – Unacceptable 1 – Acceptable 2 – Average 3 – Good 4 – Excellent 5 – Best Case

15 QUESTIONS ?

16 References ABB Product Specification Sheet (2003) Lin, C.T., Tsai D.M. (2002) Fast normalized cross correlation for defect detection. Machine Vision. Yuan-Ze University,1-5. Phil Baratti “Robot Precision” (personal communication, November 4 2003) Stegar, C.., 2001. Similarity measures for occlusion, clutter, and illumination invariant object recognition. In: B. Radig and S. Florczyk(eds), Mustererkennung 2001, Springer Munchen, pp. 145- 154. Stegar, C., Ulrich, M. 2002 Performance Evaluation of 2D Object Recognition Techniques. Technical Report: Technische Universitat Munchen, 1-15. Robots and Manufacturing Automation, pg. 220-222. http://www.prip.tuwien.ac.at/Research/RobotVision/vs.html

17 Software Flow Description Take Initial Picture: –Initialize cameras –Capture image –Store image to hard disk or in RAM (will depend on speed) Take Current Picture: –Start current image capture loop –Capture image and store in memory

18 Software Flow Description Compare Current Picture With Initial … : –Use difference algorithm to compare initial image and current image – this yields a bitmap of the pixels that have changed based on a specified error factor –Store difference image in memory

19 Software Flow Description Ready to Manipulate Object? : –Check difference image to see that an object has entered the area – using Boolean to keep track –If object is already present check to see if it has stopped moving –If done moving, act on object, else return to ‘Take Current Picture’ Block

20 Software Flow Description Identify and Find Object Coordinates … : –Take pictures of scene using both cameras focused on object –Compare the images received from the cameras using the Stereo Vision Algorithm to determine object type and relevant points relative to the position of the robot arm Send Coordinates & Instructions to Robot Controller : –Communicate with robot controller sending incremental instructions based on object type and coordinates

21 Software Flow Description Use Stereo Vision Algorithm to make sure desired actions … : –Take pictures of scene using both cameras focused on object –Compare the images received from the cameras using the Stereo Vision Algorithm to determine whether the robot arm has moved correctly. After adjusting to mistakes by the robot, send more coordinates and instructions to robot controller until it has performed the correct task


Download ppt "Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D."

Similar presentations


Ads by Google