VISPS Overview Capture visual data from two cameras Find laser Triangulate to find X, Y, Z coordinates of laser Output coordinates to RS-232 serial output.

Slides:



Advertisements
Similar presentations
1.Obtain laser pointer coordinates 2.Send laser pointer coordinates to laptop through the PS/2 port 3.Pixel Mapping between camera image and laptop screen.
Advertisements

Autonomous Tracking Unit John Berglund Randy Cuaycong Wesley Day Andrew Fikes Kamran Shah Professor: Dr. Rabi Mahapatra CPSC Spring 1999 Autonomous.
Working with images and scenes CS 5010 Program Design Paradigms “Bootcamp” Lesson 2.5 TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
QR Code Recognition Based On Image Processing
Greg Beau SerajAnanya. Outline  Project overview  Project-specific success criteria  Block diagram  Component selection rationale  Packaging design.
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
Autonomous Sensor and Control Platform Rover Tae Lee Josh Reitsema Scott Zhong Mike Chao Mark Winter.
OutLine Overview about Project Wii Robot Escaper Robot Problems and Solutions Demo.
1 CODE TESTING Principles and Alternatives. 2 Testing - Basics goal - find errors –focus is the source code (executable system) –test team wants to achieve.
Logging and Replay of Go Game Steven Davis Elizabeth Fehrman Seth Groder.
Super Fast Camera System Performed by: Tokman Niv Levenbroun Guy Supervised by: Leonid Boudniak.
1 Color Discriminating Tracking System Lloyd Rochester Sam Duncan Ben Schulz Fernando Valentiner.
X96 Autonomous Robot Design Review Saturday, March 13, 2004 By John Budinger Francisco Otibar Scott Ibara.
LYU0503 Document Image Reconstruction on Mobile Using Onboard Camera Supervisor: Professor Michael R.Lyu Group Members: Leung Man Kin, Stephen Ng Ying.
Team GPS Rover Alex Waskiewicz Andrew Bousky Baird McKevitt Dan Regelson Zach Hornback.
7/24/031 Ben Blazey Industrial Vision Systems for the extruder.
Cycling Timing System SDP /10/03 Patrick Bell Emilio Gaudette Eric Johnson Advisor: Ramakrishna Janaswamy.
USB Mass-Storage Implementation on an Embedded System (D0113) Supervisor: Dimitry Sokolik Performed by: Yoav Gershoni Shachar Faigenblat Final Presentation.
Target Electronics 27 th November Overview Current System is composed of many discrete units with separate processors that do not talk to each other.
GE Energy Silicon Wafer Measurement System Team 10 Olin Biddy Scott Johnson Chetwyn Jones Rob McCoy Tim Weber.
X96 Autonomous Robot Proposal Presentation Monday, February 16, 2004 By John Budinger Francisco Otibar.
Building a Typical Electronic Project in Senior Design Peter Wihl (former Guest Lecturer)
Spectrum Analyzer Ray Mathes, Nirav Patel,
Dynamic Traffic Light Timing Tony Faillaci John Gilroy Ben Hughes Justin Porter Zach Zientek.
On the Design, Construction and Operation of a Diffraction Rangefinder MS Thesis Presentation Gino Lopes A Thesis submitted to the Graduate Faculty of.
Universal Voice Activated Remote Control (UVARC) Thanh Phan Dat Le Mohammad Safaiezeab Brandon Wilgor Peter Ralston.
PIC Evaluation/ Development Board Dec02-12 December 10, 2002 Client: ECpE Department Faculty Advisors: Dr. Rover, Dr. Weber Chad Berg, Luke Bishop, Tyson.
Camera Link Communication Interface for Vision Applications J. Egri 6/7/05.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
Oppenheimer Technologies Rick King Jonathan Creekmore.
Team Spot A Cooperative Robotics Problem A Robotics Academy Project: Laurel Hesch Emily Mower Addie Sutphen.
Athletic Field Marking Device Anthony Cortese, Ryan Crump, Matthew Lawler, Patrick Shaughnessy (Team Leader), John Sudia.
Dynamic Traffic Light Timing Tony Faillaci John Gilroy Ben Hughes Justin Porter Zach Zientek.
A Portable Device for the Translation of Braille to Literary Text n Andrew Pasquale n Curtin University of Technology.
Live Action First Person Shooter Game Patrick Judd Ian Katsuno Bao Le.
DLS Digital Controller Tony Dobbing Head of Power Supplies Group.
Autonomous Robot Project Lauren Mitchell Ashley Francis.
Microcontroller Presented by Hasnain Heickal (07), Sabbir Ahmed(08) and Zakia Afroze Abedin(19)
Lab 0: Groups and Equipment Start date: Week #2 Due date: no report 1.
Graphics. Graphic is the important media used to show the appearance of integrative media applications. According to DBP dictionary, graphics mean drawing.
IEEE Robotics - Requirements Presentation Presented by Jason Abbett and Devon Berry.
By: Eric Backman Advisor: Dr. Malinowski.  Introduction  Goals  Project Overview and Changes  Work Completed  Updated Schedule.
ATtiny23131 A SEMINAR ON AVR MICROCONTROLLER ATtiny2313.
1 Implementation in Hardware of Video Processing Algorithm Performed by: Yony Dekell & Tsion Bublil Supervisor : Mike Sumszyk SPRING 2008 High Speed Digital.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
PROJECT HEAD CASE Dan Klowden Jon Burns cse477 Spring 2000.
INTRODUCTION TO PIC MICROCONTROLLER. Overview and Features The term PIC stands for Peripheral Interface Controller. Microchip Technology, USA. Basically.
P09311: FPGA Based Multi-Purpose Driver / Data Acquisition System Sponsor: Dr. Marcin Lukowiak Team MemberDisciplineRole Adam Van FleetEEProject Manager/Documentation.
Testing Chapter 23 IB103 Week 12 (part 3). Verify that a complex (any) program works correctly, that the program meets specifications The chapter reviews.
Part A Final Dor Obstbaum Kami Elbaz Advisor: Moshe Porian August 2012 FPGA S ETTING U SING F LASH.
Lab Environment and Miniproject Assignment Spring 2009 ECE554 Digital Engineering Laboratory.
Preliminary Design Review By: Alireza Veiseh Anh-Thu Thai Luai Abou-Emara Peter Tsang.
Voice Controlled Home Automation System Group 13 Zhe Gong Hongchuan Li.
SOFTWARE TESTING. SOFTWARE Software is not the collection of programs but also all associated documentation and configuration data which is need to make.
TRANSMISSION LINE MULTIPLE FAULT DETECTION AND INDICATION TO EB
Final Design Review By: Alireza Veiseh Anh-Thu Thai Luai Abou-Emara Peter Tsang.
Best Practice T-Scan5 Version T-Scan 5 vs. TS50-A PropertiesTS50-AT-Scan 5 Range51 – 119mm (stand- off 80mm / total 68mm) 94 – 194mm (stand-off.
 System Requirement Specification and System Planning.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
Introduction to the FPGA and Labs
Using Arduino to modify RC controls Nick Carter 12/31/2015
LAB #1 Introduction Combinational Logic Design
RAILWAY TRACK SNAP NOTIFICATION
Electronic Control Systems Week 7 – PID Control
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Software Verification and Validation
Software Verification and Validation
Video Imaged Spatial Positioning Project
Software Verification and Validation
The pursuit of perfection
Presentation transcript:

VISPS Overview Capture visual data from two cameras Find laser Triangulate to find X, Y, Z coordinates of laser Output coordinates to RS-232 serial output RS-232 Out

Talk Overview VISPS goals and requirements – Cephas Design (Theory + Architecture) – Steve Design (Details + Communication) – Cephas Design analysis – Kevin Issues and testing – Kevin Schedule – Cephas Conclusion – Cephas

GOAL Produce X, Y, Z coordinates of laser marker relative to known reference frame Requirements, use: nm laser pointer 2 RC-2 black and white cameras XSV300 Development Board PIC Microcontroller VISPS Purpose

High Level Design Goals Produce X, Y, Z from visual data Simple interface Easy to use General, Accurate Design headroom

Requirements Table 3 – VISPS operating parameter ParameterValue Coordinate 63 inches the coordinates will be within 0.1 inches of its true location. Data OutputRS-232 Protocol (19.2 Kbps / 8 data bits / 1 stop bit / no parity) VoltageRequires 9 and 12 V supply Maximum Operating Distance 60 feet Laser Flare Wavelength nm Laser Flare Max Output< 1 mW

Constraints From Requirements Code size on PIC  C < 20kB Enough pins on XSV300 for two cameras input and communication with PIC  C

How Are Design Goals Achieved? Mathematics and theory Logical implementation and architecture Implementation details and subsystem communication

Triangulation Problem Summary Side View Top View

Visual Data Left CameraRight Camera Pixel coordinates are available.

Camera Calibration

Pixel Distances h is a constant 340 pixels at any distance

Angle Derivation From Pixels

Triangulation By Pixels

Math And Theory Summary Y, and Z found Similar to X Found X, Y, and Z using in terms of pixel coordinates No trigonometry or complex math No multiplication of error

Design Implementation Goals (Output points at 30fps, Preserve design headroom) Black Box design outline Component design specifics and I/O details

PIC µC Compute angles from pixel data Compute X, Y, Z from angles Output coordinates using RS-232 protocol XSV300 Locate Marker in pixel data RoboCam RC-2 Deliver pixel data Generic Laser Pointer Project marker RS-232 Coordinate output Generic Red Filters Remove non-laser light VISPS

System Pin-outs

First saturated point Last saturated point Average of coordinates Finding Center Of Marker

XSV300-PIC Protocol NEW READY BUSY C1hiC1loC2hiC2loRhiRlo DATA

Design Analysis Goals met Requirements satisfied Error analysis Lab results

Goals Met The design will correctly produce X, Y, and Z coordinates. We managed to avoid solutions that would have made using VISPS overly cumbersome or complicated. VISPS can output points at a full 30 points per second, the same rate as the camera Significant upgrades to VISPS accuracy from higher resolution cameras would not require a major design change

Requirements Satisfied Reworking our math enabled us to fit all the needed code on the PIC  C With current cameras, VISPS is accurate to within 0.1 inches at 60 inches Moving to XSV300 from XS40 gave us the extra I/O pins we needed

Error Sources and Analysis Miscalibration of cameras Camera alignment Camera leveling Camera resolution Output resolution

Miscalibration Of Cameras All our mathematics are based on initial measurements at Z = 160cm Introduction of systematic error in Z if h does not really equal 340 pixels Spend extra time taking careful measurements Integer value for h is encouraging Do sanity checks against manual measurements

Camera Alignment All our mathematics are based on the assumption that the two cameras are perfectly parallel Introduction of systematic error if cameras are not parallel Measure D at a distance check

Camera Leveling To know the reference frame for the X, Y, Z values produced, camera assembly must be perfectly leveled Use bubble levels mounted on camera assembly Do sanity checks against manual measurements

Pixel size increases with distance Error is equal to ½ pixel width as a function of Z This error is inherent in VISPS and cannot be reduced except by increasing camera resolution Not bad. Even at 340ft, error only = 5.5 inches! Camera Resolution

Output Resolution Output is in tenths of inches Not a factor after 63 inches This error is inherent in VISPS Error can be reduced by reducing the max output distance

Lab Results Cameras filtered with red filter produce good, noise free data Able to find laser marker quickly, accurately, and consistently PIC  C output shows that math is correct Unable to test system accuracy until parts are integrated

Problem Areas and Testing Resolved issues Remaining issues Reviewers comments Testing

Resolved Issues Noise filtering Interfering light sources Complex math and PIC  C memory constraints

Remaining Issues Porting FPGA design to XSV300 System accuracy

Reviewers Comments ProblemTheir SuggestionWhat We Did Single Pixel Noise In Camera Data 9 or 3 pixel median filter.2 pixel neighbor intensity check. Noise is not saturated. Interfering Light Sources Strobe laser using spinning wheel and take difference. Red Filter to block non- laser light. Jittery Or Unstable Coordinates Average laser location over 4 frames for anti-jitter. Nothing. We want the jitter. Limited Memory of XS40 Move FPGA design to XSV300. Moved FPGA design to XSV300 Floating-Point Math Bloat Use LUT.Reworked math to avoid trigonometry.

Testing Do lots of sanity checks against manual measurements Test validity of math theories Visually inspect VGA output of laser finding module and camera perspective Logic analyzer to check FPGA output Extra PIC  C test environment for PIC  C View serial output on Microsoft Hyperterminal Output using printf() to test X, Y, Z production independently of coordinate output protocol

Current Status And Roadmap Completed: Find laser in visual data Module to talk to PIC  C PIC  C math PIC  C, FPGA and RS-232 I/O Remaining Port FPGA design to XSV300 Integrate components and test Win32 App for demo Port FPGA Design to XSV300. Integrate Components, test, and debug. More debugging, test, demo. Week 8 Week 9 Week 10

Conclusion And Summary Our design will work Design goals are met