Juan Guzman ASU Mentor: Shea Ferring. About Me Name: Juan Guzman Computer Science major ASU Senior (graduating December 2012) Have prior internship with.

Slides:



Advertisements
Similar presentations
MDA Info Session General Meeting: Information & Recruitment Sept 19, 2011: 6-8p BA3008 Mechatronics Design Association General Info Session Sept 2011.
Advertisements

LTU Armadillo 2007 IGVC Jeremy Gray, BSEE; Shawn Ellison, MSCS; Phil Munie, MSCS; Brandon Bell, MSCS.
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Project Title Here IEEE UCSD Overview Robo-Magellan is a robotics competition emphasizing autonomous navigation and obstacle avoidance over varied, outdoor.
Vision-based Motion Planning for an Autonomous Motorcycle on Ill-Structured Road Reporter :鄒嘉恆 Date : 08/31/09.
Autonomous Sensor and Control Platform Rover Tae Lee Josh Reitsema Scott Zhong Mike Chao Mark Winter.
Accessible Aerial Autonomy? Lilian de Greef, Brad Jensen, Kim Sheely Nick Berezny, Cal Poly Pomona '12 Malen Sok, Cal Poly Pomona '13 special guest star!
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
Senior Computer Engineering Project
Vision Based Control Motion Matt Baker Kevin VanDyke.
Semi-autonomous Robots and Efficiency Through Internet Access Victor Barriga Southwestern College Computer Science NSF Grant.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Background S.A.U.V.I.M. Semi - Autonomous Underwater Vehicle for
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
Night Vision James Stacy Brian Herre Maurio Grando Eric Faller Chris Bawiec James Team Bender.
1 Real Time, Online Detection of Abandoned Objects in Public Areas Proceedings of the 2006 IEEE International Conference on Robotics and Automation Authors.
Artificial Intelligence
Infosession UNIVERSITY OF TORONTO MECHATRONICS DESIGN ASSOCIATION.
E-Learning: Case Studies in Web-Controlled Devices and Remote Manipulation December 2001 by: Tarek Sobh, Raul Mihali, Puneet Batra, Amit Singh, Sudip Pathak,
Camera Aided Robot Progress Report.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mr Mehrdad Ghaziasgar.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
MDA Info Session General Meeting: Information & Recruitment Sept 19, 2012: 7-8pm Mechatronics Design Association General Info Session Sept 2012.
Knowledge Systems Lab JN 9/10/2002 Computer Vision: Gesture Recognition from Images Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Group #2 / Embedded Motion Control [5HC99] Embedded Visual Control 1 Group #5 / Embedded Visual Control Self-Balancing Robot Navigation Paul Padila Vivian.
UNIT - 1Topic - 1. An electronic device, operating under the control of instructions stored in its own memory unit, that can accept data (input), manipulate.
Portable Vision-Based HCI A Real-Time Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Department of Computer Science and Information Engineering.
MTA-SZTAKI Roboteyepair an ultra-high speed stereo vision system based on CNN technology.
TigerBot Computer Design Jonathan Cormier (CE), John Seybold (CE), Jeremy Jensen (ME), Eric Walkama (ME), Kyle Backer (EE), Matthew DeCapua (EE), Mike.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
Object Recognition in ROS Using Feature Extractors and Feature Matchers By Dolev Shapira.
Senior Design Project Megan Luh Hao Luo Febrary
Senior Design Project Megan Luh Hao Luo January
QUADCOPTER- Vision Based Object Tracking By By Pushyami Kaveti Pushyami Kaveti.
KaaShiv InfoTech presents ROBOTICS For Inplant Training / Internship, please download the "Inplant training registration form"
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Autonomy for General Assembly Reid Simmons Research Professor Robotics Institute Carnegie Mellon University.
LUNAR Lunar Unmanned Navigation and Acquisition Robot SECON I Senior Design I Final Design Review November 29, 2007.
ECE 477 Final Presentation Team 1  Spring 2013 Zelun Tie Xin Jin Ranmin Chen Hang Xie.
Mid-Semester Review Senior Design 2 October 8, 2013
Video Surveillance Under The Guidance of Smt. D.Neelima M.Tech., Asst. Professor Submitted by G. Subrahmanyam Roll No: 10021F0013 M.C.A.
Ryan Rasmussen Maggie Krause Jiajun Yang. Hardware Progress Mechanical assembly complete Received APM case and power module last week Connected wi-fi.
Big Data and IOT Laboratory by Dr. Dan Feldman. Asaf Slilat & Omri Cohen Fly With Me Zone 1 Zone 2 Target Anchor Point – Odroid U3 – PS Eye Camera –
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
MSU ROV Team Final Review Senior Design 2 November 19, 2013.
Senior Project Poster Day 2006, CIS Dept. University of Pennsylvania One if by land… Yosef Weiner, David Charles Pollack Faculty Advisor: C.J. Taylor,
Robotic Submarine (RoboSub) Information Session September 19, 2013.
OpenCV C++ Image Processing
Lunabotics Navigation Package Team May14-20 Advisor: Dr. Koray Celik Clients: ISU Lunabotics Club, Vermeer Company.
Hallucinating Robots A Mixed Real / Virtual Environment
SMART CAMERAS AS EMBEDDED SYSTEM SMVEC. SMART CAMERA  See, think and act  Intelligent cameras  Embedding of image processing algorithms  Can be networked.
Automatic License Plate Recognition for Electronic Payment system Chiu Wing Cheung d.
Hand Gestures Based Applications
Signal and Image Processing Lab
Processing the image with Python
Smart Car Robot Prepared by Supervised by Mai Asem Abushamma
CS 7455 Term Project Robot control by Machine learning
Mixed Reality Server under Robot Operating System
Visual Tracking on an Autonomous Self-contained Humanoid Robot
AI Stick Easy to learn and use, accelerate the industrialization of artificial intelligence, and let the public become an expert in AI.
ECE 477 Senior Design Group 3  Spring 2011
Elecbits Electronic shade.
Vision Based UAV Landing
PRELIMINARY DESIGN REVIEW
Jetson-Enabled Autonomous Vehicle ROS (Robot Operating System)
Jetson-Enabled Autonomous Vehicle
Presentation transcript:

Juan Guzman ASU Mentor: Shea Ferring

About Me Name: Juan Guzman Computer Science major ASU Senior (graduating December 2012) Have prior internship with Intel testing validation software Favorite Quote: “Computers make very fast, very accurate mistakes” – Anonymous

Outline AUVSI Competition Overview Tasks to complete autonomously What is Autonomy? Methods OpenCV Line Detection and Buoy Detection Demo ROS Conclusion

AUVSI RoboSub Competition July 17 – 22 in San Diego, CA Tasks needed to complete Follow multiple paths marked by lines Touch two Buoys in order by color Pass over a PVC gate Pick up multiple objects, drop into a bin Shoot a projectile at a target Pick up small cylinders from cutouts. The robot will need to do all of this Autonomously

What do we mean by Autonomous? “Acting independently or having the freedom to do so” In previous competitions, robots were tethered to a power source and controlled by human drivers. In this case, the robot cannot have any human input while completing tasks; all action is the robot acting of its own accord. The robot will use sensor data and video input to deduce its next logical steps.

Methods - How do we build one of these? What do we need to build an autonomous robot? Hardware In addition to the usual sensors and rotors of previous tethered robots Software Programming

Hardware This differs from previous Robotics projects in that we will need a capable computer on board. Things to take into account: Processor Power Needs to powerful enough to process an incoming video stream in real-time Power Consumption Size (we have about 4 inches of clearance)

VIA EPIA-P Pico-ITX 1.2 GHz VIA Nano ULV Processor Up to 4 gigs of RAM 3.9 x 2.8 inches

Software OpenCV – Open Source Computer Vision Popular library for computer vision programming functions Used to manipulate and analyze images to extract useable information. ROS - Robot Operating System Software Framework for robot applications “Meta-Operating System” “Provides hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, and more.” – From the Ros.org home page.

Programming with OpenCV Line Detection and Buoy Detection Raw Image → HSV → Threshold Functions → Noise Reduction → Canny Edge Detection → Contour Plot → Centroid

Demo

Buoy Detection by Color

ROS Uses a Package system to modularize functions Can use ROS to run multiple programs with a publisher and subscriber relationship For example, one ROS program, or “Node” can continuously pull frames from a camera, while another will see if and where a line exists on the frame. That node will then return the information to yet another node which will handle higher level movement and pass serial commands to another node handling the Arduino board in charge of rotors.

Conclusion We are building this robot to