Group #2 / Embedded Motion Control [5HC99] Embedded Visual Control 1 Group #5 / Embedded Visual Control Self-Balancing Robot Navigation Paul Padila Vivian.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

OpenCV Introduction Hang Xiao Oct 26, History  1999 Jan : lanched by Intel, real time machine vision library for UI, optimized code for intel 
S3 Technologies Presents Tactile Vision Glove for The Blind S3 Technologies: Shaun Marlatt Sam Zahed Sina Afrooze ENSC 340 Presentation: December 17, 2004.
Borough of Verona 2014 Using axis technology. Two Types of Day and Night Vision Technologies Wide dynamic range (WDR) describes an attribute of an imaging.
George Tillinghast, Rebecca Stein, Mike Suriani Tom Dinetta, Jon Richardson.
SOLAR TRACKER PROJECT. INTRODUCTION: Solar tracker is a system that is used to track sun light to increase the efficiency of electricity gained from solar.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Yiming Zhang SUNY at Buffalo TRAFFIC SIGN RECOGNITION WITH COLOR IMAGE.
Computerized Labyrinth Solver Gregory Schallert Chad Craw.
CS 561, Sessions 27 1 Towards intelligent machines Thanks to CSCI561, we now know how to… - Search (and play games) - Build a knowledge base using FOL.
X96 Autonomous Robot Design Review Saturday, March 13, 2004 By John Budinger Francisco Otibar Scott Ibara.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
X96 Autonomous Robot Proposal Presentation Monday, February 16, 2004 By John Budinger Francisco Otibar.
EIGHTH GRADE ROBOTICS KITTATINNY REGIONAL HIGH SCHOOL MR. SHEA Introduction to Robotics Day4.
3D Room Surface Mapping Neil Brazeau, Franck Dzefi, Jackline Koech, Nicholas Mosher Faculty Advisor: Prof. Mario Parente Department of Electrical and Computer.
Juan Guzman ASU Mentor: Shea Ferring. About Me Name: Juan Guzman Computer Science major ASU Senior (graduating December 2012) Have prior internship with.
The CarBot Project Group Members: Chikaod Anyikire, Odi Agenmonmen, Robert Booth, Michael Smith, Reavis Somerville ECE 4006 November 29 th 2005.
system design Final report
Raspberry Pi Camera for Measuring Bottle Size
Electromechanical Systems “Robotic Sorting System” Brent GuyJonathan Penney.
Optimal Placement and Selection of Camera Network Nodes for Target Localization A. O. Ercan, D. B. Yang, A. El Gamal and L. J. Guibas Stanford University.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Administrative Introduction Our goals for this project is for the two robots to work together intelligently using wireless communication Not only did.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Vision Surveillance Paul Scovanner.
Mobile Controlled Car Students : Tasneem J. Hamayel Hanan I. Mansour Supervisor : Dr.Aladdin.
JULLIENOR TITAS DAS EEL 5666 – IMDL FALL /2/2014.
Astrophotography The Basics. Image Capture Devices Digital Compact cameras Webcams Digital SLR cameras Astronomical CCD cameras.
Disturbance Rejection: Final Presentation Group 2: Nick Fronzo Phil Gaudet Sean Senical Justin Turnier.
Administrative Introduction Our goals for this project are for the three robots to work together intelligently to complete a maze faster than an individual.
SIPHER Students: Javier Lara, Darren Lamison-White Graduate Student Advisors: Ethan Jackson, Ryan Thibodeaux Controlling Robots: Long distance, straight-line.
Presents: THE XR-82 PROJECT. Team Members James Flora Power Point Christopher Ho Report Editor Jonathan Hammer Parts Manager Kyle Tam Web Design Edel.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
GPAR Manar Anabtawi Mays Al-Haj Qasem Dr.Sufyan Samara Prepared by Supervisor Examiners Dr.Luai Malhis Dr.Hanal Abu Zant.
Small, Lightweight Speed and Distance Sensor for Skiers and Snowboarders Michael Bekkala Michael Blair Michael Carpenter Matthew Guibord Abhinav Parvataneni.
Pool Player Bot Final Presentation Jiaying Zhang Mechanical Engineering.
AN INTELLIGENT ASSISTANT FOR NAVIGATION OF VISUALLY IMPAIRED PEOPLE N.G. Bourbakis*# and D. Kavraki # #AIIS Inc., Vestal, NY, *WSU,
October 1, 2013Computer Vision Lecture 9: From Edges to Contours 1 Canny Edge Detector However, usually there will still be noise in the array E[i, j],
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Smart Lens Robot William McCombie IMDL Spring 2007.
Keyboard mode: First Touch Second Touch if2is2mf2ms2rf2rs2ps2 if1abcdfg[space] is1hejklm[backspace] mf1npiqrs[return] ms1tvwoxz[period] rf10123u4[sym]
CONTENTS Objective Software &Hardware requirements Block diagram Mems technology Implementation Applications &Advantages Future scope Conclusion References.
SP13 ECE 445: Senior Design Sign Language Teaching Glove Project #29: Reebbhaa Mehta, Daniel Fong, Mayapati Tiwari TA: Igor Fedorov.
IEEE South East Conference 2016 MID-SEMESTER PRESENTATION.
Magic Wand Battle Game Team 53 Shanoon Martin, Jialin Sun, Manfei Wu.
Gesture controlled vehicle By, Sara Amendola Puneet Chugh Prasanth.
THE LASER QUALITY. ZAPHIRO, the new high range 2D laser machine by PRIMA INDUSTRIE, designed for very high productivity, quality, and flexibility. THE.
IEEE South East Conference 2016 MID-SEMESTER PRESENTATION.
Motion tracking TEAM D, Project 11: Laura Gui - Timisoara Calin Garboni - Timisoara Peter Horvath - Szeged Peter Kovacs - Debrecen.
The entire system was tested in a small swimming pool. The fully constructed submarine is shown in Fig. 14. The only hardware that was not on the submarine.
April 30 th, 2010 Freedom Innovation Research. Topics Covered Introduction System Overview Project Budget Timeline Future Development Question and Answers.
Best Practice T-Scan5 Version T-Scan 5 vs. TS50-A PropertiesTS50-AT-Scan 5 Range51 – 119mm (stand- off 80mm / total 68mm) 94 – 194mm (stand-off.
SMART CAMERAS AS EMBEDDED SYSTEM SMVEC. SMART CAMERA  See, think and act  Intelligent cameras  Embedding of image processing algorithms  Can be networked.
Docent-robot (Greggg)
PRESENTATION ON Line follower robot.
Monitoring Robot Prepared by: Hanin Mizyed ,Abdalla Melhem
Contents Team introduction Project Introduction Applicability
Game Theoretic Image Segmentation
Air Hockey Robot Students: Abdullah Ahmad & Moath Omar Supervisor:
Smart Car Robot Prepared by Supervised by Mai Asem Abushamma
ARDUINO LINE FOLLOWER ROBOT
Prepared By : wala’ Hamad Khayrieh Homran
Balanduino Supervisor: Dr. Raed Al-Qadi Prepared by: Nadeen Kalboneh Nardeen Mabrouk.
CS 7455 Term Project Robot control by Machine learning
Automatic Cloth Folding Machine
New horizons in the artificial vision
Segway Fault Patrick Lloyd Undergraduate Student
Elecbits Self-Balancing Robot.
Presentation transcript:

Group #2 / Embedded Motion Control [5HC99] Embedded Visual Control 1 Group #5 / Embedded Visual Control Self-Balancing Robot Navigation Paul Padila Vivian Zhang Amritam Das Michail Papamichail

Group #2 / Embedded Motion Control Overview 1. Introduction 2. Objectives 3. Design 4. Control 5. Vision 6. Conclusions and Recommendations 2 Group #5 / Embedded Visual Control

Group #2 / Embedded Motion Control 1. Introduction 3 Group #5 / Embedded Visual Control ► The Self-balancing robots not so popular and do not have many applications yet. ► They are mostly used for educative purposes ► Possible reason: hard to be stabilized under certain conditions. Application of self-balancing robot Two wheels self-balance electric scooter

Group #2 / Embedded Motion Control 1. Introduction 4 Group #5 / Embedded Visual Control ► The color tracking method is also not very popular yet. ► Possible reasons could be that colors are hard to be tracked during intense sunshine or during the night. Application of color tracking method Color-based Object Tracking in Surveillance

Group #2 / Embedded Motion Control 1. Introduction 5 Group #5 / Embedded Visual Control ► The camera is mounted in a robot and not anchored in a wall. ► The robot have automated navigation and can scout areas. ► Limits the amount of cameras that are needed. ► Cameras cannot be tricked by changing clothing color in blind spots. ► There can be a network of cameras that can track the target cooperatively. Possible application in the future By Combining the previous two applications one can achieve a new improved Surveillance system with great advantages.

Group #2 / Embedded Motion Control 1. Introduction 6 Group #5 / Embedded Visual Control ► Gesture detection ► Shape detection Playstation eye gesture detection Other visual methods

Group #2 / Embedded Motion Control 1. Introduction 7 Group #5 / Embedded Visual Control ► A robot that can perform certain tasks in a hospital. –Empties the trash. –Refills supplies. Future applications in general By Combining a self-balanced robot with any of the visual method of detection. ► A robot that can identify flawed parts in constructions. –It recognizes skewed shapes. –It can work even if the construction site is closed. –It increases the safety of the construction site. –It protects the investments.

Group #2 / Embedded Motion Control 3. Design 8 Group #5 / Embedded Visual Control Mechanical Design ► Multi-layer. ► Rigid supports. –Electronics –Motors

Group #2 / Embedded Motion Control 3. Design 9 Group #5 / Embedded Visual Control Support Structure ► The selected thickness of the material is able to support the weight of the set of batteries used. ► This material is lightweight (minimizes the total weigh). This means an improvement in the energy consumption of the robot. ► MDF is easy to and inexpensive material that can be used with laser cutting machines.

Group #2 / Embedded Motion Control 3. Design 10 Group #5 / Embedded Visual Control Plates

Group #2 / Embedded Motion Control 3. Design 11 Group #5 / Embedded Visual Control Motor Base ► Motors should be perfectly aligned. ► Misalignment causes vibrations and deviations during the displacement of the Robot

Group #2 / Embedded Motion Control 3. Design 12 Group #5 / Embedded Visual Control Motor ► Functions: –Stabilization –Displacement of the robot ►Fast reactions ►Large torque

Group #2 / Embedded Motion Control 3. Design 13 Group #5 / Embedded Visual Control Batteries ► Maximum energy consumption: 12V at 5.2A. ► batteries: 3.7V(x3) at 5.3A

Group #2 / Embedded Motion Control 3. Design 14 Group #5 / Embedded Visual Control Arduino Arduino Shield ► Compact and easy to install. ► The interfaces between the sensors and the control are ready to use ►MPU-6050: 3-axis gyroscope and a 3- axis accelerometer in a single chip with I2C communication ►L298P: Motor driver, high voltage (50V) and high current (4A) dual channel full-bridge

Group #2 / Embedded Motion Control 3. Design 15 Group #5 / Embedded Visual Control Arduino UNO ► Control unit. –Sensors –Actuators

Group #2 / Embedded Motion Control 4. Control 16 Group #5 / Embedded Visual Control Control Problem ► Stabilization Problem ► Position Problem

Group #2 / Embedded Motion Control 4. Control 17 Group #5 / Embedded Visual Control Stabilization Problem ► P ► PD ► PI

Group #2 / Embedded Motion Control 4. Control 18 Group #5 / Embedded Visual Control Position Problem ► P ► PD ► PI

Group #2 / Embedded Motion Control 4. Control 19 Group #5 / Embedded Visual Control Control Design ► Different control objectives. ► Same actuator. ► Different time constants are fundamental to guarantee stability

Group #2 / Embedded Motion Control 4. Control 20 Group #5 / Embedded Visual Control Performance ParameterValue Settling time3 sec Position tolerance +/- 4cm Tracking tolerance +/- 4 cm

Group #2 / Embedded Motion Control Vision 21 Image Processing Colour Tracking Colour Tracking Open CV Integration with Control Hardware Object Following Object Following

Group #2 / Embedded Motion Control Choice of Hardware - Raspberry Pi 2 + pi Camera 22

Group #2 / Embedded Motion Control Capturing Consistent Image 23 ► To fix exposure time, set the shutter_speed attribute to a reasonable value. shutter_speed ► To fix exposure gains, let analog_gain and digital_gain settle on reasonable values, then set exposure_mode to 'off'. analog_gain digital_gain exposure_mode ► To fix white balance, set the awb_mode to 'off', then set awb_gains to a (red, blue) tuple of gains. Optionally, set iso to a fixed value. awb_mode awb_gains iso

Group #2 / Embedded Motion Control Capturing Consistent Image – Sample Implementation 24

Group #2 / Embedded Motion Control Color Tracking 25 ► Image Conversion

Group #2 / Embedded Motion Control Noise Elimination ► Morphological Operation ► Erosion 26 Dilation

Group #2 / Embedded Motion Control Thresholded Image with Morphological Operation 27

Group #2 / Embedded Motion Control Edge Detection + Contour Analysis 28 ► Canny Edge Detection + Gaussian Blur Filter

Group #2 / Embedded Motion Control Integrating Raspberry pi with Arduino 29 ► Serial Communication

Group #2 / Embedded Motion Control Object Tracking Algorithm 30 ► Boolean Logic

Group #2 / Embedded Motion Control Object Tracking Algorithm 31 ► Proportionate Controller

Group #2 / Embedded Motion Control Performance of Object Tracking Camera Reaction Time 32 – camera –object

Group #2 / Embedded Motion Control Performance of Object Tracking Change in The object Distance 33

Group #2 / Embedded Motion Control Performance of Object Tracking Movement of the Camera 34