Self-Navigation Robot Using 360˚ Sensor Array

Slides:



Advertisements
Similar presentations
Add and Use a Sensor & Autonomous For FIRST Robotics
Advertisements

1. 2 LabVIEW for FRC Doug Norman National Instruments January 6, 2012.
Rotary Encoder. Wikipedia- Definition  A rotary encoder, also called a shaft encoder, is an electro- mechanical device that converts the angular position.
Autonomous Quadrocopter Proposal Brad Bergerhouse, Nelson Gaske, Austin Wenzel Dr. Malinowski.
Distance Sensors Kyle Zhang Jeff Peil Kristian Kalaj.
Servos The material presented is taken from a variety of sources including:
A.G.I.L.E Team Members: Brad Ramsey Derek Rodriguez Dane Wielgopolan Project Managers: Dr. Joel Schipper Dr. James Irwin Autonomously Guided Intelligent.
EE396 Project Micromouse Team: Ocha. Team Members Kanoa Jou (Programmer) Ryan Sato (Hardware) KiWoon Ahn (Recorder) Alan Do (Programmer)
FOLLOWER SENSORS AND ACTUATORS EE 552 INTSTRUCTOR :Dr MOHAN KRISNAN BY MOHAMMED KASHIF IQBAL ANESH BODDAPATTI UNIVERSITY OF DETROIT MERCY.
Preliminary Design Review
ME 224 Experimental Engineering: Professor Espinosa 2005 TEAM : Jamie Charles Carlo Niko Javier.
Senior Project Design Review Remote Visual Surveillance Vehicle (RVSV) Manoj Bhambwani Tameka Thomas.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
PT 5000 Pooja Rao Ted Tomporowski December 7, 2004.
Tracking Rover Team Rubber Ducky Joshua Rubin Alexander Starick Ryan Ramos Alexander Chi.
EIGHTH GRADE ROBOTICS KITTATINNY REGIONAL HIGH SCHOOL MR. SHEA Introduction to Robotics Day4.
Basics of Sensors. A sensor is a device which is used to sense the surroundings of it & gives some useful information about it. This information is used.
Design and Implementation of Metallic Waste Collection Robot
1. A guide to wiring your robot. Jerry Skene Past mentor – Chantilly Robotics
Embedded Microcomputer Systems Andrew Karpenko 1 Prepared for Technical Presentation February 25 th, 2011.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Team Spot A Cooperative Robotics Problem A Robotics Academy Project: Laurel Hesch Emily Mower Addie Sutphen.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Xin Jin Zelun Tie Ranmin Chen Hang Xie. Outline  Project overview  Project-specific success criteria  Block diagram  Component selection rationale.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
The Eos-Explorer CHENRAN YE IMDE ECE 4665/5666 Fall 2011.
Wall-E Prototype I Team 1 Xin Jin
Robot sensors MVRT 2010 – 2011 season. Analog versus Digital Analog Goes from 0 to 254 Numerous values Similar to making waves because there are not sudden.
Mobile Robot Navigation Using Fuzzy logic Controller
Phong Le (EE) Josh Haley (CPE) Brandon Reeves (EE) Jerard Jose (EE)
1 Cartel: Cartography (mapmaking) + Intel (intelligence) Preliminary Design Review ECE4007 L01 – Senior Design – Fall 2007 School of Electrical and Computer.
Team Ocho Cinco Raymond Chen Zhuo Jing Brian Pentz Kjell Peterson Steven Pham.
By: Eric Backman Advisor: Dr. Malinowski.  Introduction  Goals  Project Overview and Changes  Work Completed  Updated Schedule.
Team 6 DOODLE DRIVE Alexander Curtis Peachanok Lertkajornkitti | Jun Pan | Edward Kidarsa |
Final Year Project(EPT4046) Development of an internet controlled Surveillance Mobile Robot By Mimi Madihah Bt Mohd Idris Id: BACHELOR OF ENGINEERING.
GraffitiBot Sensor Report Andy Kobyljanec EEL 5666C March 25, 2008.
We thank the Office of Research and Sponsored Programs for supporting this research, and Learning & Technology Services for printing this poster. Fully-Autonomous.
Group #42: Weipeng Dang William Tadekawa Rahul Talari.
We thank the Office of Research and Sponsored Programs for supporting this research, and Learning & Technology Services for printing this poster. Miniature.
1 Cartel: Cartography (mapmaking) + Intel (intelligence) Preliminary Design Review ECE4007 L01 – Senior Design – Fall 2007 School of Electrical and Computer.
RECON ROBOT ECE 477 Group 8 Vinit Bhamburdekar Arjun Bajaj Aabhas Sharma Abhinav Valluru.
ULTRASONIC PHASED ARRAYS TESTING. ULTRASONIC PHASED ARRAYS  Use a multiple element probe  Output pulse from each element is time delayed to produce.
Introduction to Vexnet
Programming Concepts (Part B) ENGR 10 Introduction to Engineering
Programming & Sensors.
PRESENTATION ON Line follower robot.
Obstacle avoiding robot { pixel }
Metal Detector Robotic Vehicle
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
ECE 445 Smart Window Responding System
Aggressive Chasing Car
ROBOTC for VEX Online Professional Development
ECE Computer Engineering Design Project
Obstacle Detection System with Audio Feedback
Smart Car Robot Prepared by Supervised by Mai Asem Abushamma
Servos The material presented is taken from a variety of sources including:
WALL DETECTOR ROBOT VEHICLE
Emergency Personnel Indoor Locator
Project Members: M.Premraj ( ) G.Rakesh ( ) J.Rameshwaran ( )
‘SONAR’ using Arduino & ultrasonic distance sensor
Servos The material presented is taken from a variety of sources including:
Sensors For Robotics Robotics Academy All Rights Reserved.
Electrical Engineer Responsibilities
Programming Concepts (Part B) ENGR 10 Introduction to Engineering
Servos The material presented is taken from a variety of sources including:
Mating/Data Sharing Robots (Abbot and Costello)
Automatic Cloth Folding Machine
GPS Navigation System ET Spring 2018
Programming Concepts (Part B) ENGR 10 Introduction to Engineering
Presentation transcript:

Self-Navigation Robot Using 360˚ Sensor Array Dylan Easterla, Jason Grubish, Alexander Kukay, Nicolas Tremain, and Marissa Zaleski Mentors: Dr. Kim Pierson and Turner Howard | Department of Physics and Astronomy Introduction Robot Specifications This Self-Navigation robot utilizes a variety of sensors, mechanical parts, and LabVIEW code which allows it to navigate through an environment with precision and accuracy. This project is a culmination of three generations worth of work that involved myriad hours of building, troubleshooting, and learning. Physical Design Compact RIO 9073 9263 Analog Output module 9205 Analog Input module 9401 Digital I/O module Compact DAQ 9174 Sabertooth 2x12 Motor Controller GP2Y0A02YK and GP2Y0A21YK0F Sharp Infrared and MB 1040 LV-MaxSonar-EZ4 Ultrasonic distance sensors Three 12V batteries Two power supply units 12V to 5V converter 12V to 24V converter Axis M1011 IP Camera Dlink DIR-601 Wireless Router Parallax Standard Servo Motor Purpose The goal of this project was to develop a robot that can navigate through an environment with many obstacles using a sensor array that provides a 360 degree view of its surroundings. The sensor array consists of infrared and ultrasonic distance sensors. Four sensor arrays are swept in an angular pattern to provide a complete view of the robots surroundings. The data from the two sensors must be combined using a mathematical calibration algorithm to take advantage of each sensor’s specific characteristics. The unique aspects of this project are the navigation program and the sensor calibration algorithm. Design A Sharp infrared and MaxBotix ultrasonic sensors are used to gather depth data. The IR sensor is rated for accurately detecting objects between 0.2m to 1.5m with an inverse voltage to distance relationship. However the IR sensor gave varying values for different colored and textured objects at the same distance. The ultrasonic sensors were rated for a range of 0.15m to 6.45m with a linear voltage to distance relationship. The ultrasonic sensor gave accurate readings for all surfaces further than 0.3m. The sensors were used in conjunction to detect objects at all distances , surfaces, and angles. Third Generation Robot Redesign Circuit Diagram Improvements made to the robot include three additional sets of sensors located on the sides and back of the robot that sweep back and forth to provide a 360 degree view of its environment, also added were two sensors in the front and the back angled to the floor to determine a drop off, we also updated the wiring for more stable connections, The Algorithm Vector-Field Histogram (VFH) Conclusion In order to navigate through a space, we need to develop a depth map of obstacles in the robot’s surroundings by linking infrared and ultrasonic distance data with the objects’ relative locations. This is done with a vector field histogram algorithm. The sensor servo sweeps back and fourth through its field of view, taking depth data at every angle, and then sends that to the VFH. A set of four VFH’s are created in the program to detect objects at various distance thresholds, allowing it to assign more significance to closer objects than those that are further away. The control algorithm functioned rather superbly. The robot is able to navigate through a wide variety of environments. It could always be improved by the implementation of more sensors to collect data in more directions and heights, however, the current setup yields promising results. Three of these robots have been constructed by our team, with two more in the works for the future. Future Goals Two way audio/video communication Second video camera for a wider field-of-view Wheel encoders to improve the navigation algorithm Addition of a Gyroscope/Accelerometer Long-range network control Path-Planning algorithm Communication between multiple robots for more complex tasks Sample Vector Field Histogram Navigation Algorithm Using the data from the VFH, areas where the sensors detect objects are avoided by the robot’s control algorithm as it navigates through its environment. Its speed is calculated relative to the nearest object its sensors detect, allowing it to traverse tight spaces carefully, and open spaces quickly. If an object is detected within 20cm of it, it will back up and turn away from that object before proceeding. In the event that direct control is desired, the user can take over manual control of the robot via a joystick controller.