Jetson-Enabled Autonomous Vehicle ROS (Robot Operating System)

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Introductory Control Theory I400/B659: Intelligent robotics Kris Hauser.
Proportional, Integral, Derivative Line Following October 5, 2013.
Results/Conclusions: In computer graphics, AR is achieved by the alignment of the virtual camera with the actual camera and the virtual object with the.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
San Bernardino Valley College Grant Development and Management Board Presentation August 14, 2014 Dr. Kathleen M. Rowley, Director.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
FRC LabVIEW Software Overview Joe Hershberger Staff Software Engineer National Instruments.
Department of Electrical and Computer Engineering Texas A&M University College Station, TX Abstract 4-Level Elevator Controller Lessons Learned.
Autonomous Vehicle: Navigation by a Line Created By: Noam Brown and Amir Meiri Mentor: Johanan Erez and Ronel Veksler Location: Mayer Building (Electrical.
LYU0503 Document Image Reconstruction on Mobile Using Onboard Camera Supervisor: Professor Michael R.Lyu Group Members: Leung Man Kin, Stephen Ng Ying.
Camera Control Group Members - Bryan Marek - Brendan McMeel - Caitlin Motsinger - Tanya Ngo - Chris Hippleheuser.
7/17/2002 Greg Grudic: Nonparametric Modeling 1 High Dimensional Nonparametric Modeling Using Two-Dimensional Polynomial Cascades Greg Grudic University.
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Computerized Labyrinth Solver The board-game ‘Labyrinth’ traditionally uses two manual controls to navigate a marble through a maze. This project proposes.
Autonomous Robotics Team Autonomous Robotics Lab: Cooperative Control of a Three-Robot Formation Texas A&M University, College Station, TX Fall Presentations.
Intelligent Steering Using PID controllers
Juan Guzman ASU Mentor: Shea Ferring. About Me Name: Juan Guzman Computer Science major ASU Senior (graduating December 2012) Have prior internship with.
Overview and Mathematics Bjoern Griesbach
The CarBot Project Group Members: Chikaod Anyikire, Odi Agenmonmen, Robert Booth, Michael Smith, Reavis Somerville ECE 4006 November 29 th 2005.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Abstract There are many applications today that require a means of controlling a particular setting given an ever changing environment without human intervention.
Behavior Based Robotics: A Wall Following Behavior Arun Mahendra - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Agenda Path smoothing PID Graph slam.
Umm Al-Qura University Collage of Computer and Info. Systems Computer Engineering Department Automatic Camera Tracking System IMPLEMINTATION CONCLUSION.
By Alex Wright and Terrence Bunkley Mentored by Dr. Kamesh Namaduri College of Engineering Department of Computer Science and Engineering Department of.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Department of Computer Science & Engineering Background Subtraction Algorithm for the Intelligent Scarecrow System Francisco Blanquicet, Mentor: Dr. Dmitry.
Dense Image Over-segmentation on a GPU Alex Rodionov 4/24/2009.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Major Disciplines in Computer Science Ken Nguyen Department of Information Technology Clayton State University.
Reinforcement Learning Control with Robust Stability Chuck Anderson, Matt Kretchmar, Department of Computer Science, Peter Young, Department of Electrical.
Towards Establishing and Maintaining Autonomous Quadrotor Formations Audrow J. Nash William Lee College of Engineering University of North Carolina at.
1 Structure of Aalborg University Welcome to Aalborg University.
QCAdesigner – CUDA HPPS project
Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo.
PID. The proportional term produces an output value that is proportional to the current error value. Kp, called the proportional gain constant.
UAV IMAGING G6: Shen, Yubing, Yushi. PANDABOARD Dual-Core 1.2 GHz ARM Cortex-A9 CPU 1 GB DDR2 SDRAM 5V Power Supply.
City College of New York 1 John (Jizhong) Xiao Department of Electrical Engineering City College of New York Mobile Robot Control G3300:
Zack Nemes By: Clemence Larroche. To track and follow a car as it travels along a path.
Chapter - Continuous Control
Simulation and Experimental Verification of Model Based Opto-Electronic Automation Drexel University Department of Electrical and Computer Engineering.
The entire system was tested in a small swimming pool. The fully constructed submarine is shown in Fig. 14. The only hardware that was not on the submarine.
Control Engineering 05/09/2006Control SystemsLecture # 1.
Image Processing For Soft X-Ray Self-Seeding
基于多核加速计算平台的深度神经网络 分割与重训练技术
PLTW CALIFORNIA STATE CONFERENCE CENTENNIAL HIGH SCHOOL
Chapter 7 The Root Locus Method The root-locus method is a powerful tool for designing and analyzing feedback control systems The Root Locus Concept The.
Control engineering and signal processing
Control Loops Nick Schatz FRC 3184.
Balanduino Supervisor: Dr. Raed Al-Qadi Prepared by: Nadeen Kalboneh Nardeen Mabrouk.
6: Processor-based Control Systems
Basic Design of PID Controller
Cole Perrault Spring 2015 ET 493 Wesley Deneke
An Adaptive Middleware for Supporting Time-Critical Event Response
Robotics and EVT - line follower -
HW-03 Problem Kuo-95 (p. 377) Find the steady-state errors for step, ramp and parabolic inputs. Determine the type of the system. ) s ( R Problem.
Dynamical Systems Basics
Coordination and Robotics
CONTROL OF MOBILE ROBOT
Single Parameter Tuning
PID Line Follower.
Y. Ordonez1, A. Bituin1, K. Kyain1, A. Maxwell2, Z. Jiang2
Jetson-Enabled Autonomous Vehicle
Evaluating Parameter Estimation of Stability and Accuracy of Structural Property-Dependent Integration Algorithms Cañada College Undergraduate Interns:
Presentation transcript:

Jetson-Enabled Autonomous Vehicle ROS (Robot Operating System) Alexis Gilmore, Jose L. Guzman, Luis Jibaja Advisors: Dr. Hao Jiang, Student Mentor: Tommy Kwok  School of Engineering, San Francisco State University, San Francisco, CA 94132, USA HSV SEGMENTATION MOTIVATION Transform image pixel data from camera to highlight segments of interest (blue line vs rest of image). Deviation of central tracking area (red dot) from segment edges provides continuous error for movement feedback system. Learn about advancements in image processing applications and autonomous behavior. Explore the NVIDIA Jetson TX1 and its potential for image processing programs. OBJECTIVE Build a vehicle that can track and follow a line. Develop a program that uses HSV segmentation to track a line. Tune a feedback system to stabilize vehicle movement. NVIDIA’s Jetson is the ideal solution for compute-intensive embedded applications (image processing). The Jetson offers high-performance parallel processing power from onboard GPU, while consuming less than 10 watts of power. Supports vehicle components for mobility, such as Arduino, USB Camera, LiPo Battery Power. Fig 5. Parametrizing for Kd (derivative coefficient). Negative displacement is left of the segment, Positive is right. Stable movement converges at zero. JETSON TX1 Fig 2. Transformed image pixel data with central tracking area (left). FIg 3. Demonstration of the Jetson vehicle’s camera routine. PID CONTROL SYSTEM Tuned the PID Kp (Proportional), Ki (Integral), and Kd (derivate) parameters to optimize error correction based on HSV segmentation data. Vehicle performs stabilized movement along the blue line, correctly aligning central tracking area with line segment. Incorporated ROS as a framework to connect the image algorithm and movement feedback subroutines with low-level device control. Successful runs around blue-line track. Continuous error from image processing is the input for PID algorithm. PID uses the error to correct vehicular position to align the central tracking area with the segmented colored line. CONCLUSION Fig 4. PID feedback control loop involving the proportional, integral, and derivative terms of the error. Built an autonomous vehicle that tracks and follows a line using the Jetson TX1. Implemented the HSV segmentation image processing algorithm and PID movement feedback algorithm. Used ROS as a framework to to connect high-level instruction subroutines with low-level device control. RESULTS Fig 1. Jetson TX1 Power Processing onboard GPU Successfully implemented HSV segmentation algorithm to dissociate blue line from rest of image and clarify segment edges. ROS (Robot Operating System) FUTURE WORK Robotics software framework that connects low-level device control with high-level programmed subroutines. Used to control vehicle components via instructions sent to Arduino. Programming workspace used to implement image processing and movement feedback algorithms. Implement deep learning, image recognition programs for autonomous behavior. Expand on movement and feedback routines to accomplish complex tasks. ACKNOWLEDGEMENTS This project is supported by the US Department of Education through the Minority Science and Engineering Improvement Program (MSEIP, Award No. P120A150014); and through the Hispanic-Serving Institution Science, Technology, Engineering, and Mathematics (HSI STEM) Program, Award No. P031C110159.  ​ Fig 5. Diagram of the autonomous vehicle’s behavior.