Nickolas McCarley University of Alabama Abstract Robotic Navigation through Gesture Based Control (RNGBC) assists people who may not be able to operate.

Slides:



Advertisements
Similar presentations
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Advertisements

Automated Systems Standard Grade What Is An Automated System? A system in which computers are used to control machines and equipment For example: –Traffic.
Smartphones. Lesson Objectives To understand and demonstrate an understanding of Smartphones.
The MYO gesture and motion control armband let you use the movements of your hands to effortlessly control your phone,computer.
TOUCHLESS TOUCH SCREEN
ASSISTIVE TECHNOLOGY PRESENTED BY ABDUL BARI KP. CONTENTS WHAT IS ASSISTIVE TECHNOLOGY? OUT PUT: Screen magnifier Speech to Recogonizing system Text to.
Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Technology Education Copyright © 2006 by The McGraw-Hill Companies,
Modularly Adaptable Rover and Integrated Control System Mars Society International Conference 2003 – Eugene, Oregon.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Perceptual Computing Kevin Homaizad. Background Control of a device with: Voice Recognition Facial Recognition Precise Muscle Movement Recognition* No.
SAFETEXTER SAVING LIVES BEHIND THE WHEEL MICHAEL DOWDY DONOVAN HICKS KENNETH LEWIS DANNY THEPVONGSA.
MOBILE COMPUTING MOBILE COMPUTING V.Pandeeswari, V.Pandeeswari, II nd CSE II nd CSE.
Input By Hollee Smalley. What is Input? Input is any data or instructions entered into the memory of a computer.
University of Pennsylvania Moore School of Electrical Engineering ABSTRACT: The ability to communicate is essential for surviving in today’s world, but.
Outline Personal Computer Desktop Computer Mobile Computers and Mobile Devices - Notebook Computer - Tablet PC - Smart Phones and PDAs - Ebook Readers.
TOUCH ME NOT Presented by: Anjali.G.
Beyond the PC Kiosks & Handhelds Albert Huang Larry Rudolph Oxygen Research Group MIT CSAIL.
Teleoperation In Mixed Initiative Systems. What is teleoperation? Remote operation of robots by humans Can be very difficult for human operator Possible.
Computers, part of your life – Grade 11
*Smart Boards*.
GOOGLE GLASS TECHNOLOGY. Project Glass is a research and development program by Google to develop an augmented reality Head Mounted display (HMD). The.
January 29, January 29, 2014 Gesture recognition technology moving into mainstream Tal Krzypow VP Product Management eyeSight Technologies Ltd.
By : Mahwash Merchant Abdul Ahad Mushir Rashida Moiz Sandhya.
Auto-Park for Social Robots By Team Daedalus. Requirements for FVE Functional Receive commands from user via smartphone app Share data with other cars.
What is Input?  Input  Processing  Output  Storage Everything we enter into the computer to do is Input.
A Gesture Based System Humanize Technology.  Communication is the way we learn.  What about learners with communication difficulties?  Make technology.
LESSON 05 Using the Touch LED The Touch LED The purpose of this lesson is to introduce students to the Touch LED Sensor and how they can be used on a.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Google. Android What is Android ? -Android is Linux Based OS -Designed for use on cell phones, e-readers, tablet PCs. -Android provides easy access to.
Standard Input Devices
Voice Controlled Robot by Cell Phone with Android App
TOUCHLESS TOUCH SCREEN USER INTERFACE
Using Hand Gestures for Alternative User Verification
EYE TRACKING TECHNOLOGY
Computer Science.
Standard Methods of Input.
Information Computer Technology
Information Computer Technology
SIXTH SENSE TECHNOLOGY
A seminar on Touchless Touchscreen Technology
Methods of Computer Input and Output
Enable Talk Prepared By: Alaa Mayyaleh Shurouq Abu salhiah.
Dronely: A Visual Block Programming Language for the Control of Drones
PRESENTED BY:BHABESH RANJAN MAHAKUD
FIN RING Presented By: Jabir T.
Sliding Puzzle Project
Mobile Navigation Control for Planetary Web Portals Team Members: John Calilung, Miguel Martinez, Frank Navarrete, Kevin Parton, Max Ru, Catherine Suh.
Emerging Trends in Information Technology
WELCOME Mobile Applications Testing
Augmented Reality By: Erica Baas.
SIXTH SENSE TECHNOLOGY
TOUCHLESS TOUCHSCREEN USER INTERFACE
Android Screen Mirroring to TV with few Steps
Get In Touch With Canon Printer Phone Number For Online Tech support
Introduction to Computers
ETS Inside Product Launch
Introduction to Mobile Devices
What is Droid? A small robotic machine which comes in various shapes and sizes following commands & doing the tasks according to commands is called as.
RCNET Nuclear Robotics - Hands-On Activity
EEC-693/793 Applied Computer Vision with Depth Cameras
A seminar on Touchless Technology
Higher School of Economics , Moscow, 2016
Fundamentals of the Computer
CLOUD SERVICE WITH WEbTrac
Chapter 8 Technology and Information Management
Enable Talk Prepared By: Alaa Mayyaleh Shurouq Abu salhiah.
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
Higher School of Economics , Moscow, 2016
Presentation transcript:

Nickolas McCarley University of Alabama Abstract Robotic Navigation through Gesture Based Control (RNGBC) assists people who may not be able to operate a smart device using their fingers. These individuals might face limitations due to physical disabilities or may be in a job where their fingers are not free to operate the smart device. Sphero The Sphero is a polycarbonate ball controlled by Bluetooth and can roll and change color. It is typically operated using a smartphone with an app that serves as a type of remote control. The Sphero can be modified and programmed. The goal of the RNGBC project is to provide a modality to interact with the Sphero in ways other than the smartphone app. Introduction With an estimated 7.1 percent of Americans having an ambulatory disability there is a strong need to be able to operate technology with gestures [1]. RNGBC was made with this goal in mind. The Sphero and Leap Motion were used as the basic tools. RNGBC works by breaking the operation of the Sphero into simpler gestures for a user. Another problem faced by persons attempting to use standard input devices, could be the need to use gloves while performing a job task. Gloves used for protection in jobs like welding and firefighting could hinder the user from operating a touch screen while wearing the safety equipment. In these situations hand gestures might still be possible. RNGBC works so that the user does not have to use the keypad on a smart device such as a tablet or phone but they can operate the Sphero with only their hand gestures. Sphero Code The code below is used to operate Sphero movement using a computer instead of a phone. Bluetooth Connection Leap Motion The Leap Motion device can be connected to a computer in order to read hand motions that are made in a 3D space. The Leap can track the movement of the users hands and gestures indicated by the user. The Leap is also easily programmable and can be customized to recognize specific gestures from the user. The software provided with the Leap motion allows the leap to recognize specific gestures from the user, such as raising a hand or moving said hand side to side. The Leap uses camera sensors to create a workspace that is approximately three cubic feet. The user can perform a wide array of gestures and the leap will recognize them and send the command to the laptop. Leap Integration The Leap software is used to map the Leap Motion input to the computer. This software reads the users gestures and binds them to a command. Sources 1. Cornell University. Disability Statistics September Leap Motion. Leap Motion. 20 September Pierce, David. A look inside Leap Motion, the 3D gesture control that's like Kinect on steroids. 26 June September 2015 < leap-motion-gesture-controls>. 4. Sphero Company. About Sphero. 18 September —. Sphero SDK. 19 September All default Sphero code supplied via Orbotix inc and Michael DePhilllips Conclusions The current state of the project demonstrates the possibilities of using gesture-based control to operate transportation vehicles (e.g., a wheel chair) and machines in the workplace where the employees where gloves. Gesture technologies, such as those similar to this project, can assist those with disabilities and other limitations in performing common tasks. The project also has the potential to generate interest in computer science with younger students who may find the technology fun to use, sparking their curiosity in how motions sensors and robots work. The Leap can be adapted to use gestures to control a variety of other robots..