Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.

Slides:



Advertisements
Similar presentations
Georgia Tech Aerial Robotics Dr. Daniel P Schrage Jeong Hur Fidencio Tapia Suresh K Kannan SUCCEED Poster Session 6 March 1997.
Advertisements

SOUTHEASTCON I KARMA ECE IEEE SoutheastCon Hardware Competition Must build an autonomous robot that can –Start at rest at the Starting Station.
MCECS Guide Robot Project Project Update 5/23/2012.
1. 2 Mobile Robot Navigation with Human Interface Device David Buckles Brian Walsh Advisor: Dr. Malinowski.
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
Platforms – Air, Land and Sea Bob Quinn Vice President 25 SEPT 2008.
Markovito’s Team (INAOE, Puebla, Mexico). Team members.
Lesson 4 Alternative Methods Of Input.
Wheelesley : A Robotic Wheelchair System: Indoor Navigation and User Interface Holly A. Yanco Woo Hyun Soo DESC Lab.
The Gaze Controlled Robotic Platform creates a sensor system using a webcam. A specialized robot built upon the Arduino platform responds to the webcam.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell Katherine.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Team GPS Rover Alex Waskiewicz Andrew Bousky Baird McKevitt Dan Regelson Zach Hornback.
Lecture 25 Dimitar Stefanov.
Mobile Robot Navigation with Human Interface Device David Buckles Brian Walsh Advisor: Dr. Malinowski.
Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts.
A Navigation System for Increasing the Autonomy and the Security of Powered Wheelchairs S. Fioretti, T. Leo, and S.Longhi yhseo, AIMM lab.
Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
By Elizabeth Eli. Assistive or Adaptive Technology commonly refers to "...products, devices or equipment, whether acquired commercially, modified or customized,
M. Guymon - Pleasant Grove High - Spring 2003 Operating Systems Computer Technology.
Modularly Adaptable Rover and Integrated Control System Mars Society International Conference 2003 – Eugene, Oregon.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Active Display Robot System Using Ubiquitous Network Byung-Ju Yi Hanyang University.
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Cooperating AmigoBots Framework and Algorithms
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
Building Humanoid Robots Our quest to create intelligent machines Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics.
LUNAR ROVER Concept proposal meeting Dr. Ashish Dutta Indian Institute of Technology Kanpur Kanpur, INDIA ( *** for private circulation only)
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
M. Guymon Pleasant Grove High School Spring 2003 Operating Systems Computer Technology Day 3.
Ground Robotics Reliability Center Andrew Niedert, Yazan Aljeroudi, Dr. Nassif Rayess, and Dr. Richard Hill Department of Mechanical Engineering, University.
HARDWARE INTERFACE FOR A 3-DOF SURGICAL ROBOT ARM Ahmet Atasoy 1, Mehmed Ozkan 2, Duygun Erol Barkana 3 1 Institute of Biomedical Engineering, Bogazici.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
INEMO™ Demonstration Kit DOF (Degrees of Freedom) platform  The STEVAL-MKI062V2 combines accelerometers, gyroscopes and magnetometers with pressure.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
Ffffffffffffffffffffffff Controlling an Automated Wheelchair via Joystick/Head-Joystick Supported by Smart Driving Assistance Thomas Röfer 1 Christian.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
1 Structure of Aalborg University Welcome to Aalborg University.
Operating Systems Computer Technology Created by M. Guymon Pleasant Grove High School Spring 2003 Modified by M. Corbett Lehi Junior High Summer 2010.
Child-sized 3D Printed igus Humanoid Open Platform Philipp Allgeuer, Hafez Farazi, Michael Schreiber and Sven Behnke Autonomous Intelligent Systems University.
Gesture Modeling Improving Spatial Recognition in Architectural Design Process Chih-Pin Hsiao Georgia Institute of Technology.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Scaling Human Robot Teams Prasanna Velagapudi Paul Scerri Katia Sycara Mike Lewis Robotics Institute Carnegie Mellon University Pittsburgh, PA.
Smart Lens Robot William McCombie IMDL Spring 2007.
Design of a Compliant and Force Sensing Hand for a Humanoid Robot Aaron Edsinger-Gonzales MIT Computer Science and Artificial Intelligence Laboratory.
Mobile Robots Why do robots need to move?. What defines a robot? Sense – a robot has to take in information about its environment Plan – a robot has to.
For Official NASA Use Only
TOUCHLESS TOUCHSCREEN USER INTERFACE
Lesson Objectives Aims You should be able to:
Lesson 4 Alternative Methods Of Input.
Alternative Methods Of Input
Brendon Knapp, Edmund Sannda, Carlton Allred, Kyle Upton
Sensors For Robotics Robotics Academy All Rights Reserved.
Automation as the Subject of Mechanical Engineer’s interest
Major developments in technology
Sensors For Robotics Robotics Academy All Rights Reserved.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Manipulation in Human Environments
Lesson 4 Alternative Methods Of Input.
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Quanser Robotic Product Line 2015
Lesson 4 Alternative Methods Of Input.
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Presentation transcript:

Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell

Goal: How do I get to…? Photo from

Wheeley: Hardware Wheelesley v2 Vector Mobility prototype chassis Differential drive RobotEQ AX2850 motor controller Custom PC Sensor platform Vision system

Wheeley: Robot Arm Exact Dynamic’s Manus Assistive Robotic Manipulator (ARM) –6+2 DoF –Joint encoders, slip couplings –14.3 kg –80 cm reach –20 N clamping force –1.5 kg payload capacity –Keypad, joystick, single switch input devices –Programmable

Wheeley: Vision System Manipulation –Shoulder camera Canon VC-C50i Pan-Tilt-Zoom –Gripper camera PC229XP Snake Camera 0.25 in x 0.25 in x 0.75 in

Wheeley: Vision System Navigation –Videre Design’s STH-V1 –19 cm x 3.2 cm –69 mm baseline –6.5 mm focal length –60 degrees FoV

SLAM using Stereo Vision Why use vision instead of traditional ranging devices? –Accuracy –Cost –Detail

Vision and Mapping Libraries Phission – Videre Design’s Small Vision System (SVS) Simple Mapping Utility (pmap) –Laser stabilized odometry –Particle-based mapping –Relaxation over local constraints –Occupancy grid mapping

SLAM Data Flow

Results

Human Cue Detection Swarthmore Vision Module (SVM) –Basic text detector and optical character recognition

Manipulation: Motivation Direct inputs from 4x4 keypad, joystick, or single switch May not correlate well with user’s physical capabilities Layered menus Micromanage task and progress

Manipulation: Visual Control

Manipulation: Experiments Able bodied, August 2006 –Confirmed: With greater levels of autonomy, less user input is necessary for control. –Confirmed: Faster to move to the target in computer. –Unconfirmed: Users will prefer a visual interface. Target audience, Summer 2007 –Access methods –Cognitive ability –Recreation of previous experiment

Future Work Additional Wheeley modifications: –PC for mapping –Mount touch screen LCD –New Videre Stereo Head –Mount robotic arm Integrate Wheelesley navigation

References and Acknowledgements Bailey, M., A. Chanler, B. Maxwell, M. Micire, K. Tsui, and H. Yanco. “Development of Stereo Vision-Based Navigation for a Robotic Wheelchair.” in Proceedings of the International Conference on Rehabilitation Robotics (ICORR), June K. M. Tsui and H. A. Yanco. “Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface” in Proceedings of the AAAI Spring Symposium on Multidisciplinary Collaboration for Socially Assistive Robotics, March Research supported by NSF grants IIS , IIS , and IIS In collaboration with Crotched Mountain Rehabilitation Center, Exact Dynamics, Swarthmore College, and the University of Central Florida.

Questions?