J. Alan Atherton Michael Goodrich Brigham Young University Department of Computer Science April 9, 2009 Funded in part by Idaho National Laboratory And.

Slides:



Advertisements
Similar presentations
Just Add Wheels: Leveraging Commodity Laptop Hardware for Robotics Education Jonathan Kelly, Jonathan Binney, Arvind Pereira, Omair Khan and Gaurav S.
Advertisements

A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
ATRV-Jr. Interface Human-Robot Interaction: A Straw Man approach to Interface Iteration This presentation will probably involve audience discussion, which.
Game-Based Design of Human-Robot Interfaces Notes on USR design papers ( ) Presented by Misha Kostandov.
 Modular components to be re-used in future years  Tasks and Team 1. Robot graphical user interface (Zwivhuya Tshitovha) 2. Robot control interface.
Cognitive Issues in Virtual Reality Wickens, C.D., and Baker, P., Cognitive issues in virtual environments, in Virtual Environments and Advanced Interface.
Mobile – robot remote control and communication system design P. Petrova, R. Zahariev Central Laboratory of Mechatronics and Instrumentation Bulgarian.
Shared Workspaces: Behavioural Foundations Petra Neumann 781 October 12 th, 2005.
The Use of Eye Tracking Technology in the Evaluation of e-Learning: A Feasibility Study Dr Peter Eachus University of Salford.
Tele-presence for Control of a High-speed Robot Michael R. Kowalczyk and Joseph A. Schmeltzer Project Advisor: Jeffrey Horn Northern Evolutionary Robotics.
Teleoperation Jennifer Homich CSC338. Teleoperation Teleoperation is defined as operation of a machine at a distance. It is similar in meaning to the.
Experiences with an Architecture for Intelligent Reactive Agents By R. Peter Bonasso, R. James Firby, Erann Gat, David Kortenkamp, David P Miller, Marc.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Lecture 7 Date: 23rd February
Head Tracking and Virtual Reality by Benjamin Nielsen.
Background S.A.U.V.I.M. Semi - Autonomous Underwater Vehicle for
Jesper Kjeldskov & Jan Stage Department of Computer Science Aalborg University Denmark New Techniques for Usability Evaluation of Mobile Systems.
Invited Talk Telepresence in the Real World Presented by: Weihong Li --- ACM Multimedia 2004 Conference Workshop on Effective Telepresence (ETP’04) Duffie.
Performance Evaluation of a Multi-Threaded Distributed Telerobotic Framework Mayez Al-Mouhamed, Onur Toker, and Asif Iqbal College of Computer Science.
Autonomy Mode Suggestions for Improving Human-Robot Interaction Michael Baker Holly A. Yanco University of Massachusetts Lowell.
Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts.
FYP Project LYU0303: 1 Video Object Tracking and Replacement for Post TV Production.
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
Virtual Reality: How Much Immersion Is Enough? Angela McCarthy CP5080, SP
Electronic Visualization Laboratory University of Illinois at Chicago Interaction between Real and Virtual Humans: Playing Checkers R. Torre, S. Balcisoy.
Interactivity, Mobility, and the Online Frontier: Innovations and Research Directions in the ‘Virtually’ Flat World Tarek Sobh UNIVERSITY OF BRIDGEPORT.
What are Virtual Environments? Angela McCarthy CP5080, SP
Improving Human-Robot Interaction Jill Drury, The MITRE Corporation Improving Human-Robot Interaction Jill Drury, The MITRE Corporation Collaborators:
Visual-Spatial Thinking in Digital Libraries —Top Ten Problems Chaomei Chen Brunel University June 28th 2001, Hotel Roanoke and Conference Center, Roanoke,
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Chapter 5: Spatial Cognition Slide Template. FRAMES OF REFERENCE.
Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.
Visual Interfaces to Digital Libraries Katy Börner School of Library and Information Science Indiana University, Bloomington
Thuong Hoang Supervisor: Prof. Bruce Thomas Wearable Computer Lab School of Computer and Information Science In-situ Model Refinement and Hands-free Interaction.
Jessica Tsimeris Supervisor: Bruce Thomas Wearable Computer Lab
Algorithms for Control and Interaction of Large Formations of Robots Ross Mead Dr. Jerry B. Weinberg Dr. William White.
The Effects of Immersion and Navigation on the Acquisition of Spatial Knowledge of Abstract Data Networks James Henry, M.S. Nicholas F. Polys, Ph.D. Virginia.
IMA Workshop on Haptics, VR, and HCI Overview John Hollerbach School of Computing University of Utah.
Evaluation of a Visualization System for Information Retrieval at the Front and the Back End Gregory B. Newby Sch of Information and Lib. Science U. of.
Page 1 Remote Interaction With Machines Principal Investigator: Vincenzo Liberatore Task Number: NAG Case Western Reserve University September 18,
Beyond The Desktop The Future of the Interface. The co-evolution of hardware, interface and users Punched cards Character displays and keyboards Graphical.
ECE 480 Design Team 1 Autonomous Docking of NASA Robotic Arm.
2.03 Explore virtual reality design and use.
1 A Point-and-Click Interface for the Real World: Laser Designation of Objects for Mobile Manipulation Presented by Gal Peleg CSCI2950-Z, Brown University.
Teleoperation In Mixed Initiative Systems. What is teleoperation? Remote operation of robots by humans Can be very difficult for human operator Possible.
W E L C O M E. A U G M E N T E D R E A L I T Y A SEMINAR BY JEFFREY J M EC7A ROLL NO:
Direct Blind Walking in a Different Virtual World INTRODUCTION AND BACKGROUND Betty J. Mohler †, Heinrich H. Bülthoff †, William B. Thompson* & Sarah H.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
Telepresence Robots Ayesha Fathima, Breanne Happell, Saraf Rahman, Yan.
Pervasive Gaming with Mobile Devices Prepared By: Karnung Liang Project Supervisor: Dr Brett Wilkinson.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Advanced Decision Architectures Collaborative Technology Alliance Five Lessons Learned in Human-Robot Interaction Patricia McDermott Alion Science and.
Scaling Human Robot Teams Prasanna Velagapudi Paul Scerri Katia Sycara Mike Lewis Robotics Institute Carnegie Mellon University Pittsburgh, PA.
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
Learning Game and Simulation Design through Multilayer Synchronous Collaboration in a Virtual Reality Environment A Pre-Prospectus Proposal Lewis F. Jones.
Made By: Pallavi Chhikara
By Akhilesh K. Sinha Nishant Singh Supervised by Prof. Amitabha Mukerjee Video Surveillance of Basketball Matches and Goal Detection Indian Institute of.
What is Multimedia Anyway? David Millard and Paul Lewis.
WEB BASED MONITORING AND CONTROLING OF INDUSTRIAL PROCESSES PRESENTED BY: Bhagyawant (3AE07EC018) Kushal (3AE07EC032) Mahantesh (3AE07EC034) Mallinath.
Nosipho Masilela COSC 480.  Define Augmented Reality  Augmented Reality vs. Reality  History of AR and its Applications  Augmented Tracking  Future.
EYE TELE ANALYSER Presented By Gisha S Asok S7 AEI Roll No:23 Guided By Joaquim Ignatious Monteiro lecturer 1.
Basic Autonomous Capabilities O. Khatib et al. Presented by Andrew Lewis.
First-person Teleoperation of Humanoid Robots
User Experience Strategies for Winning API Documentation
IMPART: The Intelligent Mobility Partnership
Automation as the Subject of Mechanical Engineer’s interest
Properties of human stereo processing
Stereo Vision Applications
Presentation transcript:

J. Alan Atherton Michael Goodrich Brigham Young University Department of Computer Science April 9, 2009 Funded in part by Idaho National Laboratory And Army Research Laboratory 1

 Background  Related Work  Ecological Interface  User Study  Interface Changes from Study  Conclusions and Future Work 2

 What is a remote manipulator?  Applications USAR EOD Planetary Exploration 3

 Remotely operating a robot is difficult “Soda straw” — Maintaining situation awareness Time delay Mental workload  Why is this a problem? Collisions Slow Stressful Foster-Miller Talon 4

All images adopted from Yanco, H. A.; Drury, J. L. & Scholtz, J. “Beyond usability evaluation: analysis of human- robot interaction at a major robotics competition” Hum.-Comput. Interact., L. Erlbaum Associates Inc., 2004, 19,

 Background  Related Work  Ecological Interface  User Study  Interface Changes from Study  Conclusions and Future Work 6

Idaho National LaboratoryUMass Lowell Bruemmer, D. J. et al. “Shared understanding for collaborative control.” IEEE Transactions on Systems, Man and Cybernetics, Part A, 2005, 35, Yanco, H. A. et al. “Analysis of Human-Robot Interaction for Urban Search and Rescue.” Proceedings of the IEEE International Workshop on Safety, Security and Rescue Robotics,

INL / BYU AV Interface Ferland et al. - Sherbrooke C. W. Nielsen, M. A. Goodrich, and B. Ricks. “Ecological Interfaces for Improving Mobile Robot Teleoperation.” IEEE Transactions on Robotics and Automation. Vol 23, No 5, pp , October Ferland, F.; Pomerleau, F.; Dinh, C. T. L. & Michaud, F. “Egocentric and exocentric teleoperation interface using real-time, 3D video projection.” Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, ACM, 2009,

NASA Viz Nguyen, L. A.; Bualat, M.; Edwards, L. J.; Flueckiger, L.; Neveu, C.; Schwehr, K...; Wagner, M. D. & Zbinden, E. “Virtual Reality Interfaces for visualization and control of remote vehicles” Autonomous Robots, 2001, 11, Kelly, A.; Anderson, D.; Capstick, E.; Herman, H. & Rander, P. “Photogeometric Sensing for Mobile Robot Control and Visualisation Tasks” Proceedings of the AISB Symposium on New Frontiers in Human-Robot Interaction, 2009 CMU Robotics Institute

 Background  Related Work  Ecological Interface  User Study  Interface Changes from Study  Conclusions and Future Work 10

 Requirements Ecological Increase situation awareness Manage workload  Existing Interfaces Lack depth information No manipulation support Not designed for real-time operation 11

12  Real-time remote manipulation

13

 Robot Build from kit, modify Player driver Motion planning for arm Swiss Ranger driver  Communication Integrate with INL’s system Network data transfer  User Interface OpenGL display Experiment automation 14 Robot Controller User Interface

 Background  Related Work  Ecological Interface  User Study  Interface Changes from Study  Conclusions and Future Work 15

Variant 1Variant 2Variant 3 Variant 4Variant 5Variant 6 3D + Video End Effector Video3D Joint Robot Control Visualization  Task: collect yellow blocks  30 participants  Between-subject comparison 16

 Reduce memorization effects  Minimize damage to arm  Quick change 17

18

19

 Joint control  View-dependent end effector control 20

21  Robot reaches for point  User moves point with joystick  Point movement depends on view orientation

3D + Vid. End eff. 3D + Vid. Joint 3D End eff. 3D Joint Video End eff. Video Joint 22

3D + Vid. End eff. 3D + Vid. Joint 3D End eff. 3D Joint Video End eff. Video Joint 3D + Vid. End eff. 3D + Vid. Joint 3D End eff. 3D Joint Video End eff. Video Joint 23 Collisions with posts, box, table Collisions with block in final adjustments

24

 Background  Related Work  Ecological Interface  User Study  Interface Changes from Study  Conclusions and Future Work 25

 Problems Alignment Time lag Cluttered 3D scan model  Changes Stereo camera exterior orientation Interactive robot arm calibration Simple Quickening Scan Pruning 26

 Interactive stereo camera calibration  Live robot arm calibration 27

28

29

30

 Background  Related Work  Ecological Interface  User Study  Interface Changes from Study  Conclusions and Future Work 31

 3D visualization supports SA  Video is faster  3D + video is a good tradeoff  3D + video might reduce workload 32

33  Head tracking  Ecological camera video  Haptics