Engineering Design Centre Project Updates –August 2014.

Slides:



Advertisements
Similar presentations
M. De Cecco - Lucidi del corso di Measurement Systems and Applications Force Panel Measurement of Human Dexterity.
Advertisements

The Effects of Interface Design on Telephone Dialing Performance Masters thesis in Computer Science Andrew R. Freed 4/30/2003.
CS133 Input and output devices
Introduction to Computer Input Devices and Their Evaluation Shumin Zhai IBM Almaden Research Center.
Higher Coordination with Less Control – A Result of Information Maximization in the Sensorimotor Loop Keyan Zahedi, Nihat Ay, Ralf Der (Published on: May.
Interaction Devices By: Michael Huffman Kristen Spivey.
Copyright 1999 all rights reserved Input Devices n What types are there? n Why do we need them? –What functions do they perform? n What are desirable characteristics.
COMP322/S2000/L41 Classification of Robot Arms:by Control Method The Control unit is the brain of the robot. It contains the instructions that direct the.
EyeChess: the tutoring game with visual attentive interface Špakov Oleg Department of Computer Sciences University of Tampere Finland
Precise Selection Techniques for Multi-Touch Screens Hrvoje Benko Andy D. Wilson Patrick Baudisch Columbia University and Microsoft Research CHI 2006.
Vocal Joystick A New Dimension in Human-Machine Interaction ET 2 Presentation Group 3 Jeremy Moody, Carrie Chudy.
Face Recognition & Biometric Systems, 2005/2006 Face recognition process.
Quantifying Generalization from Trial-by-Trial Behavior in Reaching Movement Dan Liu Natural Computation Group Cognitive Science Department, UCSD March,
QUASID – Measuring Interaction Techniques Karin Nieuwenhuizen.
Wheelesley : A Robotic Wheelchair System: Indoor Navigation and User Interface Holly A. Yanco Woo Hyun Soo DESC Lab.
Speaker Adaptation for Vowel Classification
Topic: Fitts' Law Lawrence Fyfe CPSC 681. Fitts' Law Formula: ID (index of difficulty) = log 2 (D/W +1) Soukoreff R.W., MacKenzie I.S., Towards.
BLENDED LEARNING UNIT A Centre for Excellence in Teaching and Learning (Part of the University of Hertfordshire Learning and Teaching Institute) Introduction.
1 Ken Hinckley Edward Cutrell Steve Bathiche Tim Muss Microsoft Research & Microsoft Hardware April 23, 2002 Quantitative Analysis of Scrolling Techniques.
People & Devices: (Inputs & Outputs) Startlingly small child using computer History of human-computer interaction Another history video.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Objectives Define predictive and descriptive models and explain why they are useful. Describe Fitts’ Law and explain its implications for interface design.
Fitts’ Law Rob Diaz-Marino. Overview  The Basics Who invented it? Who invented it? What does it model? What does it model? How is it used in HCI? How.
Some questions of hypermedia and CHI Josep Blat Universitat Pompeu Fabra.
Research Methods for HCI: Cognitive Modelling BCS HCI Tutorial 1 st September, 2008.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
The Effects of Text Messaging On the Driving Performance of Young Novice Drivers MUARC: Kristie Young, Simon Hosking & Michael Regan NRMA Motoring & Services:
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Human Factors for Input Devices CSE 510 Richard Anderson Ken Fishkin.
Chapter 5 Models and theories 1. Cognitive modeling If we can build a model of how a user works, then we can predict how s/he will interact with the interface.
User Models Predicting a user’s behaviour. Fitts’ Law.
Human Control of Systems Tracking Tasks Human Performance Considerations Measuring Tracking & Controlling Performance System Behaviors Representing System.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Two Handed and Gaze Input Stanford and Princeton Lecture Nov 29, 1999 Shumin Zhai.
Chapter 5: Spatial Cognition Slide Template. FRAMES OF REFERENCE.
Expressive Emotional ECA ✔ Catherine Pelachaud ✔ Christopher Peters ✔ Maurizio Mancini.
Slides based on those by Paul Cairns, York ( users.cs.york.ac.uk/~pcairns/) + ID3 book slides + slides from: courses.ischool.berkeley.edu/i213/s08/lectures/i ppthttp://www-
===!"§ Deutsche Telekom Laboratories Target Acquisition with Camera Phones when used as Magic Lenses CHI 2008, Florence, Italy, April 9, 2008 Michael Rohs.
CHRONOS-CONTROL COMPUTER CONTROL USING TI CHRONOS Cihat Keser Yeditepe University
Learning From Demonstration. Robot Learning A good control policy u=  (x,t) is often hard to engineer from first principles Reinforcement learning 
Operant Conditioning of Cortical Activity E Fetz, 1969.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Prof Jim Warren with reference to sections 7.4 and 7.6 of The Resonant Interface.
COSC 3461: Module 9 A Principle of UI Design (revisited)
Gaze-based Interfaces for Internet
Target Tracking Spotlight (TTS) Maureen Desi Joel Douglass Rajiv Iyer Dennis Trimarchi Group 6.
© Simeon Keates 2009 Usability with Project Lecture 14 – 30/10/09 Dr. Simeon Keates.
Evaluation Using Modeling. Testing Methods Same as Formative Surveys/questionnaires Interviews Observation Documentation Automatic data recording/tracking.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Korea University User Interface Lab Copyright 2008 by User Interface Lab Human Action Laws in Electronic Virtual Worlds – An Empirical Study of Path Steering.
Input/Output Modalities CSD 823X Spring 2011 John B. Eulenberg, Professor.
School of Systems, Engineering, University of Reading rkala.99k.org April, 2013 Motion Planning for Multiple Autonomous Vehicles Rahul Kala Congestion.
Turku PET Centre EXTRACTING ARTERIAL BLOOD CURVE FROM PET IMAGE - UPDATE Abdominal aorta in [ 15 O]H 2 O studies.
The velocity profiles for temporally constrained aimed movements have a symmetrical bell shape. No matter the speed, timed movements produce symmetrical.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
A novel depth-based head tracking and facial gesture recognition system by Dr. Farzin Deravi– EDA UoK Dr Konstantinos Sirlantzis– EDA UoK Shivanand Guness.
BEST PROJECT OF THE INSTITUTE
Topic: Waveforms in Noesis
Eye Tracker Performance Evaluation with ISO 9241 – Point and Click by Blinking and Dwelling Student: Matthew Conte Superviser: Prof Scott MacKenzie CSE.
CS201 Lecture 02 Computer Vision: Image Formation and Basic Techniques
Human – Computer Communication
“Laws” of Human Performance
Specialist hardware devices for physically disabled users
Copyright Catherine M. Burns
HCI for Pen Computing CSE 481b January 24, 2006.
Fitts’s Law Incredibly professional presentation by Thomas Gin, someone please hire me.
Stan Van Pelt and W. Pieter Medendorp
Fig. 2 System and method for determining retinal disparities when participants are engaged in everyday tasks. System and method for determining retinal.
Fig. 2 System and method for determining retinal disparities when participants are engaged in everyday tasks. System and method for determining retinal.
Presentation transcript:

Engineering Design Centre Project Updates –August 2014

Engineering Design Centre Aim of the study To compare –performance and preferences of users for different input modalities –a standard computer mouse and HOTAS Joystick with different eye-gaze, head and hand movement tracking based pointing systems

Engineering Design Centre Vision parameters

Engineering Design Centre Cognitive parameters Trail Making TestDigit Symbol Test

Engineering Design Centre Motor parameters Range of Motion of wrist (Palm facing down) Measuring Radial Deviation Measuring Ulnar Deviation

Engineering Design Centre Participants AgeSexNationalitiesGSROMWDSTTMTVACB 126FIndian N 223FBritish N 353MBritish N 434MIndian N 530FPolish N 646MBritish N 728MSouth African N 823MBritish N 919MBritish N 1030FItalian N 1122MAmerican N 1230MBritish N 1326MSpanish N

Engineering Design Centre Task

Engineering Design Centre Design Modalities 1.Eye Tracking with Z-Axis selection (ZET) 2.Eye Tracking with Voice-based selection (VoiceET) 3.Adapted Eye Tracking (AdaptedET) 4.Multimodal Eye Tracking (MmET) 5.Head Tracking with Z-Axis selection (ZHT) 6.Head Tracking with Voice-based selection (VoiceHT) 7.Adaptive Head Tracking (AdaptedHT) 8.Hand Tracking with Z-Axis selection (ZGS) 9.Hand Tracking with Voice-based selection (VoiceGS) 10.Adaptive Hand Tracking (AdaptedGS) 11.HOTAS Joystick 12.Mouse Targets –Size 45, 55, 65, 75 pixels –Distances 80, 160, 240, 325 pixels [ 1 pixel ≈ 0.25 mm ]

Engineering Design Centre Results All participants can undertake trials in all conditions We measured –Selection time –Cursor trace #Wrong Selections Main movement + Homing Time Extra Distance Travelled over Target Axis Length –Pupil diameter –TLX scores –BRS scores

Engineering Design Centre Selection Times

Engineering Design Centre Fitts’ Law

Engineering Design Centre Selection Times - ANOVA SourcedfFSig. Eta Squared DEVICE Error(DEVICE)96.36 WIDTH Error(WIDTH)52.70 DIST Error(DIST)72.00 DEVICE * WIDTH Error(DEVICE*WIDTH) DEVICE * DIST Error(DEVICE*DIST) WIDTH * DIST Error(WIDTH*DIST) DEVICE * WIDTH * DIST Error(DEVICE*WIDTH*DIST) Device × Width × Distance

Engineering Design Centre Selection Times - ANOVA Point × Select × Width × Distance SourcedfFSig. Eta Squared POINT Error(POINT)31.58 SELECT Error(SELECT)48.00 WIDTH Error(WIDTH)72.00 DIST Error(DIST)49.56 POINT * WIDTH Error(POINT*WIDTH)82.77 SELECT * WIDTH Error(SELECT*WIDTH)88.56

Engineering Design Centre Interaction Diagrams - Pointing

Engineering Design Centre Interaction Diagrams

Engineering Design Centre Analysing Trajectory Source Target Reached Target Click: End of Task

Engineering Design Centre Main Movement Time

Engineering Design Centre Time Spent near Target Homing Time = Selection Time – Cursor reached Target

Engineering Design Centre Selection Times – Device Comparison

Engineering Design Centre Extra Distance Travelled Extra Distance = Total Distance – Target Axis Length

Engineering Design Centre Wrong Selections

Engineering Design Centre Cognitive Load

Engineering Design Centre Spare Mental Capacity

Engineering Design Centre Synopsis Pointing –Mouse << Head Movement << Hand Movement << Eye Gaze << HOTAS Joystick Selection –Adaptive << Z-Axis << MmET << Voice Cognitive Load –Mouse << AdaptedHT << …<< Joystick << VoiceET –Not enough spare mental capacity for VoiceGS, VoiceET and HOTAS Joystick

Engineering Design Centre Pupil Data Analysis

Engineering Design Centre Maximum Pupil Diameter

Engineering Design Centre Uncertainty Principle

Engineering Design Centre Gabor Limit

Engineering Design Centre Pupil as a wave signal

Engineering Design Centre Existing work & Patent Sudden change in pupil diameter Methods –Wavelet Transform –Linear Discriminate Analysis –Neural Network Applications –Driving simulation –Aviation –Map reading –ET as passive not active

Engineering Design Centre Power Spectrum Participants ρ P10.94 P20.93 P30.67 P40.24 P50.73 P60.64 P70.19 P80.51 P90.72 P P P P131 Average0.83

Engineering Design Centre Comparing Correlations with TLX Scores

Engineering Design Centre Power Spectrum vs TLX Scores

Engineering Design Centre Power Spectrum vs Selection Times

Engineering Design Centre Power Spectrum for different IDs

Engineering Design Centre Power Spectrum vs Selection Times

Engineering Design Centre Comparing Effect Sizes ( η² )

Engineering Design Centre Advantages over Previous work FFT over FWT –No need to choose basis wavelets Application agnostic evaluation Works with low frequency (and cheaper) eye gaze tracker Validated while ET is used in both active and passive modes However distance to target can also affect pupil diameter which may not indicate a change in cognitive load