EyeChess: the tutoring game with visual attentive interface Špakov Oleg Department of Computer Sciences University of Tampere Finland

Slides:



Advertisements
Similar presentations
Chapter 5 Development and Evolution of User Interface
Advertisements

Methodology and Explanation XX50125 Lecture 1: Part I. Introduction to Evaluation Methods Part 2. Experiments Dr. Danaë Stanton Fraser.
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Designing Effective “PowerPoint Presentations” March 15, 2013.
GAMIFYING HEALTH DATA COLLECTION Mariko Wakabayashi & RJ Kunde Department of Computer Science University of Illinois at Urbana-Champaign Collaborators:
IntroductionMethods Participants  7 adults with severe motor impairment.  9 adults with no motor impairment.  Each participant was asked to utilize.
Relating Error Diagnosis and Performance Characteristics for Affect Perception and Empathy in an Educational Software Application Maria Virvou, George.
Cognitive Walkthrough More evaluation without users.
On the motivation and attractiveness scope of the virtual reality user interface of an educational game Maria Virvou, George Katsionis & Konstantinos Manos.
Gaze vs. Mouse in Games: The Effects on User Experience Tersia //Gowases, Roman Bednarik, Markku Tukiainen Department of Computer Science and Statistics.
QUASID – Measuring Interaction Techniques Karin Nieuwenhuizen.
Virtual Workbenches Richard Anthony Dept. Computer Science University of Greenwich Distributed Systems Operating Systems Networking.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
25 February New Interaction Techniques Target Selection Under Time Pressure Conditions New Interaction Techniques Department of Computer and Information.
ART: Augmented Reality Table for Interactive Trading Card Game Albert H.T. Lam, Kevin C. H. Chow, Edward H. H. Yau and Michael R. Lyu Department of Computer.
SELECT A LESSON 1. A WORLD AND CHARACTERS 2. PATHS AND ENEMIES 4. USING PAGES TO CHANGE THE RULES 5. GAME ANALYSIS AND DESIGN 6-7. CREATING YOUR OWN GAME.
By: Jamal Redman & Rashad Blackwell. Chapter 7 provides an overview of how educational software, apps, and learning games support and promote problem.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
Sales and Marketing Productivity Team 1 Added Value Analysis TOOL USED IN SALES AND MARKETING PRODUCTIVITY PROJECTS.
Pilot: Customizing a Commercially Available Digital Game to Assess Cognitive Function William C. M. Grenhart, John F. Sprufera, Jason C. Allaire, & Anne.
The Software Development Cycle Defining and understanding the problem.
 A set of objectives or student learning outcomes for a course or a set of courses.  Specifies the set of concepts and skills that the student must.
People: Usability IS 101Y/CMSC 101Y November 5, 2013 Marie desJardins Amanda Mancuso University of Maryland Baltimore County.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Seeking and providing assistance while learning to use information systems Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Sep. 16, 2009 Babin, L.M.,
Output and User Interface Design
Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging.
Majid Sarrafzadeh Computer Science Department UCLA.
Use of Eye Movement Gestures for Web Browsing Kevin Juang Frank Jasen Akshay Katrekar Joe Ahn.
Fundamentals of Data Analysis Lecture 9 Management of data sets and improving the precision of measurement.
Copyright (C) 1993 – 2005 Michael Arthur Johnson Riga, Latvia Beyond Total Quality Management Learn to live the talk.
Lecture on Computer Science as a Discipline. 2 Computer “Science” some people argue that computer science is not a science in the same sense that biology.
TAUCHI – Tampere Unit for Computer-Human Interaction Visualizing gaze path for analysis Oleg Špakov MUMIN workshop 2002, Tampere.
Hao Wu Nov Outline Introduction Related Work Experiment Methods Results Conclusions & Next Steps.
The Design of a Collaborative Learning Environment in a Mobile Technology Supported Classroom, Concept of Fraction Equivalence Sui Cheung KONG Department.
CMPT480 Term Project Yichen Dang Nov 28,2012.   For me:  Introduce a technology for painting without hands requirement  Deeper understanding of eye.
Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:
Change detection and occlusion modes in road-traffic scenarios Professor: Liu Student: Ruby.
Usability Testing Chapter 6. Reliability Can you repeat the test?
INTRO TO USABILITY Lecture 12. What is Usability?  Usability addresses the relationship between tools and their users. In order for a tool to be effective,
Designing & Testing Information Systems Notes Information Systems Design & Development: Purpose, features functionality, users & Testing.
Human Factors Issues Chapter 9. Human Factors = ergonomics WWII based – military significance… … a necessary part of medical device design…
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
Chapter 8 Usability Specification Techniques Hix & Hartson.
E.g.: MS-DOS interface. DIR C: /W /A:D will list all the directories in the root directory of drive C in wide list format. Disadvantage is that commands.
Individual Differences in Human-Computer Interaction HMI Yun Hwan Kang.
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
I can be You: Questioning the use of Keystroke Dynamics as Biometrics —Paper by Tey Chee Meng, Payas Gupta, Debin Gao Presented by: Kai Li Department of.
Agents that Reduce Work and Information Overload and Beyond Intelligent Interfaces Presented by Maulik Oza Department of Information and Computer Science.
Lecture 13.  Failure mode: when team understands requirements but is unable to meet them.  To ensure that you are building the right system Continually.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
An Eyetracking Analysis of the Effect of Prior Comparison on Analogical Mapping Catherine A. Clement, Eastern Kentucky University Carrie Harris, Tara Weatherholt,
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oman College of Management and Technology Course – MM Topic 7 Production and Distribution of Multimedia Titles CS/MIS Department.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Laboratory Investigations Each lab group will submit a single input. All members of the group will get THE SAME grade UNLESS... You are observed goofing.
Examining the Conspicuity of Infra-Red Markers For Use With 2-D Eye Tracking Abstract Physical infra-red (IR) markers are sometimes used to help aggregate.
COM 205 Multimedia Applications St. Joseph’s College Fall 2004.
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
From: Chess players' eye movements reveal rapid recognition of complex visual patterns: Evidence from a chess-related visual search task Journal of Vision.
Introduction   Many 3-D pronunciation tutors with both internal and external articulator movements have been implemented and applied to computer-aided.
Introduction   Many 3-D pronunciation tutors with both internal and external articulator movements have been implemented and applied to computer-aided.
User Interface Evaluation
Evaluation Techniques 1
Human Factors Issues Chapter 8 Paul King.
Learning, Attentional Control, and Action Video Games
The Ergonomics of Programming
CS305, HW1, Spring 2008 Evaluation Assignment
Assessment Objective Six
Presentation transcript:

EyeChess: the tutoring game with visual attentive interface Špakov Oleg Department of Computer Sciences University of Tampere Finland March, 2005 AAFG 2005

O.Špakov1/ The number of gaze-aware interfaces increases rapidly. Part of them, namely gaze-controlled interfaces, allow to employ the gaze as an input. The majority of interaction procedures involves interface object selection. The best known gaze-based selection methods are following: Dwell time Gaze-gesture Blink However, novice users of such interfaces may be unaware of how to use their gaze to reach the best performance. Introduction EyeChess

Some researchers used primitive games to train participants to control the gaze before they were involved in testing the gaze- controlled interface [Gips, 1996] [Nelson, 1992]. Games could be used also as a tool to evaluate efficiency of an interaction methods There are no real gaze-based games on the market available yet. Due to the problems of eye-tracking devices, the gaze-aware interfaces must deal with the objects large enough to eliminate the accuracy gap. Cell-based games seems ideal for this purpose. O.Špakov2/ Introduction EyeChess

O.Špakov3/ Goal EyeChess The goal of this project was to develop a PC-based tutorial to assist novices in playing chess endgames and to carry out a pilot evaluation of the efficiency of the proposed technique. Chess game was chosen for this project as the one, which satisfies all conditions mentioned above. Moreover, this game looks promising from other research aspects (for example, for studying the processes of mental activity). The game supposes to be used as a tutorial of endgames by the players with no or very little experiences in playing chess.

3.Motion validity (red/green 3D) 4.Motion target (light-yellow) O.Špakov4/ Software design - Feedback EyeChess Visual feedbacks (static): 1.What the user is looking at (3D) 2.Motion target (dark-yellow)

Visual feedbacks (dynamic): Animation of the piece movement Sound feedback Alert sound at the attempt to move to forbidden square Congratulations as reward when the task was completed O.Špakov5/ Software design - Feedback EyeChess

O.Špakov7/ Software design - Tutoring EyeChess The tutoring is supporting by flashing 3D border of the square, which is supposed to be selected next, after the player made 2 attempts to make a wrong move. The tutorial does not support wrong movements The flashing border has either the piece to move or the destination square. It depends on whether the player found the correct piece to move or not If the player has found the correct piece to move due to flashing 3D border, the destination square will flash only when s/he makes an attempt of wrong move once more.

The game is playing in full-screen mode. It was implemented as a plug-in for iComponent application [O. Špakov’s home page] O.Špakov8/ Software design - Tutoring EyeChess The flashing appears only when the correct square is in focus (i.e. is observed by the player) The algorithm to support the flashing of the square to be selected is presented on the figure

O.Špakov9/ Pilot study EyeChess The aim of the pilot study was to test, what gaze-based selection method could be the most suitable for this game. The following parameters and selection methods were tested: Dwell time = 1.8 s. Gesture: gaze moves from the object to select to any off-screen button (placed in every corner of the monitor), and back to the same object. Blink with duration from 350 to 1000 ms. Players and procedure: four subjects were asked to move any 10 pieces they would like to any position, using each selection method. Selection time was measured and the players’ opinions were collected during the pilot test

O.Špakov10/ Pilot study EyeChess Hardware: iView X RED III eye tracker from SensoMotoric Instruments Results (the average time and player rating): Dwell time: 3.3 s, quite OK for this task Gesture: 2.8 s, less convenient, especially for permanent selection Blink: 2.5 s, worst, very annoying The first selection method was chosen. However, the second one (gesture) is still could be used since in a real chess game the selection itself does not occur very often

The subjects: three females and one male, who did not participate in the pilot test and have no or very little experience in playing chess. Two of them had some experience using eye trackers, and 2 others had not. Hardware: Tobii 1750 eye tracker from Tobii Technologies Procedure: after the subjects read the instruction how to use tutorial, they were asked to play “whites” and solve 20 endgames in 3 moves. Before the test started, they were trained to use the environment solving 2 other endgames. Completion time and the number of wrong attempts were measured during the test O.Špakov11/ Method EyeChess

O.Špakov12/ Results EyeChess The average time was 71.4 s. (σ=19.3). By moves: 1.56 s. (78.4%) s (13.4%) s. (8.1%) Errors rate: 18% of all attempts (σ=6%). By moves: 34% 6% 1%

O.Špakov13/ Results EyeChess Incorrect piece in 8% of all attempts (σ=4%). By moves: 1.21%2. 1%3. 1% Subjects often selected a wrong piece (23% overall), but then changed their mind and did not move it. By moves: 1.58%2. 5%3. 6% It seems that after the first move was performed, the subjects did mistakes or select wrong pieces only because of the calibration drift.

O.Špakov14/ Results EyeChess In almost every tenth task (9%) the subjects attempted to make a wrong move at least 2 times and all of them found the hint (blinking square). Nobody made 2 or more wrong attempts after the first correct move was found The subjects were stressed during first 7 tasks and they analyzed the given task carefully. This resulted to a long thinking. Then they relaxed and the number of error increased (trials 9-13). At the end all subjects solved tasks faster than at the beginning. The wrong attempts after the first successful move were occurred rarely.

O.Špakov15/ Conclusion EyeChess EyeChess is a PC-based gaze-controlled tutorial to assist novices in playing endgames. To make a move, the player first selects a piece and then its destination square. The visual and sound effects provide a good level of interaction. The tutorial provides a hint (flashing highlight) in case then the player is unaware of the correct move. The pilot study revealed that dwell time was the preferred gaze- based selection method.

O.Špakov16/ Conclusion EyeChess At the end of the test no significant improvements in the gaze interaction were found: both groups of subjects showed the similar trial completion time at the beginning and end of the test. The following study revealed that the tutorial was helpful in guiding the decision-making process. However, the tasks were too easy and the EyeChess tutoring support was rarely used. Overall, the game is eye tracker-friendly and thus easy to play, as the pieces and squares on the chessboard are large enough to accommodate calibration drifts.

O.Špakov17/ Future work EyeChess The project will be continued to support both novices and experts. The author plans also to extend the game to a full- range game with an opponent: a computer or another player over the net.

O.Špakov18/ Gips, J., Dimattia, P., Curran, F.X. and Olivieri, P., Using EagleEyes – an electrodes based device for controlling the computer with your eyes – to help people with special needs, in J. Klaus, E. Auff, W. Kremser & W. Zagler (eds.), Interdisciplinary Aspects on Computers Helping People with Special Needs, Oldenburg, pp , 1996 Nielsen, J., Noncommand User Interfaces. Communications of ACM 36, pp , 1992 O. Špakov’s home page: iComponent application, at References EyeChess