Presentation is loading. Please wait.

Presentation is loading. Please wait.

HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM.

Similar presentations


Presentation on theme: "HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM."— Presentation transcript:

1 HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS F1 F7 Fixation signal (below) produced by each human and the algorithm. Time is shown across the horizontal axis and each strip continues from the end of the strip above. Next-Generation Portable Eye Tracking Susan M. Munn* and Jeff B. Pelz Multidisciplinary Vison Research Laboratory Chester F. Carlson Center for Imaging Science Rochester Institute of Technology Abstract 3.0 Identification of fixations in dynamic scenes 4 2.0 Compensating for headgear motion 2,3 References 1.0 Equipment: The RIT Wearable Eye Tracker 1 Portable eye tracking allows us to monitor and analyze a person's eye movements as he or she moves around and performs natural tasks. Video-based portable eye trackers include two cameras: an eye camera that records a video of the person's eye, and a scene camera that records a video of the central portion of the person's field-of-view. These videos are typically processed to determine where the person is looking – his/her point-of-regard (POR) – in each frame of the scene video. Since portable eye tracking is fairly new to the eye tracking field, some of the current algorithms and techniques for processing eye tracking data may not be suitable for use with portable eye trackers. For instance, the typical method used to determine how the eye has moved (based on the movements of its features through the eye video) is very sensitive to movements of the eye camera with respect to the eye; this problem is most severe when the infrared-emitting diode (IRED) – which illuminates the eye and creates a corneal reflection (CR) – is close to the eye (as is the case with portable eye trackers). In addition, algorithms that process the POR data to identify fixations (periods of time when the subject is looking at the same location) expect these data to be defined in static two-dimensional scene images. We developed a new technique to more robustly determine movements of the eye in the presence of movements of the eye camera (with respect to the eye); this technique also reduces noise in the final eye movement data. Additionally, a fixation- identification algorithm has been implemented and tested by comparing its results to those produced by humans. These algorithms will greatly impact the field of portable eye tracking and allow us to do much more with our eye tracking data. We present one example application which uses these algorithms towards obtaining three- dimensional locations of fixations. PROBLEM: Headgear movements, which result in movements of the eye camera with respect to the eye, can be mistaken for eye movements. Effect of headgear movement: GOAL: Determine if a velocity-based fixation-identification algorithm can be applied to eye tracking data in dynamic scenes. [1] Babcock, J. S. and J. B. Pelz. Building a lightweight eyetracking headgear. In ETRA 2004: Proceedings of the Eye Tracking Research & Applications Symposium, pp 109-114, ACM Press. [2] Kolakowski, S. M. and J. B. Pelz. Compensating for eye tracker camera movement. In ETRA 2006: Proceedings of the Eye Tracking Research & Applications Symposium, pp 79-85, ACM Press. [3] Li, F., S. M. Munn and J. B. Pelz. A model-based approach to video-based eye tracking. Journal of Modern Optics, Vol. 55, Nos. 4-5, pp 503-531, 2008. [4] Munn, S. M., L. Stefano and J. B. Pelz. Fixation identification in dynamic scenes: comparison of an algorithm to human coders. APGV 2008: Symposium on Applied Perception in Graphics and Visualization, submitted. [5] Munn, S. M. and J. B. Pelz. 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker. In ETRA 2008: Proceedings of the Eye Tracking Research and Applications Symposium, pp 181-188, ACM Press. Task 2. Walk through hallway (find a specific room number) Agreement among all three humansResult of automatic algorithm IN FIXATION (BREAK B/W 2 FIXATIONS) ONE SINGLE FIXATION Comparison of algorithm results to three human coders Task 1. Watch a computer animation In fixation Between fixations Algorithm disagrees with 3 humans All agree Top 4 rows HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS 4.0 Application: 3D POR, position and head orientation for each fixation 5 SOLUTIONSOLUTION TIME TO CODE VIDEOS Coded byTask 1Task 2 Human 150* minutes60 minutes Human 270 minutes90 minutes Human 3105 minutes90 minutes Algorithm † 11.8 seconds5.9 seconds *Error made (later corrected) † Implemented in MATLAB on an Apple 2.16 GHz MacBook Fixation 1 (F1) Fixation 7 (F7) … Seven vertices of a cubic structure were fixated and the corresponding PORs were reconstructed and are shown (connected) in the figure on the right. Two subjects performed two different tasks and the start and end of each fixation was coded by an algorithm and by three experienced humans. NOISE REDUCTION: The headgear moves slowly which means that the Headgear array can be smoothed resulting in a final Eye array that will contain about as much noise as the robust array of Pupil positions ( Pupil ). Point-of-regard (POR): point in scene where subject is looking. Pupil CR Headgear SCENE CAM EYE CAM IRED Position Head orientation POR Key eadgear moved ye moved We track the centers of the PUPIL and CR to determine how the eye is moving. Bottom row (“ERRORS”) Actual movement of the eye: *susan.m.munn@gmail.com


Download ppt "HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM."

Similar presentations


Ads by Google