眼動儀與互動介面設計 廖文宏 6/26/2009.

Slides:



Advertisements
Similar presentations
Introduction to Eye Tracking
Advertisements

Xiaoyong Ye Franz Alexander Van Horenbeke David Abbott
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
A Projector Based Hand-held Display System
David Wild Supervisor: James Connan Rhodes University Computer Science Department Gaze Tracking Using A Webcamera.
Facial feature localization Presented by: Harvest Jang Spring 2002.
Augmented Reality David Johnson.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Recognition of Traffic Lights in Live Video Streams on Mobile Devices
ADVISE: Advanced Digital Video Information Segmentation Engine
Correlation Between Image Reproduction Preferences and Viewing Patterns Measured with a Head Mounted Eye Tracker Lisa A. Markel Jeff B. Pelz, Ph.D. Center.
Comparison of two eye tracking devices used on printed images Barbora Komínková The Norwegian Color Research Laboratory Faculty of Computer Science and.
Practical Gaze Tracking Peter Elliott CS 498 Spring 2009.
Head Tracking and Virtual Reality by Benjamin Nielsen.
Multiple Human Objects Tracking in Crowded Scenes Yao-Te Tsai, Huang-Chia Shih, and Chung-Lin Huang Dept. of EE, NTHU International Conference on Pattern.
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of 1-1 HCI Human Computer Interaction Week 10.
An Investigation of Usability Issues with Mobile Systems Using a Mobile Eye Tracker thesis by Marie Norlien International University in Germany Thesis.
EIE426-AICV 1 Eye Tracking Techniques and applications eie426-EyeTracking ppt.
Research & Innovation 1 An Industry Perspective on VVG Research Oliver Grau BBC Research & Innovation VVG SUMMER SCHOOL '07.
Eye tracking: principles and applications 廖文宏 Wen-Hung Liao 12/10/2009.
Copyright © 2014, Oracle and/or its affiliates. All rights reserved. | From a certain point of view Eye tracking with Java Gerrit Grunwald Java Technology.
 At the end of this class, students are able to  Describe definition of input devices clearly  List out the examples of input devices  Describe.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Designing and implementing a method for locating and presenting a Laser pointer spot Eran Korkidi Gil-Ad Ben-Or.
Supporting Beyond-Surface Interaction for Tabletop Display Systems by Integrating IR Projections Hui-Shan Kao Advisor : Dr. Yi-Ping Hung.
HiTrack Introduction Shanghai Zhuo-Shi Software Technology Co., Ltd Shanghai Zhuo-Shi Software Technology Co., Ltd
Smart Pathfinding Robot. The Trouble Quad Ozan Mindek Team Leader, Image Processing Tyson Mowery Packaging Specialist Jungwoo Seo Webmaster, Networking.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Takuya Matsuo, Norishige Fukushima and Yutaka Ishibashi
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Digital Image Processing GSP 216. Digital Image Processing Pre-Processing – Correcting for radiometric and geometric errors in data Image Rectification.
Stylization and Abstraction of Photographs Doug Decarlo and Anthony Santella.
National institute of science & technology BLINK DETECTION AND TRACKING OF EYES FOR EYE LOCALIZATION LOPAMUDRA CS BLINK DETECTION AND TRACKING.
March 2004 Charles A. DiMarzio, Northeastern University ECEG287 Optical Detection Course Notes Part 15: Introduction to Array Detectors Profs.
Counting How Many Words You Read
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Eye Tracking Using A Simple Webcamera.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Turning a Mobile Device into a Mouse in the Air
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
A Recognition Method of Restricted Hand Shapes in Still Image and Moving Image Hand Shapes in Still Image and Moving Image as a Man-Machine Interface Speaker.
Suggested Machine Learning Class: – learning-supervised-learning--ud675
By: Matt Kelly (CE), Michael Krenzer (EE), Hemsley Pichardo (EE), Tina Podrasky (ISE), Brad Wideman(CE)
Mobile eye tracker construction and gaze path analysis By Wen-Hung Liao 廖文宏.
Investigating the Use of Eye-Tracking Technology for Assessment: A case study of research and innovation at a Special School INNOVATION IN THE ASSESSMENT.
Heechul Han and Kwanghoon Sohn
EYE TRACKING TECHNOLOGY
EYE-GAZE COMMUNICATION
Analyzing Eye Tracking Data
Signal and Image Processing Lab
Presented by Jason Moore
Eye Detection and Gaze Estimation
EYE-GAZE COMMUNICATION
Senior Capstone Project Gaze Tracking System
Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the
Play game, pause video, move cursor… with your eyes
Secure graphical password system for high traffic public areas
Identifying Confusion from Eye-Tracking Data
Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display
Interactive media.
CS 594: Empirical Methods in HCC Sensor Data Streams in HCI Research
Data hiding method using image interpolation
Fig. 2 System and method for determining retinal disparities when participants are engaged in everyday tasks. System and method for determining retinal.
Fig. 2 System and method for determining retinal disparities when participants are engaged in everyday tasks. System and method for determining retinal.
Presentation transcript:

眼動儀與互動介面設計 廖文宏 6/26/2009

Outline Introduction Mobile eye tracker construction Head-mounted eye tracker Remote eye tracker Head movement compensation Human computer interface (HCI) applications Conclusions

Introduction An eye tracker is a device for measuring eye positions and eye movements. The most popular variant uses video images from which the eye position is extracted. Input source: visible spectrum vs. infrared

Eye Tracking at NCCU CS: Past Scan path analysis using high speed iView X data. Head mounted eye tracker for eye scrolling, eye gaming and eye typing. scene camera eye camera

Eye Tracking at NCCU CS: Present Improve the pupil detection algorithm to alleviate corneal reflection problem. Enhance the accuracy by compensating for head movement. Construct and test a remote eye tracker. More HCI applications using the remote eye tracker. Use the eye tracking device to assist mobile user interface design.

System Architecture Eye image Calibration process Preprocessing 9 pairs of points Pupil detection Calibration Gaze point projection Scene image

Pupil Detection Step 1: Feature detection Step 2: Noise removal edge detection with constrains on neighboring dark points. excludes edges created by bright spots apply erosion Step 2: Noise removal Head mounted eye tracker Remote eye tracker Step 3: Fitting the ellipse using singular value decomposition

Allowing Head Movement: Head Mounted Eye Tracker Use red markers on the LCD screen as the references for head movement calculation. Required information: Ratio between distance of makers in the projected image and in real: (dyWidthSclale,dyHeightScale) Head movements: (difX,difY) Correction made:

Compensate for head movement (error/standard deviation) Experimental Results Calibration point Original error Compensate for head movement (error/standard deviation) 1 98.17 32.91 (12.7) 2 137.54 55.89 (10.94) 3 108.20 29.69 (9.57) 4 76.30 28.79 (21.19) 5 113.30 22.49 (7.79) 6 117.86 34.83 (10.96) 7 116.53 33.31 (8.01) 8 112.56 16.03 (6.78) 9 146.73 26.49 (9.66) 1 cm = 38 Pixels

Allowing Head Movement: Remote Eye Tracker Use makers placed on the glasses as references. Requires an additional calibration step Fixate on the same spot, and turn the head up, down, left and right. Record the shift amount of the markers: (markXi,markYi) and movements of the pupil: (pupilXi,pupilYi) Correction made:

Compensate for head movement (error/standard deviation) Experimental Results Calibration point Original error Compensate for head movement (error/standard deviation) 1 238.73 39.69 (24.6) 2 227.51 63.56 (22.82) 3 132.97 44.30 (21.49) 4 222.97 51.66 (30.94) 5 306.58 31.15 (16.71) 6 280.79 51.66 (21.83) 7 311.89 69.03 (41.57) 8 344.24 69.88 (31.71) 9 347.03 64.66 (24.41)

Demo: Head Mounted Eye Tracker

Demo: Web Browsing

Demo: Photo Viewing

Demo: Dynamic Scene

Demo: Remote Eye Tracker

Demo: Tic-Tac-Toe

Summary Enhance the reliability of pupil detection. Improve the accuracy by compensating for head movements. Promising results for both head mounted and remote eye trackers. Interactive HCI applications.

Eye Tracking at NCCU CS: Future Faster, more accurate eye tracking. More intuitive calibration process. Developing gaze-based digital interactive media.

http://www.cs.nccu.edu.tw/~whliao/dct/