Presentation is loading. Please wait.

Presentation is loading. Please wait.

眼動儀與互動介面設計 廖文宏 6/26/2009.

Similar presentations


Presentation on theme: "眼動儀與互動介面設計 廖文宏 6/26/2009."— Presentation transcript:

1 眼動儀與互動介面設計 廖文宏 6/26/2009

2 Outline Introduction Mobile eye tracker construction
Head-mounted eye tracker Remote eye tracker Head movement compensation Human computer interface (HCI) applications Conclusions

3 Introduction An eye tracker is a device for measuring eye positions and eye movements. The most popular variant uses video images from which the eye position is extracted. Input source: visible spectrum vs. infrared

4 Eye Tracking at NCCU CS: Past
Scan path analysis using high speed iView X data. Head mounted eye tracker for eye scrolling, eye gaming and eye typing. scene camera eye camera

5 Eye Tracking at NCCU CS: Present
Improve the pupil detection algorithm to alleviate corneal reflection problem. Enhance the accuracy by compensating for head movement. Construct and test a remote eye tracker. More HCI applications using the remote eye tracker. Use the eye tracking device to assist mobile user interface design.

6 System Architecture Eye image Calibration process Preprocessing
9 pairs of points Pupil detection Calibration Gaze point projection Scene image

7 Pupil Detection Step 1: Feature detection Step 2: Noise removal
edge detection with constrains on neighboring dark points. excludes edges created by bright spots apply erosion Step 2: Noise removal Head mounted eye tracker Remote eye tracker Step 3: Fitting the ellipse using singular value decomposition

8 Allowing Head Movement: Head Mounted Eye Tracker
Use red markers on the LCD screen as the references for head movement calculation. Required information: Ratio between distance of makers in the projected image and in real: (dyWidthSclale,dyHeightScale) Head movements: (difX,difY) Correction made:

9 Compensate for head movement (error/standard deviation)
Experimental Results Calibration point Original error Compensate for head movement (error/standard deviation) 1 98.17 (12.7) 2 137.54 55.89 (10.94) 3 108.20 29.69 (9.57) 4 76.30 28.79 (21.19) 5 113.30 22.49 (7.79) 6 117.86 34.83 (10.96) 7 116.53 33.31 (8.01) 8 112.56 16.03 (6.78) 9 146.73 26.49 (9.66) 1 cm = 38 Pixels

10 Allowing Head Movement: Remote Eye Tracker
Use makers placed on the glasses as references. Requires an additional calibration step Fixate on the same spot, and turn the head up, down, left and right. Record the shift amount of the markers: (markXi,markYi) and movements of the pupil: (pupilXi,pupilYi) Correction made:

11 Compensate for head movement (error/standard deviation)
Experimental Results Calibration point Original error Compensate for head movement (error/standard deviation) 1 238.73 39.69 (24.6) 2 227.51 63.56 (22.82) 3 132.97 44.30 (21.49) 4 222.97 51.66 (30.94) 5 306.58 31.15 (16.71) 6 280.79 51.66 (21.83) 7 311.89 69.03 (41.57) 8 344.24 69.88 (31.71) 9 347.03 64.66 (24.41)

12 Demo: Head Mounted Eye Tracker

13 Demo: Web Browsing

14 Demo: Photo Viewing

15 Demo: Dynamic Scene

16 Demo: Remote Eye Tracker

17 Demo: Tic-Tac-Toe

18 Summary Enhance the reliability of pupil detection.
Improve the accuracy by compensating for head movements. Promising results for both head mounted and remote eye trackers. Interactive HCI applications.

19 Eye Tracking at NCCU CS: Future
Faster, more accurate eye tracking. More intuitive calibration process. Developing gaze-based digital interactive media.

20


Download ppt "眼動儀與互動介面設計 廖文宏 6/26/2009."

Similar presentations


Ads by Google