Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display

Similar presentations


Presentation on theme: "Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display"— Presentation transcript:

1 Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display
A.T.Duchowski, B.Pelfrey, D.H.House, R.Wang School of Computing, Clemson University

2 Introduction Stereo images have appeared in a variety of forms since their introduction by Wheatstone (1838) How to capture gaze depth by an eye tracker over a stereo display such as Wheatstone’s? Figure 1. Wheatstone’s mirror stereoscope. The head is brought up to mirrors A’ and A, and pictures E’ and E are viewed stereoscopically (Wheatstone, 1838).

3 Motivation Figure 2. Layered illustration of horse’s hoof showing outer layer and coffin bone. Idea is to measure amount of switching as gaze depth changes across layered stereo displays Then develop effective means of visualization to facilitate disambiguation of layers (e.g., via grids)

4 Background Figure 3. Horizontal coordinates of left and right gaze points. Early informal measurements showed promise in simply using horizontal depth disparity Daugherty et al. (2010) used similar idea to measure vergence response to anaglyphic stereo

5 Background Other work:
Holliman et al. (2007) studied depth perception on desktop 3D displays, but did not measure eye movements Essig et al. (2004) measured gaze atop random dot stereograms Kwon and Shul (2006) measured interocular distance during rendering stereo image at five different depths Presently we report observations on vergence response when viewing stereo at different depths vs. monoscopical rendering

6 Methodology Apparatus two IBM T221 “Big Bertha” LCD displays
Figure 4. Custom-built, high-resolution Wheatstone-style stereoscope. Apparatus two IBM T221 “Big Bertha” LCD displays LC Technologies’ Eyegaze system (60 Hz)

7 Methodology Figure 5. 5x5 grid of cubes rendered with the closest row of cubes at 30 cm in front of the screen with each of the four remaining rows 12 cm farther from the viewer. Stimulus grid rotates about vertical axis inducing motion parallax screen plane aligned at three rows in front of the screen at two rows behind the screen at

8 Methodology Procedure & participants
Figure 6. Calibration result image; random-dot stereogram pair used for pre-screening. Procedure & participants 2D calibration performed until 1.3 cm accuracy achieved random-dot stereogram pre-screened depth perception 20 participants participated (18 M, 2 F; ages 16-34) task was to fixate individually rotating cube

9 Methodology Experimental design Pilot study
within-subjects 2 (display) x 2 (motion parallax) x 5 (depth) familiarity and fatigue effects were mitigated by counterbalancing both the 2 x 2 stereo and motion combinations, and by alternating depth order depth order varied so that all even-numbered participants started with cube target behind the screen progressing to the front; reverse for odd-numbered participants Pilot study used to fine-tune data processing tools (filters, fitting)

10 Pilot study Same design as main experiment, but:
fewer subjects (paper authors basically) test effect of stereo (if negative, cancel study!) double-check file naming scheme, Python scripts, etc. Figure 7. From left to right: Dixon Cleveland (LC Tech president), coauthors AD, BP, DH.

11 Figure 8. Raw gaze depth with outliers beyond 2 SD removed.
Pilot study Figure 8. Raw gaze depth with outliers beyond 2 SD removed. Outcomes: definite observable gaze depth response to stereo need for filtering need for 3D calibration

12 Pilot study: filtering
Figure 9. Gaze depth filtered with a 6th order Butterworth filter. Cascading three 2nd order filters gave good results with cutoff frequency set to 0.15 Hz Problems: lag (1.15 s) and initial conditions

13 Pilot study: 3D calibration
Figure 10. Filtered and either shifted (left) or fit (right) gaze depth. Monoscopic data filtered and shifted (mean set to 0) Stereo data scaled and shifted via least-squares minimization (over all depth targets)

14 Results RMS of gaze depth computed per each of 15 targets
Shifted: monoscopic display elicits no response Fit: stereo elicits closer depth agreement Figure 11. Root mean square error of gaze depth under the four viewing conditions.

15 Results Shifted data, within-subjects three-way ANOVA:
significance of depth (F(4,76)=59.50, p<0.01) no significance of display (F(1,19)=3.29, p<0.09) no significance of motion parallax (F(1,19)=1.21, p=0.29) Fit data, within-subject three-way ANOVA: significance of depth (F(4,76)=323.29, p<0.01) significance of display (F(1,19)=126.00, p<0.01) no significance of motion parallax (F(1,19)=3.12, p=0.09)

16 Results Interaction between depth and display is significant (F(4,76)=83.77, p<0.01) Gaze depth error increases with target depth Figure 12. Root mean square error of gaze depth over the five viewing depth intervalsb.

17 Results Subjective data, pairwise t-tests:
significance difference between perception of static monoscopic and rocking stereo (p<0.01) Figure 13. Mean responses to each of four viewing conditions (7-point Likert scale of agreement).

18 Discussion Eye vergence movements clearly respond and match the depth component of a 3D stereo target Noise may be due to eye tracking equipment Wheatstone setup requires splitting of binocular eye tracking optics—most modern eye trackers are binocular (we are currently observing similar effects)

19 Conclusion Our work documents gaze depth response to stereoscopic manipulation of target depths Currently gaze depth is measured by eye tracker vendor’s proprietary algorithm (we now have our own computation based on horizontal gaze disparity) The combined use of the Butterworth filter with least squares fitting is an effective means of depth calibration (we now have this running in real-time)

20 Q & A Thank you! Questions?


Download ppt "Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display"

Similar presentations


Ads by Google