EyeLink 1000/2k and EyeLink Remote Introduction and Training

Slides:



Advertisements
Similar presentations
ARTIFICIAL PASSENGER.
Advertisements

Eye and Pen currently support: EyelinkII, Eyelink 1000 (S.R. research, Ltd.) i-ViewX HED (SensoMotoric Instruments GmbH, S.M.I.) ASL 504 (Applied Science.
MHSL - Dolphin Training Class
EyeLink II Training Session Copyright of SR Research Ltd., Eye Link II System Introduction and Basic Training Session SR Research Ltd. Mississauga.
Introduction to Eye Tracking
Assistive Technology Training Online (ATTO) University at Buffalo – The State University of New York USDE# H324M Co:Writer.
Eye tracking experiments August 29th, 2014 Daniel Schreij VU Cognitive Psychology departement
Set the Camera Options  Resolution  Focus  Exposure  Zoom  Flash  Self-Timer/Remote Control.
On The Air Net Training and Support for Narrow Band Emergency Messaging System Emergency Radio Communications Operations Operational since 2008!
Poster: Light sensor Various types (photo-transistors, -resistors or -diodes) exhibit a wide range of response.
Motors and Automatic Target Recognition Motors... l Motorized Instruments have 2 Motors: –One to Rotate the Alidade Horizontally –A Second to Rotate.
User Guide and Trouble-shooting Information. A classroom set of 20 cameras are available for teacher and student use. Capable of taking video, but recommend.
EPOCH 1000 Series Procedure Multi-Angle Feature. Feature Overview and Activation.
Setting up your computer workstation. Setting up your workstation correctly will reduce most of the causes of pain and discomfort from sitting at a computer.
 Any time you half press the shutter button, the light meter activates.  As we know, it measures the light in your scene, and calculates a shutter speed.
Market introduction CellaTemp PA.  Newest interface technology  Latest high-resolution microprocessors  Newest sensors  New amplifier technology 
Comparison of two eye tracking devices used on printed images Barbora Komínková The Norwegian Color Research Laboratory Faculty of Computer Science and.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Electro-Oculography (EOG) Measurement System The goal : To measure eye movement with maximum accuracy using skin electrodes around the eyes that detect.
Chapter 11 - Monitoring Server Performance1 Ch. 11 – Monitoring Server Performance MIS 431 – created Spring 2006.
1 Lab Equipment. 2 TopicSlides DC Power Supply3-4 Digital Multimeter5-8 Function Generator9-12 Scope – basic controls13-20 Scope – cursors21-24 Scope.
HYPACK® Multibeam Training Seminar
La Technologie des Mouvements Oculaires en Linguistique Expérimentale Rachel Shen.
Modern Remote Control Copyright
Society for Psychophysiological Research
Eye Movements and Visual Attention
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
VistA Imaging Display User Guide. VistA imaging Display 2 VISTA IMAGING DISPLAY There are minor changes in this document from previous versions of the.
This presentation will guide you though the initial stages of installation, through to producing your first report Click your mouse to advance the presentation.
Use a Large Bold Type for the Main Title Use Smaller Type for the Subtitle. Above type is 96 pt, this type is 66 pt Make Authors’ names smaller. This is.
Ircon ® ScanIR ® 3 Linescanner How to work with Snapshots? Confidential Rev. A 07/2013.
Programming with Alice Computing Institute for K-12 Teachers Summer 2011 Workshop.
OPL MRR Viewer Tutorial David Stark North Carolina State University 31 Jan 2008.
Ladibug Document Camera DC 170 BY ART LEBEAU SOURCED FROM THE MANUFACTURERS USER GUIDE.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Eye Tracking and its Application in MRI and EEG Settings
TAUCHI – Tampere Unit for Computer-Human Interaction Visualizing gaze path for analysis Oleg Špakov MUMIN workshop 2002, Tampere.
This tutorial teaches Microsoft Word basics. Although knowledge of how to navigate in a Windows environment is helpful, this tutorial was created for.
> 1 Diagrams in Word Faculty of Health Alan Grace.
Getting Started With AutoCAD ENGR 2 Week #1 Laboratory.
SIGNZ V3.39 Centre Proposals SIGNZ V3.39 Centre Proposals.
Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:
Graphics. Graphic is the important media used to show the appearance of integrative media applications. According to DBP dictionary, graphics mean drawing.
Introduction to AutoCAD Engineering Drawings
A High-Performance Brain- Computer Interface
Eye Tracking and its Application in MRI and EEG Settings
Velleman Oscilloscope: Windows 7 by Mr. David Fritz.
Introduction to KE EMu
Counting How Many Words You Read
Introduction to KE EMu Unit objectives: Introduction to Windows Use the keyboard and mouse Use the desktop Open, move and resize a.
GVC3200 Video Conferencing System. Conference Room Setup Prepare below equipment before setup: Display Device (e.g., HDTV), with power adapters GVC3200,
Introduction to KE EMu Unit objectives: Introduction to Windows Use the keyboard and mouse Use the desktop Open, move and resize a.
P15051: Robotic Eye Project Definition Review TIM O’HEARNANDREW DROGALISJORGE GONZALEZ KATIE HARDY DANIEL WEBSTER.
SNG via Webinar. Where’s Webinar??  Double click Aflac 2000 folder  Highlight “SNGWebCommunicator”  Right Click and “Send To - Desktop”
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
SPM Users Basic Training August 2010 Lecture VIII – AC Imaging Modes: ACAFM and MAC Imaging methods using oscillating cantilevers.
CI-110 Plant Canopy Analyzer Instrument Training Conducted by: Brienne Meyer
Using Microsoft Outlook Training By Gary Lane.
EYE TRACKING TECHNOLOGY
Presented by Jason Moore
Learning Objectives • Dynamic Input Line tool. • Coordinate systems.
A Sensorimotor Role for Traveling Waves in Primate Visual Cortex
Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display
眼動儀與互動介面設計 廖文宏 6/26/2009.
Google Plus Hangouts Skills: Start a Hangout on Air, invite participants, conduct Hangout Concepts: none We will begin with some examples of Google Hangouts.
Spatial Coding of the Predicted Impact Location of a Looming Object
Attentional Modulations Related to Spatial Gating but Not to Allocation of Limited Resources in Primate V1  Yuzhi Chen, Eyal Seidemann  Neuron  Volume.
Responses of Collicular Fixation Neurons to Gaze Shift Perturbations in Head- Unrestrained Monkey Reveal Gaze Feedback Control  Woo Young Choi, Daniel.
Volume 27, Issue 3, Pages (February 2017)
Spatial Coding of the Predicted Impact Location of a Looming Object
Presentation transcript:

EyeLink 1000/2k and EyeLink Remote Introduction and Training SR Research Ltd. Toronto - Ottawa, Canada

Agenda Video-Based Eye Tracking Basic Terminology: pupil image corneal reflection Basic Terminology: calibration and validation spatial and temporal resolution accuracy The EyeLink CL Platform overview camera mount types EyeLink Remote mode EyeLink Components Practical Issues the Set Options screen camera setup and recording

Video-Based Eye Tracking infrared light and camera -minimal interference with the visual stimulus high-speed video image analysis -dark pupil based -corneal reflection based -determine pupil and CR centers calibration yields predictive model -sample the camera image while subject fixates different known locations -induce a predictive model validation evaluates model

Mapping raw eye data / camera image data to predict gaze position Calibration Mapping raw eye data / camera image data to predict gaze position -calibration samples from raw camera image space with pupil position “D” are overlaid on top of a view of what the subject sees -target location gives an idea of where in the raw camera space the subject should be looking -calibration targets as they are acquired give insight into adequacy of the sampled positions and progress of the calibration

Good Calibration Poor Calibration Calibration Grid gives idea of quality of the underlying predictive model Good Calibration Poor Calibration

Video-Based Eye Tracking Validation -evaluate predictive model test-retest accuracy -subject re-fixates known locations -difference between the predicted gaze position and validation sample is ‘accuracy’ -summarizes calibration adequacy -reveals spatial positions fit least accurately as error in degrees of visual angle

Checking accuracy level of the calibration Validation Checking accuracy level of the calibration

Some Basic Terms Accuracy -test-retest discrepancy -measure of absolute spatial location -will be best with dominant eye Spatial Resolution -smallest measureable movement -measure of relative spatial location -worsens with faster sampling rate and obstacles such as glasses Temporal Resolution -how many images processed per second (Hz) -also referred to as ‘samples’ per second

More Basic Terms Eye Event Resolution -smallest psychological event that is spatially measurable with the system Blink Recovery Time -recover of gaze position after missing data Head Movement Compensation -tolerable level of head movements

EyeLink 1000 Family Focal Imaging Technology (FIT) point, focus and track camera real-time Host computer for data collection Display computer for stimulus delivery One Camera, Many Mounts… -Desktop, Tower, Primate, Arm Mount -Long Range variants for MEG/MRI … and Multiple Modes of Operation -High Precision Monocular/Binocular -Remote Mode with Head Free to Move Unified Software Host Eye Tracker Application Application Programming Interface

EyeLink 1000 High Precision Accurate: Drift free, 0.25 - 0.5° typical approx 2 cm lateral head movement compensation Highest Spatial Resolution available, Low Noise: 0.01° RMS in pupil-CR 1000 Hz tracking mode 0.02° RMS in pupil-CR 2000 Hz tracking mode glasses increase error by about 0.01° Fastest Sampling Rate Available: 2000 / 1000 / 500 / 250 Hz recording Eye Event Resolution: 0.05° microsaccades Real-Time streaming data via Ethernet or analog: 1.4 ms delay (SD=0.4 ms) @ 2000 Hz 1.8 ms delay (SD=0.8 ms) @ 1000 Hz No missing data after blinks

EyeLink Desktop Mount

EyeLink 1000 Remote Mode

EyeLink 1000 Remote Mode Accurate: No Head Stabilization required: 0.25°-0.5° average accuracy No Head Stabilization required: 22 x 18 x 20 cm (horizontal x vertical x depth) allowable head movement at a 60 cm camera distance Spatial Resolution: 0.1° RMS error Fastest Sampling Rate of any Remote System: 500 Hz monocular eye tracking No missing data around blinks Real-time: Access eye position data with 3.0 ms delay (SD=1.11 ms)

EyeLink System Components

The Display PC Performs full experiment control - integrates calibration and data collection into one step - sets any tracker preference sends commands to control tracker Allows focus on stimulus presentation and data processing ordinary experiment delivery with calls to the underlying EyeLink libraries to interface with the eye tracker Time stamps experiment events with messages near Real-time access to eye sample and eye event data structures for gaze contingent paradigms

Display PC API Compatible with many stimulus delivery methods: Experiment Generator Packages: -Experiment Builder -E-Prime -Presentation -Psychtoolbox (MATLAB) Programming Languages: -C / C++ -Python -Delphi -any Windows COM language Multiple Operating Systems: MacOS X Windows Linux

The Host PC Host PC Application controls the eye tracker Performs image analysis Performs data recording Performs eye event parsing Configures preferences Provides real-time feedback Gaze view: Gaze cursor on background image Plot view: Eye traces over time

Recording Gaze View

Recording Plot View

The Host PC Application Set Options Screen

The Host PC Application Set Options Screen

Participant Setup Camera Setup Screen

Participant Setup Tower Mount set the height of the chair, chinrest and Tower Mount so your subject is comfortably seated and looking at the top of the display -never adjust the Tower Mount with subject’s head in place set the eye-selection knob to track the dominant eye adjust the mirror angle to get a good view of the eye, or to avoid glasses reflections; adjusting chin position may help too click pupil in the global view to autothreshold (Host Display PC) focus the camera - minimize size of yellow CR circle autothreshold and if necessary, adjust pupil and CR thresholds -if cautious or troubleshooting, check the setup by asking the subject to look at four corners while monitoring threshold quality at all positions calibration, validation and recording

Participant Setup Desktop Mount set the height of the chair and chinrest so your subject is comfortably seated and looking at the top of the display center the Desktop Mount at the bottom of the display as high in the field of view as possible without occluding the display - ensure you have the correct lens and camera angle for the type of tracking you wish to perform (monocular, binocular or remote) adjust camera angle and position to get a good view of the eye(s) click pupil in the global view to autothreshold (Host or Display PC) focus the camera - minimize size of yellow CR circle autothreshold and if necessary, adjust pupil and CR thresholds -if cautious or troubleshooting, check the setup by asking the subject to look at four corners -if the CR is smeared move the Desktop Mount toward the problematic corner until CR is tracked calibration, validation and recording

Participant Setup Optimal Eye Position adjust the height of the chair so that the subject is comfortable and has line of sight to upper part of the monitor the forehead rest should be just above the eye brow. The figures on the right shows that the subject is seated too high (top-right) or too low (bottom-right).

Search Limits: when to use and how to enable Participant Setup Search Limits: when to use and how to enable Search Limits (red box) can be used to reduce the area of the image that is searched to find the eye normally Search Limits is not needed, however with some participants glasses it can be used to exclude regions of the camera image that may otherwise be detected as a pupil or CR

Adjusting Search Limits Participant Setup Adjusting Search Limits Press “Use Search Limits” to toggle on /off search limits. Use ALT and cursor keys to adjust the shape of the search limits. Use SHIFT and cursor keys to adjust the position of the search limits.

Participant Setup Centroid model fitting: Centroid Vs. Ellipse Pupil Tracking Modes Centroid model fitting: - tracks centre of a circle fit to thresholded pupil advantages: highly stable has very low noise - disadvantage: position drift if pupil is occluded (i.e., by eyelids) Ellipse model fitting: - tracks centre of an ellipse fit to the thresholded pupil -advantages: - decreased drift - overcomes pupil occlusion -disadvantage: higher noise level

Focusing the Eye Camera Participant Setup Focusing the Eye Camera Focusing Arm Focused Not Focused

Setting Pupil Threshold Participant Setup Setting Pupil Threshold Figure 1 The pupil threshold can be adjusted automatically, through the Auto-threshold command, or manually, through the up/down arrow. A threshold too low will result in shadows (Figure 1), while a threshold too high will result in a noisy signal (Figure 2). Figure 2

Symptoms of Poor Pupil Threshold Participant Setup Symptoms of Poor Pupil Threshold Pupil clipped and lost Good Corner Shadow captures pupil

Setting Corneal Reflection (CR) Participant Setup Setting Corneal Reflection (CR) Good Corneal Reflection Poor Corneal Reflection

Participant Setup Binocular vs. Monocular Mount

Subjects wearing glasses Participant Setup Participant Setup Subjects wearing glasses Binocular Setup Good mirror angle Poor mirror angle

Status Panel Monitor the status of camera image of the tracked eye throughout setup, calibration, validation and recording phases. Pupil OK (green): Pupil present and can be tracked at selected sample rate SIZE (yellow): Occurs when the pupil size is larger than the maximum allowed pupil size MISSING (red): Pupil not present Corneal (only operational in Pupil-CR mode) OK (green): Corneal reflection is present and can be tracked MISSING (red): Corneal reflection is not present

Participant Setup EyeLink Remote set the height of the chair so your subject is comfortably seated center the Desktop Mount at the bottom of the display as high in the field of view as possible without occluding the display place the target on the subject’s forehead adjust camera angle and position to get a good view of the eye and sticker; capture as wide a range of subject movement as possible click pupil in the global view to autothreshold (Host Display PC) focus the camera - minimize size of yellow CR circle adjust thresholding bias for pupil and CR -if cautious or troubleshooting, check the setup by asking the subject to look at four corners -if the CR is smeared move the Desktop Mount toward the problematic corner until CR is tracked calibration, validation and recording

Participant Setup EyeLink Remote place target sticker on the subject’s forehead the eye and the sticker stay within the camera image when the subject’s head moves an ideal target-camera distance should be about 550 mm to 600 mm for calibration for highest accuracy use a 13 pt calibration

Participant Setup EyeLink Remote Target is too close to the eye vertically Target has a large angle

Participant Setup EyeLink Remote Pupil threshold bias adjusted by UP or DOWN cursor keys (1.08 typical) CR threshold bias adjusted by – or + keys (1.00 typical) .

Participant Setup EyeLink Remote Monitor the thumbnail camera images at the lower left corner of the tracker screen The two dots in the middle panel reflect the target and eye position in the global camera image For reliable tracking, both dots should stay within the red box.

Participant Setup EyeLink Remote Pupil OK: Pupil present and can be tracked at selected sample rate SIZE: Occurs when the pupil size is larger than the maximum allowed pupil size or smaller than the required size (e.g., too bright or small pupil) MISSING: Pupil not present Status Panel Corneal OK: Corneal reflection is present and can be tracked MISSING: Corneal reflection is not present Target OK: Target is present and can be tracked MISSING: Target is not present. NEAR EYE: Target is placed too close to the eye on the vertical dimension. ANGLE: Target has too large an angle to be recognized properly.

Calibration

Good Calibration Poor Calibration Always checks for the calibration grid at the end of calibration. Good Calibration Poor Calibration

Calibration Improving calibration accuracy: • check pupil and CR as the subject looks at all four corners of the display – they should always be visible, well- thresholded and tracked • encourage the subject to sit still – no head turning! • use backspace (to redo a target) and manual accept mode (spacebar twice) to only sample the eye position when the subject is stably fixating the calibration target • match the background color of the screen during calibration/validation to your test displays; -changes in pupil size caused by large brightness differences will reduce recording accuracy

Checking gaze accuracy of the calibration Validation Checking gaze accuracy of the calibration

Validation Validation Results: GOOD (green background): Errors are acceptable. FAIR (grey background): Errors are moderate, calibration should be improved. POOR: (red background): Errors are too high for useful eye tracking. In general, ensure that the average gaze error is within 0.5 and maximum error within 1.0

EyeLink Data EDF File (binary File) What’s recorded? Use EDF2ASC converter to get ASC files Use EyeLink Data Viewer for direct analysis What’s recorded? Samples Events Saccades Fixations Blinks Messages Buttons (Please read Chapter 4 of EyeLink 1000 User Manual)

EyeLink Recording Data Samples System time, x, y and pupil size With optional velocity, resolution, and CR status 6079861 503.7 680.3 972.0 ..... 6079862 503.7 680.1 972.0 ..... 6079863 503.8 680.1 972.0 ..... 6079864 503.8 680.2 972.0 ..... 6079865 503.9 680.2 971.0 ..... 6079866 503.7 680.1 971.0 ..... 6079867 503.7 680.1 971.0 ..... 6079868 503.6 680.0 970.0 ..... 6079869 503.6 680.1 970.0 ..... 6079870 503.7 680.2 970.0 ..... (Please read Chapter 4 of EyeLink 1000 User Manual)

EyeLink Recording Data Saccades SSACC: eye, start time ESACC: eye, start time, end time, duration, start x/y, end x/y, amplitude, peak velocity. SSACC L 6079955 ESACC L 6079955 6079962 8 507.5 682.5 511.9 682.0 0.20 38 … SSACC L 6080723 ESACC L 6080723 6080763 41 513.8 679.7 633.5 550.5 7.94 285 (Please read Chapter 4 of EyeLink 1000 User Manual)

EyeLink Recording Data Fixations SFIX: eye, start time EFIX: eye, start time, end time, duration, average x/y, pupil size, and additionally, resolution. SFIX L 1454748 EFIX L 1454748 1454907 160 510.3 4.1 1187 28.45 27.50 ... SFIX L 1454919 EFIX L 1454919 1455873 955 514.0 0.3 1361 28.40 27.50 (Please read Chapter 4 of EyeLink 1000 User Manual)

EyeLink Data Viewer

EyeLink Data Viewer Excellent Data Visualization Spatial Overlay, Temporal Graph, and Playback Views Flexible Event Filtering Fixations, Saccades, Blinks, Interest Areas, etc. Reaction Time Definition, Interest Period Definitions Detailed Output Reports Fixation, Saccade, Interest Area, Trial, Message, or Sample Reports Full Experiment Integration by Messages Images, Interest Areas, and Trial Condition Variables

EyeLink Data Viewer Generates detailed Fixation, Saccade, Interest Area and Trial reports

EyeLink Support Documents EyeLink 1000 User Manual EyeLink 1000 Installation Guide SR Research Experiment Builder Windows Programmers Guide EyeLink Data Viewer Contact Information E-mail: support@sr-research.com Phone: 1-613-826-2958/ 1-866-821-0731 Web: http://www.sr-support.com

EyeLink Support