Directed Component Analysis

Slides:



Advertisements
Similar presentations
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Advertisements

Abstract Overview Analog Circuit EEG signals have magnitude in the microvolt range. A much larger voltage magnitude is needed to detect changes in the.
Artifact (artefact) reduction in EEG – and a bit of ERP basics CNC, 19 November 2014 Jakob Heinzle Translational Neuromodeling Unit.
Lecture 7: Principal component analysis (PCA)
LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun ( ) Wong Chi Kin ( )
Dimensional reduction, PCA
Independent Component Analysis (ICA) and Factor Analysis (FA)
Ordinary least squares regression (OLS)
Principal Component Analysis Principles and Application.
Goals Looking at data –Signal and noise Structure of signals –Stationarity (and not) –Gaussianity (and not) –Poisson (and not) Data analysis as variability.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
ERP DATA ACQUISITION & PREPROCESSING EEG Acquisition: 256 scalp sites; vertex recording reference (Geodesic Sensor Net)..01 Hz to 100 Hz analogue filter;
Md. Zia Uddin Bio-Imaging Lab, Department of Biomedical Engineering Kyung Hee University.
T – Biomedical Signal Processing Chapters
EEG Classification Using Maximum Noise Fractions and spectral classification Steve Grikschart and Hugo Shi EECS 559 Fall 2005.
FINSIG'05 25/8/2005 1Eini Niskanen, Dept. of Applied Physics, University of Kuopio Principal Component Regression Approach for Functional Connectivity.
Basics of Neural Networks Neural Network Topologies.
Bayesian Modelling of Functional Imaging Data Will Penny The Wellcome Department of Imaging Neuroscience, UCL http//:
Speech Signal Representations I Seminar Speech Recognition 2002 F.R. Verhage.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition Instructor’s Presentation Slides 1.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
EEG DATA EEG Acquisition: 256 scalp sites; vertex recording reference (Geodesic Sensor Net)..01 Hz to 100 Hz analogue filter; 250 samples/sec. EEG Preprocessing:
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
Principal components analysis (PCA) as a tool for identifying EEG frequency bands: I. Methodological considerations and preliminary findings Jürgen Kayser,
By Sarita Jondhale 1 Signal preprocessor: “conditions” the speech signal s(n) to new form which is more suitable for the analysis Postprocessor: operate.
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
CIVET seminar Presentation day: Presenter : Park, GilSoon.
Chapter 14 EXPLORATORY FACTOR ANALYSIS. Exploratory Factor Analysis  Statistical technique for dealing with multiple variables  Many variables are reduced.
Electromyography E.M.G..
EMPIRICAL ORTHOGONAL FUNCTIONS 2 different modes SabrinaKrista Gisselle Lauren.
Descriptive Statistics The means for all but the C 3 features exhibit a significant difference between both classes. On the other hand, the variances for.
Image Transformation Spatial domain (SD) and Frequency domain (FD)
Review of Matrix Operations
Interest Points EE/CSE 576 Linda Shapiro.
[Ran Manor and Amir B.Geva] Yehu Sapir Outlines Review
Distinctive Image Features from Scale-Invariant Keypoints
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
EMPIRICAL ORTHOGONAL FUNCTIONS
Theoretical and empiral rationale for using unrestricted PCA solutions
Conversion of Standard Broadcast Video Signals for HDTV Compatibility
The general linear model and Statistical Parametric Mapping
Dynamical Statistical Shape Priors for Level Set Based Tracking
Electromyography E.M.G..
WiFinger: Talk to Your Smart Devices with Finger-grained Gesture
Electromyography E.M.G..
Volume 58, Issue 3, Pages (May 2008)
AN ANALYSIS OF TWO COMMON REFERENCE POINTS FOR EEGS
Perceptual Echoes at 10 Hz in the Human Brain
EE513 Audio Signals and Systems
The general linear model and Statistical Parametric Mapping
Dynamic Causal Modelling for M/EEG
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
9.4 Enhancing the SNR of Digitized Signals
Machine Learning for Visual Scene Classification with EEG Data
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Video Google: Text Retrieval Approach to Object Matching in Videos
Chapter 4. Time Response I may not have gone where I intended to go, but I think I have ended up where I needed to be. Pusan National University Intelligent.
Principal Component Analysis
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
Performance Optimization
Curve Fitting Filling in the gaps.
Hippocampal-Prefrontal Theta Oscillations Support Memory Integration
Feature Selection in BCIs (section 5 and 6 of Review paper)
Theoretical rationale and empirical evaluation
Presentation transcript:

Directed Component Analysis DCA EEG Artifact Extraction Algorithm

Introduction Segments EEG data into a sequence of overlapping windows Uniquely models artifactual and non-artifactual (cortical) activity in each window Allows for non-stationary artifact and cortical topographies DCA artifact extraction is applied separately to each window in turn

Types of Artifacts EKG Eye blinks Eye movements Model artifactual activity as an outer product of its topography and corresponding time course of intensity Artifacts can be described by single or multiple topographies and corresponding intensity time courses Multiple topographies: EKG, Eye movements Single topography: Eye blinks

Eye Anatomy

Eye Blink Artifact Dipolar source across retina of each eye (metabolism) Generates an omnipresent electric potential field (0.4-1.0mV) During an eye blink => eye lids slide over cornea => current flow is established through the highly conductive cornea into the extra-ocular region via the eyelids Current flow links the dipolar source to the scalp potential field

Eye Blink Artifact Electrical activity resulting from eye blinks is a major source of contamination in scalp-recorded EEG data Eye blinks can exceed 500uV, whereas typical EEG is approximately +/-25uV

Big Picture DCA extracts eye blink activity by estimating the intensity of the blink topography at each time point Intensity at each time point is a weighted sum of the scalp electrode voltages Weights are the elements of the DCA spatial filter Spatial filter must differentiate between eye blink and cortical intensity Filter requires a model of eye blink activity: blink template Filter also requires a model of non-artifactual activity to which it can assign, or share, the cortical intensity Filter will not extract the intensity of any non-artifactual activity captured by the “cortical” model, even if the topography of that activity is correlated to the blink template, such as frontal activity Filter will extract the intensity of any non-artifactual activity not described by the cortical model

The Models DCA requires 2 models: a model of artifactual (eye blink) activity and a model of non-artifactual (cortical) activity Eye Blink Activity Blink activity is modeled by the blink template At present, the blink template is the average of one or more EEG time slices acquired from about the peaks of one or more blink intervals Cortical Activity At present, cortical activity is modeled by a subset of the eigenvectors of the “blink-free” EEG data covariance matrix Data covariance matrix is symmetric, positive-definite Eigenvectors are orthogonal Eigenvalues are non-negative Eigenvectors capture directions of maximal data variance in channel space, subject to orthogonality constraint Efficient, parsimonious representation of “blink-free” EEG data Cortical model must not capture the blink template If so, spatial filter cannot recognize the eye blinks

Eye Blink Topography Source model: Extracted from recorded EEG: Spherical, FDM, FEM Accuracy of geometry & tissue conductivity values Extracted from recorded EEG: Specific to each individual subject Average of topographies at blink peaks over a chosen time interval Blink topography changes slowly over time

Cortical Topographies Eye Blink Topography Deriving cortical topographies from blink-free EEG ensures that we: Minimize misallocation of eye blink energy to the cortical topographies during filtering Allow for accurate representation of cortical EEG from frontal sources, not eye blinks, by retaining cortical topographies with significant correlation to the blink topography (r > 0.75) Eliminate the effect of “bad” channel activity on subsequent spatial filtering by zeroing out corresponding elements of the cortical topographies Cortical Topography (Eigenvector)

Blink and Cortical Topography Extraction Identify intervals of eye blink activity in real time by accounting for: EEG amplitude, slope, and blink template correlation thresholds AC-coupled amplifier recovery time Compute average topographies of each eye blink interval to provide the forward model with a dynamic, localized representation of blink activity Extract cortical EEG, free of eye blink activity, from consecutive EEG data windows to provide for the computation of a dynamic, localized representation of brain activity

Eye Blink Removal Sequence Eye blink topography EEG Source model Data Model Matrix of eye blink topography (KEB) and cortical eigenvectors (K1 .. KCT) Cortical topography EEG Source model Spatial Filter Generation Matrix pseudo inverse Extract temporal evolution of eye blink intensity Refine Extracted Eye Blink Intensity Nullspace filtering Frequency filtering Extract Eye Blinks

Spatial Filter Derivation Spatial filter extracts intensity of activity correlated to the blink topography and not captured by the principal eigenvectors of the “blink-free” EEG data covariance matrix KEB: Blink topography KC1, KC2, …, KCT: Cortical eigenvectors K*EB, K*C1, …, K*CT: Pseudo-inverse of blink and cortical eigenvectors (Spatial filter: K*EB) Blink Topography + Cortical Eigenvectors KEB … KC2 KC1 KCT Matrix Pseudo - Inverse K*EB K*C1 K*C2 K*CT

MATLAB Demo Demonstration of: Covariance Matrix Eigenvector Estimation Blink Template Simulation Matrix Pseudo Inverse Spatial Filter Extraction Spatial Filter Properties Left Nullspace

Spatial Filter Details Eigenvectors of cortical topography covariance matrix (current data window) Plot 1 … K1 KLNS KCT Left Nullspace Cortical Eigenvectors Plot 2 Plot 3 Plot 4 Spatial filter is a weighted sum of left nullspace eigenvectors Plot 5 Weights are corresponding correlation coefficients from Plot 5 Plot 1: Cumulative fraction of EEG variance captured by all eigenvectors of cortical topography covariance matrix Plot 2: Correlation of blink template with eigenvectors retained for cortical model (Cortical Eigenvectors) Plot 3: Correlation of blink template with eigenvectors not retained for cortical model (Left Nullspace) Plot 4: Correlation of spatial filter with cortical eigenvectors - Essentially zero to minimize cortical activity extraction Plot 5: Correlation of spatial filter with left nullspace eigenvectors - Nonzero correlation to allow for extraction of blink activity

Eye Blink Removal Sequence Eye blink topography EEG Source model Data Model Matrix of eye blink topography (KEB) and cortical eigenvectors (K1 .. KCT) Cortical topography EEG Source model Spatial Filter Generation Matrix pseudo inverse Extract temporal evolution of eye blink intensity Refine Extracted Eye Blink Intensity Nullspace filtering Frequency filtering Extract Eye Blinks

Time Course of Eye Blink Intensity Inner product of spatial filter and EEG electrode recordings at each time point extracts temporal stream of eye blink intensity EEG Data Window EEG1 … EEG2 EEG3 EEGn Spatial Filter K*EB Blink Intensity … Topographies (time slices) Channels (time series) or … …

Eye Blink Removal Sequence Eye blink topography EEG Source model Data Model Matrix of eye blink topography (KEB) and cortical eigenvectors (K1 .. KCT) Cortical topography EEG Source model Spatial Filter Generation Matrix pseudo inverse Extract temporal evolution of eye blink intensity Refine Extracted Eye Blink Intensity Nullspace filtering Frequency filtering Extract Eye Blinks

Extracted Blink Intensity Refinement Nullspace filtering Frequency filtering Cortical model is comprised of Eigenvectors K1 - KCT, capturing ~ 95 % of cortical EEG topography variance per window Pseudo-inverse generates spatial filter as a weighted sum of Eigenvectors K1 - KLNS Nullspace filtering deletes from the left nullspace all eigenvectors with minimal correlation to the blink topography (r < specified threshold) Spatial filter is regenerated as a weighted sum of remaining left nullspace eigenvectors, and blink intensity stream is re-extracted from EEG data Eigenvectors of cortical topography covariance matrix … K1 KLNS KCT Left Nullspace Cortical Topographies Subsequent to nullspace filtering, low pass filtering of blink intensity stream reduces the power of its high frequency (~ > 12 Hz) components Before After

Extraction of Blink Artifacts From EEG Blink removal sequence is re-implemented for each consecutive EEG data window Blink - Free EEG “Blinky” EEG = - BT * Blink Intensity

Eye Blink Removal Sequence Eye blink topography EEG Source model Data Model Matrix of eye blink topography (KEB) and cortical eigenvectors (K1 .. KCT) Cortical topography EEG Source model Spatial Filter Generation Matrix pseudo inverse Extract temporal evolution of eye blink intensity Refine Extracted Eye Blink Intensity Nullspace filtering Frequency filtering Extract Eye Blinks

Blink Topography and Spatial Filter Cortical and left nullspace eigenvectors span channel space Blink topography can be expressed as a weighted sum of eigenvectors derived from cortical topography covariance matrix Orthogonal eigenvectors permit representation of blink topography variance as sum of eigenvector weights squared Consequence: As number of cortical eigenvectors increases => fraction of topography variance captured by cortical eigenvectors increases => fraction captured by left nullspace decreases (fewer left nullspace eigenvectors) => ii2} decreases BT = 1KC1mKCm1KL1nKLn Cortical eigenvectors Left nullspace BT2 = 1m12n2

Blink Topography and Spatial Filter Corresponding spatial filter is a weighted sum of left nullspace eigenvectors Orthogonal eigenvectors permit concise expression for inner-product of blink topography and spatial filter < BT , SF > = 1 by definition of pseudo inverse Consequence: As ii2} decreases => jj2} increases to maintain constant unity inner product => supremum of the spatial filter weights, max{|| … |n|}, increases Consequence: Increased sensitivity of spatial filter to all activity not captured by cortical eigenvectors Consequence: Optimal value for user-specified fraction of cortical EEG variance captured by cortical eigenvectors (~ 0.95) SF = 1KL1nKLn < BT , SF > = 11nn

Bad Channel Detection Artifact removal method rests on several assumptions: Eye blink intervals are detected and excluded during cortical topography extraction Spatial distribution of eye blinks are described by a single, relatively stable topography in each data window Variance of EEG activity not captured by cortical eigenvectors is minimal Imperative to detect and exclude bad channels Bad channel variance is not accurately captured by cortical eigenvectors, distorts eye blink topographies and can interfere with eye blink detection Artifact removal framework provides for detection and exclusion of bad channels Detect bad channels based upon: Frequency band power Voltage and variance thresholds Correlation to own spherical spline interpolant (low if channel is bad) Correlation to nearby channels (normally high due to volume conduction) Spatial filter components corresponding to bad channels are zeroed out

When It Works…

DCA Demo Blinksmrkd.1ms.40.seg.oar.raw GNG2_002.1v.cmb.raw 128 channels x 39,375 samples Process time: ~13 seconds GNG2_002.1v.cmb.raw 128 channels x 931,640 samples Process time: ~346 seconds EEG acquisition time: ~3,726 seconds @ 250 Hz

Next Steps Short term: Variable window overlap, down to one sample increments More frequent model updating Zero-lag IIR low-pass filter Reduce filter transients Monitor blink template stability Local / Global blink template Increase “Blink to Noise” ratio

Even More Next Steps Long(er) Term Improve the “cortical” model Iterative refinement Varimax / Promax rotation of EEG covariance matrix eigenvectors Covariance matrix would be based on ALL data within time window May allow for modeling of cortical activity currently excluded