Presentation is loading. Please wait.

Presentation is loading. Please wait.

Brain Electrophysiological Signal Processing: Preprocessing

Similar presentations


Presentation on theme: "Brain Electrophysiological Signal Processing: Preprocessing"— Presentation transcript:

1 Brain Electrophysiological Signal Processing: Preprocessing
ME (Signal Processing), IISc: Neural Signal Processing, Spring 2014 Brain Electrophysiological Signal Processing: Preprocessing Kaushik Majumdar Indian Statistical Institute Bangalore Center ME (Signal Processing), IISc: Neural Signal Processing

2 Heart-Rate and Muscle Artifacts in EEG
Benbadis and Rielo, 2008: Heart-Rate and Muscle Artifacts in EEG ME (Signal Processing), IISc: Neural Signal Processing

3 ME (Signal Processing), IISc: Neural Signal Processing
Preprocessing Visual Inspection Filtering Principal Component Analysis (PCA) Independent Component Analysis (ICA) ME (Signal Processing), IISc: Neural Signal Processing

4 Matrix Representation of Multi-Channel EEG
M is an m x n matrix, whose m rows represent m EEG channels and n columns represent n time points. Often during EEG processing we are to find a matrix W such that WM is the processed signal. ME (Signal Processing), IISc: Neural Signal Processing

5 EOG Identification by Principal Component Analysis (PCA)
Majumdar, under preparation, 2013 EOG Identification by Principal Component Analysis (PCA) ME (Signal Processing), IISc: Neural Signal Processing

6 ME (Signal Processing), IISc: Neural Signal Processing
PCA Algorithm (cont.) ME (Signal Processing), IISc: Neural Signal Processing

7 ME (Signal Processing), IISc: Neural Signal Processing
PCA Algorithm (cont.) PCA Rotation and (Stretching or Contracting) ME (Signal Processing), IISc: Neural Signal Processing

8 Performance of PCA in EOG Removal
Wallstrom et al., Int. J. Psychophysiol., 53: , 2004 Performance of PCA in EOG Removal EOG ME (Signal Processing), IISc: Neural Signal Processing

9 Independent Component Analysis (ICA)
In PCA data components are assumed to be mutually orthogonal, which is too restrictive. PCA components Original data sets ME (Signal Processing), IISc: Neural Signal Processing

10 ME (Signal Processing), IISc: Neural Signal Processing
ICA (cont.) PCA will give poor results if the covariance matrix has eigenvalues close to each other. ME (Signal Processing), IISc: Neural Signal Processing

11 ICA as Blind Source Separation (BSS)
S1 S4 Four musicians are playing in a room. From the outside only music can be heard through four microphones. No one can be seen. How the music heard from outside can be decomposed into four sources? S2 S3 1 2 4 3 ME (Signal Processing), IISc: Neural Signal Processing

12 Mathematical Formulation
A is mixing matrix, x is sensor vector, s is source vector and n is noise, which is to be eliminated by filtering. ME (Signal Processing), IISc: Neural Signal Processing

13 Mathematical Formulation (cont.)
Given find such that Any estimation technique of is called an ICA technique or BSS technique in general. ME (Signal Processing), IISc: Neural Signal Processing

14 ICA Algorithm: FastICA
Hyvarinen and Oja, Neural Networks, 13: , 2000 ICA Algorithm: FastICA Whitening: Normalization (make mean zero). Make variance one i.e., E expectation, x is the vector of signals and I is identity matrix. ME (Signal Processing), IISc: Neural Signal Processing

15 ME (Signal Processing), IISc: Neural Signal Processing
FastICA (cont.) B is orthogonal matrix and D is diagonal matrix of E will satisfy Whitening complete ME (Signal Processing), IISc: Neural Signal Processing

16 ME (Signal Processing), IISc: Neural Signal Processing
Non-Gaussianity ICA is appropriate only when probability distribution of the data set is non-Gaussian. Gaussian distribution is of the form ME (Signal Processing), IISc: Neural Signal Processing

17 Entropy of Gaussian Variable
A Gaussian variable has the largest entropy among a class of random variables with equal variance (for a proof see Cover & Thomas, Elements of Information Theory). Here we will give an intuitive argument. ME (Signal Processing), IISc: Neural Signal Processing

18 Entropy of a Random Variable X
ME (Signal Processing), IISc: Neural Signal Processing Entropy of a Random Variable X More information Less (zero) information

19 Gaussian Random Variable Has Highest Entropy: Intuitive Proof
By Central Limit Theorem (CLT) the mean of a class of random variables (class is signified by uniform variance) follows normal distribution as the number of members in the class tends to infinity (i.e., becomes very large). Infinite observations hold infinite or maximum amount of information. ME (Signal Processing), IISc: Neural Signal Processing

20 Intuitive Proof (cont.)
Therefore a random variable with normal distribution has the highest information content. So it has the highest entropy. If each variable in a class of random variables admits only finite number of nonzero values, the one with uniform distribution will have the highest entropy. ME (Signal Processing), IISc: Neural Signal Processing

21 Non-Gaussianity as Negentropy
H is entropy and J negentropy. J is to be maximized. When J is maximum y is reduced to a component. This can be shown by calculating the kurtosis for component and sum of components including the said component (See Hyvarinen & Oja, 2000, P. 7). ME (Signal Processing), IISc: Neural Signal Processing

22 Steps of FastICA after Whitening
g is in the form of either of the two ME (Signal Processing), IISc: Neural Signal Processing

23 ME (Signal Processing), IISc: Neural Signal Processing
Exercise FastICA has been implemented in EEGLAB (in runica function). Remove artifacts from sample EEG data using the ICA implementation in EEGLAB. ME (Signal Processing), IISc: Neural Signal Processing

24 Concept of Independence in PCA and ICA
In PCA independence means orthogonality i.e., pairwise dot product is zero. In ICA independence is statistical independence. Let x, y be random variables, p(x) is probability distribution function of x and p(x,y) is joint probability distribution function of (x,y). If p(x,y) = p(x).p(y) holds we call x and y are statistically independent. ME (Signal Processing), IISc: Neural Signal Processing

25 ME (Signal Processing), IISc: Neural Signal Processing
Independence (cont.) If vectors v1 and v2 are orthogonal they are independent. Say not, then a1v1 + a2v2 = 0 implies, a1v1.v1 + a2v2.v1 = 0 or a1 = 0. Similarly a2 = 0. If v1 = cv2 then both of them must have same probability distribution or p(v1,v2) = p(v1) = p(v2). If v1 and v2 are linearly independent p(v1,v2) = p(v1).p(v2) may or may not hold. If p(v1,v2) = p(v1).p(v2) holds then v1 and v2 are linearly independent. ME (Signal Processing), IISc: Neural Signal Processing

26 Conditions for ICA Applicability
Sources are statistically independent. Propagation delays in the mixing medium are negligible. Sources are time varying. Mixing medium delays may affect sources in different locations differently and thereby corrupting their temporal structures. Number of sources = number of sensors. ME (Signal Processing), IISc: Neural Signal Processing

27 ME (Signal Processing), IISc: Neural Signal Processing
References Benbadis and Rielo, EEG artifacts, eMedicine, available online at Hyvarinen and Oja, Independent component analysis: algorithms and applications, Neural Networks, vol. 13, p , 2000. Majumdar, A Brief Survey of Quantitative EEG Analysis (under preparation), Chapter 2. ME (Signal Processing), IISc: Neural Signal Processing


Download ppt "Brain Electrophysiological Signal Processing: Preprocessing"

Similar presentations


Ads by Google