Download presentation

Presentation is loading. Please wait.

Published byDavion Avon Modified over 3 years ago

1
1 Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer

2
2 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Roadmap

3
3 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Roadmap

4
4 Maarten De Vos EEG 1 EEG 2 EEG 3 Independent Component Analysis (ICA) ICA: Estimate statistically independent sources s 1, s 2 and s 3 ; and mixing coefficients m i i Decomposing a measurement (EEG) into contributing sources. EEG 3 = m 31 s 1 + m 32 s 2 + m 33 s 3 EEG 1 = m 11 s 1 + m 12 s 2 + m 13 s 3 EEG 2 = m 21 s 1 + m 22 s 2 + m 23 s 3

5
5 Maarten De Vos Decomposition of a measured signal S1S1 SRSR M1M1 MRMR =++ …. + E Y PCA estimates orthogonal sources (basis) = MQQ * S S1S1 SRSR M1M1 MRMR =++ …. + E Y ICA estimates statistically independent sources = MS Matrix decompositions (e.g. PCA) are often not unique. ICA imposes statistical independence to sources.

6
6 Maarten De Vos Different implementations of ‘independence’ Jade: Joint Approximate Diagonalization of Eigenmatrices All the higher-order cross-cumulants are zero Fourth order tensor cumulant is diagonal Mixing matrix is the matrix that approximately diagonalizes the eigenmatrices of cumulant Computation of ICA

7
7 Maarten De Vos = * * * * * Approximate diagonalization Sobi: Second Order Blind Identification Assumption that sources are autocorrelated Mixing matrix also diagonalizes set of matrices Matrices are correlation matrices at different time lags * * * * * *

8
8 Maarten De Vos Decomposition of a measured signal S1S1 SRSR M1M1 MRMR =++ …. + E Y B A = Y S C PCA Tucker / HOSVD : estimates subspace If a signal is multi-dimensional (=higher order tensor), multilinear algebra tools can be used that better exploit the multi-dimensional nature of the data. = B A S C QQ*PP* OO*

9
9 Maarten De Vos Decomposition of a measured signal S1S1 SRSR M1M1 MRMR =++ …. + E Y S1S1 SRSR A1A1 ARAR =+ Y E BRBR B1B1 PCA CPA: Canonical / Parallel Factor Analysis If a signal is multi-dimensional (=higher order tensor), multilinear algebra tools can be used that better exploit the multi-dimensional nature of the data.

10
10 Maarten De Vos CPA components are not orthogonal The best Rank – R approximation may not exist The R components are not ordered But the decomposition is unique and no rotation is possible without changing the model part … Something about CPA S1S1 SRSR A1A1 ARAR =++ …. Y E BRBR B1B1

11
11 Maarten De Vos CPA computed often by ALS: –Minimization of the (Frobenius -) norm of residuals –Minimize 1)Initialize A,S,B 2)Update A, given S and B : 3)Update S, given A and B : 4)Update B, given A and S : 5)Iterate (2-3-4) until convergence Computation of CPA S1S1 SRSR A1A1 ARAR =++ …. Y E BRBR B1B1

12
12 Maarten De Vos Sometimes long swamps, meaning that the costfunction converges very slowly. Computation of CPA (2)

13
13 Maarten De Vos In order to reduce swamps, interpolate A, B and S from the estimates of 2 previous iterations and use the interpolated matrices at the current iteration 1.Line Search: 2.Then ALS update Choice of crucial =1 annihilates LS step (i.e. we get standard ALS) Search directions Improvement of ALS: Line search

14
14 Maarten De Vos [Harshman, 1970] « LSH » [Bro, 1997] « LSB » [Rajih, Comon, 2005] « Enhanced Line Search (ELS) » [Nion, De Lathauwer, 2006] «Enhanced Line Search with Complex Step (ELSCS) » Improvement of ALS : Line search

15
15 Maarten De Vos

16
16 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Simulation results Conclusion Overview

17
17 Maarten De Vos These activations have different ratios in different subjects Gives rise to a trilinear CPA structure [Beckmann et al., 2005]

18
18 Maarten De Vos Beckmann et al (2005) Combination of ICA and CPA : tensor pICA Tensor pICA outperforms CPA due to low Signal-to-Noise Ratio Algo from paper: –One iteration step to optimize ICA costfunction –One iteration step to optimize trilinear structure –Optimize ‘until convergence’ Algo implemented in paper: –Compute ICA on matricized tensor –Decompose afterwards mixing vector to obtain trilinear decomposition Tensor pICA

19
19 Maarten De Vos Does it make sense to add constraints? –- : uniqueness –+: robustness –+: more identifiable if constraints make sense –+: see results

20
20 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Overview

21
21 Maarten De Vos We developed a new algorithm that simultaneously imposed the independence and the trilinear constraint A1A1 ARAR B1B1 BRBR =+ …. + Y SRSR S1S1

22
22 Maarten De Vos Compute fourth-order cumulant tensor Compute the ‘eigenmatrices’ of this tensor -> 3 rd order tensor Add slice with covariance matrix to this tensor This tensor has a 3 rd order CPA structure ICA - CPA * * * * * * * * * * *

23
23 Maarten De Vos Compute fourth-order cumulant tensor Compute the ‘eigenmatrices’ of this tensor -> 3 rd order tensor This tensor has a 3 rd order CPA structure When the mixing matrix has a bilinear structure (the mixing vector has a Khatri-Rao structure) and this tensor can be rewritten as a 5 th order tensor with CPA structure: ICA - CPA

24
24 Maarten De Vos How to compute the 5 th order CPA? ALS breaks symmetry, simulations showed bad performance Taking into account the partial symmetry naturally preserved in a line-search scheme –Search directions: between current estimate and ALS update –Step size: rooting real polynomial of degree 10

25
25 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Overview

26
26 Maarten De Vos Application in fMRI? [Stegeman, 2007]: CPA on fMRI comparable to tensor pICA if correct number of components is chosen [Daubechies, 2009]: ICA on fMRI works rather because of sparsity than because of independence. Infomax

27
27 Maarten De Vos We consider narrow-band sources received by a uniform circular array (UCA) of I identical sensors of radius P. We assume free-space propagation. The entries of A represent the gain between a transmitter and an antenna We generated BPSK user signals: all source distributions are binary (1 or -1), with an equal probability of both values. B contains the chips of the spreading codes for the different users. Application in telecommunications

28
28 Maarten De Vos Well-conditioned mixtureMixture (5,2,1000) Rank overestimated Colored noise

29
29 Maarten De Vos Outperforms orthogonality constraint

30
30 Maarten De Vos We developed a new algorithm to impose both independence and trilinear constraints simultaneously: ICA-CPA We showed that the method outperforms both standard ICA and CPA in certain situations It should only be used when assumptions are validated … Conclusion

31
31 Maarten De Vos

Similar presentations

OK

Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.

Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on total hip replacement Ppt on non ferrous metals and alloys Ppt on switching network Difference between raster scan and random scan display ppt on tv Download ppt on rag pickers Quest diagnostics appt on-line Ppt on partnership act 1932 Ppt on 3d printing pen Ppt on strings in c language Ppt on biogeochemical cycle carbon cycle