Presentation is loading. Please wait.

Presentation is loading. Please wait.

Canonical Correlation Analysis: An overview with application to learning methods By David R. Hardoon, Sandor Szedmak, John Shawe-Taylor School of Electronics.

Similar presentations


Presentation on theme: "Canonical Correlation Analysis: An overview with application to learning methods By David R. Hardoon, Sandor Szedmak, John Shawe-Taylor School of Electronics."— Presentation transcript:

1 Canonical Correlation Analysis: An overview with application to learning methods By David R. Hardoon, Sandor Szedmak, John Shawe-Taylor School of Electronics and Computer Science, University of Southampton Published in Neural Computaion, 2004 Presented by: Shankar Bhargav

2 Measuring the linear relationship between two multi dimensional variables Finding two sets of basis vectors such that the correlation between the projections of the variables onto these basis vectors is maximized Determine Correlation Coefficients Canonical Correlation Analysis

3 More than one canonical correlations will be found each corresponding to a different set of basis vectors/Canonical variates Correlations between successively extracted canonical variates are smaller and smaller Correlation coefficients : Proportion of correlation between the canonical variates accounted for by the particular variable. Canonical Correlation Analysis

4 Differences with Correlation Not dependent on the coordinate system of variables Finds direction that yield maximum correlations

5 Find basis vectors for two sets of variables x, y such that the correlations between the projections of the variables onto these basis vector S x = (x.w x ) and S y = (y.w y ) ρ = E[S x S y ] √ E[S x 2 ] E[S y 2 ] ρ = E[(x T w x y T w y )] √E[(x T w x x T w x ) ] E[(y T w y y T w y )]

6 ρ = max w x w y E[w x T x y T w y ] √E[w x T x x T w x ] E[w y T y y T w y ] ρ = max w x w y w x T C xy w y √ w x T C xx w x w y T C yy w y Solving this with constraint with constraint w x T C xx w x =1 w y T C yy w y =1

7 C xx -1 C xy C yy -1 C yx w x = ρ 2 w x C yy -1 C yx C xx -1 C xy w y = ρ 2 w y C xy w y = ρλ x C xx w x C yx w x = ρλ y C yy w y λ x =λ y -1 = w y T C yy w y √ √ w x T C xx w x

8 CCA in Matlab [ A, B, r, U, V ] = canoncorr(x, y) [ A, B, r, U, V ] = canoncorr(x, y) x, y : set of variables in the form of matrices  Each row is an observation  Each column is an attribute/feature A, B: Matrices containing the correlation coefficient r : Column matrix containing the canonical correlations (Successively decreasing) U, V: Canonical variates/basis vectors for A,B respectively

9 Interpretation of CCA Correlation coefficient represents unique contribution of each variable to relation Multicollinearity may obscure relationships Factor Loading : Correlations between the canonical variates (basis vector) and the variables in each set Proportion of variance explained by the canonical variates can be inferred by factor loading

10 Redundancy Calculation Redundancy left =[ ∑ (loadings left 2 )/p]*R c 2 Redundancy right =[ ∑ (loadings right 2 )/q]*R c 2 p – Number of variable in the first (left) set of variables p – Number of variable in the first (left) set of variables q – Number of variable in the second (right) set of variables q – Number of variable in the second (right) set of variables Rc2 – Respective squared canonical correlation Since successively extracted roots are uncorrelated we can sum the redundancies across all correlations to get a single index of redundancy. Since successively extracted roots are uncorrelated we can sum the redundancies across all correlations to get a single index of redundancy.

11 Application Kernel CCA can be used to find non linear relationships between multi variates Two views of the same semantic object to extract the representation of the semantics Speaker Recognition – Audio and Lip movement Speaker Recognition – Audio and Lip movement Image retrieval – Image features (HSV, Texture) and Associated text Image retrieval – Image features (HSV, Texture) and Associated text

12 Use of KCCA in cross-modal retrieval 400 records of JPEG images for each class with associated text and a total of 3 classes 400 records of JPEG images for each class with associated text and a total of 3 classes Data was split randomly into 2 parts for training and test Data was split randomly into 2 parts for training and test Features Features Image – HSV Color, Gabor texture Text – Term frequencies Results were taken for an average of 10 runs Results were taken for an average of 10 runs

13

14 Cross-modal retrieval Content based retrieval: Retrieve images in the same class Tested with 10 and 30 images sets where count j k = 1 if the image k in the set is of the same label as the text query present in the set, else count j k = 0. where count j k = 1 if the image k in the set is of the same label as the text query present in the set, else count j k = 0.

15 Comparison of KCCA (with 5 and 30 Eigen vectors) with GVSM Content based retrieval

16 `

17 Mate based retrieval Match the exact image among the selected retrieved images Tested with 10 and 30 images sets where count j = 1 if the exact matching image was present in the set else it is 0 where count j = 1 if the exact matching image was present in the set else it is 0

18

19 Comparison of KCCA (with 30 and 150 Eigen vectors) with GVSM Mate based retrieval

20 Comments The good Good explanation of CCA and KCCA Good explanation of CCA and KCCA Innovative use of KCCA in image retrieval application Innovative use of KCCA in image retrieval application The bad The data set and the number of classes used were small The data set and the number of classes used were small The image set size is not taken into account while calculating accuracy in Mate based retrieval The image set size is not taken into account while calculating accuracy in Mate based retrieval Could have done cross-validation tests Could have done cross-validation tests

21 Limitations and Assumptions of CCA At least 40 to 60 times as many cases as variables is recommended to get relliable estimates for two roots– BarciKowski & Stevens(1986) Outliers can greatly affect the canonical correlation Variables in two sets should not be completely redundant

22 Thank you


Download ppt "Canonical Correlation Analysis: An overview with application to learning methods By David R. Hardoon, Sandor Szedmak, John Shawe-Taylor School of Electronics."

Similar presentations


Ads by Google