Canonical Correlation
What is Canonical Correlation? Canonical correlation seeks the weighted linear composite for each variate (sets of D.V.s or I.V.s) to maximize the overlap in their distributions. Labeling of D.V. and I.V. is arbitrary. The procedure looks for relationships and not causation. Goal is to maximize the correlation (not the variance extracted as in most other techniques). Canonical correlation is the “mother” m.v. model Lacks specificity in interpreting results that may limit its usefulness in many situations
X1X2X3X4...XqX1X2X3X4...Xq Y1Y2Y3Y4...YpY1Y2Y3Y4...Yp What is the best way to understand how the variables in these two sets are related?
Bivariate correlations across sets Multiple correlations across sets Principal components within sets; correlations between principal components across sets
X1X2X3X4...XqX1X2X3X4...Xq Y1Y2Y3Y4...YpY1Y2Y3Y4...Yp What linear combinations of the X variables (u) and the Y variables (t) will maximize their correlation?
b1X1+b2X2+b3X3+b4X4+.+bpXp=ub1X1+b2X2+b3X3+b4X4+.+bpXp=u a 1 Y 1 + a 2 Y 2 + a 3 Y 3 + a 4 Y a q Y q = t What linear combinations of the X variables (u) and the Y variables (t) will maximize their correlation?
b1X1+b2X2+b3X3+b4X4+.+bpXpb1X1+b2X2+b3X3+b4X4+.+bpXp a1Y1+a2Y2+a3Y3+a4Y4+.+aqYqa1Y1+a2Y2+a3Y3+a4Y4+.+aqYq Max(R c ) Where Rc represents the overlapping variance between two variates which are linear composites of each set of variables
Assumptions Multiple continuous variables for D.V.s and I.V.s or categorical with dummy coding Assumes linear relationship between any two variables and between variates. Multivariate normality is necessary to perform statistical tests. Sensitive to homoscedasticity decreases correlation between variables Multicollinearity in either variate confounds interpretation of canonical results
When use Canonical Correlation? Descriptive technique which can define structure in both the D.V. and I.V. variates simultaneously Series of measures are used for both D.V. and I.V. Canonical correlation also has ability to define structure in each variate, which are derived to maximize their correlation
Objectives of Canonical Correlation Determine the magnitude of the relationships that may exist between two sets of variables Derive a variate(s) for each set of criterion and predictor variables such that the variate(s) of each set is maximally correlated. Explain the nature of whatever relationships exist between the sets of criterion and predictor variables Seek the max correlation of shared variance between the two sides of the equation
Canonical correlation: Correlation between two sets; the largest possible correlation that can be found between linear combinations. Canonical variate: The linear combinations created from the IV set and DV set. Extraction of canonical variates can continue up to a maximum defined by the number of measures in the smaller of the two sets. Information: Canonical Functions
Information: Canonical Variates Canonical weights: weights used to create the linear combinations; interpreted like regression coefficients Canonical loadings: correlations between each variable and its variate; interpreted like loadings in PCA Canonical cross-loadings: Correlation of each observed independent or dependent variable with opposite canonical variate
Interpreting Canonical Variates Canonical Weights –Larger weight contributes more to the function –Negative weight indicates an inverse relationship with other variables –Be careful of multicollinearity –Assess stability of samples
Interpreting Canonical Variates Canonical Loadings – direct assessment of each variable’s contribution to its respective canonical variate –Larger loadings = more important to deriving the canonical variate –Correlation between the original variable and its canonical variate –Assess stability of loadings across samples
Interpreting Canonical Variates Canonical Cross-Loadings –Measure of correlation of each original D.V. with the independent canonical variate. –Direct assessment of the relationship between each D.V. and the independent variate. –Provides a more pure measure of the dependent and independent variable relationship –Preferred approach to interpretation
X1X2X3X4...XqX1X2X3X4...Xq Y1Y2Y3Y4...YpY1Y2Y3Y4...Yp Canonical Cross-Loadings Represents the correlation between Y1 and the X variate
X1X2X3X4...XqX1X2X3X4...Xq Y1Y2Y3Y4...YpY1Y2Y3Y4...Yp Canonical Loadings and Weights Loading X1: correlation between X1 and X variate (its own variate) r Weight X1: unique partial contribution of X1 to X variate (its own variate)
Deriving Canonical Functions & Assessing Overall Fit Max # of variate functions = # of variables in the smallest set - I.V. or D.V. Variates extracted in steps. Factor which accounts for max residual variance is selected –First pair of canonical variates has the highest intercorrelation possible –Successive pairs of variates are orthogonal and independent of all variates –Canonical correlation squared represents the amount of variance in one canonical variate that is accounted for by the other canonical variate
Interpretation: Selection of Functions Level of statistical significance of the function – usually F statistic based on Rao’s approximation, p <.05 Magnitude of the canonical relationship –size of canonical correlations; practical significance –R c 2 variance shared by variates, not variance extracted from predictor & criterion variables Redundancy index – summary of the ability of a set of predictor variables to account for variation in criterion variables
Redundancy Index Redundancy = [Mean of (loadings) 2 ] x R c 2 Provides the shared variance that can be explained by the canonical function Redundancy provided for both IV and DV variates, but DV variate of more interest Both loadings and R c 2 must be high to get high redundancy
Considerations: Canonical R Small sample sizes may have an adverse affect –Suggested minimum sample size = 10 * # of variables Selection of variables to be included: –Conceptual or Theoretical basis –Inclusion of irrelevant or deletion of relevant variables may adversely affect the entire canonical solution –All I.V.s must be interrelated and all D.V.s must be interrelated –Composition of D.V. and I.V. variates is critical to producing practical results
Limitations Rc reflects only the variance shared by the linear composites, not the variances extracted from the variables Canonical weights are subject to a great deal of instability Interpretation difficult because rotation is not possible Precise statistics have not been developed to interpret canonical analysis
Crosby, Evans, and Cowles (1990) examined the impact of relationship quality on the outcome of insurance sales. They examined relationship characteristics and outcomes for 151 transactions. Relationship Characteristics: Appearance similarity Lifestyle similarity Status similarity Interaction intensity Mutual disclosure Cooperative intentions
Crosby, Evans, and Cowles (1990) examined the impact of relationship quality on the outcome of insurance sales. They examined relationship characteristics and outcomes for 151 transactions. Outcomes: Trust in the salesperson Satisfaction with the salesperson Cross-sell Total insurance sales
Matrix data Variables = rowtype_ trust satis cross total appear life status interact mutual coop. Begin data N Mean STDDEV Corr 1.00 corr corr corr corr corr corr corr corr corr end data.
Variable labels trust ' Trust in the salesperson' Satis 'Satisfaction with the salesperson' cross 'Cross-sell' total 'Total insurance sales' appear 'Appearance similarity' life 'Lifestyle similarity' status 'Status similarity' interact 'Interaction intensity' mutual 'Mutual disclosure' coop 'Cooperative intentions'. MANOVA trust satis cross total with appear life status interact mutual coop /matrix=IN(*) /print signif(multiv dimenr eigen stepdown univ hypoth) error(cor) /discrim raw stan cor alpha(1).
Multivariate Tests of Significance (S = 4, M = 1/2, N = 69 1/2) Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys There is at least one significant relationship between the two sets of measures. With 6 and 4 measures in the two sets, there are a maximum of 4 possible sets of linear combinations that can be formed.
Eigenvalues and Canonical Correlations Root No. Eigenvalue Pct. Cum. Pct. Canon Cor. Sq. Cor RcRc Rc2Rc2
Dimension Reduction Analysis Roots Wilks L. F Hypoth. DF Error DF Sig. of F 1 TO TO TO TO Two of the four possible sets of linear combinations are significant.
Standardized canonical coefficients for DEPENDENT variables Function No. Variable TRUST SATIS CROSS TOTAL Outcomes: Trust in the salesperson Satisfaction with the salesperson Cross-sell Total insurance sales
Correlations between DEPENDENT and canonical variables Function No. Variable TRUST SATIS CROSS TOTAL Outcomes: Trust in the salesperson Satisfaction with the salesperson Cross-sell Total insurance sales
Standardized canonical coefficients for COVARIATES CAN. VAR. COVARIATE APPEAR LIFE STATUS INTERACT MUTUAL COOP Relationship Characteristics: Appearance similarity Lifestyle similarity Status similarity Interaction intensity Mutual disclosure Cooperative intentions
Correlations between COVARIATES and canonical variables CAN. VAR. Covariate APPEAR LIFE STATUS INTERACT MUTUAL COOP Relationship Characteristics: Appearance similarity Lifestyle similarity Status similarity Interaction intensity Mutual disclosure Cooperative intentions
Remaining issues: How much variance is really accounted for? How easily does the procedure capitalize on chance?
How much variance is really accounted for? Reliance on the canonical correlations for evidence of variance accounted for across sets of variables can be misleading. Each linear combination only captures a portion of the variance in its own set. That needs to be taken into account when judging the variance accounted for across sets.
The squared canonical correlation indicates the shared variance between linear combinations from the two sets.
Each linear combination accounts for only a portion of the variance in the variables in its set.
Redundancy Index Redundancy = [Mean of (loadings) 2 ] x R c 2 Provides the shared variance that can be explained by the canonical function Redundancy provided for both IV and DV variates, but DV variate of more interest Both loadings and R c 2 must be high to get high redundancy Proportion of variance in the variables of the opposite set that is accounted for by the linear combination.
Fader and Lodish (1990) collected data for 331 different grocery products. They sought relations between what they called structural variables and promotional variables. The structural variables were characteristics not likely to be changed by short-term promotional activities. The promotional variables represented promotional activities. The major goal was to determine if different promotional activities were associated with different types of grocery products.
Structural variables (X): PENETPercentage of households making at least one category purchase PCYCLEAverage interpurchase time PRICEAverage dollars spent in the category per purchase occasion PVTSHCombined market share for all private-label and generic products PURHHAverage number of purchase occasions per household during the year
Promotional variables (Y): FEATPercent of volume sold on feature (advertised in local newspaper) DISPPercent of volume sold on display (e.g., end of aisle) PCUTPercent of volume sold at a temporary reduced price SCOUPPercent of volume purchased using a retailer’s store coupon MCOUPPercent of volume purchased using a manufacturer’s coupon
MANOVA penet purhh pcycle price pvtsh with feat disp pcut scoup mcoup /print signif(multiv dimenr eigen stepdown univ hypoth) error(cor) /discrim raw stan cor alpha(1). Canonical correlation analysis must be obtained using syntax statements in SPSS: SPSS syntax
PENET PURHHPCYCLEPRICEPVTSHFEATDISPPCUTSCOUPMCOUP BEER WINE FRESH BREAD CUPCAKES Structural variables (X): PENETPercentage of households making at least one category purchase PCYCLEAverage interpurchase time PRICEAverage dollars spent in the category per purchase occasion PVTSHCombined market share for all private-label and generic products PURHHAverage number of purchase occasions per household during the year Promotional variables (Y): FEATPercent of volume sold on feature (advertised in local newspaper) DISPPercent of volume sold on display (e.g., end of aisle) PCUTPercent of volume sold at a temporary reduced price SCOUPPercent of volume purchased using a retailer’s store coupon MCOUPPercent of volume purchased using a manufacturer’s coupon
Raw canonical coefficients for COVARIATES Function No. COVARIATE FEAT DISP PCUT SCOUP MCOUP The same coefficients exist for the other set of variables.
Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys These tests indicate whether there is any significant relationship between the two sets of variables. They do not indicate how many of those sets of linear combinations are significant. With 5 variables in each set, there are up to 5 sets of linear combinations that could be derived. This test tells us that at least the first one is significant.
Eigenvalues and Canonical Correlations Root No. Eigenvalue Pct. Cum. Pct. Canon Cor. Sq. Cor The canonical correlations are extracted in decreasing size. At each step they represent the largest correlation possible between linear combinations in the two sets, provided the linear combinations are independent of any previously derived linear combinations.
Dimension Reduction Analysis Roots Wilks L. F Hypoth. DF Error DF Sig. of F 1 TO TO TO TO TO Procedures for testing the significance of the canonical correlations can be applied sequentially. At each step, the test indicates whether there is any remaining significant relationships between the two sets. In this case, three sets of linear combinations can be formed.
As in principal components, identifying the number of significant sets of linear combinations is just the beginning. The nature of those linear combinations must also be determined. This requires interpreting the canonical weights and loadings.
Raw canonical coefficients for DEPENDENT variables Function No. Variable PENET PURHH PCYCLE PRICE PVTSH The linear combinations can be formed using the variables in their original metrics. Sometimes this makes it easier to understand the role a particular variable plays because the metric is well understood.
Standardized canonical coefficients for DEPENDENT variables Function No. Variable PENET PURHH PCYCLE PRICE PVTSH The standardized canonical coefficients are the weights applied to standardized variables to create the new linear combinations. Structural variables (X): PENETPercentage of households making at least one category purchase PCYCLEAverage interpurchase time PRICEAverage dollars spent in the category per purchase occasion PVTSHCombined market share for all private-label and generic products PURHHAverage number of purchase occasions per household during the year
Correlations between DEPENDENT and canonical variables Function No. Variable PENET PURHH PCYCLE PRICE PVTSH The loadings provide information about the bivariate relationship between each variable and each linear combination. Structural variables (X): PENETPercentage of households making at least one category purchase PCYCLEAverage interpurchase time PRICEAverage dollars spent in the category per purchase occasion PVTSHCombined market share for all private-label and generic products PURHHAverage number of purchase occasions per household during the year
Standardized canonical coefficients for COVARIATES CAN. VAR. COVARIATE FEAT DISP PCUT SCOUP MCOUP Promotional variables (Y): FEATPercent of volume sold on feature(advertised in local newspaper) DISPPercent of volume sold on display (e.g., end of aisle) PCUTPercent of volume sold at a temporary reduced price SCOUPPercent of volume purchased using a retailer’s store coupon MCOUPPercent of volume purchased using a manufacturer’s coupon
Correlations between COVARIATES and canonical variables CAN. VAR. Covariate FEAT DISP PCUT SCOUP MCOUP Promotional variables (Y): FEATPercent of volume sold on feature(advertised in local newspaper) DISPPercent of volume sold on display (e.g., end of aisle) PCUTPercent of volume sold at a temporary reduced price SCOUPPercent of volume purchased using a retailer’s store coupon MCOUPPercent of volume purchased using a manufacturer’s coupon
Variance in dependent variables explained by canonical variables CAN. VAR. Pct Var DE Cum Pct DE Pct Var CO Cum Pct CO Variance in covariates explained by canonical variables CAN. VAR. Pct Var DE Cum Pct DE Pct Var CO Cum Pct CO Average Squared Loading Correlations between DEPENDENT and canonical variables Function No. Variable 1 PENET.956 PURHH.555 PCYCLE PRICE PVTSH.336 ( L 2 i,1 )/i Average squared loadings (33.462) times the squared canonical correlation (.413) = Redundancy
Interpretation: Average squared loading The canonical variate extracts XX% of the variance in variable a, b, and c –Example: The canonical variate extracts 33.46% of the variance in percent of households making at least one purchase, average interpurchase time, average $ spent on category, and average # of purchase occasions/household yearly
Interpretation: Redundancy Redundancy is 13.81% Indicates that the promotional variate extracts 13.81% of the variance in structural variables (purchase decisions)
Variance in dependent variables explained by canonical variables CAN. VAR. Pct Var DE Cum Pct DE Pct Var CO Cum Pct CO Variance in covariates explained by canonical variables CAN. VAR. Pct Var DE Cum Pct DE Pct Var CO Cum Pct CO Average squared loading Redundancy
Interpretation: Average squared loading The canonical variate extracts XX% of the variance in variable a, b, and c –Example: The canonical variate extracts 52.47% of the variance in percent of volume sold on feature, percent of volume sold on display, percent of volume sold at a temp reduced price, percent of volume and purchase with retail coupon.
Interpretation: Redundancy Redundancy is 21.54% Indicates that the structural (purchase decision) variate extracts 13.81% of the variance in promotional variables
Any given loading can be squared to indicate the proportion of the variance in that variable that is accounted for by that canonical variate. The sum of the squared loadings for a given variable indicates the total proportion of variance accounted for by the collection of canonical variates. The average of the squared loadings for a canonical variate is the adequacy coefficient and indicates the proportion of variance in the collection of variables that is accounted for by the canonical variate. The redundancy coefficient is the proportion of variance in a set of variables that is accounted for by a linear combination from the other set. The sum of the redundancy coefficients gives the total proportion of variance in one set that is accounted for by the other set. These will usually be different values for each set.