Download presentation

1
**Multivariate Statistics**

Discriminant Function Analysis MANOVA

2
**Discriminant Function Analysis**

You wish to predict group membership from a set of two or more continuous variables. Example: The IRS wants to classify tax returns as OK or fraudulent. The have data on many predictor variables from audits conducted in past years.

3
**The Discriminant Function**

Is a weighted linear combination of the predictors The weights are selected so that the two groups differ as much as possible on the discriminant function.

4
**Eigenvalue and Canonical r**

Compute a discriminant score, Di, for each case. Use ANOVA to compare the groups on Di SSBetween Groups / SSWithin Groups = eigenvalue

5
**Classification The analysis includes a classification function.**

This allows one to predict group membership for any case on which you have data on the predictors. Those who are predicted to have submitted fraudulent returns are audited.

6
**Two or More Discriminant Functions**

You may be able to obtain more than one discriminant function. The maximum number you can obtain is the smaller of The number of predictor variables One less than the number of groups Each function is orthogonal to the others. The first will have the greatest eigenvalue, the second the next greatest, etc.

7
**Labeling Discriminant Functions**

You may wish to name these things you have created or discovered. As when naming factors from a factor analysis, look at the loadings (correlations between Di and the predictor variables) Look at the standardized discriminant function coefficients (weights).

8
**Predicting Jurors’ Verdict Selections**

Poulson, Braithwaite, Brondino, and Wuensch (1997). Subjects watch a simulated trial. Defendant accused of murder. There is no doubt that he did the crime. He is pleading insanity. What verdict does the juror recommend?

9
**The Verdict Choices Guilty GBMI (Guilty But Mentally Ill)**

NGRI (Not Guilty By Reason of Insanity) The jurors are not allowed to know the consequences of these different verdicts.

10
**Eight Predictor Variables**

Attitude about crime control Attitude about the insanity defense Attitude about the death penalty Attitude about the prosecuting attorneys Attitude about the defense attorneys Assessment of the expert testimony Assessment of mental status of defendant. Can the defendant be rehabilitated?

11
Multicollinearity This is a problem that arises when one predictor can be nearly perfectly predicted by a weighted combination of the others. It creates problems with the analysis. One solution is to drop one or more of the predictors. If two predictors are so highly correlated, what is to be lost by dropping one of them?

12
**But I Do Not Want To Drop Any**

The lead researcher did not want to drop any of the predictors. He considered them all theoretically important. So we did a little magic to evade the multicollinearity problem.

13
Principal Components We used principal components analysis to repackage the variance in the predictors into eight orthogonal components. We used those components as predictors in a discriminant function analysis. And then transformed the results back into the metric of the original predictor variables.

14
**The First Discriminant Function**

Separated those selecting NGRI from those selecting Guilty. Those selecting NGRI: Believed the defendant mentally ill Believed the defense expert testimony more than the prosecution expert testimony Were receptive to the insanity defense Opposed the death penalty Thought the defendant could be rehabilitated Favored lenient treatment over strict crime control.

15
**The Second Discriminant Function**

Separated those selecting GBMI from those selecting NGRI or Guilty. Those selecting GBMI: Distrust attorneys (especially prosecution) Think rehabilitation likely Oppose lenient treatment Are not receptive to the insanity defense Do not oppose the death penalty.

16
**MANOVA This is just a DFA in reverse.**

You predict a set of continuous variables from one or more grouping variables. Often used in an attempt to control familywise error when there are multiple outcome variables. This approach is questionable, but popular.

17
**MANOVA First, ANOVA Second**

Suppose you have an A x B factorial design. You have five dependent variables. You worry that the Type I boogeyman will get you if you just do five A x B ANOVAs. You do an A x B factorial MANOVA first. For any effect that is significant (A, B, A x B) in MANOVA, you do five ANOVAs.

18
**The Beautiful Criminal**

Wuensch, Chia, Castellow, Chuang, & Cheng (1993) Data collected in Taiwan Grouping variables Defendant physically attractive or not Sex of defendant Type of crime: Swindle or burglary Defendant American or Chinese Sex of juror

19
**Dependent Variables One set of two variables**

Length of recommended sentence Rated seriousness of the crime A second set of 12 variables, ratings of the defendant on attributes such as Physical attractiveness Intelligence Sociability

20
**Type I Boogeyman If we did a five-way ANOVA on one DV**

We would do 27 F tests And that is just for the omnibus analysis If we do that for each of the 14 DVs That is 378 F tests And the Boogeyman is licking his chops

21
Results, Sentencing Female jurors gave longer sentences, but only with American defendants Attractiveness lowered the sentence for American burglars But increased the sentence for American swindlers Female jurors gave shorter sentences to female defendants

22
**Results, Ratings The following were rated more favorably**

Physically attractive defendants American defendants Swindlers

23
Canonical Variates For each effect (actually each treatment df) there is a different set of weights applied to the outcome variables. The weights are those that make the effect as large as possible. The resulting linear combination is called a canonical variate. Again, one canonical variate per treatment degree of freedom.

24
**Labeling the Canonical Variates**

Look at the loadings Look at the standardized weights (standardized discriminant function coefficients)

25
**Sexual Harassment Trial: Manipulation Check**

Moore, Wuensch, Hedges, and Castellow (1994) Physical attractiveness (PA) of defendant, manipulated. Social desirability (SD) of defendant, manipulated. Sex/gender of mock juror. Ratings of the litigants on 19 attributes. Experiment 2: manipulated PA and SD of plaintiff.

26
**Experiment 1: Ratings of Defendant**

Social Desirabililty and Physical Attractiveness manipulations significant. CVSocial Desirability loaded most heavily on sociability, intelligence, warmth, sensitivity, and kindness. CVPhysical Attractiveness loaded well on only the physical attractiveness ratings.

27
**Experiment 2: Ratings of Plaintiff**

Social Desirabililty and Physical Attractiveness manipulations significant. CVSocial Desirability loaded most heavily on intelligence, poise, sensitivity, kindness, genuineness, warmth, and sociability. CVPhysical Attractiveness loaded well on only the physical attractiveness ratings.

Similar presentations

OK

Université d’Ottawa / University of Ottawa 2001 Bio 8100s Applied Multivariate Biostatistics L11.1 Lecture 11: Canonical correlation analysis (CANCOR)

Université d’Ottawa / University of Ottawa 2001 Bio 8100s Applied Multivariate Biostatistics L11.1 Lecture 11: Canonical correlation analysis (CANCOR)

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on pricing policy samples Ppt on recycling of waste paper Ppt on ozone layer depletion Service marketing ppt on hotel industry Ppt on mind reading phones Download ppt on sources of water Ppt on plane table survey Ppt on gunn diodes Ppt on relations and functions for class 11th sample Ppt on world environment day images