Download presentation

Presentation is loading. Please wait.

Published byAlexis Shore Modified about 1 year ago

1
Geometric Representation of Regression

2
‘Multipurpose’ Dataset from class website Attitude towards job –Higher scores indicate more unfavorable attitude toward company Number of years worked Days absent 12 cases EMP DAYSABS ATTRATE YEARS a b c d e f g h i j k l

3
Typical representation with response surface Correlations.89 and up* R 2 model =.903 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) ATTRATE * YEARS DAYSABS ATTRATE YEARS DAYSABS ATTRATE YEARS

4
Typical representation with response surface Where the response surface crosses the y axis (daysabs) provides the intercept in our formula Holding a variable ‘constant’ is like adding a plane perpendicular to that variable’s axis The process as a whole minimizes the sum of the squared distances between the original data points and their projection onto the plane

5
Alternative Given a variable, we can instead view it as a vector projection from an origin into some n-dimensional space In another way, the space is the number of dimensions, one for each individual (for this data 12 dimensions), where this vector, which represents their values on some predictor, occupies only a single dimension within that space

6
Assume now two standardized variables of equal N Now we have 2 vectors (of N components) emanating from the origin* The cosine of the angle they create is the simple correlation of the two variables If they were perfectly correlated they would occupy the same dimension (i.e. be right on top of one another) X1 X2

7
X1 X2 Y Adding a third variable, we can again understand their simple correlations as the cosines of the respective angles they create Given the plane created by X1 and X2, might we find a way to project Y onto it?

8
X1 X2 Y That is in fact what multiple regression does and this projection is that linear combination* resulting in our predicted values The cosine of the angle created by Y and Y-hat is the multiple R, which when squared gives the amount of variance in Y accounted for by the model containing X1 and X2 The attempt is made in regression to minimize that angle/max its cosine Partial correlations may be represented too, by creating a plane perpendicular** to one variable and projecting the others onto that plane The cosine of the angle they create will be their partial correlation Y-hat

9
One dichotomous predictor

10
2 dichotomous predictors (2x2 ANOVA)

11
Dichotomous outcome

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google