Presentation is loading. Please wait.

Presentation is loading. Please wait.

Correlation.

Similar presentations


Presentation on theme: "Correlation."— Presentation transcript:

1 Correlation

2 Correlation Coefficient Array

3 Correlation Multiple regression Polynomial regression Multivariate transformations

4 Multiple Regression

5 Regression Until now we have been concerned with the relationship between two variables Where more complex relationships are concerned it is often better to consider multiple regression or more correctly, multiple linear regression.

6 Where e is a random error term
Theory Consider simplest form of multiple regression where y is the dependant variable and x1 and x2 independent variables y = b0 + b1x1 + b2x2 + e Where e is a random error term

7 Theory y = b0 + b11x11 + b21x21 + e1

8 Theory y1 = b0 + b11x11 + b21x21 + e1 y2= b0 + b12x12 + b22x22 + e2
: : : : : : yi = b0 + b1ix1i + b2ix2i + ei yn = b0 + b1nx1n + b21x2n + en yi = nb0 + b1x1i + b2x2i

9 Theory yi = nb0 + b1x1i + b2x2i Least Square “Best” Fit
Minimize the sum of squares of error F = ei2 = [yi - b0 - b1x1i - b2x2i]2

10 F = ei2 = [yi - b0 - b1x1i - b2x2i]2
Theory F = ei2 = [yi - b0 - b1x1i - b2x2i]2 dF/db1 = x1y - b2x1x2 - b1x12 and dF/db2 = x2y - b1x1x2 - b2x22

11 b1 = [(x22)(x1y)-(x1x2)(x2y)]
Theory dF/db1 = x1y - b2x1x2 - b1x12 dF/db2 = x2y - b1x1x2 - b2x22 b1 = [(x22)(x1y)-(x1x2)(x2y)] [(x12)x22)-x1x2)2]

12 Theory dF/db1 = x1y - b2x1x2 - b1x12
b1 = [(x22)(x1y)-(x1x2)(x2y)] [(x12)(x22)-(x1x2)2] b2 = [(x12)(x2y)-(x1x2)(x1y)] [(x22)(x12)-(x1x2)2]

13 b0 = mean(y)-b1mean(x1)-b2mean(x2)
Theory b0 = mean(y)-b1mean(x1)-b2mean(x2)

14 Analysis of Variance Total SS = y2 - [y]2/n Reg SS = b1x1y + b2x2y
More generally : Reg SS = [bixiy] Residual SS = Total SS - Reg SS

15 Analysis of Variance Table

16 Example

17 Example x12=1,753.7; x22=23.2; x1y=-65.2
 x2y=7,210.0; x1x2=-156.7; y2=3,211,504

18 b1 = [(x22)(x1y)-(x1x2)(x2y)]
Example x12=1,753.7; x22=23.2; x1y=-65.2  x2y=7,210.0; x1x2=-156.7; y2=3,211,504 b1 = [(x22)(x1y)-(x1x2)(x2y)] [(x12)(x22)-(x1x2)2]

19 b1 = [(x22)(x1y)-(x1x2)(x2y)]
Example x12=1,753.7; x22=23.2; x1y=-65.2  x2y=7,210.0; x1x2=-156.7; y2=3,211,504 b1 = [(x22)(x1y)-(x1x2)(x2y)] [(x12)x22)-x1x2)2] b1 = [(23.2)(-65.2)-(-156.7)(7210)] [(1753.7)(23.2)-(-156.6)2] =

20 b2 = [(x12)(x2y)-(x1x2)(x1y)]
Example x12=1,753.7; x22=23.2; x1y=-65.2  x2y=7,210.0; x1x2=-156.7; y2=3,211,504 b2 = [(x12)(x2y)-(x1x2)(x1y)] [(x12)(x22)-(x1x2)2] b2 = [(1753.7)(-7210)-(-156.7)(-65,194)] [(1753.7)(23.2)-(-156.6)2] =

21 Example b0 = 6561 - (-23.75)(96.2) - (150.27)(16.7) = 6336
y = x x2

22 Analysis of Variance Table

23 Analysis of Variance Table

24 Analysis of Variance Table

25 Analysis of Variance Table

26 Analysis of Variance Table

27 Analysis of Variance Table

28 Matrix Formation of Multiple Regression
y = b0 + b1x1 + b2x2 + ….. + bnxn e Y = X x b + E

29 Introduction to Matrixes
} 6b1 + 3b2 = 24 4b1 + 4 b2 = 20 Simultaneous equations [ ] [ ] [ ] } b b Matrix Form x =

30 Matrix Formation y = b0 + b1x1 + b2x2 + ….. + bnxn e Y = X x b + e

31 Matrix Formation Y = X x b

32 F = ee’ = YY’ - 2YX’b + bb’ XX’
Matrix Formation F = ee’ = YY’ - 2YX’b + bb’ XX’ dF/db = 2XX’ b - 2YX’ = 0 XX’b = YX’

33 Two Variable Example = x11 x21 x31 x41 x51 x11 x12 x12 x1x2

34 Matrix Formation XX’ =

35 Two Variable Example x11 x12 x21 x22 y1 y2 x1y x2y x =

36 Matrix Formation = YX’

37 [ ] [ ] [ ] Two Variable Example x12 x1x2 b1 x1y x2x1 x22 b2 x2y
= XX’ x b = YX’ (XX’)-1XX’ x b = (XX’)-1YX’ b = (XX’)-1YX’

38 Matrix Formation b = (XX’)-1 YX’ Find the inverse of XX’
Donated by (XX’)-1 b = (XX’)-1 YX’

39 Matrix Inverse with Two Variables
A x A = [U]

40 Matrix Inverse with Two Variables
A x A = [U] [ ] [ ] [ ] a b c d 1 ad-bc d -b -c a 1 0 0 1 = x

41 Matrix Inverse with Two Variables
[ ] [ ] [ ] x12 x1x b x1y x2x1 x b x2y x =

42 Matrix Inverse with Two Variables
[ ] [ ] [ ] x12 x1x b x1y x2x1 x b x2y x = XX’ x b = X’Y

43 Matrix Inverse with Two Variables
[ ] [ ] [ ] x12 x1x b x1y x2x1 x b x2y x = XX’ x b = X’Y [ ] [ ] x12 x1x x x1x2 x2x1 x x2x1 x12 1 ad-bc = [U] x XX’ x (XX’) = Unit

44 Matrix Inverse with Two Variables
[ ] [ ] [ ] x x1x x1y b1 -x2x1 x x2y b2 1 ad-bc = x (XX’) x Y = b = x22 x12 - [x2x1]2 1 ad-bc

45 Compare Matrix with None
b1 = [(x22)(x1y)-(x1x2)(x2y)] [(x12)(x22)-(x1x2)2] b2 = [(x12)(x2y)-(x1x2)(x1y)] [(x22)(x12)-(x1x2)2]

46 Forward Step-Wise Regression

47 Two Variable Multiple Regression

48 Analysis of Variance Table
y = x x2

49 Two Variable Multiple Regression
There is significant regression effects by regressing both independent variables onto the dependant variable. The is significant linear relationship between height (x1) and yield but no relationship between yield and tiller. There is significant linear relationship between tiller (x2) and yield and no relationship between yield and height.

50 Two Variable Multiple Regression
We may have made the relationship too complex by including both variables. Forward Step-wise Regression. Backward Step-wise Regression.

51 Two Variable Multiple Regression

52 Analysis of Variance Table
y = 10, Height (x1)

53 Analysis of Variance Table
y = Height Tiller

54 Analysis of Variance Table
y = Height Tiller

55 Analysis of Variance Table
y = Height Tiller

56 Analysis of Variance Table
y = 10, Height (x1)

57 Forward Step-Wise Regression Example 2
20 Spring Canola Cultivars Average over 10 environments Seed yield; plant establishment; days to first flowering, days to end of flowering; plant height; and oil content

58 Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00
F.Start -0.30 F.Finish -0.15 0.93 Height -0.45 0.72 0.70 0.04 -0.51 -0.52 -0.27 0.31 -0.82 -0.80 -0.53 -0.21

59 Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00
-0.30 -0.15 -0.45 0.04 0.31 F.Start 0.93 0.72 -0.51 -0.82 F.Finish 0.70 0.52 0.80 Height -0.27 -0.53 -0.52 -0.21 -0.80

60 Analysis of Variance Table
y = 3, x F.Start

61 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00 -0.30 -0.15 -0.45 0.04 0.31 F.Start 0.93 0.72 -0.51 -0.82 F.Finish 0.70 0.52 0.80 Height -0.27 -0.53 -0.52 -0.21 -0.80 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

62 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00 -0.30 -0.15 -0.45 0.04 0.31 F.Start 0.93 0.72 -0.51 -0.82 F.Finish 0.70 0.52 0.80 Height -0.27 -0.53 -0.52 -0.21 -0.80 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

63 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00 -0.30 -0.15 -0.45 0.04 0.31 F.Start 0.93 0.72 -0.51 -0.82 F.Finish 0.70 0.52 0.80 Height -0.27 -0.53 -0.52 -0.21 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

64 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00 -0.30 -0.15 -0.45 0.04 0.31 F.Start 0.93 0.72 -0.51 -0.82 F.Finish 0.70 0.52 0.80 Height -0.27 -0.53 -0.52 -0.21 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

65 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 1.00 -0.30 -0.15 -0.45 0.04 0.31 F.Start 0.93 0.72 -0.51 -0.82 F.Finish 0.70 0.52 0.80 Height -0.27 -0.53 -0.52 -0.21 -0.80 -0.63 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

66 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 0.91 0.38 -0.23 0.11 0.06 F.Start F.Finish 0.36 0.03 -0.05 0.04 Height 0.48 0.10 0.74 0.21 -0.63 0.33 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

67 Analysis of Variance Table
y = F.Start %Oil

68 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x
Example #2 Character Est. F.St. F.Fi. Ht. %Oil Yield Establish 0.90 0.39 -0.23 0.04 F.Start F.Finish 0.36 Height 0.48 0.11 0.05 0.29 A[i,j] = A[i,j]–{Ai,x x Ax,j}/Ax,x

69 Analysis of Variance Table
y = FS %Oil Height

70 Analysis of Variance Table
y = F.Start %Oil

71 Forward Step-Wise Regression
Enter the variable “most associated with the dependant variable. Check to see if relationship is significant. Adjust the relationship between the dependant variable and the other remaining variables, accounting for the relationship between the dependant variable and the entered variable(s).

72 Forward Step-Wise Regression

73 Forward Step-Wise Regression
Enter most correlated variable

74 Forward Step-Wise Regression
Enter most correlated variable Check that entry is significant

75 Forward Step-Wise Regression
Enter most correlated variable Check that entry is significant Adjust correlation with other variables

76 Forward Step-Wise Regression
Enter most correlated variable Check that entry is significant Adjust correlation with other variables

77 Polynomial Regression

78 Polynomial Regression

79 Polynomial Regression

80 Polynomial Regression

81 Analysis of Variance Table
y = N N2

82 Polynomial Regression
y = N N2 dY/dN = Slope

83 Polynomial Regression
y = N N2 dy/dN = N 0.436 N = n = 36.08

84 Multivariate Transformation


Download ppt "Correlation."

Similar presentations


Ads by Google