Presentation is loading. Please wait.

Presentation is loading. Please wait.

POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1.

Similar presentations


Presentation on theme: "POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1."— Presentation transcript:

1 POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1

2 Substantive Significance
2

3 Statistical Significance vs. Substantive Importance
Statistical significance speaks to how confident or certain we are that the results are not the product of chance. Substantive importance speaks to how large the results are and the extent to which they matter. 3

4 Statistical Significance vs. Substantive Importance
Recall that it is easier to attain statistical significance as the sample we take gets larger. The denominator in significance tests depends on the SD which is heavily influenced by the sample size. Dividing by a smaller denominator leads to a larger test statistic (Z Score). 4

5 Statistical Significance vs. Substantive Importance
But, none of this speaks to whether the size of the effect we observe is large or small, or whether it matters for some social process. All we are saying with a statistically significant result is that the results are not likely the product of chance. Statistical significance is about confidence NOT importance. 5

6 Statistical Significance vs. Substantive Importance
In the context of regression, the question of substantive significance is always: Is the slope big? What’s big or not is always to some degree subjective. But the question is only to some degree subjective. It is also partly objective. The challenge of assessing substantive significance is to make a correct assessment with respect to the objective part and a reasonable assessment with respect to the subjective part. 6

7 Relationships among Variables
We began this course by talking about how to describe data, where we were examining one variable. Measures of central tendency Measures of dispersion Now we can apply these same concepts to relationships among and between variables. 7

8 Relationships among Variables
Association- Variables are associated if larger (or smaller) values of one variable occur more frequently with larger (or smaller) values of other variables. 8

9 Relationships among Variables
How to describe relationships? Tables Graphs summary statistics. 9

10 Relationships among Variables
Tables-describe simple relationships. Usually if you don’t see a relationship in a simple table you wont find it using more complex methods. Look to diagonals Null Hypothesis (expected) Alternative Hypothesis (Observed) Education Education Low High Total 50% (50) 0% Low High Total 25% (25) 50% Income Income 10

11 Relationships among Variables
Tables Relationships often summarized using the chi-squared statistic Where the observed and expected are calculated for each cell and this result is added across all cells. We treat this statistic as we would a Z score, but use the distribution to determine significance (page T-20). 11

12 Relationships among Variables
So for the table of income and education we would get the following result: To find the area in the tail multiply the number of rows -1 by the number of columns-1 or (r-1)*(c-1) to find the df and use that line in table F on page T-20 12

13 Relationships among Variables
Potential Problems: The unit of analysis can conceal important factors. Simpson’s Paradox- The direction of a relationship can change when the level of data analysis goes from individual to group levels of association. 13

14 Relationships among Variables
University Admissions Decisions Male Female Admit 3500 2000 Deny 4500 4000 Total 8000 6000 Acceptance Rates Female: 2000/6000= 33% Male: 3500/8000= 44% Is there bias? 14

15 Relationships among Variables
University Admissions Decisions by College Sciences Humanities Male Female Admit 3000 1000 Deny Total 6000 2000 Male Female Admit 500 1000 Deny 1500 3000 Total 2000 4000 Acceptance Rates Male: 50% Female: 50% Acceptance rates Male: 25% Female: 25% Is there bias? What is going on? 15

16 Relationships among Variables
Lurking Variables: A variable not included in the analysis but that affects the result. In this case it was that men and women had preferred different majors. Men preferred the easier to get into major. 16

17 Relationships among Variables
Lurking Variable: A variable not included in the analysis but that affects the result. X Y 17

18 Relationships among Variables
Lurking Variable: A variable not included in the analysis but that affects the result. X Y Z 18

19 Relationships among Variables
We can examine the strength of relationships graphically. Scatter plots- Show the relationship between two variables when the data are measured on the ordinal, interval, or ratio scales. 19

20 Relationships among Variables
Scatter plot- (Example here) Graphs stolen from fiverthirtyeight.com 20

21 Relationships among Variables
They can also be used to compare across events. 2008 2010 21

22 Relationships among Variables
We can examine the strength of relationships Statistically. Measures of Association- 22

23 Relationships among Variables
We can examine the strength of relationships statistically. Measures of Association- We mentioned the statistic. The workhorse is the correlation coefficient. 23

24 Correlation Coefficient

25 Relationships among Variables
Correlation Coefficient- Tells us the linear association between two variables. Ranges from -1.0 (perfect negative association) to +1.0 a perfect positive association. Abbreviated as ‘r’ Answers the question: How far are the points (on average) from the line. 25

26 Relationships among Variables
26

27 Relationships among Variables
As a general rule: r=.7 is a very strong relationship r=.5 is a strong relationship r=.3 is a weak relationship r=0 is no relationship But it varies depending on how noisy the data are. 27

28 Relationships among Variables
Weaknesses: Does not tell us the magnitude. Example: correlation between education and income =.8. Should you get an MBA? How can we account for intervening variables? 28

29 How much increasing one variable changes another variable.
Regression Tells us not only the direction but the precise strength of the relationship. How much increasing one variable changes another variable. 29

30 Regression To clarify this concept we need to be more precise in how we define our variables. Dependent Variable (Y)- The thing being explained. Usually the phenomenon we are interested in studying. Think of this as “the effect” 30

31 Regression Independent Variable (X)-
The thing that affects what we seek to explain. In overly simplistic terms think of this as “the cause” or “an influence” 31

32 Regression Regression tells us in precise terms the strength of the relationship between the dependent and independent variable. How? By fitting a line through the data that provides the least squared errors. 32

33 Regression OLS picks the line with the smallest sum of squared errors.
33

34 Regression Regression is a process that evaluates the relationship between two variables and selects the line that minimizes the sum of squared errors around the line. Where: Y = dependent variable = intercept =slope X= independent variable e = residual 34

35 Regression The relationship between the independent and dependent variable is summarized by the regression coefficient which tells us the angle or slope of the line. 35

36 Regression Regression Coefficient (b)-
Tells us how much a one unit change in X causes in Y. This is called the slope which reflects the angle of the line. 36

37 Regression Intercept-
Tells us what the value of Y is when X is zero. Also called the constant. 37

38 Its how well a line fits or describes the data.
Regression R squared (R2) Tells us how much of the variation in the dependent variable (Y) our model explained. Its how well a line fits or describes the data. Ranges from 0 to 100%. 38

39 ALWAYS begin by graphing your data.
Regression What is the relationship between the vote for Ross Perot and Bob Dole in Florida in 1996? ALWAYS begin by graphing your data. 39

40 Regression OLS picks the line with the smallest sum of squared errors.
40

41 Regression Intercept Slope (b) R2 41

42 Regression: Interpretation
Three main results: The slope: for every additional vote Bob Dole received, Perot got .18 more votes, or for every 100 for Dole, Perot got 18. Where Dole got no votes, we expect Perot to get 1055. The model explains about 84% of the variation in Ross Perot’s vote. 42

43 Regression Slope (b)=.184 R2=.843 Intercept =1055 43

44 High vs. Low R2 R2=.06 R2=.45 Used to compare how well different models explain the data. Higher R2 indicates a better fit. 44

45 Regression Standard Error
To this point, we have assumed that we know the standard deviation of the population. In practice, we seldom know this, so we estimate it (just as we estimate the population mean). This estimate is called the standard error. 45

46 Regression We estimate a standard error for the slope and the intercept and use it like we did the standard deviation—to perform significance tests. 46

47 Regression Here we are conducting a significance test of whether the observed slope and intercept differ from the null hypothesis. This statistic follows a T distribution which accounts for the additional uncertainty that comes from estimating the standard error. 47

48 Regression Intercept Slope (b) R2 Standard Error Standard Error 48

49 We can simply plug the figures in from the Stata output.
Regression We can simply plug the figures in from the Stata output. This statistic follows a T distribution which accounts for the additional uncertainty that comes from estimating the standard error. (See inside back cover of M&M). To determine the degrees of freedom subtract the number of variables (2) in the model from the number of observations (67). This is abbreviated as n-k. 49

50 We can simply plug the figures in from the Stata output.
Regression We can simply plug the figures in from the Stata output. Or, we can use our rule of thumb—if the T statistic is greater than 2 it is significant! 50

51 Stata also reports these p values for us.
Regression If we look to the table we see that the p value corresponds to less than or less than 1 time in 10,000 would we see a result as big as .184 if the true value were zero. Stata also reports these p values for us. 51

52 Regression: Interpreting the Rest
T Stats P Value F Statistic- all variables=0 52

53 Regression: Residuals
OK—lets interpret this. 53

54 Regression: Residuals
Next Up: Residuals Regression assumptions Multiple regression. 54


Download ppt "POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1."

Similar presentations


Ads by Google