Political Science 30: Political Inquiry

Slides:



Advertisements
Similar presentations
Regression and correlation methods
Advertisements

Chapter 12 Inference for Linear Regression
Lesson 10: Linear Regression and Correlation
Objectives 10.1 Simple linear regression
Inference for Regression
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Linear Regression: Making Sense of Regression Results
Overview Correlation Regression -Definition
Objectives (BPS chapter 24)
Chapter 10 Simple Regression.
Regression Analysis. Unscheduled Maintenance Issue: l 36 flight squadrons l Each experiences unscheduled maintenance actions (UMAs) l UMAs costs $1000.
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Linear Regression and Correlation
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Correlation and Regression Analysis
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
Inference for regression - Simple linear regression
Chapter 11 Simple Regression
Understanding Multivariate Research Berry & Sanders.
Correlation and Regression. The test you choose depends on level of measurement: IndependentDependentTest DichotomousContinuous Independent Samples t-test.
Chapter 6 & 7 Linear Regression & Correlation
Bivariate Regression Analysis The most useful means of discerning causality and significance of variables.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Part IV Significantly Different: Using Inferential Statistics
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
Chapter 20 Linear and Multiple Regression
Regression and Correlation
Chapter 4 Basic Estimation Techniques
Regression Analysis AGEC 784.
REGRESSION G&W p
26134 Business Statistics Week 5 Tutorial
Correlation and Simple Linear Regression
Political Science 30: Political Inquiry
Multiple Regression.
Chapter 15 Linear Regression
POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1.
Correlation and Simple Linear Regression
Simple Linear Regression
The Weather Turbulence
Correlation and Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Statistics II: An Overview of Statistics
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Ch 4.1 & 4.2 Two dimensions concept
Warsaw Summer School 2017, OSU Study Abroad Program
Chapter Thirteen McGraw-Hill/Irwin
Introduction to Regression
Regression Part II.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Political Science 30: Political Inquiry Linear Regression II Political Science 30: Political Inquiry

Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for independent variables Fit of the regression: R Square Statistical significance How to reject the null hypothesis Multivariate regressions College graduation rates Ethnicity and voting

Linear Regression: Review Want to draw a line that best represents the relationship between the IV (X) and DV (Y). Y = a + b*X Allows us to predict DV given value of IV Regression finds the values for a and b that minimizes the distance between the points and the line Technically, a and b are population parameters. We only get to calculate sample statistics, a-hat and b-hat.

Interpreting SPSS regression output Slope or “coefficient” How tight is the fit? Y-intercept or “constant”

Interpreting SPSS regression output An SPSS regression output includes two key tables for interpreting your results: A “Coefficients” table that contains the y-intercept (or “constant”) of the regression, a coefficient for every independent variable, and the standard error of that coefficient. A “Model Summary” table that gives you information on the fit of your regression.

Interpreting SPSS regression output: Coefficients In this class, we will ONLY LOOK AT UNSTANDARDIZED COEFFICIENTS! • The y-intercept is 4.2% with a standard error of 7.0% • The coefficient for SAT Scores is 0.059%, with a standard error of 0.007%.

Interpreting SPSS regression output: Coefficients Est. Graduation Rate = 4.2 + 0.059 * Average SAT Score

Interpreting SPSS regression output: Coefficients The y-intercept or constant is the predicted value of the dependent variable when the independent variable takes on the value of zero. This basic model predicts that when a college admits a class of students who averaged zero on their SAT, 4.2% of them will graduate. The constant is not the most helpful statistic.

Interpreting SPSS regression output: Coefficients The coefficient of an independent variable is the predicted change in the dependent variable that results from a one unit increase in the independent variable. A college with students whose SAT scores are one point higher on average will have a graduation rate that is 0.059% higher. Increasing SAT scores by 200 points leads to a (200)(0.059%) = 11.8% rise in graduation rates

Interpreting SPSS regression output: Fit of the Regression The R Square measures how closely a regression line fits the data in a scatterplot. • It can range from zero (no explanatory power) to one (perfect prediction). • An R Square of 0.345 means that differences in SAT scores can explain 35% of the variation in college graduation rates. Key sentence for your homework!

R Square Examples

Statistical Significance What would the null hypothesis look like in a scatterplot? If the independent variable has no effect on the dependent variable, the scatterplot should look random, the regression line should be flat, and its slope should be zero. Null hypothesis: The regression coefficient (b) for an independent variable equals zero. Can we reject null b=0 based on our estimate of b-hat?

Statistical Significance Our formal test of statistical significance asks whether we can be sure that a regression coefficient for the population differs from zero. Just like in a difference in means/proportions test, the “standard error” is the standard deviation of the sample distribution. If a coefficient is more than two standard errors away from zero, we can reject the null hypothesis (that it equals zero).

Statistical Significance So, if a coefficient is more than twice the size of its standard error, we reject the null hypothesis with 95% confidence. This works whether the coefficient is negative or positive. The coefficient/standard error ratio is called the “test statistic” or “t-stat.” A t-stat bigger than 2 or less than -2 indicates at statistically significant correlation.

Interpreting SPSS regression output: T-Stats

Multivariate Regressions A “multivariate regression” uses more than one independent variable (or confound) to explain variation in a dependent variable. The coefficient for each independent variable reports its effect on the DV, holding constant all of the other IVs in the regression. Thought experiment: Comparing two colleges founded in the same year with the same student faculty ratio, what is the effect of SATs?

Multivariate Regressions Year of Founding SAT Scores Graduation Tuition Rates Student/Faculty Ratio

Multivariate Regressions Again, want to estimate coefficients: Est. Grad. Rate = a + b1*SAT Score + b2*Year Founded+ b3*Tuition + b4*Faculty Ratio

Multivariate Regressions

Multivariate Regressions Holding all other factors constant, a 200 point increase in SAT scores leads to a predicted (200)(0.042) = 8.4% increase in the graduation rate, and this effect is statistically significant. Controlling for other factors, a college that is 100 years younger should have a graduation rate that is (100)(-0.021) = 2.1% lower, but this effect is not significantly different from zero.