Lecture 3 HSPM J716. New spreadsheet layout Coefficient Standard error T-statistic – Coefficient ÷ its Standard error.

Slides:



Advertisements
Similar presentations
Inference for Regression: Inference About the Model Section 14.1.
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
BA 275 Quantitative Business Methods
Lecture 3 HSPM J716. Efficiency in an estimator Efficiency = low bias and low variance Unbiased with high variance – not very useful Biased with low variance.
Copyright © 2009 Pearson Education, Inc. Chapter 29 Multiple Regression.
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Simple Linear Regression
July 1, 2008Lecture 17 - Regression Testing1 Testing Relationships between Variables Statistics Lecture 17.
Psychology 202b Advanced Psychological Statistics, II February 10, 2011.
What makes one estimator better than another Estimator is jargon term for method of estimating.
Linear Regression with One Regression
Multivariate Data Analysis Chapter 4 – Multiple Regression.
T-test.
Chapter Topics Types of Regression Models
Lecture 23 Multiple Regression (Sections )
Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections ): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence.
Simple Linear Regression Analysis
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Measures of Association Deepak Khazanchi Chapter 18.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
15: Linear Regression Expected change in Y per unit X.
Simple Linear Regression Analysis
Correlation & Regression
Active Learning Lecture Slides
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 11 Simple Regression
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
© 2001 Prentice-Hall, Inc.Chap 9-1 BA 201 Lecture 22 Estimation of Predicted Values.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Byron Gangnes Econ 427 lecture 3 slides. Byron Gangnes A scatterplot.
© 2013 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Introductory Statistics: Exploring the World through.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Regression: Checking the Model Peter T. Donnan Professor of Epidemiology and Biostatistics Statistics for Health Research.
Scatterplots & Regression Week 3 Lecture MG461 Dr. Meredith Rolfe.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Linear model. a type of regression analyses statistical method – both the response variable (Y) and the explanatory variable (X) are continuous variables.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
AP Statistics Chapter 14 Section 1.
ENM 310 Design of Experiments and Regression Analysis
Political Science 30: Political Inquiry
BIVARIATE REGRESSION AND CORRELATION
…Don’t be afraid of others, because they are bigger than you
Statistics in Data Mining on Finance by Jian Chen
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Review of Chapter 3 where Multiple Linear Regression Model:
Linear Regression.
CHAPTER 3 Describing Relationships
Review of Chapter 2 Some Basic Concepts: Sample center
Correlation and Regression-III
The Least-Squares Line Introduction
Simple Linear Regression
Section 2: Linear Regression.
Simple Linear Regression
Ch 4.1 & 4.2 Two dimensions concept
Presentation transcript:

Lecture 3 HSPM J716

New spreadsheet layout Coefficient Standard error T-statistic – Coefficient ÷ its Standard error

Standard error of coefficient Shows how near the estimated coefficient might be to the true coefficient.

Confidence interval for a coefficient Coefficient ± its standard error × t from table 95% probability that the true coefficient is in the 95% confidence interval? If you do a lot of studies, you can expect that, for 95% of them, the true coefficient will be in the 95% confidence interval.

Standard error of the regression Should be called standard residual – But it isn’t

Assumptions Required for using linear least squares model Illustrated in assignment 2

Durbin-Watson statistic Serial correlation – For clinic 2

Confidence interval for prediction The hyperbolic outline

Formal outlier test? Using confidence interval of prediction

Multiple regression 3 or more dimensions 2 or more X variables Y = α + βX + γZ + error Y = α + β 1 X 1 + β 2 X 2 + … + β p X p error

Fitting a plane in 3D space Linear assumption – Now a flat plane – The effect of a change in X 1 on Y is the same at all levels of X 1 and X 2 and any other X variables. Residuals are vertical distances from the plane to the data points floating in space.

β interpretation in Y = α + βX + γZ + error β is the effect on Y of changing X by 1, holding Z constant. Often, there is a linear relationship between X and Z. When X is one unit bigger than you would predict it to be, based on what Z is, then we expect Y to be β more than you would expect from what Z is.

β -hat formula in Y = α + βX + γZ + error – See pdf file

LS Spreadsheet as front end Word processor as back end Interpretation of results – Coefficients – Standard errors – T-statistics – P-values Prediction