Statistical Assumptions for SLR

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
The Simple Regression Model
Statistical Techniques I EXST7005 Simple Linear Regression.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 7. Statistical Estimation and Sampling Distributions
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Topic 3: Simple Linear Regression. Outline Simple linear regression model –Model parameters –Distribution of error terms Estimation of regression parameters.
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
Simple Linear Regression
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
Simple Linear Regression Analysis
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C22: The Method of Least Squares.
Gu Yuxian Wang Weinan Beijing National Day School.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Inference about the slope parameter and correlation
Chapter 13 Simple Linear Regression
Chapter 11: Linear Regression and Correlation
STATISTICAL INFERENCE
Probability Theory and Parameter Estimation I
Linear Regression and Correlation Analysis
Maximum Likelihood Estimation
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Parameter, Statistic and Random Samples
BIVARIATE REGRESSION AND CORRELATION
Simple Linear Regression - Introduction
CIS 2033 based on Dekking et al
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
Diagnostics and Transformation for SLR
Random Sampling Population Random sample: Statistics Point estimate
Inference about the Slope and Intercept
Linear Regression.
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
More about Posterior Distributions
Residuals The residuals are estimate of the error
Regression Models - Introduction
Inference about the Slope and Intercept
مدلسازي تجربي – تخمين پارامتر
The Multivariate Normal Distribution, Part 2
The Simple Linear Regression Model: Specification and Estimation
Simple Linear Regression
Parametric Methods Berlin Chen, 2005 References:
Diagnostics and Transformation for SLR
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Inference about the Slope and Intercept
Regression Models - Introduction
Presentation transcript:

Statistical Assumptions for SLR The assumptions for the simple linear regression model are: 1) The simple linear regression model of the form Yi = β0 + β1Xi + εi where i = 1, …, n is appropriate. 2) E(εi)=0 2) Var(εi) = σ2 3) εi’s are uncorrelated. STA302/1001 week 2

Properties of Least Squares Estimates Estimate of β0 and β1 – functions of data that can be calculated numerically for a given data set. Estimator of β0 and β1 – functions of the underlying random variables. Recall: the least-square estimators are… Claim: The least squares estimators are unbiased estimators for β0 and β1. Proof:… STA302/1001 week 2

Estimation of Error Term Variance σ2 The variance σ2 of the error terms εi’s needs to be estimated to obtain indication of the variability of the probability distribution of Y. Further, a variety of inferences concerning the regression function and the prediction of Y require an estimate of σ2. Recall, for random variable Z the estimates of the mean and variance of Z based on n realization of Z are…. Similarly, the estimate of σ2 is S2 is called the MSE – Mean Square Error it is an unbiased estimator of σ2 (proof later on). STA302/1001 week 2

Normal Error Regression Model In order to make inference we need one more assumption about εi’s. We assume that εi’s have a Normal distribution, that is εi ~ N(0, σ2). The Normality assumption implies that the errors εi’s are independent (since they are uncorrelated). Under the Normality assumption of the errors, the least squares estimates of β0 and β1 are equivalent to their maximum likelihood estimators. This results in additional nice properties of MLE’s: they are consistent, sufficient and MVUE. STA302/1001 week 2