Presentation is loading. Please wait.

Presentation is loading. Please wait.

Regression Models - Introduction

Similar presentations


Presentation on theme: "Regression Models - Introduction"— Presentation transcript:

1 Regression Models - Introduction
In regression models we are fitting a statistical model to data, in particular, linear models. We use the error in the data after fitting the model to understand how they affect the estimated model. We generally use regression to be able to predict the value of one variable given the value of others. We aim to understand the underlying mechanism that produced the data and for this we use inferential techniques. In regression models there are two types of variables: Y – dependent variable, also called response variable. It is modeled as random. X – independent variable also called predictor variable or explanatory variable. It is sometimes modeled as random and sometimes it has fixed value for each observation. week1

2 Simple Linear Regression - Introduction
Simple linear regression studies the relationship between a quantitative response variable Y, and a single explanatory variable X. Idea of statistical model: Actual observed value of Y = … In general a simple model for data is imposed by a statistician for how data was generated. Box (a well know statistician) claim: “All models are wrong, some are useful”. ‘Useful’ means that they describe the data well and can be used for predictions and inferences. Recall: parameters are constants in a statistical model which we usually don’t know but will use data to estimate. week1

3 Simple Linear Regression Models
The statistical model for simple linear regression is a straight line model of the form where… For particular points, We expect that different values of X will produce different mean response. In particular we have that for each value of X, the possible values of Y follow a distribution whose mean is... Formally it means that …. week1

4 Estimation – Least Square Method
Estimates of the unknown parameters β0 and β1 based on our observed data are usually denoted by b0 and b1. For each observed value xi of X the fitted value of Y is This is an equation of a straight line. The deviations from the line in vertical direction are the errors in prediction of Y and are called “residuals”. They are defined as The estimates b0 and b1 are found by the Method of Lease Squares which is based on minimizing sum of squares of residuals. Note, the least-squares estimates are found without making any statistical assumptions about the data. week1

5 Derivation of Least-Squares Estimates
Let We want to find b0 and b1 that minimize S. Use calculus…. week1

6 Properties of Fitted Line
Note: you need to know how to prove the above properties. week1


Download ppt "Regression Models - Introduction"

Similar presentations


Ads by Google