Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.

Similar presentations


Presentation on theme: "Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between."— Presentation transcript:

1 Chapter 14 Introduction to Regression Analysis

2 Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between Regression & Correlation

3 Regression Regression is the measure of the average relationship between two or more variables in terms of original units of data. Independent variable: variable which is used to predict of interest Dependent variable: variable which we want to predict

4 Uses of Regression Analysis Regression analysis provides estimates of values of DV from the values of the IV by means of device called regression lines. It helps in obtaining a measure of error involved in using the regression lines as a basis of estimation. With the help of regression coefficients, we can calculate the correlation coefficient.

5 Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent variables. Regression is thus an explanation of causation. If the independent variable(s) sufficiently explain the variation in the dependent variable, the model can be used for prediction. Independent variable (x) Dependent variable

6 Simple Linear Regression Independent variable (x) Dependent variable (y) The output of a regression is a function that predicts the dependent variable based upon values of the independent variables. Simple regression fits a straight line to the data. y’ = b0 + b1X ± є b0 (y intercept) B1 = slope = ∆y/ ∆x є

7 Simple Linear Regression Independent variable (x) Dependent variable The function will make a prediction for each observed data point. The observation is denoted by y and the prediction is denoted by y. Zero Prediction: y Observation: y ^ ^

8 Simple Linear Regression For each observation, the variation can be described as: y = y + ε Actual = Explained + Error Zero Prediction error: ε ^ Prediction: y ^ Observation: y

9 Regression Lines & Least Square Method Regression line of Y on X: This line gives the most probable values of Y for given values of X. Y= a + bX a & b are constant, to find that: - The value of b is actually the regression coefficient of Y on X

10 Regression line of X on Y: This line gives the most probable values of X for given values of Y, X= a + bY a & b are constant, to find that: - The value of b is actually the regression coefficient of X on Y

11 Another Approach Regression Equation of Y on X:. (Y - Y) = b yx (X - X) b yx is the regression coefficient of Y on X Regression Equation of X on Y:. (X - X) = b xy (Y - Y) b xy is the regression coefficient of Y on X

12 Actual Mean Method Regression coefficient of Y on X Regression coefficient of X on Y Assumed Mean Method Direct Method Regression coefficient of Y on X Regression coefficient of X on Y Regression coefficient of Y on X Regression coefficient of X on Y

13 Regression in case of Grouped Data Regression coefficient of Y on X Regression coefficient of X on Y

14 Thank U


Download ppt "Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between."

Similar presentations


Ads by Google