REGRESSION.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
Engineering experiments involve the measuring of the dependent variable as the independent one has been altered, so as to determine the relationship between.
9. SIMPLE LINEAR REGESSION AND CORRELATION
Regression and Correlation
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Regression Analysis British Biometrician Sir Francis Galton was the one who used the term Regression in the later part of 19 century.
Lecture 5 Correlation and Regression
Linear Regression.
Introduction to Linear Regression and Correlation Analysis
Biostatistics Unit 9 – Regression and Correlation.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Chapter 6 & 7 Linear Regression & Correlation
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Basic Concepts of Correlation. Definition A correlation exists between two variables when the values of one are somehow associated with the values of.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
CORRELATION. Correlation key concepts: Types of correlation Methods of studying correlation a) Scatter diagram b) Karl pearson’s coefficient of correlation.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Inference about the slope parameter and correlation
Regression Analysis AGEC 784.
LEAST – SQUARES REGRESSION
Statistics 101 Chapter 3 Section 3.
Linear Regression Special Topics.
Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
A Session On Regression Analysis
SIMPLE LINEAR REGRESSION MODEL
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
CHAPTER 10 Correlation and Regression (Objectives)
Simple Linear Regression - Introduction
CORRELATION.
Correlation and Regression
Regression Models - Introduction
CHAPTER- 17 CORRELATION AND REGRESSION
The Weather Turbulence
Least-Squares Regression
Single Regression Equation
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Correlation and Regression
Chapter 14 Inference for Regression
Chapter 14 Multiple Regression
MGS 3100 Business Analysis Regression Feb 18, 2016
REGRESSION ANALYSIS 11/28/2019.
Presentation transcript:

REGRESSION

Defination and Meaning :

Regression analysis measures the nature and extent of the relation between 2 or more variables thus enables us to make predictions. The dictionary meaning of regression is “the act of returning or going back”; First used in 1877 by Francis Galton;

Regression is the statistical tool with the help of which we are in a position to estimate (predict) the unknown values of one variable from the known values of another variable; It helps to find out average probable change in one variable given a certain amount of change in another;

Regression lines For two variables X and Y, we will have two regression lines: Regression line X on Y gives values of Y for given values of X; Regression line Y on X gives values of X for given values of Y;

Regression Equation:

Regression equations are algebraic expressions of regression lines; Y on X Regression equation expressed as Y=a+bX Y is dependent variable X is independent variable ‘a’ & ‘b’ are constants/parameters of line ‘a’ determines the level of fitted line (i.e. distance of line above or below origin) ‘b’ determines the slope of line (i.e change in Y for unit change in X)

Regression equations are algebraic expressions of regression lines; X on Y Regression equation expressed as X=a+bY X is dependent variable Y is independent variable ‘a’ & ‘b’ are constants/parameters of line ‘a’ determines the level of fitted line (i.e. distance of line above or below origin) ‘b’ determines the slope of line (i.e change in Y for unit change in X)

Method of Least Square :

Constant “a” & “b” can be calculated by method of least square; The line should be drawn through the plotted points in such a manner that the sum of square of the vertical deviations of actual Y values from estimated Y values is the least i.e. ∑(Y-Ye)2 should be minimum; Such a line is known as line of best fit; with algebra & calculus: For Y on X For X on Y ∑Y=Na+b ∑X ∑X=Na+b ∑Y ∑XY=a ∑X + b ∑X2 ∑XY=a ∑Y + b ∑Y2

Difference between Regression and Correlation :

Correlation Regression Correlation coefficient (r) between x & y is a measure of direction & degree of linear relationship between x & y; It does not imply cause & effect relationship between the variables. It indicates the degree of association bxy & byx are mathematical measures expressing the average relationship between the two variables It indicates the cause & effect relationship between variables. It is used to forecast the nature of dependent variable when the value of independent variable is know

Multiple Regression :

When we use more than one independent variable to estimate the dependent variable in order to increase the accuracy of the estimate; the process is called multiple regression analysis. It is based on the same assumptions & procedure that are encountered using simple regression. The principal advantage of multiple regression is that it allows us to use more of the information available to us to estimate the dependent variable;

Estimating equation describing relationship among three variables :

where, Y = estimated value corresponding to the dependent variable Y= a+b1X1+b2X2 where, Y = estimated value corresponding to the dependent variable a= Y intercept b1 and b2 = slopes associated with X1 and X2, respectively X1 and X2 = values of the two independent variables

Normal Equations: To determine the values of the constants a, b1 and b2 ,we use three equations (which statistician call the “normal equation”) ∑Y=Na+b1∑X1 + b2∑X2 ∑X1Y=a ∑X1 + b1 ∑X12 + b2∑X1 X2 ∑X2Y=a ∑X2 + b2 ∑X22 + b1∑X1 X2

THANK YOU