Managerial Economics in a Global Economy

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

The Multiple Regression Model.
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Chapter 10 Curve Fitting and Regression Analysis
Chapter 10 Simple Regression.
Chapter 12 Simple Regression
Linear Regression and Correlation
The Simple Regression Model
SIMPLE LINEAR REGRESSION
THE IDENTIFICATION PROBLEM
SIMPLE LINEAR REGRESSION
Introduction to Regression Analysis, Chapter 13,
Relationships Among Variables
Lecture 5 Correlation and Regression
Correlation and Linear Regression
Correlation and Linear Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Prepared by Robert F. Brooker, Ph.D. Copyright ©2004 by South-Western, a division of Thomson Learning. All rights reserved.Slide 1 Managerial Economics.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Linear Regression and Correlation
Regression Method.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Chapter 6 & 7 Linear Regression & Correlation
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Examining Relationships in Quantitative Research
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Chapter 5 Demand Estimation Managerial Economics: Economic Tools for Today’s Decision Makers, 4/e By Paul Keat and Philip Young.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Correlation & Regression Analysis
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Chapter 4 Demand Estimation
Chapter 13 Simple Linear Regression
Chapter 4: Basic Estimation Techniques
Chapter 14 Introduction to Multiple Regression
Chapter 4 Basic Estimation Techniques
Inference for Least Squares Lines
Principles and Worldwide Applications, 7th Edition
Statistics for Managers using Microsoft Excel 3rd Edition
Micro Economics in a Global Economy
Chapter 11 Simple Regression
Statistics for Business and Economics (13e)
Chapter 13 Simple Linear Regression
Correlation and Simple Linear Regression
Correlation and Regression
Managerial Economics in a Global Economy
Chapter 4 Demand Estimation
Correlation and Simple Linear Regression
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Product moment correlation
SIMPLE LINEAR REGRESSION
BEC 30325: MANAGERIAL ECONOMICS
BEC 30325: MANAGERIAL ECONOMICS
Chapter Thirteen McGraw-Hill/Irwin
Correlation and Simple Linear Regression
Presentation transcript:

Managerial Economics in a Global Economy Chapter 4 DEMAND ESTIMATION

Regression Analysis Scatter Diagram

Regression Line: Line of Best Fit Regression Line: Minimizes the sum of the squared vertical deviations (et) of each point from the regression line. Ordinary Least Squares (OLS) Method

REGRESSION ANALYSIS Given the following demand function: Y = A + B1 X + B2 P + B3 I + B4 Pr; X = selling expenses (advertising) Pr = price of substitutes What we want are estimates of the values of A, B1, B2, B3, & B4. Regression analysis describes the way in which one variable is related to another. It derives an equation that can be used to estimate the unknown value of one variable on the basis of the known value of the other variable(s).

the simple regression model takes the following form: Yi = A + B Xi + ei; Regression analysis assumes that the mean value of Y, given the value of X, is a linear function of X. In other words, the mean value of the dependent variable is assumed to be a linear function of the independent variable. Yi is the ith observed value of the dependent variable and Xi is the ith observed value of the independent variable. Essentially ei is an error term, that is, a random amount that is added to A+BXi (or subtracted if ei is negative).

Because of the presence of the error term, the observed values of Yi fall around the population regression line (A+BXi), not on it Regression analysis assumes that the values of ei are independent and that their mean value equals zero.

Sample Regression Line (based on a sample) . sample regression line (estimated regression line) describes the average relationship between the dependent variable and independent variable. The general expression of the sample regression line is: i.e, the value of the dependent variable predicted by the regression line, a & b = estimators of A and B. a = the intercept of the regression line b = the slope of the line, measure the change in the predicted value of Y associated with a one unit increase in X.

Method of Least Squares. Used to determine the values of a and b. Since the deviation of the ith observed value of Y from the regression line equals , the sum of the squared deviations equals: Where n is the sample size. Using minimization technique we can find the values of a and b that minimize this expression, by differentiating these expression with respect to a and b and by setting these partial derivatives equal to zero.

(1) (2) solving equations (1) and (2) simultaneously, and letting equal the mean value of X in the sample and equal the mean of Y, we find that; and

given the following data for company X given the following data for company X. Given the following results of the table below = 2.533 + 1.504X; if Y = the observed value of sales = the computed (estimated) value of sales based on the regression line. from the table Y = 4 when X = 1. But using the regression line: = 2.533 + 1.504(1) = 4.037 (Note there is a difference between the observed sales (4) and the estimated sales (4.037).

if X = 0 then Y = 2.533 +1.504(0) = 2.533 (the intercept: the value of Y that intersects the vertical axis) Interpretation: if the firm’s selling expenses = 0, sales would be 2.533 million of units, and estimated sales go up 1.504 million units when selling expenses increase by 1m.

Ordinary Least Squares Estimation Using Excel The model: Objective: Determine the slope and intercept that minimize the sum of the squared errors.

Estimation Procedure

Standard Error of the Slope Estimate Tests of Significance Standard Error of the Slope Estimate A measure of the amount of scatter of individual observations about the regression line. It is useful in constructing prediction intervals - that is, intervals within which there is a specified probability that the dependent variable will lie.

if probability is set at 0 if probability is set at 0.95, a very approximate prediction interval is: 2se; since se = 0.3702 if the predicted value of Y is 11, there is a probability that the firm’s sales will be between: 10.26 (11 – (2 × 0.37)) and 11.74 (11 + (2 × 0.37)) Example Calculation

The t-statistic (significance of individual variables). Managers need to know whether a particular independent variable influences the dependent variable. The least square estimates of B’s by chance may be positive even if their true values are zero. e.g., B1 = 1.76 i.e., selling expenses have an effect on sales (t=0.0001). To test whether the true value of B1 is zero we must look at the t-statistic of B1. The t-statistic has a distribution called t-distribution. All things equal, the bigger the value of t-statistic (in absolute terms), the smaller the probability that the true value of the regression coefficient in question is zero. In our case, there is only 1 in 10 000 that chance alone would have resulted in a large t-statistic.

Calculation of the t Statistic Degrees of Freedom = (n-k) = (10 - 2) = 8 Critical Value at 5% level =2.306 ( is significant)

Decomposition of Sum of Squares Total Variation = Explained Variation + Unexplained Variation Coefficient of Determination Coefficient of Correlation

Decomposition of Sum of Squares

Multiple Regression Analysis Model: Adjusted Coefficient of Determination Analysis of Variance and F Statistic

Problems in Regression Analysis Multicollinearity. A situation in which two or more independent variables are very highly correlated. Under perfect linear correlation it is impossible to estimate the regression coefficients. e.g., (perfect linear correlation) Y = A + B1 X1i + B2 X2i; where X1i = 3X2i-1 or X1i = 6 + X2i or X1i = 2 + 4X2i; imperfect linear correlation. Y = A + B1 X1i + B2 X2i; where; X1 = price, X2 = nominal income (p.Q)

If two independent variables move together in a rigid fashion, there is no way to tell how much effect each has separately, all what we can observe is the effect of both combined. Consequences of mutlicollinearity - High R2 with no significant t-scores - High simple correlation coefficients (cross correlation matrix) How to deal with multicollinearity - Drop one or more of the multicollinear variables - Transform the multicollinear variables (e.g. first difference) - Increase sample size

Serial Correlation (or Autocorrelation) Error terms are not independent, if this year’s error term is positive, next year is always positive ( positive serial correlation ), and if this year’s error term is negative, next year’s is always negative. This is a violation of the assumptions underlying regression analysis. [ should be E(reiej) = 0, if not, the simple correlation between two observations of the error term is not equal to zero] Consequences of Serial Correlation - Increases of the variances of the distributions - Leads to underestimate the standard errors of the coefficients.

Detecting Serial correlation. - Durbin Watson Test Compare the computed DW with the DW tables to show whether d is so high or so low, that the hypothesis that there is no serial correlation should be rejected. if d < dL reject the hypothesis of no serial correlation. if d > du accept the hypothesis of no serial correlation if dL  d  du, the test is inconclusive. e.g.; if the hypothesis is that there is a negative serial correlation, we should - reject the hypothesis of no serial correlation if d<4-dL - accept the hypothesis of no serial correlation if d<4-du - if 4-du  d  4-dL the test is inconclusive.

How to deal with serial correlation - take the difference of the variables - use generalized least squares

Steps in Demand Estimation 1. Model Specification: Identify Variables - Identify the independent variables (in reality an empirical issue) 2. Specify Functional Form - Specify the mathematical form of the equation relating the mean value of the dependent variable to those of the independent variables. e.g., Y = f(X,P). This can take the following forms: Y = A + B1 Xi + B2 Pi + ei; B1>0, B2<0 or: log Y = log A + B1 log Xi + B2 log Xi + log ei;

3. Collect your data. Data can be: - time series - cross section - cross section/time series (panel) 4. Estimate The Function 5. Test the Results