11 Chapter 5 The Research Process – Hypothesis Development – (Stage 4 in Research Process) © 2009 John Wiley & Sons Ltd. www.wileyeurope.com/college/sekaran.

Slides:



Advertisements
Similar presentations
Ordinary least Squares
Advertisements

Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
The Multiple Regression Model.
Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Chapter 14, part D Statistical Significance. IV. Model Assumptions The error term is a normally distributed random variable and The variance of  is constant.
CHAPTER 2 Building Empirical Model. Basic Statistical Concepts Consider this situation: The tension bond strength of portland cement mortar is an important.
Inference for Regression
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Correlation and Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
Chapter 10 Simple Regression.
The Simple Regression Model
BCOR 1020 Business Statistics
Chapter 9 Hypothesis Testing.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Multiple Regression Models
1 Chapter 10 Correlation and Regression We deal with two variables, x and y. Main goal: Investigate how x and y are related, or correlated; how much they.
Lecture 5 Correlation and Regression
Active Learning Lecture Slides
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Chapter 13: Inference in Regression
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 9 Hypothesis Testing.
Correlation and Linear Regression
4.2 One Sided Tests -Before we construct a rule for rejecting H 0, we need to pick an ALTERNATE HYPOTHESIS -an example of a ONE SIDED ALTERNATIVE would.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Correlation and Regression
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-1 Review and Preview.
Chapter 12 Correlation & Regression
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Lecturer: Kem Reat, Viseth, PhD (Economics)
Production Planning and Control. A correlation is a relationship between two variables. The data can be represented by the ordered pairs (x, y) where.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Managerial Economics Demand Estimation & Forecasting.
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
11 Chapter 12 Quantitative Data Analysis: Hypothesis Testing © 2009 John Wiley & Sons Ltd.
Chapter 5 Demand Estimation Managerial Economics: Economic Tools for Today’s Decision Makers, 4/e By Paul Keat and Philip Young.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 8 Hypothesis Testing.
Correlation & Regression Analysis
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and Alison Kelly Copyright © 2014 by McGraw-Hill Higher Education. All rights.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Multiple Regression Analysis: Inference
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Multiple Regression Analysis: Inference
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
Ch. 2: The Simple Regression Model
Basic Estimation Techniques
Ch. 2: The Simple Regression Model
Basic Estimation Techniques
CHAPTER 29: Multiple Regression*
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Chapter 7: The Normality Assumption and Inference with OLS
Seminar in Economics Econ. 470
Product moment correlation
The Simple Regression Model
The Simple Regression Model
Introduction to Regression
Presentation transcript:

11 Chapter 5 The Research Process – Hypothesis Development – (Stage 4 in Research Process) © 2009 John Wiley & Sons Ltd.

 Recall the Research process: 1)Broad problem area 2)Problem statement 3)Theoretical Framework 4)Generation of hypotheses 5)Data collection: 6)Data analysis: 7)Report Writing( Interpretation of results) 2 © 2009 John Wiley & Sons Ltd.

Stage 4 : Hypothesis  A proposition that is empirically testable. It is an empirical statement concerned with the relationship among variables.  Good hypothesis: –Must be adequate for its purpose –Must be testable –Must be better than its rivals 3 © 2009 John Wiley & Sons Ltd.

The Simple Regression Model  Regression analysis is a statistical tool for the investigation of relationships between variables.  Usually, the investigator seeks to determine the causal effect of one variable upon another.  To explore such issues, the investigator assembles data on the underlying variables of interest and employs regression to estimate the quantitative effect of the causal variables upon the variable that they influence. The investigator also typically assesses the “statistical significance” of

The Simple Regression Model  The investigator also typically assesses the “statistical significance” of the estimated relationships, that is, the degree of confidence that the true relationship is close to the estimated relationship.  Regression techniques have long been central to the field of economic statistics (“econometrics”).

 Definition of the simple linear regression model Dependent variable, explained variable, response variable,… Independent variable, explanatory variable, regressor,… Error term, disturbance, unobservables,… Intercept Slope parameter "Explains variable in terms of variable " The Simple Regression Model

 Example: Soybean yield and fertilizer  Example: A simple wage equation Measures the effect of fertilizer on yield, holding all other factors fixed Rainfall, land quality, presence of parasites, … Measures the change in hourly wage given another year of education, holding all other factors fixed Labor force experience, tenure with current employer, work ethic, intelligence … The Simple Regression Model

 In order to estimate the regression model one needs data  A random sample of observations First observation Second observation Third observation n-th observation Value of the explanatory variable of the i-th observation Value of dependent variable of the i-th observation The Simple Regression Model

 Fit as good as possible a regression line through the data points: Fitted regression line For example, the i-th data point The Simple Regression Model

 What does "as good as possible" mean?  Regression residuals  Minimize sum of squared regression residuals  Ordinary Least Squares (OLS) estimates The Simple Regression Model

 CEO Salary and return on equity  Fitted regression  Causal interpretation? Salary in thousands of dollars Return on equity of the CEO‘s firm Intercept If the return on equity increases by 1 percent, then salary is predicted to change by 18,501 $ The Simple Regression Model

 Wage and education  Fitted regression Hourly wage in dollars Years of education Intercept In the sample, one more year of education was associated with an increase in hourly wage by 0.54 $ The Simple Regression Model

 CEO Salary and return on equity The regression explains only 1.3 % of the total variation in salaries The Simple Regression Model

Multiple Regression Analysis: Estimation  Definition of the multiple linear regression model Dependent variable, explained variable, response variable,… Independent variables, explanatory variables, regressors,… Error term, disturbance, unobservables,… InterceptSlope parameters "Explains variable in terms of variables "

 Motivation for multiple regression –Incorporate more explanatory factors into the model –Explicitly hold fixed other factors that otherwise would be in –Allow for more flexible functional forms  Example: Wage equation Hourly wage Years of educationLabor market experience All other factors… Now measures effect of education explicitly holding experience fixed Multiple Regression Analysis: Estimation

 Example: Wage equation –Test whether, after controlling for education and tenure, higher work experience leads to higher hourly wages Standard errors Test against. One would either expect a positive effect of experience on hourly wage or no effect at all. Multiple Regression Analysis: Inference

 Example: Wage equation (cont.) "The effect of experience on hourly wage is statistically greater than zero at the 5% (and even at the 1%) significance level." t-statistic Degrees of freedom; here the standard normal approximation applies Critical values for the 5% and the 1% significance level (these are conventional significance levels). The null hypothesis is rejected because the t-statistic exceeds the critical value. Multiple Regression Analysis: Inference

 Testing against two-sided alternatives Test agai nst. Reject the null hypothesis in favour of the alternative hypothesis if the absolute value of the estimated coefficient is too large. Construct the critical value so that, if the null hypothesis is true, it is rejected in, for example, 5% of the cases. In the given example, these are the points of the t-distribution so that 5% of the cases lie in the two tails. ! Reject if absolute value of t-statistic is less than or greater than 2.06 Multiple Regression Analysis: Inference

 Guidelines for discussing economic and statistical significance –If a variable is statistically significant, discuss the magnitude of the coefficient to get an idea of its economic or practical importance –The fact that a coefficient is statistically significant does not necessa-rily mean it is economically or practically significant! –If a variable is statistically and economically important but has the "wrong“ sign, the regression model might be misspecified –If a variable is statistically insignificant at the usual levels (10%, 5%, 1%), one may think of dropping it from the regression –If the sample size is small, effects might be imprecisely estimated so that the case for dropping insignificant variables is less strong Multiple Regression Analysis: Inference

 Computing p-values for t-tests –If the significance level is made smaller and smaller, there will be a point where the null hypothesis cannot be rejected anymore –The reason is that, by lowering the significance level, one wants to avoid more and more to make the error of rejecting a correct H 0 –The smallest significance level at which the null hypothesis is still rejected, is called the p-value of the hypothesis test –A small p-value is evidence against the null hypothesis because one would reject the null hypothesis even at small significance levels –A large p-value is evidence in favor of the null hypothesis –P-values are more informative than tests at fixed significance levels Multiple Regression Analysis: Inference