Lecturer: Ing. Martina Hanová, PhD.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Lecture 5 HSPM J716. Assignment 4 Answer checker Go over results.
Lecture 5 HSPM J716. Assignment 4 Answer checker Go over results.
The Use and Interpretation of the Constant Term
Choosing a Functional Form
Problems in Applying the Linear Regression Model Appendix 4A
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Econ 140 Lecture 131 Multiple Regression Models Lecture 13.
5  ECONOMETRICS CHAPTER Yi = B1 + B2 ln(Xi2) + ui
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Stat 112: Lecture 12 Notes Fitting Curvilinear Relationships (Chapter 5): –Interpreting the slope coefficients in the log X transformation –The log Y –
Multiple Regression Models
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Multiple Regression Research Methods and Statistics.
Correlation & Regression
Estimation of Demand Prof. Ravikesh Srivastava Lecture-8.
Chapter 11 Simple Regression
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
16-1 Linear Trend The long term trend of many business series often approximates a straight line.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
EXAMPLES Linear models The log-linear model Semilog models
Prediction, Goodness-of-Fit, and Modeling Issues Prepared by Vera Tabakova, East Carolina University.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
FUNCTIONAL FORMS OF REGRESSION MODELS Application 5.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Variance Stabilizing Transformations. Variance is Related to Mean Usual Assumption in ANOVA and Regression is that the variance of each observation is.
Lecturer: Ing. Martina Hanová, PhD. Business Modeling.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Lecturer: Ing. Martina Hanová, PhD. Business Modeling.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Econometrics I Summer 2011/2012 Course Guarantor: prof. Ing. Zlata Sojková, CSc., Lecturer: Ing. Martina Hanová, PhD.
Lecture 11: Simple Linear Regression
Regression and Correlation of Data Summary
Simple Linear Regression
Regression Analysis AGEC 784.
Ch. 2: The Simple Regression Model
Lecturer: Ing. Martina Hanová, PhD.
Objectives By the end of this lecture students will:
Correlation and Simple Linear Regression
Business statistics and econometrics
Kin 304 Regression Linear Regression Least Sum of Squares
Active Learning Lecture Slides
ECONOMETRICS DR. DEEPTI.
Business Modeling Lecturer: Ing. Martina Hanová, PhD.
BPK 304W Regression Linear Regression Least Sum of Squares
Simple Linear Regression
Regression Analysis PhD Course.
Regression Models - Introduction
J.-F. Pâris University of Houston
The Simple Regression Model
Section 6.2 Prediction.
The Simple Regression Model
Simple Linear Regression
Linear Regression Summer School IFPRI
Regression and Correlation of Data
Regression Models - Introduction
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Lecturer: Ing. Martina Hanová, PhD. econometrics Lecturer: Ing. Martina Hanová, PhD.

Assumption of Ordinary Least Squares Linearity: The true relationship between the mean of the response variable E(Y) and the explanatory variables X1, ... Xn is a straight line. the mean of the response is not a linear function of the predictors/ transforming the response and/or predictor variables unequal error variances/ transforming the response and/or predictor variables or use "weighted least squares regression.„

DATA TRANSFORMATION To introduce basic ideas behind data transformations we first consider a simple linear regression model in which: We transform the predictor (x) values only. We transform the response (y) values only. We transform both the predictor (x) values and response (y) values.

Examples of logarithmic transformations Simple regression equation: Y = a + b*X Meaning: A unit increase in X is associated with an average of b units increase in Y. Equation: Y = a*ebX log(Y) = c + b*X     Meaning: A unit increase in X is associated with an average of 100*b% increase in Y. Equation: Y = a*Xb log(Y) = c + b*log(X) Meaning: A 1% increase in X is associated with a b% increase in Y. Equation: Y = a + b*log(X) Meaning: A 1% increase in X is associated with an average b/100 units increase in Y.     

The log-log transformation From taking the log of both sides of the equation: Cobb–Douglas production function Y = total production (the monetary value of all goods produced in a year) L = labor input K = capital input A = total factor productivity α and β – output elasticities

Log – log model The coefficients in a log-log model represent the elasticity of Y variable with respect to Xi variables. The coefficient is the estimated percent change in Y for a percent change in your Xi.

Log-transforming Only the Response Log-lin model ln Y = β0 + β1 * X + ε Log-Linear Trend Models Y = e^(β0 + β1 * t) + ε ln Y = β0 + β1 * t + ε B1 provides the instantaneous rate of growth for your dependent variable associated with a unit change in your independent variable The compounded growth rate is considered to be a more accurate estimate of the impact of X. compounded growth rate = e^B1-1

Log-transforming Only the Predictor Lin-log model Y = β0 + β1 *ln X + ε Y = β0 + β1 *ln (time) + ε B1 coefficients represent the estimated unit change in your dependent variable for a percentage change in your independent variable.