Presentation is loading. Please wait.

Presentation is loading. Please wait.

Linear Regression Basics II Fin250f: Lecture 7.1 Spring 2010 Brooks, chapter 3.1-3.3,3.7, 3.8.

Similar presentations


Presentation on theme: "Linear Regression Basics II Fin250f: Lecture 7.1 Spring 2010 Brooks, chapter 3.1-3.3,3.7, 3.8."— Presentation transcript:

1 Linear Regression Basics II Fin250f: Lecture 7.1 Spring 2010 Brooks, chapter 3.1-3.3,3.7, 3.8

2 Outline  Matrix introduction  Multivariate linear model Standard errors Matlab OLS function  What’s a big sample?  Data mining  Goodness of fit

3 Matrices (Appendix A5)

4 Matrices

5 Even More Matrices: Transpose

6 Multivariate Linear Model

7 Least Square Solution

8 Matlab Code  OLS function (ols.m)  Setting up CRSP data (ccrspmat.m)  Example: Estimating CAPM beta rollingcapm.m  Example: Simple return forecast simpleretfcast.m  Example: Monday returns? monday.m

9 What’s a Large Sample?  Asymptotic results T goes to infinity  When are these results valid?  Depends Complexity/stationarity of data Complexity of model

10 Data Snooping (see 3.7)  For finite data can always find something positive Significant beta Accurate forecasts Profitable trades  “In sample bias”  “mclinearsnoop.m”  Snooping versus Mining

11 Goodness of Fit and other Objectives  How good is a “fit” or a forecast?  Basic objective function

12 Various Measures

13 Traditional: R-squared (R^2)

14 R^2  Good Easy and intuitive  Bad Not well defined statistically Always improves with more parameters  (See adjusted R^2) Can often be high when there are time trends Not well defined objective


Download ppt "Linear Regression Basics II Fin250f: Lecture 7.1 Spring 2010 Brooks, chapter 3.1-3.3,3.7, 3.8."

Similar presentations


Ads by Google