Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.

Slides:



Advertisements
Similar presentations
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Advertisements

Copula Regression By Rahul A. Parsa Drake University &
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Nguyen Ngoc Anh Nguyen Ha Trang
Multivariate linear models for regression and classification Outline: 1) multivariate linear regression 2) linear classification (perceptron) 3) logistic.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 2. Hypothesis Testing.
Binary Response Lecture 22 Lecture 22.
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Estimation of parameters. Maximum likelihood What has happened was most likely.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Inference about a Mean Part II
OLS versus MLE Example YX Here is the data:
The Binary Logit Model Definition Characteristics Estimation 0.
Lecture 7 1 Statistics Statistics: 1. Model 2. Estimation 3. Hypothesis test.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
The maximum likelihood method Likelihood = probability that an observation is predicted by the specified model Plausible observations and plausible models.
Likelihood probability of observing the data given a model with certain parameters Maximum Likelihood Estimation (MLE) –find the parameter combination.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
9-1 MGMG 522 : Session #9 Binary Regression (Ch. 13)
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Lecture Slide #1 Logistic Regression Analysis Estimation and Interpretation Hypothesis Tests Interpretation Reversing Logits: Probabilities –Averages.
Forecasting Choices. Types of Variable Variable Quantitative Qualitative Continuous Discrete (counting) Ordinal Nominal.
M.Sc. in Economics Econometrics Module I Topic 7: Censored Regression Model Carol Newman.
CY3A2 System identification1 Maximum Likelihood Estimation: Maximum Likelihood is an ancient concept in estimation theory. Suppose that e is a discrete.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Université d’Ottawa / University of Ottawa 2001 Bio 8100s Applied Multivariate Biostatistics L1a.1 Lecture 1a: Some basic statistical concepts l The use.
5. Maximum Likelihood –II Prof. Yuille. Stat 231. Fall 2004.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Maximum Likelihood Estimation
Université d’Ottawa - Bio Biostatistiques appliquées © Antoine Morin et Scott Findlay :32 1 Logistic regression.
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
Logistic regression (when you have a binary response variable)
Logistic Regression Saed Sayad 1www.ismartsoft.com.
Review of statistical modeling and probability theory Alan Moses ML4bio.
CS 2750: Machine Learning Linear Models for Classification Prof. Adriana Kovashka University of Pittsburgh February 15, 2016.
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
Computacion Inteligente Least-Square Methods for System Identification.
Conditional Expectation
Estimation Econometría. ADE.. Estimation We assume we have a sample of size T of: – The dependent variable (y) – The explanatory variables (x 1,x 2, x.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
MathematicalMarketing Slide 3c.1 Mathematical Tools Chapter 3: Part c – Parameter Estimation We will be discussing  Nonlinear Parameter Estimation  Maximum.
Statistics 350 Lecture 3.
The Maximum Likelihood Method
Estimating Volatilities and Correlations
Probability Theory and Parameter Estimation I
M.Sc. in Economics Econometrics Module I
Generalized Linear Models
The Maximum Likelihood Method
Simultaneous equation system
Generalized Linear Models
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Estimation Maximum Likelihood Estimates Industrial Engineering
Maximum Likelihood Find the parameters of a model that best fit the data… Forms the foundation of Bayesian inference Slide 1.
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
More about Posterior Distributions
Statistical Assumptions for SLR
EC 331 The Theory of and applications of Maximum Likelihood Method
10701 / Machine Learning Today: - Cross validation,
مدلسازي تجربي – تخمين پارامتر
Estimation Maximum Likelihood Estimates Industrial Engineering
Simple Linear Regression
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Maximum Likelihood Estimation (MLE)
Presentation transcript:

Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution is not Normal? We can use an alternative estimator: MLE. See “Generalized Linear Models” in S-Plus.

OLS vs. MLE If assumptions of OLS hold, OLS and MLE give exactly same estimates! So, using MLE instead of OLS is OK! MLE called “Generalized Linear Models” in S-Plus. More general than “Linear Regression” Allows you to specify dist’n of error.

Example: Ozone Attainment “Out of Attainment” if ozone exceeds standard on a given day. Model distribution of number of days out of attainment in a given county over 20 years. Use a Poisson Distribution Estimate the parameter using Maximum Likelihood.

MLE Principle: choose parameter(s) that make observing the given data the most probable (or “likely”). How do we measure “likelihood”? If we know sampling distribution, know how “probable” or “likely” any given data are. So we can measure likelihood. We must know the distribution.

Graph of Likelihood

Log-Likelihood Maximizing log-likelihood is equivalent to maximizing likelihood.

Solution We can model number of exceedences as Poisson distribution. 1 parameter. Estimated with maximum likelihood Estimated parameter () is 2.45