Efficiency Measurement William Greene Stern School of Business New York University.

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Bayesian Estimation in MARK
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Part 24: Bayesian Estimation 24-1/35 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
[Part 3] 1/49 Stochastic FrontierModels Stochastic Frontier Model Stochastic Frontier Models William Greene Stern School of Business New York University.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
1 Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University
Bayesian statistics – MCMC techniques
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Presenting: Assaf Tzabari
Classical and Bayesian analyses of transmission experiments Jantien Backer and Thomas Hagenaars Epidemiology, Crisis management & Diagnostics Central Veterinary.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Computer vision: models, learning and inference
Part 23: Simulation Based Estimation 23-1/26 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii-
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Discrete Choice Modeling William Greene Stern School of Business New York University.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
[Topic 5-Bayesian Analysis] 1/77 Discrete Choice Modeling William Greene Stern School of Business New York University.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
WSEAS AIKED, Cambridge, Feature Importance in Bayesian Assessment of Newborn Brain Maturity from EEG Livia Jakaite, Vitaly Schetinin and Carsten.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Bayesian Inference Ekaterina Lomakina TNU seminar: Bayesian inference 1 March 2013.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Efficiency Measurement William Greene Stern School of Business New York University.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Advanced Higher Statistics Data Analysis and Modelling Hypothesis Testing Statistical Inference AH.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Efficiency Measurement William Greene Stern School of Business New York University.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Markov Random Fields Probabilistic Models for Images
1 A Bayes method of a Monotone Hazard Rate via S-paths Man-Wai Ho National University of Singapore Cambridge, 9 th August 2007.
Efficiency Measurement William Greene Stern School of Business New York University.
Frontier Models and Efficiency Measurement Lab Session 2: Stochastic Frontier William Greene Stern School of Business New York University 0Introduction.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Econometrics in Health Economics Discrete Choice Modeling and Frontier Modeling and Efficiency Estimation Professor William Greene Stern School of Business.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Stochastic Frontier Models
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
SIR method continued. SIR: sample-importance resampling Find maximum likelihood (best likelihood × prior), Y Randomly sample pairs of r and N 1973 For.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
Bayesian Estimation and Confidence Intervals Lecture XXII.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Advanced Statistical Computing Fall 2016
Let’s continue to do a Bayesian analysis
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Special Topics In Scientific Computing
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
CSCI 5822 Probabilistic Models of Human and Machine Learning
STA 216 Generalized Linear Models
OVERVIEW OF BAYESIAN INFERENCE: PART 1
Location-Scale Normal Model
More about Posterior Distributions
Efficiency Measurement
Stochastic Frontier Models
Section 7.7 Introduction to Inference
Econometrics Chengyuan Yin School of Mathematics.
Applied Statistics and Probability for Engineers
Presentation transcript:

Efficiency Measurement William Greene Stern School of Business New York University

Session 6 Model Extensions

 Simulation Based Estimators Normal-Gamma Frontier Model Bayesian Estimation of Stochastic Frontiers  A Discrete Outcomes Frontier  Similar Model Structures  Similar Estimation Methodologies  Similar Results

Functional Forms Normal-half normal and normal-exponential: Restrictive functional forms for the inefficiency distribution

Normal-Truncated Normal More flexible. Inconvenient, sometimes ill behaved log-likelihood function. MU=-.5 MU=+.5 MU=0

Normal-Gamma Very flexible model. VERY difficult log likelihood function. Bayesians love it. Conjugate functional forms for other model parts

Normal-Gamma Model z ~ N[- i +  v 2 / u,  v 2 ]. q(r,ε i ) is extremely difficult to compute

Normal-Gamma Frontier Model

Simulating the Log Likelihood  i = y i - ’x i,  i = - i -  v 2 / u, =  v, and P L = (- i /) F q is a draw from the continuous uniform(0,1) distribution.

Application to C&G Data This is the standard data set for developing and testing Exponential, Gamma, and Bayesian estimators.

Application to C&G Data ModelMeanStd.Dev.MinimumMaximum Normal Exponential Gamma Descriptive Statistics for JLMS Estimates of E[u|e] Based on Maximum Likelihood Estimates of Stochastic Frontier Models

Inefficiency Estimates

Tsionas Fourier Approach to Gamma

Discrete Outcome Stochastic Frontier

Chanchala Ganjay Gadge CONTRIBUTIONS TO THE INFERENCE ON STOCHASTIC FRONTIER MODELS DEPARTMENT OF STATISTICS AND CENTER FOR ADVANCED STUDIES, UNIVERSITY OF PUNE PUNE , INDIA

Bayesian Estimation  Short history – first developed post 1995  Range of applications Largely replicated existing classical methods Recent applications have extended received approaches  Common features of the applications

Bayesian Formulation of SF Model Normal – Exponential Model

Bayesian Approach v i – u i = y i -  - ’x i. Estimation proceeds (in principle) by specifying priors over  = (,,v,u), then deriving inferences from the joint posterior p(|data). In general, the joint posterior for this model cannot be derived in closed form, so direct analysis is not feasible. Using Gibbs sampling, and known conditional posteriors, it is possible use Markov Chain Monte Carlo (MCMC) methods to sample from the marginal posteriors and use that device to learn about the parameters and inefficiencies. In particular, for the model parameters, we are interested in estimating E[|data], Var[|data] and, perhaps even more fully characterizing the density f(|data).

On Estimating Inefficiency One might, ex post, estimate E[u i |data] however, it is more natural in this setting to include (u 1,...,u N ) with , and estimate the conditional means with those of the other parameters. The method is known as data augmentation.

Priors over Parameters

Priors for Inefficiencies

Posterior

Gibbs Sampling: Conditional Posteriors

Bayesian Normal-Gamma Model  Tsionas (2002) Erlang form – Integer P “Random parameters” Applied to C&G (Cross Section) Average efficiency  River Huang (2004) Fully general Applied (as usual) to C&G

Bayesian and Classical Results

A 3 Parameter Gamma Model

Methodological Comparison  Bayesian vs. Classical Interpretation Practical results: Bernstein – von Mises Theorem in the presence of diffuse priors  Kim and Schmidt comparison (JPA, 2000)  Important difference – tight priors over u i in this context.  Conclusions Not much change in existing results Extensions to new models (e.g., 3 parameter gamma)