Empirical Maximum Likelihood and Stochastic Process Lecture VIII.

Slides:



Advertisements
Similar presentations
Numerical Solution of Nonlinear Equations
Advertisements

Optimization.
Engineering Optimization
Recall Taylor’s Theorem from single variable calculus:
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation.
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Computer vision: models, learning and inference
5/20/ Multidimensional Gradient Methods in Optimization Major: All Engineering Majors Authors: Autar Kaw, Ali.
ENGINEERING OPTIMIZATION
Methods For Nonlinear Least-Square Problems
Chapter 1 Introduction The solutions of engineering problems can be obtained using analytical methods or numerical methods. Analytical differentiation.
Parametric Inference.
Advanced Topics in Optimization
Nonlinear Algebraic Systems 1.Iterative solution methods 2.Fixed-point iteration 3.Newton-Raphson method 4.Secant method 5.Matlab tutorial 6.Matlab exercise.
Optimization using Calculus
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Maximum likelihood (ML)
CISE301_Topic8L1KFUPM1 CISE301: Numerical Methods Topic 8 Ordinary Differential Equations (ODEs) Lecture KFUPM Read , 26-2, 27-1.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Derivatives and Differential Equations
9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
Taylor Series.
Biointelligence Laboratory, Seoul National University
Empirical Financial Economics Asset pricing and Mean Variance Efficiency.
ENCI 303 Lecture PS-19 Optimization 2
Cost Functions and the Estimation of Flexible Functional Forms Lecture XVIII.
Block 4 Nonlinear Systems Lesson 14 – The Methods of Differential Calculus The world is not only nonlinear but is changing as well 1 Narrator: Charles.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Roots of Equations ~ Open Methods Chapter 6 Credit:
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Comparative Statics and Duality of the Cost Function Lecture VII.
Sec 15.6 Directional Derivatives and the Gradient Vector
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
Finite Difference Methods Definitions. Finite Difference Methods Approximate derivatives ** difference between exact derivative and its approximation.
Engineering Analysis – Computational Fluid Dynamics –
3.7 – Implicit Differentiation An Implicit function is one where the variable “y” can not be easily solved for in terms of only “x”. Examples:
Overview of Optimization in Ag Economics Lecture 2.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Dr. Jie Zou PHY Chapter 2 Solution of Nonlinear Equations: Lecture (II)
Circuits Theory Examples Newton-Raphson Method. Formula for one-dimensional case: Series of successive solutions: If the iteration process is converged,
Lecture Note 2 – Calculus and Probability Shuaiqiang Wang Department of CS & IS University of Jyväskylä
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Numerical Methods Multidimensional Gradient Methods in Optimization- Theory
Signal & Weight Vector Spaces
Performance Surfaces.
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Computers in Civil Engineering 53:081 Spring 2003 Lecture #8 Roots of Equations: Systems of Equations.
Optimization of Nonlinear Singularly Perturbed Systems with Hypersphere Control Restriction A.I. Kalinin and J.O. Grudo Belarusian State University, Minsk,
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
The Farm Portfolio Problem: Part I Lecture V. An Empirical Model of Mean- Variance Deriving the EV Frontier –Let us begin with the traditional portfolio.
Maximum Likelihood. Much estimation theory is presented in a rather ad hoc fashion. Minimising squared errors seems a good idea but why not minimise the.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Ordinary Differential Equations (ODEs). Objectives of Topic  Solve Ordinary Differential Equations (ODEs).  Appreciate the importance of numerical methods.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
deterministic operations research
12. Principles of Parameter Estimation
Modeling of geochemical processes Numeric Algorithm Refreshment
Classification Analytical methods classical methods
L5 Optimal Design concepts pt A
30 m 2000 m 30 m 2000 m. 30 m 2000 m 30 m 2000 m.
Performance Surfaces.
12. Principles of Parameter Estimation
CISE301: Numerical Methods Topic 8 Ordinary Differential Equations (ODEs) Lecture KFUPM Read , 26-2, 27-1 CISE301_Topic8L1 KFUPM.
Derivatives and Gradients
Presentation transcript:

Empirical Maximum Likelihood and Stochastic Process Lecture VIII

Empirical Maximum Likelihood  To demonstrate the estimation of the likelihood functions using maximum likelihood, we formulate the estimation problem for the gamma distribution for the same dataset including a trend line in the mean.

 The basic gamma distribution function is  Next, we add the possibility of a trend line

Numerical Optimization  Given the implicit nonlinearity involved, we will solve for the optimum using the nonlinear optimization techniques.  Most students have been introduced to the first-order conditions for optimality. For our purposes, we will redevelop these conditions within the framework of a second- order Taylor series expansion of a function.

taking the second-order Taylor series expansion of a function (f(x)) of a vector of variables (x) where x 0 is the point of approximation, the gradient vector is defined as

and the Hessian matrix is defined as

Given the Taylor series expansion x 0 defines a maximum if

If we restrict our attention to functions whose second derivatives are continuous close to x 0, this equation implies two sufficient conditions.  First, the vector of first derivatives must vanish

 Second, the Hessian matrix is negative definite, or

Newton-Raphson is simply a procedure which efficiently finds the zeros of the gradient vector (a vector valued function) Solving for x based on x 0 we have

 While the gamma function with a time trend is amenable to solution using these numerical techniques, for demonstration purposes we return to the normal distribution function with a time trend specified as

The gradient of the likelihood function becomes

Hessian matrix becomes