D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.

Slides:



Advertisements
Similar presentations
Fin500J: Mathematical Foundations in Finance
Advertisements

Nonlinear Programming McCarl and Spreen Chapter 12.
D Nagesh Kumar, IIScOptimization Methods: M4L1 1 Linear Programming Applications Software for Linear Programming.
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Engineering Optimization
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
D Nagesh Kumar, IIScOptimization Methods: M2L2 1 Optimization using Calculus Convexity and Concavity of Functions of One and Two Variables.
Economics 2301 Lecture 31 Univariate Optimization.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Maxima and Minima. Maximum: Let f(x) be a function with domain DC IR then f(x) is said to attain the maximum value at a point a є D if f(x)
1. 2 Local maximum Local minimum 3 Saddle point.
Maximum and Minimum Value Problems By: Rakesh Biswas
D Nagesh Kumar, IIScOptimization Methods: M1L1 1 Introduction and Basic Concepts (i) Historical Development and Model Building.
MIT and James Orlin © Nonlinear Programming Theory.
Clicker Question 1 What are the critical numbers of f (x ) = |x + 5| ? A. 0 B. 5 C. -5 D. 1/5 E. It has no critical numbers.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Example 1 Determine whether the stationary point of the following quadratic functions is a local maxima, local minima or saddle point? A point x* is a.
Advanced Topics in Optimization
Linear Programming Applications
Optimization using Calculus
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
D Nagesh Kumar, IIScOptimization Methods: M3L1 1 Linear Programming Preliminaries.
Introduction and Basic Concepts
D Nagesh Kumar, IIScOptimization Methods: M3L4 1 Linear Programming Simplex method - II.
Finding a free maximum Derivatives and extreme points Using second derivatives to identify a maximum/minimum.
ECON 1150, Spring 2013 Lecture 3: Optimization: One Choice Variable Necessary conditions Sufficient conditions Reference: Jacques, Chapter 4 Sydsaeter.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Introduction to Optimization (Part 1)
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
1. Problem Formulation. General Structure Objective Function: The objective function is usually formulated on the basis of economic criterion, e.g. profit,
CHAP 0 MATHEMATICAL PRELIMINARY
1 f ’’(x) > 0 for all x in I f(x) concave Up Concavity Test Sec 4.3: Concavity and the Second Derivative Test the curve lies above the tangentsthe curve.
Local Extrema Do you remember in Calculus 1 & 2, we sometimes found points where both f'(x) and f"(x) were zero? What did this mean?? What is the corresponding.
1 Intermediate Microeconomics Math Review. 2 Functions and Graphs Functions are used to describe the relationship between two variables. Ex: Suppose y.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Nonlinear Programming Models
D Nagesh Kumar, IIScOptimization Methods: M3L2 1 Linear Programming Graphical method.
MAT 213 Brief Calculus Section 4.2 Relative and Absolute Extreme Points.
Linear Programming Revised Simplex Method, Duality of LP problems and Sensitivity analysis D Nagesh Kumar, IISc Optimization Methods: M3L5.
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Copyright © Cengage Learning. All rights reserved. Applications of Differentiation.
Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.
Graphs. What can we find out from the function itself? Take the function To find the roots.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Signal & Weight Vector Spaces
Performance Surfaces.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Introduction to Optimization
MAXIMA AND MINIMA. ARTICLE -1 Definite,Semi-Definite and Indefinite Function.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
deterministic operations research
Concavity.
Lecture 8 – Nonlinear Programming Models
Unconstrained and Constrained Optimization
Relationship between First Derivative, Second Derivative and the Shape of a Graph 3.3.
Title: Maximum, Minimum & Point of inflection
PRELIMINARY MATHEMATICS
Outline Unconstrained Optimization Functions of One Variable
Shivangi’s Questions z = x3+ 3x2y + 7y +9 What is
Performance Surfaces.
Chapter 12 Graphing and Optimization
Relationship between First Derivative, Second Derivative and the Shape of a Graph 3.3.
Convex and Concave Functions
Multivariable optimization with no constraints
Presentation transcript:

D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization

D Nagesh Kumar, IIScOptimization Methods: M2L3 2 Objectives  To study functions of multiple variables, which are more difficult to analyze owing to the difficulty in graphical representation and tedious calculations involved in mathematical analysis for unconstrained optimization.  To study the above with the aid of the gradient vector and the Hessian matrix.  To discuss the implementation of the technique through examples

D Nagesh Kumar, IIScOptimization Methods: M2L3 3 Unconstrained optimization  If a convex function is to be minimized, the stationary point is the global minimum and analysis is relatively straightforward as discussed earlier.  A similar situation exists for maximizing a concave variable function.  The necessary and sufficient conditions for the optimization of unconstrained function of several variables are discussed.

D Nagesh Kumar, IIScOptimization Methods: M2L3 4 Necessary condition In case of multivariable functions a necessary condition for a stationary point of the function f(X) is that each partial derivative is equal to zero. In other words, each element of the gradient vector defined below must be equal to zero. i.e. the gradient vector of f(X), at X=X *, defined as follows, must be equal to zero:

D Nagesh Kumar, IIScOptimization Methods: M2L3 5 Sufficient condition  For a stationary point X* to be an extreme point, the matrix of second partial derivatives (Hessian matrix) of f(X) evaluated at X* must be:  positive definite when X* is a point of relative minimum, and  negative definite when X* is a relative maximum point.  When all eigen values are negative for all possible values of X, then X* is a global maximum, and when all eigen values are positive for all possible values of X, then X* is a global minimum.  If some of the eigen values of the Hessian at X* are positive and some negative, or if some are zero, the stationary point, X*, is neither a local maximum nor a local minimum.

D Nagesh Kumar, IIScOptimization Methods: M2L3 6 Example Analyze the function and classify the stationary points as maxima, minima and points of inflection Solution

D Nagesh Kumar, IIScOptimization Methods: M2L3 7 Example …contd.

D Nagesh Kumar, IIScOptimization Methods: M2L3 8 Example …contd.

D Nagesh Kumar, IIScOptimization Methods: M2L3 9 Example …contd.

D Nagesh Kumar, IIScOptimization Methods: M2L3 10 Thank you