D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.

Slides:



Advertisements
Similar presentations
Fin500J: Mathematical Foundations in Finance
Advertisements

Nonlinear Programming McCarl and Spreen Chapter 12.
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Engineering Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
D Nagesh Kumar, IIScOptimization Methods: M2L2 1 Optimization using Calculus Convexity and Concavity of Functions of One and Two Variables.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Empirical Maximum Likelihood and Stochastic Process Lecture VIII.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Maxima and Minima. Maximum: Let f(x) be a function with domain DC IR then f(x) is said to attain the maximum value at a point a є D if f(x)
1. 2 Local maximum Local minimum 3 Saddle point.
Maximum and Minimum Value Problems By: Rakesh Biswas
Optimization in Engineering Design 1 Lagrange Multipliers.
MIT and James Orlin © Nonlinear Programming Theory.
Optimization using Calculus
Engineering Optimization
Maximization without Calculus Not all economic maximization problems can be solved using calculus –If a manager does not know the profit function, but.
Tutorial 7 Constrained Optimization Lagrange Multipliers
Constrained Optimization
Constrained Optimization Economics 214 Lecture 41.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Example 1 Determine whether the stationary point of the following quadratic functions is a local maxima, local minima or saddle point? A point x* is a.
Advanced Topics in Optimization
Optimization using Calculus
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Introduction and Basic Concepts
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Introduction to Optimization (Part 1)
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
CHAP 0 MATHEMATICAL PRELIMINARY
Slide 2a.1 Stiff Structures, Compliant Mechanisms, and MEMS: A short course offered at IISc, Bangalore, India. Aug.-Sep., G. K. Ananthasuresh Lecture.
Nonlinear Programming Models
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.
Lecture 26 Molecular orbital theory II
Optimization unconstrained and constrained Calculus part II.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Multi-objective Optimization
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Stochastic Optimization
Signal & Weight Vector Spaces
Performance Surfaces.
Introduction to Optimization
Optimization and Lagrangian. Partial Derivative Concept Consider a demand function dependent of both price and advertising Q = f(P,A) Analyzing a multivariate.
Calculus-Based Optimization AGEC 317 Economic Analysis for Agribusiness and Management.
(i) Preliminaries D Nagesh Kumar, IISc Water Resources Planning and Management: M3L1 Linear Programming and Applications.
MAXIMA AND MINIMA. ARTICLE -1 Definite,Semi-Definite and Indefinite Function.
Section Lagrange Multipliers.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Optimal Control.
Another sufficient condition of local minima/maxima
deterministic operations research
Chapter 11 Optimization with Equality Constraints
Lecture 8 – Nonlinear Programming Models
Unconstrained and Constrained Optimization
The Lagrange Multiplier Method
Constrained Optimization – Part 1
PRELIMINARY MATHEMATICS
Outline Unconstrained Optimization Functions of One Variable
Derivatives in Action Chapter 7.
Performance Surfaces.
Calculus-Based Optimization AGEC 317
Multivariable optimization with no constraints
Presentation transcript:

D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization

Objectives D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 2  To study the optimization of functions of multiple variables without optimization.  To study the above with the aid of the gradient vector and the Hessian matrix.  To discuss the implementation of the technique through examples  To study the optimization of functions of multiple variables subjected to equality constraints using  the method of constrained variation

Unconstrained optimization D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 3  If a convex function is to be minimized, the stationary point is the global minimum and analysis is relatively straightforward  A similar situation exists for maximizing a concave variable function.  Necessary and sufficient conditions for the optimization of unconstrained function of several variables

Necessary condition D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 4 In case of multivariable functions a necessary condition for a stationary point of the function f(X) is that each partial derivative is equal to zero. Each element of the gradient vector defined below must be equal to zero. i.e. the gradient vector of f(X), at X=X *, defined as follows, must be equal to zero:

Sufficient condition D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 5  For a stationary point X* to be an extreme point, the matrix of second partial derivatives (Hessian matrix) of f(X) evaluated at X* must be:  positive definite when X* is a point of relative minimum, and  negative definite when X* is a relative maximum point.  When all eigen values are negative for all possible values of X, then X* is a global maximum, and when all eigen values are positive for all possible values of X, then X* is a global minimum.  If some of the eigen values of the Hessian at X* are positive and some negative, or if some are zero, the stationary point, X*, is neither a local maximum nor a local minimum.

Example D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 6 Analyze the function and classify the stationary points as maxima, minima and points of inflection Solution

Example … D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 7 Solving these simultaneous equations we get X * =[1/2, 1/2, -2]

D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 8 Hessian of f(X) is Example …

D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 9 or Since all eigen values are negative the function attains a maximum at the point X * =[1/2, 1/2, -2] or Example …

Constrained optimization D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 10 A function of multiple variables, f(x), is to be optimized subject to one or more equality constraints of many variables. These equality constraints, g j (x), may or may not be linear. The problem statement is as follows: Maximize (or minimize) f(X), subject to g j (X) = 0, j = 1, 2, …, m where

D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 11  With the condition that ; or else if m > n then the problem becomes an over defined one and there will be no solution. Of the many available methods, the method of constrained variation is discussed  Another method using Lagrange multipliers will be discussed in the next lecture. Constrained optimization…

Solution by Method of Constrained Variation D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 12 For the optimization problem defined above, consider a specific case with n = 2 and m = 1 The problem statement is as follows: Minimize f(x 1,x 2 ), subject to g(x 1,x 2 ) = 0 For f(x 1,x 2 ) to have a minimum at a point X * = [x 1 *,x 2 * ], a necessary condition is that the total derivative of f(x 1,x 2 ) must be zero at [x 1 *,x 2 * ]. (1)

Method of Constrained Variation… D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 13 Since g(x 1 *,x 2 * ) = 0 at the minimum point, variations dx 1 and dx 2 about the point [x 1 *, x 2 * ] must be admissible variations, i.e. the point lies on the constraint: g(x 1 * + dx 1, x 2 * + dx 2 ) = 0 (2) assuming dx 1 and dx 2 are small the Taylor’s series expansion of this gives us (3)

D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 14 orat [x 1 *,x 2 * ] (4) which is the condition that must be satisfied for all admissible variations. Assuming, (4) can be rewritten as (5) Method of Constrained Variation…

D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 15 (5) indicates that once variation along x 1 (dx 1 ) is chosen arbitrarily, the variation along x 2 (dx 2 ) is decided automatically to satisfy the condition for the admissible variation. Substituting equation (5) in (1) we have: (6) The equation on the left hand side is called the constrained variation of f. Equation (5) has to be satisfied for all dx 1, hence we have (7) This gives us the necessary condition to have [x 1 *, x 2 * ] as an extreme point (maximum or minimum) Method of Constrained Variation…

D. Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Thank You