Unconstrained and Constrained Optimization

Slides:



Advertisements
Similar presentations
Business Calculus Extrema. Extrema: Basic Facts Two facts about the graph of a function will help us in seeing where extrema may occur. 1.The intervals.
Advertisements

Fin500J: Mathematical Foundations in Finance
Analyzing Multivariable Change: Optimization
4.3 Connecting f’ and f’’ with the Graph of f
Relative Extrema of Two Variable Functions. “Understanding Variables”
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Optimization in Engineering Design 1 Lagrange Multipliers.
Optimization using Calculus
Extremum. Finding and Confirming the Points of Extremum.
Finding a free maximum Derivatives and extreme points Using second derivatives to identify a maximum/minimum.
2.3 Curve Sketching (Introduction). We have four main steps for sketching curves: 1.Starting with f(x), compute f’(x) and f’’(x). 2.Locate all relative.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Relative Extrema.
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
f /(x) = 6x2 – 6x – 12 = 6(x2 – x – 2) = 6(x-2)(x+1).
1 Cost Minimization and Cost Curves Beattie, Taylor, and Watts Sections: 3.1a, 3.2a-b, 4.1.
Consider minimizing and/or maximizing a function z = f(x,y) subject to a constraint g(x,y) = c. y z x z = f(x,y) Parametrize the curve defined by g(x,y)
1 Technology Beattie, Taylor, and Watts Sections: , b.
MAT 213 Brief Calculus Section 4.2 Relative and Absolute Extreme Points.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Optimization unconstrained and constrained Calculus part II.
Definition of Curve Sketching  Curve Sketching is the process of using the first and second derivative and information gathered from the original equation.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Math Review and Lessons in Calculus
Beattie, Taylor, and Watts Sections: 3.1b-c, 3.2c, , 5.2a-d
Introduction to Optimization
Economics 2301 Lecture 37 Constrained Optimization.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
AP Calculus Chapter 5. Definition Let f be defined on an interval, and let x 1 and x 2 denote numbers in that interval f is increasing on the interval.
Advanced Mathematics D. Chapter Four The Derivatives in Graphing and Application.
Chapter 14 Partial Derivatives
§ 2.3 The First and Second Derivative Tests and Curve Sketching.
Extrema of Functions of Two Variables
4.3 Using Derivatives for Curve Sketching.
Extreme Values of Functions
Review Problems Sections 3-1 to 3-4
Concavity.
Functions of Several Variables
Beattie, Taylor, and Watts Sections: , b
MK. OPTMIMASI BAGIAN 2 Kuliah 1: 12 NOVEMBER 2017.
Kuan Liu, Ryan Park, Nathan Saedi, Sabrina Sauri & Ellie Tsang
Review of General Economic Principles
Relative and Absolute Extrema
Mathematical Economics
Chapter 2 Applications of the Derivative
Relationship between First Derivative, Second Derivative and the Shape of a Graph 3.3.
Cost Minimization and Cost Curves
Concavity and Second Derivative Test
Concavity and the Second Derivative Test
Constrained Optimization – Part 1
13 Functions of Several Variables
PRELIMINARY MATHEMATICS
Which of the following are polynomial functions?
Outline Unconstrained Optimization Functions of One Variable
AS-Level Maths: Core 2 for Edexcel
Derivatives in Action Chapter 7.
Extrema on an Interval 3.1 On the agenda: Defining Extrema
Derivatives and Graphing
Constrained Optimization – Part 1
4.2 Critical Points, Local Maxima and Local Minima
Objectives Graphing Basic Polynomial Functions
Applications of Derivatives
Copyright © Cengage Learning. All rights reserved.
Analyzing f(x) and f’(x) /
Relationship between First Derivative, Second Derivative and the Shape of a Graph 3.3.
Multivariable optimization with no constraints
Analyzing Multivariable Change: Optimization
Presentation transcript:

Unconstrained and Constrained Optimization

Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative Unconstrained Optimization Constrained Optimization

Optimization There are two ways of examining optimization. Minimization In this case you are looking for the lowest point on the function. Maximization In this case you are looking for the highest point on the function.

Needed Terminology Critical Point A point x* on a function is said to be a critical point if when you evaluate the derivative of the function at the point x*, then the derivative at that point is zero, i.e., f’(x*) = 0.

What observations can you make about the attributes of a minimum?

Questions Regarding the Minimum What is the sign of the slope when you are to the left of the minimum point? Another way of saying this is what is f’(x) when x < x*? Note: x* denotes the point where the function is at a minimum.

Questions Regarding the Minimum Cont. What is the sign of the slope when you are to the right of the minimum point? Another way of saying this is what is f’(x) when x > x*? What is the sign of the slope when you at the minimum point? Another way of saying this is what is f’(x) when x = x*?

Graphical Representation of a Minimum y y = f(x) = x2 – 8x + 20 20 4 x 4

What observations can you make about the attributes of a maximum?

Questions Regarding the Maximum What is the sign of the slope when you are to the left of the maximum point? Another way of saying this is what is f’(x) when x < x*? Note: x* denotes the point where the function is at a maximum.

Questions Regarding the Maximum Cont. What is the sign of the slope when you are to the right of the maximum point? Another way of saying this is what is f’(x) when x > x*? What is the sign of the slope when you at the maximum point? Another way of saying this is what is f’(x) when x = x*?

Graphical Representation of a Maximum y 16 y = f(x) = -x2 + 8x x 4 8

Interpreting the First Derivative The first derivative of a function as was shown previously is the slope of the curve evaluated at a particular point. In essence it tells you the instantaneous rate of change of the function at the given particular point. Knowing the slope of the function can tell you where a maximum or a minimum exists on a curve. Why?

Question Can the derivative tell you whether you are at a maximum or a minimum? The answer is yes if you examine the slope of the function around the critical point, i.e., the point where the derivative is zero. An easier way of examining whether you have a maximum or a minimum is to examine the second derivative of the function.

The Second Derivative The second derivative of a function f(x) is the derivative of the function f’(x), where f’(x) is the derivative of f(x). The second derivative can tell you whether the function is concave or convex at the critical point. The second derivative can be denoted by f’’(x).

Concavity and the Second Derivative The maximum of a function f(x) occurs when a critical point x* is at a concave portion of the function. This is equivalent to saying that f’’(x*) < 0. If f’’(x) < 0 for all x, then the function is said to be concave.

Convexity and the Second Derivative The minimum of a function f(x) occurs when a critical point x* is at a convex portion of the function. This is equivalent to saying that f’’(x*) > 0. If f’’(x) > 0 for all x, then the function is said to be convex.

Special Case of the Second Derivative Suppose you have a function f(x) that has a maximum at x*. What does it mean when the second derivative is equal to zero, i.e., f’’(x*) = 0? This is a point where the second derivative may not be able to tell you whether you have a maximum or a minimum. Usually in this case you will get a saddle point/point of inflection where the point is neither a maximum nor a minimum.

Example of Special Case of the Second Derivative Suppose y = f(x) = x3, then f’(x) = 3x2 and f’’(x) = 6x, This implies that x* = 0 and f’’(x*=0) = 0. y=f(x)=x3 y x

Unconstrained Optimization An unconstrained optimization problem is one where you only have to be concerned with the objective function you are trying to optimize. An objective function is a function that you are trying to optimize. None of the variables in the objective function are constrained.

First and Second Order Condition For a Maximum The first order condition for a maximum at a point x* on the function f(x) is when f’(x*) = 0. The second order condition for a maximum at a point x* on the function f(x) is when f’’(x*) < 0.

First and Second Order Condition For a Minimum The first order condition for a minimum at a point x* on the function f(x) is when f’(x*) = 0. The second order condition for a minimum at a point x* on the function f(x) is when f’’(x*) > 0.

Example of Using First and Second Order Conditions Suppose you have the following function: f(x) = x3 – 6x2 + 9x Then the first order condition to find the critical points is: f’(x) = 3x2 - 12x + 9 = 0 This implies that the critical points are at x = 1 and x = 3.

Example of Using First and Second Order Conditions Cont. The next step is to determine whether the critical points are maximums or minimums. These can be found by using the second order condition. f’’(x) = 6x – 12 = 6(x-2)

Example of Using First and Second Order Conditions Cont. Testing x = 1 implies: f’’(1) = 6(1-2) = -6 < 0. Hence at x =1, we have a maximum. Testing x = 3 implies: f’’(3) = 6(3-2) = 6 > 0. Hence at x =3, we have a minimum. Are these the ultimate maximum and minimum of the function f(x)?

Relative Vs. Absolute Extremum A relative extremum is a point that is locally greater or lesser than all points around it. A relative extrema can be found by using the first order condition. An absolute extremum is a point that is either absolutely greater than or less than all other points, i.e., f(x*) > f(x) for all x not equal to x* for a maximum and f(x*) < f(x) for all x not equal to x* for a minimum.

Finding the Absolute Extremum To find the absolute extremum, you need to compare all the critical points on the function, as well as, any potential end points of the function like  and - . When evaluating a polynomial function at , the value of the function at  takes the value of the at the highest ordered variable.

Finding the Absolute Extremum Cont. Some properties of :  +  =   -  is undefined c* = , where c is any value greater than zero  *  =   * (-) = - From the previous example, the relative extremum points occur at x =-, 1, 3, and . The absolute maximum occurs at x = and the absolute minimum occurs at x =-.

Unconstrained Optimization: Two Variables Suppose you have a function y = f(x1,x2), then to find the critical points, you can use the following first order condition:

Unconstrained Optimization: Two Variables Cont. The second order condition are more complex where you have to examine the second derivative of each of the variables, as well as, the cross derivative.

Cross Partial Derivative Suppose you have a function y = f(x1,x2), then the cross partial derivative can be represented as:

Cross Partial Derivative Example Suppose you have a function y = f(x1,x2) =x12+2x1x2+3x22, then:

Constrained Optimization Constrained Optimization is said to occur when one or more of the variables in the objective function is constrained by some function. Hence a constrained optimization problem will have an objective functions and a set of constraints.

Constrained Optimization Cont. The Constrained Optimization problem where you are trying to maximize can be set-up as the following: Maximize an objective function f(x) with respect to x given a set of constraints g(x)=c.

Example of Constrained Optimization Suppose that you want to maximize f(x) = 5x -x2, subject to the constraint that x = 2. Since x = 2 is the constraint, the answer to this is trivial where x* = 2.

Constrained Optimization: Two Variables The Constrained Optimization problem where you are trying to maximize can be set-up as the following: Maximize an objective function f(x1,x2) with respect to x1,x2 given a set of constraints g(x1,x2) = c.

Example of Constrained Optimization: Two Variables Suppose you want to maximize y = f(x1,x2) = x1x2 with respect to x1 and x2 given that 600 = x1 + 2x2. To solve this problem, we can turn this constrained problem into an unconstrained problem.

Example of Constrained Optimization: Two Variables Cont. If we solve the constraint for x1 as a function of x2 we get x1 = 600 – 2x2. Plugging x1 into the objective function gives the following new unconstrained maximization problem. Maximize (600 – 2x2)x2 w.r.t. x2. The first order condition is 600 – 4x2=0. Which implies x2*=150. Which implies x1*=300.

Motivating the Lagrange Method In the previous problem, we made a substitution to turn the constrained optimization problem into an unconstrained problem. While this made solving the problem easier, there may be times when you have multiple constraints or potentially inequality constraints that make changing the constrained into the unconstrained difficult or impossible.

Motivating the Lagrange Method Cont. Another way of solving the above problem is using Lagrange’s method. The Lagrange method uses what is called a Lagrange multiplier  to transform the problem. The Lagrange multiplier tells us how much will be added to the optimization problem if the constraint was relaxed by one unit.

Setting-Up the Lagrange The Constrained Optimization problem where you are trying to maximize can be set-up as the following: Maximize an objective function f(x1,x2) with respect to x1 and x2 given a set of constraints g(x1,x2) = c.

Solving the Lagrange Problem To solve the Lagrange problem, you need to optimize L(x1,x2, ) with respect to x1, x2, and . This is equivalent to using the first and second order conditions.

Example of Lagrange Suppose you want to maximize y = f(x1,x2) = x1x2 with respect to x1 and x2 given that 600 = x1 + 2x2. This implies: