Presentation is loading. Please wait.

Presentation is loading. Please wait.

Math for CSLecture 71 Constrained Optimization Lagrange Multipliers ____________________________________________ Ordinary Differential equations.

Similar presentations


Presentation on theme: "Math for CSLecture 71 Constrained Optimization Lagrange Multipliers ____________________________________________ Ordinary Differential equations."— Presentation transcript:

1 Math for CSLecture 71 Constrained Optimization Lagrange Multipliers ____________________________________________ Ordinary Differential equations

2 Math for CSLecture 72 Constrained optimization problem can be defined as following: Minimize the function,while searching among x, that satisfy the constraints: Constrained Optimization

3 Math for CSLecture 73 Example 1 g(x,y)= x 1 2 +x 2 2 -1=0 solution x*, the gradient of f(x) is orthogonal to the circle. Otherwise, there is non-zero projection of the gradient on the circle, and therefore, sliding contrary to this projection projection decreases the f(x) without violating the constraint g(x)=0. Consider the problem of minimizing f(x)=x 1 +x 2 and constraint that g(x)=x 1 2 +x 2 2 -1. The figure shows the circle defined by g(x)=0 and the lines of constant value of f(x). One can see, that at the x*

4 Math for CSLecture 74 Example 2 g(x,y)=0 Minimize the path between M and C, so that the path touches the constraint g(x)=0. Each ellipse describes the points lying on paths of the same lengths. Again, in the solution the gradient of f(x) is orthogonal to the curve of the constraint.

5 Math for CSLecture 75 The straightforward method to solve constrained optimization problem is to reduce the number of free variables: If x=(x 1,..x n ) and there are k constraints g 1 (x)=0,…g k (x)=0, then, the k constraint equations can (sometimes) be solved to reduce the dimensionality of x from n to n-k: Dimensionality Reduction

6 Math for CSLecture 76 Now we consider the hard case, when dimensionality reduction is impossible. If there are no constraints (k=0), the gradient of f(x) vanishes at the solution x*: In the constrained case, the gradient must be orthogonal to the subspace, defined by the constraints (otherwise a sliding along this subspace will decrease the value f(x), without violating the constraint). Surfaces defined by constraint

7 Math for CSLecture 77 Explanation The constraints limit the subspace of the solution. Here the solution lies on the intersection of the planes, defined by g 1 (x)=0 and g 2 (x)=0. The gradient f(x) must be orthogonal to this subspace (otherwise there is non-zero projection of f(x) along the constraint and the function value can be further decreased). The orthogonal subspace is spanned by λ 1 g 1 (x)+ λ 2 g 2 (x). Thus f(x*)= λ 1 g 1 (x*)+ λ 2 g 2 (x*). g 1 (x)=0 g 2 (x)=0 λ1∆g1(x)λ1∆g1(x) λ2∆g2(x)λ2∆g2(x)

8 Math for CSLecture 78 We observe, that the more additional constraints are applied, the more restricted is the coordinate of the optimum (anywhere in R 3 for k=0, on the surface for k=1, on the line for k=2), but the less restricted is the gradient of the function f(x) (zero, along the normal to the surface, within the plane, orthogonal to the line). This requirement for the gradient to lie in the hyperspace, orthogonal to the intersection of the hypersurfaces, defined by the constraints can be summarized as: Constrain for coordinate and relaxation for the gradient (1)

9 Math for CSLecture 79 The second requirement is to satisfy the constraints: The requirements (1-2) can be written together in the elegant form. Define the function: Then, we can write to satisfy (1) and to satisfy (2). Lagrange Multipliers (2)

10 Math for CSLecture 710 In other words, we have constructed a function that depends on a variable, and we require for that function. Lagrange Multipliers (2)

11 Math for CSLecture 711 The constants λ i are called Lagrange Multipliers. We obtained that the solution (λ*, x*) of the constrained optimization problem Satisfies the equations,where Summary (2)

12 Math for CSLecture 712 Ordinary Differential Equations (2) Linear Equations Separable equations

13 Math for CSLecture 713 1. A differential equation is an equation involving an unknown function and its derivatives. 2. The order of the differential equation is the order of the highest derivative of the unknown function involved in the equation. 3. A linear differential equation of order n is a differential equation written in the following form: where is not the zero function. Differential Equations (2)

14 Math for CSLecture 714 4. Existence: Does a differential equation have a solution? 5. Uniqueness: Does a differential equation have more than one solution? If yes, how can we find a solution which satisfies particular conditions? 6. If the values of the unknown function y(x) and its derivatives at some point are known is called an initial value problem (in short IVP). 7. If no initial conditions are given, we call the description of all solutions to the differential equation the general solution. Differential Equations (2)

15 Math for CSLecture 715 A first order linear differential equation has the following form: To solve this equation, let us multiply both sides by : First order differential equation (2)

16 Math for CSLecture 716 The differential equation of the form is called separable, if f(x,y) = h(x)·g(y); that is, In order to solve it, perform the following steps: (1) Solve the equation g(y) = 0, which gives the constant solutions of (a); (2) Rewrite the equation (a) as Separable Equations (2) (a)

17 Math for CSLecture 717 (3) Now we can integrate to obtain (4) Now we can write down all the solutions, obtained in (1) and (2). If this is an IVP, we must use an initial to find a particular solution. Separable Equations (2)


Download ppt "Math for CSLecture 71 Constrained Optimization Lagrange Multipliers ____________________________________________ Ordinary Differential equations."

Similar presentations


Ads by Google