Presentation is loading. Please wait.

Presentation is loading. Please wait.

Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.

Similar presentations


Presentation on theme: "Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables."— Presentation transcript:

1 Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables

2 Different types of relaxation  Variable by variable relaxation – strict minimization  Changing a small subset of variables simultaneously – Window strict minimization relaxation  Stochastic relaxation – may increase the energy – should be followed by strict minimization

3 Easy to solve problems  Quadratic functional and linear constraints  Solve a linear system of equations  Quadratization of the functional: P=1, P>2  Linearization of the constraints: P=2  Inequality constraints: active set method

4

5 Data structure For each node in the graph keep 1.A list of all the graph’s neighbors: for each neighbor keep a pair of index and weight 2.… 3.… 4.Its current placement 5.The unique square in the grid the node belongs to For each square in the grid keep 1.A list of all the nodes which are mostly within  Defines the current physical neighborhood 2. The total amount of material within the square

6 Graph drawing Most graph drawing algorithms

7 Graph drawing Most graph drawing algorithms With space utilization

8 The error of the compressed 32x32 grid graph

9 Start the equi-density with a 2x2 grid

10 Continue the equi-density with a 4x4 grid

11 8x8 grid

12 The final results was obtained after equi-density in a 32x32 grid

13 5-level binary tree with non-uniform vertices

14 Inequality constraints Given E(x) subject to m inequality constraints: g k (x)<=0, k=1,…,m, construct the Lagrangian while taking into account only the unsatisfied constraints. This is the active set A, i.e., for k in A L(x, ) = E(x) +  k  k  g k (x) and solve the system of n+|A| equations as with equality constraints.  At the solution g k (x)=0 for k in A  Should probably take only  of the solution  Should consider constraints only when k >0 these are the binding constraints  Iterate

15 Exc#5: Lagrange multipliers inequality constraints minimize x 2 +y 2 subject to x+2y<1 and 1/2-y<0 starting at (1,1/4) 1)Find the minimum 2)Calculate the Lagrange multipliers 3)Which constraint is binding, explain

16 Easy to solve problems  Quadratic functional and linear constraints  Solve a linear system of equations  Quadratization of the functional: P=1, P>2  Linearization of the constraints: P=2  Inequality constraints: active set method  Linear functional and linear constraints  Linearization of the quadratic functional

17 Linear programming minimize/maximize a linear function under equality/inequality linear constraints  Standard form:  The region satisfying all the constraints is the feasible region and it is convex

18 Linear programming (cont.)  The optimum of a convex function in a convex polyhedron region is at its extreme, corner points

19 Convex set (region) and function convex non-convex convex polyhedron A convex function F(x) satisfies: F(x 1 ) F(x 2 ) x1x1 x2x2

20 The basic mechanism of the simplex method: A simple example

21

22 Linear programming (cont.)  The optimum of a linear function in a convex polyhedron region is at its extreme, corner points  The maximum of a linear function cannot be in the interior since there is always a positive gradient direction  Follow this direction until hitting the boundary  Keep on going over the exterior until hitting a corner point  Go from one corner point to another  At one of the corner points the global maximum

23

24 Linear programming (cont.)  The number of corner points is finite  The global maximum is at the corner point in which Z(x) is greater or equal to the value of Z at all adjacent corner points  The simplex method (Dantzig 1948) starts at a feasible corner point and visited a sequence of corner points until a maximum is obtained  #of iterations is almost always O(M) or O(N) whichever is larger, but can become exponential for pathological cases


Download ppt "Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables."

Similar presentations


Ads by Google