Presentation is loading. Please wait.

Presentation is loading. Please wait.

Network Systems Lab. Korea Advanced Institute of Science and Technology No.1 Some useful Contraction Mappings  Results for a particular choice of norms.

Similar presentations


Presentation on theme: "Network Systems Lab. Korea Advanced Institute of Science and Technology No.1 Some useful Contraction Mappings  Results for a particular choice of norms."— Presentation transcript:

1 Network Systems Lab. Korea Advanced Institute of Science and Technology No.1 Some useful Contraction Mappings  Results for a particular choice of norms  Prop.1.12

2 Network Systems Lab. Korea Advanced Institute of Science and Technology No.2 Some useful Contraction Mappings  Prop.1.13 Assume the following:

3 Network Systems Lab. Korea Advanced Institute of Science and Technology No.3 Unconstrained Optimization  Jacobi algorithm(Generalization of the JOR for linear eq.s)  Gauss-Seidel algorithm(Generalization of the SOR for linear eq.s)

4 Network Systems Lab. Korea Advanced Institute of Science and Technology No.4  Gradient algorithm(Generalization of the Richardson’s for linear eq.s)  Gauss-Seidel variant of the Gradient algorithm  The above 4 algorithms are called the Descent Algorithm; in fact, the Gradient algorithm is called the Steepest Descent Algorithm.

5 Network Systems Lab. Korea Advanced Institute of Science and Technology No.5   Descent Direction Θ

6 Network Systems Lab. Korea Advanced Institute of Science and Technology No.6  Scaled Gradient algorithm

7 Network Systems Lab. Korea Advanced Institute of Science and Technology No.7  Newton and Approximate Newton Methods  Even for nonquadratic case, Newton’s algorithm converges much faster (under certain assumptions) than previously introduced algorithms, particularly in the neighborhood of the optimal solution [OrR 70]

8 Network Systems Lab. Korea Advanced Institute of Science and Technology No.8  The Jacobi algorithm can be viewed as an approximation of Newton’s algorithm in which the off-diagonal entries of are ignored.  Approximate Newton Method

9 Network Systems Lab. Korea Advanced Institute of Science and Technology No.9  Convergence Analysis using the descent approach  Assumption 2.1  Lemma 2.1 (Descent Lemma)

10 Network Systems Lab. Korea Advanced Institute of Science and Technology No.10  Prop. 2.1 (Convergence of Descent Algorithms)

11 Network Systems Lab. Korea Advanced Institute of Science and Technology No.11

12 Network Systems Lab. Korea Advanced Institute of Science and Technology No.12

13 Network Systems Lab. Korea Advanced Institute of Science and Technology No.13  Show that Jacobi, Gradient, scaled Gradient, Newton and Approximate Newton satisfy the conditions of Prop.2.1 (under certain assumptions), who implies that for these algorithms.

14 Network Systems Lab. Korea Advanced Institute of Science and Technology No.14  Gradient Algorithm  Scaled Gradient Algorithm

15 Network Systems Lab. Korea Advanced Institute of Science and Technology No.15  Prop. 2.2 (Convergence of the Gauss-Seidel Algorithm)

16 Network Systems Lab. Korea Advanced Institute of Science and Technology No.16

17 Network Systems Lab. Korea Advanced Institute of Science and Technology No.17  The case of a convex cost function  Prop. 2.3 (Convergence of Descent Methods in Convex Optim.)  Prop. 2.4 (Geometric Convergence for Strictly Convex Optim.)

18 Network Systems Lab. Korea Advanced Institute of Science and Technology No.18

19 Network Systems Lab. Korea Advanced Institute of Science and Technology No.19 Convexity  Definition A.13  Convex setNon-convex set Strictly Convex Convex, but not strictly convex Non-convex CCC

20 Network Systems Lab. Korea Advanced Institute of Science and Technology No.20 Convexity (Cont’d)  Proposition A.35  A linear function is convex  The weighted sum of convex functions with positive weights is convex   Any vector norm is convex  Proposition A.36   Proposition A.39 

21 Network Systems Lab. Korea Advanced Institute of Science and Technology No.21 Convexity (Cont’d)  Proposition A.40   Proposition A.41 (Strong Convexity) 

22 Network Systems Lab. Korea Advanced Institute of Science and Technology No.22 Constrained Optimization   Proposition 3.1 (Optimality Condition)

23 Network Systems Lab. Korea Advanced Institute of Science and Technology No.23 Constrained Optimization (Cont’d)   Proposition 3.2 (Projection Theorem)

24 Network Systems Lab. Korea Advanced Institute of Science and Technology No.24 proof of prop 3.2

25 Network Systems Lab. Korea Advanced Institute of Science and Technology No.25 Gradient Projection Algorithm    

26 Network Systems Lab. Korea Advanced Institute of Science and Technology No.26 Proposition 3.3  Assumption 3.1  Same as Assumption 2.1 ( as in unconstrained optimization)  Prop 3.3 (Properties of the gradient projection mapping) 

27 Network Systems Lab. Korea Advanced Institute of Science and Technology No.27 Proof of Proposition 3.3  Proof of Proposition 3.3 (a)

28 Network Systems Lab. Korea Advanced Institute of Science and Technology No.28 Proof of Proposition 3.3  Proof of Proposition 3.3 (b)  Proof of Proposition 3.3 (c)

29 Network Systems Lab. Korea Advanced Institute of Science and Technology No.29 Proposition 3.4  Convergence of the Gradient Projection Algorithm proof ) refer to proposition 3.3

30 Network Systems Lab. Korea Advanced Institute of Science and Technology No.30 Proposition 3.5  Geometric Convergence for strongly convex problems

31 Network Systems Lab. Korea Advanced Institute of Science and Technology No.31 Scaled Gradient Projection Algorithms    

32 Network Systems Lab. Korea Advanced Institute of Science and Technology No.32  Proposition 3.7

33 Network Systems Lab. Korea Advanced Institute of Science and Technology No.33 The case of a product constraint set : parallel implementations      

34 Network Systems Lab. Korea Advanced Institute of Science and Technology No.34   The assumption that X is a Cartesian product opens up the possibility for a Gauss-Seidel version of the gradient projection algorithm. 

35 Network Systems Lab. Korea Advanced Institute of Science and Technology No.35 Proposition 3.8  Convergence of the Gauss-Seidel Gradient Projection Algorithm


Download ppt "Network Systems Lab. Korea Advanced Institute of Science and Technology No.1 Some useful Contraction Mappings  Results for a particular choice of norms."

Similar presentations


Ads by Google