Presentation is loading. Please wait.

Presentation is loading. Please wait.

Min Cost Flow: Polynomial Algorithms. Overview Recap: Min Cost Flow, Residual Network Potential and Reduced Cost Polynomial Algorithms Approach Capacity.

Similar presentations


Presentation on theme: "Min Cost Flow: Polynomial Algorithms. Overview Recap: Min Cost Flow, Residual Network Potential and Reduced Cost Polynomial Algorithms Approach Capacity."— Presentation transcript:

1 Min Cost Flow: Polynomial Algorithms

2 Overview Recap: Min Cost Flow, Residual Network Potential and Reduced Cost Polynomial Algorithms Approach Capacity Scaling Successive Shortest Path Algorithm Recap Incorporating Scaling Cost Scaling Preflow/Push Algorithm Recap Incorporating Scaling Double Scaling Algorithm - Idea

3 Min Cost Flow - Recap v1v1 v2v2 v3v3 v4v4 v5v5 5 -2 -3 4,14,1 3,43,4 5,15,1 1,11,1 3,33,3

4 Min Cost Flow - Recap fdsfds Compute feasible flow with min cost

5 Residual Network - Recap

6 Reduced Cost - Recap

7

8 Min Cost Flow: Polynomial Algorithms

9 Approach We have seen several algorithm for the MCF problem, but none polynomial – in logU, logC. Idea: Scaling! Capacity/Flow values Costs both Next Week: Algorithms with running time independent of logU, logC Strongly Polynomial Will solve problems with irrational data

10 Capacity Scaling

11 Successive Shortest Path - Recap

12

13

14 Algorithm Complexity: Assuming integrality, at most nU iterations. In each iteration, compute shortest paths, Using Dijkstra, bounded by O(m+nlogn) per iteration

15 Capacity Scaling - Scheme Successive Shortest Path Algorithm may push very little in each iteration. Fix idea: use scaling Modify algorithm to push units of flow Ignore edges with residual capacity < until there is no node with excess or no node with deficit Decrease by factor 2, and repeat Until < 1.

16 Definitions 34 33 3 4 2 1 G(x)

17 Definitions 34 33 3 4 G(x, 3)

18 Main Observation in Algorithm Observation: Augmentation of units must start at a node in S( ), along a path in G(x, ), and end at a node in T( ). In the -phase, we find shortest paths in G(x, ), and augment over them. Thus, edges in G(x, ), will satisfy the reduced optimality conditions. We will consider edges with less residual capacity later.

19 Initializing phases i j

20 Capacity Scaling Algorithm

21 Initial values. 0 pseudoflow and potential (optimal!) Large enough

22 Capacity Scaling Algorithm In beginning of -phase, fix optimality condition on new edges with resid. Cap. r ij < 2 by saturation

23 Capacity Scaling Algorithm augment path in G(x, ) from node in S( ) to node in T( )

24 Capacity Scaling Algorithm - Correctness

25

26 Capacity Scaling Algorithm - Assumption We assume path from k to l in G(x, ) exists. And we assume we can compute shortest distances from k to rest of nodes. Quick fix: initially, add dummy node D with artificial edges (1,D) and (D,1) with infinite capacity and very large cost.

27 Capacity Scaling Algorithm – Complexity The algorithm has O(log U) phases. We analyze each phase separately.

28 Capacity Scaling Algorithm – Phase Complexity D E

29 Capacity Scaling Algorithm – Phase Complexity – Cont.

30

31 Capacity Scaling Algorithm – Complexity

32 Cost Scaling

33 Approximate Optimality

34 Approximate Optimality Properties a b c d 4 -5 3 -2

35 Algorithm Strategy

36 Preflow Push Recap

37 Distance Labels Distance Labels Satisfy: d(t) = 0, d(s) = n d(v) d(w) + 1 if r(v,w) > 0 d(v) is at most the distance from v to t in the residual network. s must be disconnected from t …

38 Terms Nodes with positive excess are called active. Admissible arc in the residual graph: w v d(v) = d(w) + 1

39 The preflow push algorithm While there is an active node { pick an active node v and push/relabel(v) } Push/relabel(v) { If there is an admissible arc (v,w) then { push = min {e(v), r(v,w)} flow from v to w } else { d(v) := min{d(w) + 1 | r(v,w) > 0} (relabel) }

40 Running Time The # of relabelings is (2n-1)(n-2) < 2n 2 The # of saturating pushes is at most 2nm The # of nonsaturating pushes is at most 4n 2 m – using potential Φ = Σ v active d(v)

41 Back to Min Cost Flow…

42 Applying Preflow Pushs technique j i

43 Initialization v w -10

44

45 Push/relabel until no active nodes exist

46 Correctness Lemma 1: Let x be pseudo-flow, and x a feasible flow. Then, for every node v with excess in x, there exists a path P in G(x) ending at a node w with deficit, and its reversal is a path in G(x). Proof: Look at the difference x-x, and observe the underlying graph (edges with negative difference are reversed).

47 Lemma 1: Proof Cont. Proof: Look at the difference x-x, and observe the underlying graph (edges with negative difference are reversed). 34 33 3 4 2 1 32 34 2 5 2 0 -

48 Lemma 1: Proof Cont. Proof: Look at the difference x-x, and observe the underlying graph (edges with negative difference are reversed). v w S There must be a node with deficit reachable, otherwise x isnt feasible

49 Correctness (cont) Corollary: There is an outgoing residual arc incident with every active vertex Corollary: So we can push/relabel as long as there is an active vertex

50 Correctness – Cont.

51

52

53 Correctness (cont)

54 Complexity Lemma : a node is relabeled at most 3n times.

55 Lemma 2 – Cont.

56

57

58 Complexity Analysis (Cont.) Lemma: The # of saturating pushes is at most O(nm). Proof: same as in Preflow Push.

59 Complexity Analysis – non Saturating Pushes Def: The admissible network is the graph of admissible edges. 4 -2 2 -4 2

60 Complexity Analysis – non Saturating Pushes Def: The admissible network is the graph of admissible edges. -2 -4

61 Complexity Analysis – non Saturating Pushes Def: The admissible network is the graph of admissible edges. Lemma: The admissible network is acyclic throughout the algorithm. Proof: induction.

62 Complexity Analysis – non Saturating Pushes – Cont. Lemma: The # of nonsaturating pushes is O(n 2 m). Proof: Let g(i) be the # of nodes reachable from i in admissible network Let Φ = Σ i active g(i)

63 Complexity Analysis – non Saturating Pushes – Cont. Φ = Σ i active g(i) By acyclicity, decreases (by at least one) by every nonsaturating push ij

64 Complexity Analysis – non Saturating Pushes – Cont. Φ = Σ i active g(i) Initially g(i) = 1. Increases by at most n by a saturating push : total increase O(n 2 m) Increases by each relabeling by at most n (no incoming edges become admissible): total increase < O(n 3 ) O(n 2 m) non saturating pushes

65 Cost Scaling Summary Total complexity O(n 2 mlog(nC)). Can be improved using ideas used to improve preflow push

66 Double Scaling (If Time Permits)

67 Double Scaling Idea

68 Network Transformation i b(i) j b(j) C ij, u ij x ij i b(i) (i,j) C ij, x ij j b(j) -u ij 0, r ij

69 Improve approximation Initialization N1N1 N2N2 +

70 Capacity Scaling - Scheme

71 Double Scaling - Correctness Assuming algorithm ends, immediate, since we augment along admissible path. (Residual path from excess to node to deficit node always exists – see cost scaling algorithm) We relabel when indeed no outgoing admissible edges.

72 Double Scaling - Complexity O(log U) phases. In each phase, each augmentation clears a node from S( ) and doesnt introduce a new one. so O(m) augmentations per phase.

73 Double Scaling Complexity – Cont. In each augmentation, We find a path of length at most 2n (admissible network is acyclic and bipartite) Need to account retreats.

74 Double Scaling Complexity – Cont. In each retreat, we relabel. Using above lemma, potential cannot grow more than O(l), where l is a length of path to a deficit node. Since graph is bipartite, l = O(n). So in all augmentations, O(n (m+n)) = O(mn) retreats. N1N1 N2N2

75 Double Scaling Complexity – Cont. To sum up: Implemented improve-approximation using capacity scaling O(log U) phases. O(m) augmentations per phase. O(n) nodes in path O(mn) node retreats in total. Total complexity: O(log(nC) log(U) mn)

76 Thank You


Download ppt "Min Cost Flow: Polynomial Algorithms. Overview Recap: Min Cost Flow, Residual Network Potential and Reduced Cost Polynomial Algorithms Approach Capacity."

Similar presentations


Ads by Google