Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ford-Fulkerson Recap.

Similar presentations


Presentation on theme: "Ford-Fulkerson Recap."— Presentation transcript:

1 Ford-Fulkerson Recap

2 Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf v s t w
c(e) (f(e))

3 Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf v 2 3 s t w
Forward edge. capacity: c(e)-f(e) w Backward edge. capacity: f(e)

4 Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf v s 2 3 t w

5 Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf 3 v s 2 3 t w

6 Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf 3 v 2 s 2 3 t 3 2
2 s 2 3 t 3 2 w

7 Delete all edges with residual capacity 0!
Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf s w v t 2 3 Delete all edges with residual capacity 0!

8 Gf G u v 2 4 u v 6 (4) saturated u v 6 (6) u v 6 empty u v 6 u v 6 (0)

9 Gf u v 2 1 4 G 6 (4) v u 3 (2)

10 Ford-Fulkerson Algorithm
Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf. Iterate: improve the flow re-compute Gf

11 Ford-Fulkerson Algorithm
Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf. Flow Update: push Δ(P) flow units along P

12 Ford-Fulkerson Algorithm
Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

13 We Saw: Correctness proof: optimality is missing
The flow is always feasible and integral The algorithm terminates Each iteration takes O(m) time Flow value increases by at least 1 in each iteration. Number of iterations at most sum of capacities. Correctness proof: optimality is missing Running time: want faster

14 Is the Flow Optimal? s w v t 5 (1) 2 (2) 3 (3) 2(2)

15 Is the Flow Optimal? s w v t 1 100 s w v t 1 (1) 100 (2)

16 Is the Flow Optimal? v 100 s 1 t v 1 (1) 100 (2) w s 1 (1) t 100 (2)

17 Ford-Fulkerson Algorithm
Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

18 Ford-Fulkerson Algorithm
Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

19 Edmonds-Karp Algorithm
Start: with f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be the shortest simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

20 Changes to Gf Gf s t P

21 Changes to Gf Gf’ s t Claim: Only edges of P may disappear
At least one of them has to disappear The only new edges are anti-parallel to edges of P

22 Claim: distance from s to t in the residual graph never decreases

23 Claim: distance from s to t in the residual graph never decreases

24 d is length of augmenting path
BFS from s in Gf t s d is length of augmenting path d+1

25 forward-looking edges
BFS from s in Gf shortcut edge forward-looking edges t s d+1

26 backward-looking edges
BFS from s in Gf backward-looking edges t s d+1

27 sideways-looking edges
BFS from s in Gf sideways-looking edges t s d+1

28 BFS from s in Gf t s d+1

29 BFS from s in Gf t s d+1

30 BFS from s in Gf t s d+1 We do not create shortcut edges!
The s-t distance cannot decrease! d+1

31 Claim: a phase has no more than m iterations.

32 BFS from s in Gf at the Start of the Phase
d+1

33 BFS from s in Gf at the Start of the Phase
d+1

34 BFS from s in Gf at the Start of the Phase
d+1

35 BFS from s in Gf at the Start of the Phase
d+1

36 BFS from s in Gf at the Start of the Phase
d+1

37 BFS from s in Gf at the Start of the Phase
d+1

38 BFS from s in Gf at the Start of the Phase
d+1

39 BFS from s in Gf at the Start of the Phase
We never create shortcut edges As long as augmenting path length is d, it has to visit every layer in turn Induction Conclusion: at most m iterations to a phase! t s We do not create new forward-looking edges in a phase. d+1 At least one forward-looking edge deleted in each iteration


Download ppt "Ford-Fulkerson Recap."

Similar presentations


Ads by Google