Ford-Fulkerson Recap.

Slides:



Advertisements
Similar presentations
Maximum flow Main goals of the lecture:
Advertisements

Min Cost Flow: Polynomial Algorithms. Overview Recap: Min Cost Flow, Residual Network Potential and Reduced Cost Polynomial Algorithms Approach Capacity.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
1 Maximum flow sender receiver Capacity constraint Lecture 6: Jan 25.
1 Augmenting Path Algorithm s t G: Flow value = 0 0 flow capacity.
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
1 Augmenting Path Algorithm s t G: Flow value = 0 0 flow capacity.
The max flow problem
Maximum Flows Lecture 4: Jan 19. Network transmission Given a directed graph G A source node s A sink node t Goal: To send as much information from s.
The Dinitz Algorithm An example of a run. v1v1 Residual & Layered Networks construction s t v2v2 v3v3 v4v4 v5v5 v6v6 v7v
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
1 WEEK 11 Graphs III Network Flow Problems A Simple Maximum-Flow Algorithm Izmir University of Economics.
15.082J and 6.855J March 4, 2003 Introduction to Maximum Flows.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
1 Maximum Flows CONTENTS Introduction to Maximum Flows (Section 6.1) Introduction to Minimum Cuts (Section 6.1) Applications of Maximum Flows (Section.
TU/e Algorithms (2IL15) – Lecture 8 1 MAXIMUM FLOW (part II)
Flow A flow f for a network N is is an assignment of an integer value f(e) to each edge e that satisfies the following properties: Capacity Rule: For each.
Cycle Canceling Algorithm
Maximum Flow c v 3/3 4/6 1/1 4/7 t s 3/3 w 1/9 3/5 1/1 3/5 u z 2/2
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
CS4234 Optimiz(s)ation Algorithms
Lectures on Network Flows
Algorithms and Networks Hans Bodlaender
Network flow problem [Adapted from M.Chandy].
Edmunds-Karp Algorithm: Choosing Good Augmenting Paths
Lecture 16 Maximum Matching
Network Flow 2016/04/12.
Instructor: Shengyu Zhang
Edmonds-Karp Algorithm
Maximum Flow c v 3/3 4/6 1/1 4/7 t s 3/3 w 1/9 3/5 1/1 3/5 u z 2/2
CSE Algorithms Max Flow Problems 11/21/02 CSE Max Flow.
Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
7. Ford-Fulkerson Demo.
7. Ford-Fulkerson Demo.
Correctness of Edmonds-Karp
Richard Anderson Lecture 23 Network Flow
Richard Anderson Lecture 21 Network Flow
Max Flow Min Cut, Bipartite Matching Yin Tat Lee
Augmenting Path Algorithm
Flow Networks and Bipartite Matching
Complexity of Ford-Fulkerson
Algorithms (2IL15) – Lecture 7
The Dinitz Algorithm An example of a run.
Network Flow CSE 373 Data Structures.
EE5900 Advanced Embedded System For Smart Infrastructure
Max Flow / Min Cut.
Maximum Flow c v 3/3 4/6 1/1 4/7 t s 3/3 w 1/9 3/5 1/1 3/5 u z 2/2
Lecture 21 Network Flow, Part 1
7. Ford-Fulkerson Demo.
7. Ford-Fulkerson Algorithm with multiple optimal solutions
MAXIMUM flow by Eric Wengert.
The Ford-Fulkerson Algorithm
Augmenting Path Algorithm
Introduction to Maximum Flows
Richard Anderson Lecture 22 Network Flow
The Dinitz Algorithm An example of a run.
7. Edmonds-Karp Algorithm
Maximum Flow Neil Tang 4/8/2008
7. Edmonds-karp Demo.
7. Ford-Fulkerson Demo.
7. Ford-Fulkerson Demo.
7. Ford-Fulkerson Demo.
7. Ford-Fulkerson Demo.
Introduction to Maximum Flows
7. Ford-Fulkerson Demo.
Richard Anderson Lecture 22 Network Flow
7. Ford-Fulkerson Demo.
7. Ford-Fulkerson Demo 02/25/19 Copyright 2000, Kevin Wayne.
Presentation transcript:

Ford-Fulkerson Recap

Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf v s t w c(e) (f(e))

Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf v 2 3 s t w Forward edge. capacity: c(e)-f(e) w Backward edge. capacity: f(e)

Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf v s 2 3 t w

Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf 3 v s 2 3 t w

Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf 3 v 2 s 2 3 t 3 2 2 s 2 3 t 3 2 w

Delete all edges with residual capacity 0! Residual Flow Network s w v t 5 (3) 2 (0) 3 (3) G Gf s w v t 2 3 Delete all edges with residual capacity 0!

Gf G u v 2 4 u v 6 (4) saturated u v 6 (6) u v 6 empty u v 6 u v 6 (0)

Gf u v 2 1 4 G 6 (4) v u 3 (2)

Ford-Fulkerson Algorithm Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf. Iterate: improve the flow re-compute Gf

Ford-Fulkerson Algorithm Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf. Flow Update: push Δ(P) flow units along P

Ford-Fulkerson Algorithm Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

We Saw: Correctness proof: optimality is missing The flow is always feasible and integral The algorithm terminates Each iteration takes O(m) time Flow value increases by at least 1 in each iteration. Number of iterations at most sum of capacities. Correctness proof: optimality is missing Running time: want faster

Is the Flow Optimal? s w v t 5 (1) 2 (2) 3 (3) 2(2)

Is the Flow Optimal? s w v t 1 100 s w v t 1 (1) 100 (2)

Is the Flow Optimal? v 100 s 1 t v 1 (1) 100 (2) w s 1 (1) t 100 (2)

Ford-Fulkerson Algorithm Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

Ford-Fulkerson Algorithm Start: f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be any simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

Edmonds-Karp Algorithm Start: with f(e)=0 for all edges e Compute Gf. While Gf contains an s-t path: Let P be the shortest simple s-t path in Gf. Let Δ(P)=mine Pcf(e). For each edge e=(u,v) on P: If e is a forward edge, increase f(e) by Δ Otherwise, let e'=(v,u) be the edge of G corresponding to e. Reduce f(e’) by Δ. Recompute Gf.

Changes to Gf Gf s t P

Changes to Gf Gf’ s t Claim: Only edges of P may disappear At least one of them has to disappear The only new edges are anti-parallel to edges of P

Claim: distance from s to t in the residual graph never decreases

Claim: distance from s to t in the residual graph never decreases

d is length of augmenting path BFS from s in Gf t s d is length of augmenting path d+1

forward-looking edges BFS from s in Gf shortcut edge forward-looking edges t s d+1

backward-looking edges BFS from s in Gf backward-looking edges t s d+1

sideways-looking edges BFS from s in Gf sideways-looking edges t s d+1

BFS from s in Gf t s d+1

BFS from s in Gf t s d+1

BFS from s in Gf t s d+1 We do not create shortcut edges! The s-t distance cannot decrease! d+1

Claim: a phase has no more than m iterations.

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase d+1

BFS from s in Gf at the Start of the Phase We never create shortcut edges As long as augmenting path length is d, it has to visit every layer in turn Induction Conclusion: at most m iterations to a phase! t s We do not create new forward-looking edges in a phase. d+1 At least one forward-looking edge deleted in each iteration