Download presentation

Presentation is loading. Please wait.

Published byTalia Corlew Modified over 2 years ago

1
Introduction to Algorithms 6.046J/18.401J L ECTURE 16 Greedy Algorithms (and Graphs) Graph representation Minimum spanning trees Optimal substructure Greedy choice Prim’s greedy MST algorithm Prof. Charles E. Leiserson November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.1

2
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.2 Graphs (review) Definition. A directed graph (digraph) G = (V, E) is an ordered pair consisting of a set V of vertices (singular: vertex), a set E ⊆ V V of edges. In an undirected graph G = (V, E), the edge set E consists of unordered pairs of vertices. In either case, we have | E | = O(V 2 ). Moreover, if G is connected, then | E | | V | –1, which implies that lg | E | = (lg V). (Review CLRS, Appendix B.)

3
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.3 The adjacency matrix of a graph G = (V, E), where V = {1, 2, …, n}, is the matrix A[1.. n, 1.. n] given by 1 if (i, j) E, 0 if (i, j) ∉ E. Adjacency-matrix representation A[i,j] =

4
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.4 The adjacency matrix of a graph G = (V, E), where V = {1, 2, …, n}, is the matrix A[1.. n, 1.. n] given by 1 if (i, j) E, 0 if (i, j) ∉ E. Adjacency-matrix representation A[i,j] = 12341234 0 1 1 0 0 0 1 0 0 0 0 0 1 0 A 1 2 3 4 (V 2 ) storage dense representation. 2 1 3 4

5
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.5 Adjacency-list representation An adjacency list of a vertex v V is the list Adj[v] of vertices adjacent to v. 2 1 3 4 Adj[1] = {2, 3} Adj[2] = {3} Adj[3] = {} Adj[4] = {3}

6
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.6 Adjacency-list representation An adjacency list of a vertex v V is the list Adj[v] of vertices adjacent to v. 2 1 3 4 Adj[1] = {2, 3} Adj[2] = {3} Adj[3] = {} Adj[4] = {3} For undirected graphs, | Adj[v] | = degree(v). For digraphs, | Adj[v] | = out-degree(v).

7
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.7 Adjacency-list representation An adjacency list of a vertex v V is the list Adj[v] of vertices adjacent to v. 2 1 3 4 Adj[1] = {2, 3} Adj[2] = {3} Adj[3] = {} Adj[4] = {3} For undirected graphs, | Adj[v] | = degree(v). For digraphs, | Adj[v] | = out-degree(v). Handshaking Lemma: ∑ v ∈ V = 2 | E | for undirected graphs ⇒ adjacency lists use È(V + E) storage — a sparse representation (for either type of graph)

8
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.8 Minimum spanning trees Input: A connected, undirected graph G = (V, E) with weight function w : E → R. For simplicity, assume that all edge weights are distinct. (CLRS covers the general case.)

9
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.9 Minimum spanning trees Input: A connected, undirected graph G = (V, E) with weight function w : E → R. For simplicity, assume that all edge weights are distinct. (CLRS covers the general case.) Output: A spanning tree T — a tree that connects all vertices — of minimum weight: w (T) = w(u,v). (u,v) T

10
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.10 Example of MST 6 12 14 8 5 7 9 15 3 10

11
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.11 Example of MST 6 12 14 8 5 7 9 15 3 10

12
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.12 Optimal substructure MST T: (Other edges of G are not shown.)

13
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.13 Optimal substructure MST T: (Other edges of G are not shown.) Remove any edge (u, v) ∈ T. u v

14
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.14 Optimal substructure MST T: (Other edges of G are not shown.) Remove any edge (u, v) T. u v

15
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.15 Optimal substructure MST T: (Other edges of G are not shown.) Remove any edge (u, v) T. Then, T is partitioned into two subtrees T 1 and T 2. T1T1 u v T2T2

16
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.16 Optimal substructure MST T: (Other edges of G are not shown.) Remove any edge (u, v) T. Then, T is partitioned into two subtrees T 1 and T 2. Theorem. The subtree T 1 is an MST of G 1 = (V 1, E 1 ), the subgraph of G induced by the vertices of T 1 : V 1 = vertices of T 1, E 1 = { (x, y) E : x, y V 1 }. T1T1 u v T2T2 Similarly for T 2.

17
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.17 Proof of optimal substructure Proof. Cut and paste: w(T) = w(u, v) + w(T 1 ) + w(T 2 ). If T 1 were a lower-weight spanning tree than T 1 for G 1, then T = {(u, v)} ∪ T 1 ∪ T 2 would be a lower-weight spanning tree than T for G.

18
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.18 Proof of optimal substructure Proof. Cut and paste: w(T) = w(u, v) + w(T 1 ) + w(T 2 ). If T 1 were a lower-weight spanning tree than T 1 for G 1, then T = {(u, v)} ∪ T 1 ∪ T 2 would be a lower-weight spanning tree than T for G. Do we also have overlapping subproblems? Yes.

19
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.19 Proof of optimal substructure Proof. Cut and paste: w(T) = w(u, v) + w(T 1 ) + w(T 2 ). If T 1 were a lower-weight spanning tree than T 1 for G 1, then T = {(u, v)} ∪ T 1 ∪ T 2 would be a lower-weight spanning tree than T for G. Do we also have overlapping subproblems? Yes. Great, then dynamic programming may work! Yes, but MST exhibits another powerful property which leads to an even more efficient algorithm.

20
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.20 Hallmark for “greedy” algorithms Greedy-choice property A locally optimal choice is globally optimal.

21
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.21 Hallmark for “greedy” algorithms Greedy-choice property A locally optimal choice is globally optimal. Theorem.Let T be the MST of G = (V, E), and let A ⊆ V. Suppose that (u, v) E is the least-weight edge connecting A to V – A. Then, (u, v) T.

22
∈ A ∈ V – A November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.22 Proof of theorem Proof. Suppose (u, v) ∉ T. Cut and paste. T:T: v (u, v) = least-weight edge connecting A to V – A u

23
A V – A November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.23 Proof of theorem Proof. Suppose (u, v) ∉ T. Cut and paste. T:T: v (u, v) = least-weight edge connecting A to V – A u Consider the unique simple path from u to v in T.

24
A V – A November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.24 Proof of theorem Proof. Suppose (u, v) ∉ T. Cut and paste. T:T: v (u, v) = least-weight edge connecting A to V – A u Consider the unique simple path from u to v in T. Swap (u, v) with the first edge on this path that connects a vertex in A to a vertex in V – A.

25
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.25 Proof of theorem Proof. Suppose (u, v) ∉ T. Cut and paste. T : A V – A v (u, v) = least-weight edge connecting A to V – A u Consider the unique simple path from u to v in T. Swap (u, v) with the first edge on this path that connects a vertex in A to a vertex in V – A. A lighter-weight spanning tree than T results.

26
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.26 Prim’s algorithm I DEA : Maintain V – A as a priority queue Q. Key each vertex in Q with the weight of the least- weight edge connecting it to a vertex in A. Q ← V key[v] ← ∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ dou ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) ⊳ D ECREASE -K EY [v] ← u At the end, {(v, [v])} forms the MST.

27
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.27 Example of Prim’s algorithm ∞ 6 12 14 3 8 5 ∞ 7 10 9 15 ∞∞∞∞ ∞ 0 ∞∞∞∞ A V – A

28
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.28 Example of Prim’s algorithm ∞ 6 12 14 3 8 5 ∞ 7 10 9 15 ∞∞∞∞ ∞∞∞∞ A V – A ∞ 0∞ 0

29
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.29 Example of Prim’s algorithm ∞ 6 12 14 3 8 5 10 7 10 9 15 ∞∞∞∞ ∞ A V – A 0

30
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.30 Example of Prim’s algorithm ∞ 6 12 14 3 8 5 10 7 10 9 15 ∞∞∞∞ ∞ A V – A 0

31
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.31 Example of Prim’s algorithm 6 12 14 3 8 5 10 7 10 9 15 ∞∞ A V – A 0

32
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.32 Example of Prim’s algorithm 6 12 14 3 8 5 10 7 10 9 15 ∞∞ A V – A 0

33
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.33 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

34
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.34 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

35
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.35 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

36
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.36 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

37
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.37 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

38
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.38 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

39
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.39 Example of Prim’s algorithm 6 12 14 3 8 5 8 7 10 9 15 A V – A 0

40
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.40 Analysis of Prim Q ← V key[v] ←∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ do u ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) [v] ← u

41
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.41 Analysis of Prim Q ← V key[v] ←∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ do u ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) [v] ← u (V) total

42
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.42 Analysis of Prim Q ← V key[v] ←∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ do u ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) [v] ← u (V) total |V | time s

43
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.43 Analysis of Prim Q ← V key[v] ←∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ do u ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) [v] ← u (V) total |V | time s degree(u) times

44
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.44 Analysis of Prim Q ← V key[v] ←∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ do u ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) [v] ← u (V) total |V | time s degree(u) times Handshaking Lemma (E) implicit D ECREASE -K EY ’s.

45
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.45 Analysis of Prim Q ← V key[v] ←∞ for all v V key[s] ← 0 for some arbitrary s V while Q ∅ do u ← E XTRACT -M IN (Q) for each v Adj[u] do if v Q and w(u, v) < key[v] then key[v] ← w(u, v) [v] ← u (V) total |V | time s degree(u) times Handshaking Lemma (E) implicit D ECREASE -K EY ’s. Time = (V)·T EXTRACT-MIN + (E)·T DECREASE-KEY

46
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.46 Analysis of Prim (continued) Time = (V) · T EXTRACT-MIN + (E) · T DECREASE-KEY

47
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.47 Analysis of Prim (continued) Time = (V) · T EXTRACT-MIN + (E) · T DECREASE-KEY Q T EXTRACT-MIN T DECREASE-KEY Total

48
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.48 Analysis of Prim (continued) Time = (V) · T EXTRACT-MIN + (E) · T DECREASE-KEY Q T EXTRACT-MIN T DECREASE-KEY Total array O(V) O(1) O(V 2 )

49
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.49 Analysis of Prim (continued) Time = (V) · T EXTRACT-MIN + (E) · T DECREASE-KEY Q T EXTRACT-MIN T DECREASE-KEY Total array O(V) O(1) O(V 2 ) binary heap O(lg V) O(lg V) O(E lg V)

50
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.50 Analysis of Prim (continued) Time = (V) · T EXTRACT-MIN + (E) · T DECREASE-KEY Q T EXTRACT-MIN T DECREASE-KEY Total array O(V) O(1) O(V 2 ) binary heap O(lg V) O(lg V) O(E lg V) Fibonacci heap O(lg V) amortized O(1) amortized O(E + V lg V) worst case

51
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.51 MST algorithms Kruskal’s algorithm (see CLRS): Uses the disjoint-set data structure (Lecture 10). Running time = O(E lg V).

52
November 9, 2005Copyright © 2001-5 by Erik D. Demaine and Charles E. LeisersonL16.52 MST algorithms Kruskal’s algorithm (see CLRS): Uses the disjoint-set data structure (Lecture 10). Running time = O(E lg V). Best to date: Karger, Klein, and Tarjan [1993]. Randomized algorithm. O(V + E) expected time.

Similar presentations

OK

9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 7 Greedy Graph.

9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 7 Greedy Graph.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on bond length definition Ppt on electric meter testing standards Ppt on game theory wikipedia Ppt on range of motion Ppt on uses of synthetic fibres Slides for ppt on it act 2008 Ppt on ozone layer in hole Ppt on conservation of water in hindi Ppt on natural resources management Ppt on steve jobs leadership