Presentation is loading. Please wait.

Presentation is loading. Please wait.

Elementary Graph Algorithms CSc 4520/6520 Fall 2013 Slides adapted from David Luebke, University of Virginia and David Plaisted, University of North Carolina.

Similar presentations


Presentation on theme: "Elementary Graph Algorithms CSc 4520/6520 Fall 2013 Slides adapted from David Luebke, University of Virginia and David Plaisted, University of North Carolina."— Presentation transcript:

1 Elementary Graph Algorithms CSc 4520/6520 Fall 2013 Slides adapted from David Luebke, University of Virginia and David Plaisted, University of North Carolina

2 Graphs  Graph G = (V, E) »V = set of vertices »E = set of edges  (V  V)  Types of graphs »Undirected: edge (u, v) = (v, u); for all v, (v, v)  E (No self loops.) »Directed: (u, v) is edge from u to v, denoted as u  v. Self loops are allowed. »Weighted: each edge has an associated weight, given by a weight function w : E  R. »Dense: |E|  |V| 2. »Sparse: |E| << |V| 2.  |E| = O(|V| 2 )

3 Graphs  If (u, v)  E, then vertex v is adjacent to vertex u.  Adjacency relationship is: »Symmetric if G is undirected. »Not necessarily so if G is directed.  If G is connected: »There is a path between every pair of vertices. »|E|  |V| – 1. »Furthermore, if |E| = |V| – 1, then G is a tree.  Other definitions in Appendix B (B.4 and B.5) as needed.

4 Representation of Graphs  Two standard ways. »Adjacency Lists. »Adjacency Matrix. a dc b a b c d b a d dc c ab ac a dc b 12 34 1 2 3 4 1 0 1 1 1 2 1 0 1 0 3 1 1 0 1 4 1 0 1 0

5 Adjacency Lists  Consists of an array Adj of |V| lists.  One list per vertex.  For u  V, Adj[u] consists of all vertices adjacent to u. a dc b a b c d b c d dc a dc b a b c d b a d dc c ab ac If weighted, store weights also in adjacency lists.

6 Storage Requirement  For directed graphs: »Sum of lengths of all adj. lists is  out-degree(v) = |E| v  V »Total storage:  (V+E)  For undirected graphs: »Sum of lengths of all adj. lists is  degree(v) = 2|E| v  V »Total storage:  (V+E) No. of edges leaving v No. of edges incident on v. Edge (u,v) is incident on vertices u and v.

7 Pros and Cons: adj list  Pros »Space-efficient, when a graph is sparse. »Can be modified to support many graph variants.  Cons »Determining if an edge (u,v)  G is not efficient. Have to search in u’s adjacency list.  (degree(u)) time.  (V) in the worst case.

8 Adjacency Matrix  |V|  |V| matrix A.  Number vertices from 1 to |V| in some arbitrary manner.  A is then given by: a dc b 1 2 3 4 1 2 3 4 1 0 1 1 1 2 0 0 1 0 3 0 0 0 1 4 0 0 0 0 a dc b 12 34 1 2 3 4 1 0 1 1 1 2 1 0 1 0 3 1 1 0 1 4 1 0 1 0 A = A T for undirected graphs.

9 Space and Time  Space:  ( V 2 ). »Not memory efficient for large graphs.  Time: to list all vertices adjacent to u:  ( V ).  Time: to determine if ( u, v)  E:  ( 1 ).  Can store weights instead of bits for weighted graph.

10 Graph-searching Algorithms  Searching a graph: »Systematically follow the edges of a graph to visit the vertices of the graph.  Used to discover the structure of a graph.  Standard graph-searching algorithms. »Breadth-first Search (BFS). »Depth-first Search (DFS).

11 Breadth-first Search  Input: Graph G = ( V, E ), either directed or undirected, and source vertex s  V.  Output: »d[v] = distance (smallest # of edges, or shortest path) from s to v, for all v  V. d[v] =  if v is not reachable from s. »  [v] = u such that ( u, v ) is last edge on shortest path s v. u is v’s predecessor. »Builds breadth-first tree with root s that contains all reachable vertices. Definitions: Path between vertices u and v: Sequence of vertices (v 1, v 2, …, v k ) such that u=v 1 and v =v k, and (v i,v i+1 )  E, for all 1  i  k-1. Length of the path: Number of edges in the path. Path is simple if no vertex is repeated.

12 Breadth-first Search  Expands the frontier between discovered and undiscovered vertices uniformly across the breadth of the frontier. »A vertex is “discovered” the first time it is encountered during the search. »A vertex is “finished” if all vertices adjacent to it have been discovered.  Colors the vertices to keep track of progress. »White – Undiscovered. »Gray – Discovered but not finished. »Black – Finished. Colors are required only to reason about the algorithm. Can be implemented without colors.

13 Breadth-first searching  A breadth-first search (BFS) explores nodes nearest the root before exploring nodes further away  For example, after searching A, then B, then C, the search proceeds with D, E, F, G  Node are explored in the order A B C D E F G H I J K L M N O P Q  J will be found before N LM N OP G Q H J IK FED BC A

14 BFS for Shortest Paths Finished Discovered Undiscovered S 1 1 1 S 2 2 2 2 2 2 S 3 3 3 3 3

15 BFS(G,s) 1.for each vertex u in V[G] – {s} 2do color[u]  white 3 d[u]   4  [u]  nil 5color[s]  gray 6d[s]  0 7  [s]  nil 8Q   9enqueue(Q,s) 10while Q   11do u  dequeue(Q) 12for each v in Adj[u] 13do if color[v] = white 14then color[v]  gray 15 d[v]  d[u] + 1 16  [v]  u 17 enqueue(Q,v) 18color[u]  black BFS(G,s) 1.for each vertex u in V[G] – {s} 2do color[u]  white 3 d[u]   4  [u]  nil 5color[s]  gray 6d[s]  0 7  [s]  nil 8Q   9enqueue(Q,s) 10while Q   11do u  dequeue(Q) 12for each v in Adj[u] 13do if color[v] = white 14then color[v]  gray 15 d[v]  d[u] + 1 16  [v]  u 17 enqueue(Q,v) 18color[u]  black white: undiscovered gray: discovered black: finished Q: a queue of discovered vertices color[v]: color of v d[v]: distance from s to v  [u]: predecessor of v Example: animation.

16 Example (BFS)  0       r s t u v w x y Q: s 0 (Courtesy of Prof. Jim Anderson)

17 Example (BFS) 1 0 1      r s t u v w x y Q: w r 1 1

18 Example (BFS) 1 0 1 2  2   r s t u v w x y Q: r t x 1 2 2

19 Example (BFS) 1 0 1 2  2  2 r s t u v w x y Q: t x v 2 2 2

20 Example (BFS) 1 0 1 2  2 3 2 r s t u v w x y Q: x v u 2 2 3

21 Example (BFS) 1 0 1 2 3 2 3 2 r s t u v w x y Q: v u y 2 3 3

22 Example (BFS) 1 0 1 2 3 2 3 2 r s t u v w x y Q: u y 3 3

23 Example (BFS) 1 0 1 2 3 2 3 2 r s t u v w x y Q: y 3

24 Example (BFS) 1 0 1 2 3 2 3 2 r s t u v w x y Q: 

25 Example (BFS) 1 0 1 2 3 2 3 2 r s t u v w x y BF Tree

26 Analysis of BFS  Initialization takes O(V).  Traversal Loop »After initialization, each vertex is enqueued and dequeued at most once, and each operation takes O(1). So, total time for queuing is O(V). »The adjacency list of each vertex is scanned at most once. The sum of lengths of all adjacency lists is  (E).  Summing up over all vertices => total running time of BFS is O(V+E), linear in the size of the adjacency list representation of graph.

27 Breadth-first Tree  For a graph G = (V, E) with source s, the predecessor subgraph of G is G  = (V , E  ) where » V  ={v  V :  [v]  NIL }  {s} » E  ={(  [v],v)  E : v  V  - {s}}  The predecessor subgraph G  is a breadth-first tree if: » V  consists of the vertices reachable from s and » for all v  V , there is a unique simple path from s to v in G  that is also a shortest path from s to v in G.  The edges in E  are called tree edges. |E  | = |V  | - 1.

28 Shortest Path Tree  Theorem: BFS algorithm »visits all and only nodes reachable from s »sets d[v] equal to the shortest path distance from s to v, for all nodes v, and »sets parent variables to form a shortest path tree

29 Proof Ideas  Use induction on distance from s to show that the d- values are set properly.  Basis: distance 0. d[s] is set to 0.  Induction: Assume true for all nodes at distance x-1 and show for every node v at distance x.  Since v is at distance x, it has at least one neighbor at distance x-1. Let u be the first of these neighbors that is enqueued.

30 Proof Ideas u c d vs dist=x-1 dist=x dist=x-1 dist=x+1 Key property of shortest path distances: if v has distance x, it must have a neighbor with distance x-1, no neighbor has distance less than x-1, and no neighbor has distance more than x+1

31 Proof Ideas  Fact: When u is dequeued, v is still unvisited. »because of how queue operates and since d never underestimates the distance  By induction, d[u] = x-1.  When v is enqueued, d[v] is set to d[u] + 1= x

32 BFS Running Time  Initialization of each node takes O(V) time  Every node is enqueued once and dequeued once, taking O(V) time  When a node is dequeued, all its neighbors are checked to see if they are unvisited, taking time proportional to number of neighbors of the node, and summing to O(E) over all iterations  Total time is O(V+E)

33 Depth-first Search (DFS)  Explore edges out of the most recently discovered vertex v.  When all edges of v have been explored, backtrack to explore other edges leaving the vertex from which v was discovered (its predecessor).  “Search as deep as possible first.”  Continue until all vertices reachable from the original source are discovered.  If any undiscovered vertices remain, then one of them is chosen as a new source and search is repeated from that source.

34 Depth-first Search  Input: G = ( V, E ), directed or undirected. No source vertex given!  Output: » 2 timestamps on each vertex. Integers between 1 and 2|V|. d[v] = discovery time (v turns from white to gray) f [v] = finishing time (v turns from gray to black) »  [v] : predecessor of v = u, such that v was discovered during the scan of u’s adjacency list.  Uses the same coloring scheme for vertices as BFS.

35 Depth-first searching  A depth-first search (DFS) explores a path all the way to a leaf before backtracking and exploring another path  For example, after searching A, then B, then D, the search backtracks and tries another path from B  Node are explored in the order A B D E H L M N I O P C F G J K Q  N will be found before J LM N OP G Q H J IK FED BC A

36 Pseudo-code DFS(G) 1. for each vertex u  V[G] 2. do color[u]  white 3.  [u]  NIL 4. time  0 5. for each vertex u  V[G] 6. do if color[u] = white 7. then DFS-Visit(u) DFS(G) 1. for each vertex u  V[G] 2. do color[u]  white 3.  [u]  NIL 4. time  0 5. for each vertex u  V[G] 6. do if color[u] = white 7. then DFS-Visit(u) Uses a global timestamp time. DFS-Visit(u) 1.color[u]  GRAY  White vertex u has been discovered 2.time  time + 1 3. d[u]  time 4. for each v  Adj[u] 5. do if color[v] = WHITE 6. then  [v]  u 7. DFS-Visit(v) 8. color[u]  BLACK  Blacken u; it is finished. 9. f[u]  time  time + 1 DFS-Visit(u) 1.color[u]  GRAY  White vertex u has been discovered 2.time  time + 1 3. d[u]  time 4. for each v  Adj[u] 5. do if color[v] = WHITE 6. then  [v]  u 7. DFS-Visit(v) 8. color[u]  BLACK  Blacken u; it is finished. 9. f[u]  time  time + 1 Example: animation.

37 Example (DFS) 1/ u v w x y z (Courtesy of Prof. Jim Anderson)

38 Example (DFS) 1/ 2/ u v w x y z

39 Example (DFS) 1/ 3/ 2/ u v w x y z

40 Example (DFS) 1/ 4/ 3/ 2/ u v w x y z

41 Example (DFS) 1/ 4/ 3/ 2/ u v w x y z B

42 Example (DFS) 1/ 4/5 3/ 2/ u v w x y z B

43 Example (DFS) 1/ 4/5 3/6 2/ u v w x y z B

44 Example (DFS) 1/ 4/5 3/6 2/7 u v w x y z B

45 Example (DFS) 1/ 4/5 3/6 2/7 u v w x y z B F

46 Example (DFS) 1/8 4/5 3/6 2/7 u v w x y z B F

47 Example (DFS) 1/8 4/5 3/6 2/7 9/ u v w x y z B F

48 Example (DFS) 1/8 4/5 3/6 2/7 9/ u v w x y z B F C

49 Example (DFS) 1/8 4/5 3/6 10/ 2/7 9/ u v w x y z B F C

50 Example (DFS) 1/8 4/5 3/6 10/ 2/7 9/ u v w x y z B F C B

51 Example (DFS) 1/8 4/5 3/6 10/11 2/7 9/ u v w x y z B F C B

52 Example (DFS) 1/8 4/5 3/6 10/11 2/7 9/12 u v w x y z B F C B

53 Analysis of DFS  Loops on lines 1-2 & 5-7 take  (V) time, excluding time to execute DFS-Visit.  DFS-Visit is called once for each white vertex v  V when it’s painted gray the first time. Lines 3-6 of DFS- Visit is executed |Adj[v]| times. The total cost of executing DFS-Visit is  v  V |Adj[v]| =  (E)  Total running time of DFS is  (V+E).

54 Parenthesis Theorem Theorem 22.7 For all u, v, exactly one of the following holds: 1. d[u] < f [u] < d[v] < f [v] or d[v] < f [v] < d[u] < f [u] and neither u nor v is a descendant of the other. 2. d[u] < d[v] < f [v] < f [u] and v is a descendant of u. 3. d[v] < d[u] < f [u] < f [v] and u is a descendant of v. Theorem 22.7 For all u, v, exactly one of the following holds: 1. d[u] < f [u] < d[v] < f [v] or d[v] < f [v] < d[u] < f [u] and neither u nor v is a descendant of the other. 2. d[u] < d[v] < f [v] < f [u] and v is a descendant of u. 3. d[v] < d[u] < f [u] < f [v] and u is a descendant of v.  So d[u] < d[ v ] < f [u] < f [ v ] cannot happen.  Like parentheses:  OK: ( ) [ ] ( [ ] ) [ ( ) ]  Not OK: ( [ ) ] [ ( ] ) Corollary v is a proper descendant of u if and only if d[u] < d[ v ] < f [ v ] < f [u].

55 Example (Parenthesis Theorem) 3/6 4/5 7/8 12/13 2/9 1/10 y z s x w v BF 14/15 11/16 u t C C C C B (s (z (y (x x) y) (w w) z) s) (t (v v) (u u) t)

56 Depth-First Trees  Predecessor subgraph defined slightly different from that of BFS.  The predecessor subgraph of DFS is G  = (V, E  ) where E  ={(  [v],v) : v  V and  [v]  NIL}. »How does it differ from that of BFS? » The predecessor subgraph G  forms a depth-first forest composed of several depth-first trees. The edges in E  are called tree edges. Definition: Forest: An acyclic graph G that may be disconnected.

57 White-path Theorem Theorem 22.9 v is a descendant of u if and only if at time d[u], there is a path u v consisting of only white vertices. (Except for u, which was just colored gray.) Theorem 22.9 v is a descendant of u if and only if at time d[u], there is a path u v consisting of only white vertices. (Except for u, which was just colored gray.)

58 Classification of Edges  Tree edge: in the depth-first forest. Found by exploring ( u, v ).  Back edge: ( u, v ), where u is a descendant of v ( in the depth-first tree ).  Forward edge: ( u, v ), where v is a descendant of u, but not a tree edge.  Cross edge: any other edge. Can go between vertices in same depth-first tree or in different depth-first trees. Theorem: In DFS of an undirected graph, we get only tree and back edges. No forward or cross edges. Theorem: In DFS of an undirected graph, we get only tree and back edges. No forward or cross edges.

59 Review: Kinds Of Edges  Thm: If G is undirected, a DFS produces only tree and back edges  Thm: An undirected graph is acyclic iff a DFS yields no back edges  Thus, can run DFS to find cycles

60 1 |128 |1113|16 14|155 | 63 | 4 2 | 79 |10 source vertex d f Tree edgesBack edgesForward edgesCross edges Review: Kinds of Edges

61 DFS And Cycles  Running time: O(V+E)  We can actually determine if cycles exist in O(V) time: »In an undirected acyclic forest, |E|  |V| - 1 »So count the edges: if ever see |V| distinct edges, must have seen a back edge along the way »Why not just test if |E| <|V| and answer the question in constant time?

62 Directed Acyclic Graphs  A directed acyclic graph or DAG is a directed graph with no directed cycles:

63 DFS and DAGs  Argue that a directed graph G is acyclic iff a DFS of G yields no back edges: »Forward: if G is acyclic, will be no back edges Trivial: a back edge implies a cycle »Backward: if no back edges, G is acyclic Argue contrapositive: G has a cycle   a back edge –Let v be the vertex on the cycle first discovered, and u be the predecessor of v on the cycle –When v discovered, whole cycle is white –Must visit everything reachable from v before returning from DFS- Visit() –So path from u  v is yellow  yellow, thus (u, v) is a back edge

64 Topological Sort  Topological sort of a DAG: »Linear ordering of all vertices in graph G such that vertex u comes before vertex v if edge (u, v)  G  Real-world example: getting dressed

65 Getting Dressed UnderwearSocks ShoesPants Belt Shirt Watch Tie Jacket

66 Getting Dressed UnderwearSocks ShoesPants Belt Shirt Watch Tie Jacket SocksUnderwearPantsShoesWatchShirtBeltTieJacket

67 Topological Sort Algorithm Topological-Sort() { Run DFS When a vertex is finished, output it Vertices are output in reverse topological order }  Time: O(V+E)  Correctness: Want to prove that (u,v)  G  u  f > v  f

68 Correctness of Topological Sort  Claim: (u,v)  G  u  f > v  f »When (u,v) is explored, u is grey v = grey  (u,v) is back edge. Contradiction (Why?) v = white  v becomes descendent of u  v  f < u  f (since must finish v before backtracking and finishing u) v = black  v already finished  v  f < u  f

69  v is discovered but not yet finished when u is discovered.  Then u is a descendant of v.  But that would make (u,v) a back edge and a DAG cannot have a back edge (the back edge would form a cycle).  Thus Case 1 is not possible.

70 DFS Application: Strongly Connected Components  Consider a directed graph.  A strongly connected component (SCC) of the graph is a maximal set of nodes with a (directed) path between every pair of nodes  Problem: Find all the SCCs of the graph.

71 SCC Example h fae g cbd four SCCs

72 How Can DFS Help?  Suppose we run DFS on the directed graph.  All nodes in the same SCC are in the same DFS tree.  But there might be several different SCCs in the same DFS tree. »Example: start DFS from node h in previous graph

73 Main Idea of SCC Algorithm  DFS tells us which nodes are reachable from the roots of the individual trees  Also need information in the "other direction": is the root reachable from its descendants?  Run DFS again on the "transpose" graph (reverse the directions of the edges)

74 SCC Algorithm input: directed graph G = (V,E) 1.call DFS(G) to compute finishing times 2.compute G T // transpose graph 3.call DFS(G T ), considering nodes in decreasing order of finishing times 4.each tree from Step 3 is a separate SCC of G

75 SCC Algorithm Example h fae g cbd input graph - run DFS

76 After Step 1 c bg a f 12345678910111213141516 d e h fin(c) fin(d)fin(b)fin(e)fin(a)fin(h)fin(g) fin(f) Order of nodes for Step 3: f, g, h, a, e, b, d, c

77 After Step 2 h fae g cbd transposed input graph - run DFS with specified order of nodes

78 After Step 3 g he f a 12345678910111213141516 d SCCs are {f,h,g} and {a,e} and {b,c} and {d}. b c

79 Running Time of SCC Algorithm  Step 1: O(V+E) to run DFS  Step 2: O(V+E) to construct transpose graph, assuming adjacency list rep.  Step 3: O(V+E) to run DFS again  Step 4: O(V) to output result  Total: O(V+E)

80 Correctness of SCC Algorithm  Proof uses concept of component graph, G SCC, of G.  Nodes are the SCCs of G; call them C 1, C 2, …, C k  Put an edge from C i to C j iff G has an edge from a node in C i to a node in C j

81 Example of Component Graph {a,e}{f,h,g} {d} {b,c} based on example graph from before

82 Facts About Component Graph  Claim: G SCC is a directed acyclic graph.  Why?  Suppose there is a cycle in G SCC such that component C i is reachable from component C j and vice versa.  Then C i and C j would not be separate SCCs.

83 Facts About Component Graph  Consider any component C during Step 1 (running DFS on G)  Let d(C) be earliest discovery time of any node in C  Let f(C) be latest finishing time of any node in C  Lemma: If there is an edge in G SCC from component C' to component C, then f(C') > f(C).


Download ppt "Elementary Graph Algorithms CSc 4520/6520 Fall 2013 Slides adapted from David Luebke, University of Virginia and David Plaisted, University of North Carolina."

Similar presentations


Ads by Google