# CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 1 Graphs 2 Incidence and Adjacency Representing a graph with an adjacency matrix, an incidence matrix,

## Presentation on theme: "CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 1 Graphs 2 Incidence and Adjacency Representing a graph with an adjacency matrix, an incidence matrix,"— Presentation transcript:

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 1 Graphs 2 Incidence and Adjacency Representing a graph with an adjacency matrix, an incidence matrix, adjacency lists. Graph search: depth-first, breadth-first. Shortest paths.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 2 Incidence and Adjacency Let e =  v 1, v 2  be an edge of a digraph. Then v 1 is adjacent to v 2. And v 2 is adjacent from v 1. e is incident to v 2. e is incident from v 1. For an undirected graph we can use the terms “adjacent to” and “incident to” for relationships in both directions. v2v2 v1v1 e

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 3 Representing a Digraph with an Adjacency Matrix G =  V, E , V = { v 1, v 2,..., v n }, E = { e 1, e 2,..., e m } e i =  v a(i), v b(i)  An adjacency matrix for G contains n rows and n columns. A[i,j] = 1 if  v i, v j   E; 0 otherwise. a b c d a 0 1 1 1 b 0 0 0 0 c 0 0 0 1 d 0 0 1 0 c d a b

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 4 Adjacency Lists Represent the adjacency matrix info as an array of lists, one for each row of the matrix. The i th list contains the vertices to which v i is adjacent. (Similar to the sparse array idea). a b c d a 0 1 1 1 a: b, c, d b 0 0 0 0 b: c 0 0 0 1 c: d d 0 0 1 0 d: c Efficent for getNeighbors(v)

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 5 Incidence Matrix e 1 e 2 e 3 e 4 e 5 a -1 -1 -1 0 0 b 1 0 0 0 0 c 0 0 1 -1 1 d 0 1 0 1 -1 E = {  a, b ,  a, d ,  a, c ,  c, d ,  d, c  } -1: edge starts at that vertex (it’s incident from the vertex) 1: edge ends at that vertex (it’s incident to the vertex) 0: edge does not involve the vertex.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 6 Using a Hash Table If the graph is sparse, store adjacency info in the hash table. if  v i, v j   E, perform put(  v i, v j , 1) If the complementary graph is sparse (the graph is very dense), then store the complement of the edges in the hash table. if  v i, v j   E, perform put(  v i, v j , 0) In either case, the keys are edges, and the values tell whether or not the edge is part of the graph.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 7 Graph Search Given a directed graph G =  V,E  and a starting vertex s, find a path from s to some goal vertex g, where g is any vertex that satisfies a goal(v) predicate. E.g., let s = a. c d a b e f g

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 8 Possible Solutions There may be 0 solutions, 1 solution, or many solutions. This particular problem seems to have at least 2 distinct solutions. P =   a,c ,  c,f ,  f,e ,  e,g  . c d a b e f g

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 9 Some Additional Solutions If cycles are permitted in the path, there may be an infinite number of solutions. P =   a,d ,  d,c ,  c,d ,...,  d,c ,  c,f ,  f,e ,  e,g ,  g,f ,...,  f,e ,  e,g   c d a b e f g

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 10 Avoiding Loops Even if cycles do not occur in the solution, they may cause a nuisance for any search process that is not prepared for them. Therefore, it is important to keep track of which vertices have been visited during a search. We’ll maintain two lists: OPEN and CLOSED. OPEN will hold the set of vertices that have been discovered so far but that have not yet been examined. CLOSED will hold the vertices that have already been examined.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 11 Depth-First Search Set OPEN =  S  ; Set CLOSED =   ; Initialize a hash table PRED. (for “predecessor”) Put into PRED: (S, NULL); While OPEN is not empty do { Remove the first element of OPEN; call this V. If Goal(V) then output the reverse of the linked list: V, PRED(V), PRED(PRED(V)), etc., and return. Find all W such that  V,W   E. Call this set L. Remove from L any elements already on OPEN or CLOSED. For each of the remaining elements V’ of L, put into PRED: (V’, V) and put V’ onto OPEN at the front of the list. Put V onto CLOSED. } Output “No solution” and return.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 12 Breadth-First Search Set OPEN =  S  ; Set CLOSED =   ; Initialize a hash table PRED. (for “predecessor”) Put into PRED: (S, NULL); While OPEN is not empty do { Remove the first element of OPEN; call this V. If Goal(V) then output the reverse of the linked list: V, PRED(V), PRED(PRED(V)), etc., and return. Find all W such that  V,W   E. Call this set L. Remove from L any elements already on OPEN or CLOSED. For each of the remaining elements V’ of L, put into PRED: (V’, V) and put V’ onto OPEN at the rear of the list. Put V onto CLOSED. } Output “No solution” and return.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 13 Comparison of DFS and B(readth)FS In DFS, OPEN serves as a Stack In BFS, OPEN serves as a Queue. In both, CLOSED serves as a Set. Both DFS and BFS find a solution if it exists. BFS always finds a shortest solution (if one exists). DFS finds some solution that depends upon the order in which successors (members of L) are determined and processed.

CSE 373, Copyright S. Tanimoto, 2001 Graphs 2 - 14 Vertex Visitation Order In what order do vertices get processed (removed from OPEN and used as the value of V in the algorithm?) DFS: a, b, c, f, e, g BFS: a, b, c, d, f, e, g c d a b e f g