Download presentation

Presentation is loading. Please wait.

Published byJob Boyd Modified over 2 years ago

1
Lecture 17 Path Algebra Matrix multiplication of adjacency matrices of directed graphs give important information about the graphs. Manipulating these matrices to study graphs is path algebra. With "path algebra" we can solve the following problems: Compute the total number of paths between all pairs of vertices in a directed acyclic graph; Solve the "all-pairs-shortest-paths" problem in a weighted directed graph with no negative cycles; Compute the "transitive" closure of a directed graph. Think: what is the meaning of M 2 where M is an adjacency matrix of a graph G?

2
All paths of length r Claim. M r describes paths of length r in G, with entry i,j denoting number of distinct paths from i to j, here, M is the adjacency matrix of G. Proof. The base case is r = 0, trivial. For the induction step, assume the claim is true for r' < r; we prove it for r. Then M r ij = (M · M r-1 ) ij = ∑ 1 ≤ k ≤ n M ik M r-1 kj Now any r-step path from i to j must start with a step to some intermediate vertex k. If there is such a path, M ik is 1; otherwise it is 0. So adding up M ik M r-1 kj gives the number of r-step paths.

3
Computing the matrix powers Suppose G is acyclic (no path longer than n-1). Consider I + M + M 2 +... + M n-1 : the ij'th entry gives the total number of distinct paths from i to j. Therefore, in O(n ω+1 ) steps, we can compute the total number of distinct paths between all pairs of vertices. Here ω denotes the best-known exponent for matrix multiplication; currently ω = 2.376. We can even do better, if n=2 k, by first calculating M 2, M 4,..., M 2^(k-1), and then calculating (I+M)(I+M 2 )(I+M 4 )...(I+M 2^(k-1) ) = I + M + M 2 +... + M 2^k–1, where 2 k is the least power of 2 that is ≥ n. This gives an algorithm that runs in O(n ω log n) time, where n is the number of vertices.

4
Reachability graph Given an un-weighted directed graph G = (V,E), and we want to form the graph G' that has an edge between u and v if and only if there exists a path (of any length) in G from u to v. Let's first see how to solve it using what we know from say CS 240. There, we explored depth-first and breadth-first search algorithms. These algorithms could find all vertices reachable from a given vertex in O(|V|+|E|) time. So if we run depth-first search from every vertex, the total time is O(n(|V|+|E|), which could be as bad as O(n 3 ) if the graph is dense. Can we do better than O(n 3 )?

5
Transitive closure Back to path algebra. Now our matrix M consists of 1's and 0's. How can we find the matrix where there's a 1 in row i and column j iff there is a length-2 path connecting vertex i and j? I claim the entry in row i and column j should be OR 1 ≤ k ≤ n M ik AND M kj. That is: we just need to use “boolean multiplication and additions in our matrix computation. So the transitive closure of G is given by M' = I + M + M 2 +... + M n-1 where +, * are corresponding boolean operations. It tells if there is a path between any two nodes.

6
Transitive closure continues.. How fast can we compute: I + M + M 2 +... + M n-1 ? we can multiply 2 Boolean matrices in O(n ω ) steps. Since, using the “doubling trick”, we have to do log n Boolean matrix multiplications, this gives a total cost of O(n ω log n) to solve the transitive closure problem. This is indeed better than simply running breadth-first or depth-first search from each vertex. Think: what if n is not a power of 2?

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google