Presentation is loading. Please wait.

Presentation is loading. Please wait.

308-203A Introduction to Computing II Lecture 16: Dijkstra’s Algorithm Fall Session 2000.

Similar presentations


Presentation on theme: "308-203A Introduction to Computing II Lecture 16: Dijkstra’s Algorithm Fall Session 2000."— Presentation transcript:

1 308-203A Introduction to Computing II Lecture 16: Dijkstra’s Algorithm Fall Session 2000

2 Graphs with Weighted Edges Graphs frequently come with extra “satellite data” for the vertices or edges. A common varient is assigning each edge some positive weight. Graph G = (V, E) weight: E   (i.e. a mapping from edges to real numbers)

3 Example The paradigmatic example is a map of highways labelled with distances: Montréal Québec Roberval Trois-Rivières Sherbrooke Rivière-du-Loup Chicoutimi 142 147 158 130 206 259 101 211 318 531 Val-d’Or

4 Weight of a Path Problems For a path, P: P = ( v 1, v 2, … v i ) The weight of the path is: W P =  weight(v i, v (i+1) )

5 Shortest-Path Problems There is a general class of problems on such graphs involving finding paths of minimal weight. Some are easy to solve and some are hard….

6 A Hard Problem The Travelling Salesman Problem: find a shortest path through a connected graph which visits all of the vertices exactly once. This problem is “NP-Complete,” which means there is no computationally tractable solution (according to decades of empirical evidence, NP-Complete problems require exponential time to solve)

7 A Not-So-Hard Problem The Single-Source Shortest Path Problem: Given a vertex, v, find the shortest path from v to every vertex reachable from v. A nice solution is Dijkstra’s algorithm, which is a kind of algorithm known as a “Greedy” algorithm.

8 Greedy algorithms A “Greedy” algorithm is an algorithm based on the principle of always taking the choice which looks best in the short-run. In most cases, this does not actually yield an optimal long-term solution (or even a good one)

9 A Greedy Algorithm that Works Say you’re on a long drive. Clearly, the way to minimize the number of gas stops is to stop at a gas station only when you know you’ll run out of gas before you see another one. This is “greedy” because given the choice to stop or not to stop, you always choose the one which appears to minimize the total number of stops.

10 A Flawed “Greedy Algorithm” A greedy algorithm does not work for the Travelling Salesman Problem: a b c d e 10 1 5 8 3 14 Assume we start from “a”

11 A Flawed “Greedy Algorithm” a b c d e 10 1 5 8 3 14 Best “greedy” choices gives the path: P = (a, b, c, d, e) Weight = 30

12 A Flawed “Greedy Algorithm” a b c d e 10 1 5 8 3 14 But I can find a better path: P = (a, b, e, c, d) Weight = 26

13 Dijkstra’s Algorithm For the Single-Source Shortest Path Problem, the “Greedy” approach actually does work. Idea: 1) Label all vertices with a best-known distance so far 2) Initialize distance to 0 for the starting vertex and  for all others

14 Dijkstra’s Algorithm Idea (continued): 3) Maintain a list, A, of “unexplored” vertices, which is initialized to all vertices 4) Choose vertices from A to explore by a “greedy” strategy: always take the one with minimum distance 5) When you explore a vertex, update the best known distance to its neighbors and remove it from A.

15 Dijkstra (pseudocode) for each v  V, distance(v) =  ; distance( startVertex ) = 0; A := V;// Initialize B := Ø ; for j := 1 to ( |V| - 1 )// Grow set B { u := Extract-Min( A ) ;// “Greedy” choice B := B  { u } ; for each v  Neighbors( u )// Update distances Relax(v, u) }

16 Helper Routine Relax(Vertex v, Vertex u) { newWeight = distance( u ) + weight( u, v ) ; if (newWeight < distance( v ) ) distance( v ) = newWeight ; } Updates are performed by checking whether u yields a better way to get to v

17 Example abc def 0      Initialize starting vertex to 0 and all others to  A = {a, b, c, d, e, f } B =  5 3 8 2 7 3 1

18 Example Remove minimum element from A and call Relax() A = {b, c, d, e, f } B = { a } abc def 0 3  5   5 3 8 2 7 3 1

19 Example Remove minimum element from A and call Relax() A = {b, c, d, e, f } B = { a } abc def 0 3  5   5 3 8 2 7 3 1

20 Example Now “d” is the minimum in A A = {b, c, e, f } B = { a, d } abc def 0 3 8 + 3 =11 5   5 3 8 2 7 3 1 11

21 Example Now “b” is the minimum in A: We find we now have a better path to “e” A = { c, e, f } B = { a, b, d } abc def 0 3 5   5 3 8 2 7 3 1 7 5+2 = 7

22 Example Now “e” is the minimum in A: A = { c, f } B = { a, b, d, e } abc def 0 3 5 7 + 7 = 14 7 + 3 = 10 5 3 8 2 7 3 1 7

23 Example Now “f” is the minimum in A: A = { f } B = { a, b, c, d, e } abc def 0 3 5 11 10 5 3 8 2 7 3 1 7 10 + 1 = 11

24 Example Done! A = { f } B = { a, b, c, d, e } abc def 0 3 5 11 10 5 3 8 2 7 3 1 7

25 Running time Like for Depth and Breadth-First Searches, we execute a body of code once per vertex, and that code looks only at the neighbors of the vertex it is run for:  O( |V| + |E| )

26 Proof of Correctness Since “Greedy Algorithms” don’t always give optimal solutions, we must furnish a proof that, in the particular case of Dijkstra’s algorithm, one actually does compute the shortest possible paths…

27 Any questions?


Download ppt "308-203A Introduction to Computing II Lecture 16: Dijkstra’s Algorithm Fall Session 2000."

Similar presentations


Ads by Google