Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.

Similar presentations


Presentation on theme: "Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved."— Presentation transcript:

1 Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.

2 9-1 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible, i.e. satisfying the constraints b locally optimal (with respect to some neighborhood definition) b greedy (in terms of some measure), and irrevocable For some problems, it yields a globally optimal solution for every instance. For most, does not but can be useful for fast approximations. We are mostly interested in the former case in this class. Defined by an objective function and a set of constraints

3 9-2 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Applications of the Greedy Strategy b Optimal solutions: change making for normal coin denominationschange making for normal coin denominations minimum spanning tree (MST)minimum spanning tree (MST) single-source shortest pathssingle-source shortest paths simple scheduling problemssimple scheduling problems Huffman codesHuffman codes b Approximations/heuristics: traveling salesman problem (TSP)traveling salesman problem (TSP) knapsack problemknapsack problem other combinatorial optimization problemsother combinatorial optimization problems

4 9-3 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Change-Making Problem Given unlimited amounts of coins of denominations d 1 > … > d m, give change for amount n with the least number of coins Example: d 1 = 25c, d 2 =10c, d 3 = 5c, d 4 = 1c and n = 48c Greedy solution: Greedy solution is b optimal for any amount and normal set of denominations b may not be optimal for arbitrary coin denominations For example, d 1 = 25c, d 2 = 10c, d 3 = 1c, and n = 30c Ex: Prove the greedy algorithm is optimal for the above denominations. Q: What are the objective function and constraints?

5 9-4 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Minimum Spanning Tree (MST) b Spanning tree of a connected graph G: a connected acyclic subgraph of G that includes all of Gs vertices b Minimum spanning tree of a weighted, connected graph G: a spanning tree of G of the minimum total weight Example: c d b a c d b a c d b a 6 4 1

6 9-5 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Prims MST algorithm b Start with tree T 1 consisting of one (any) vertex and grow tree one vertex at a time to produce MST through a series of expanding subtrees T 1, T 2, …, T n b On each iteration, construct T i+1 from T i by adding vertex not in T i that is closest to those already in T i (this is a greedy step!) b Stop when all vertices are included

7 9-6 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Example c d b a c d b a c d b a c d b a c d b a

8 9-7 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Notes about Prims algorithm b Proof by induction that this construction actually yields an MST (CLRS, Ch. 23.1). Main property is given in the next page. b Needs priority queue for locating closest fringe vertex. The Detailed algorithm can be found in Levitin, P b Efficiency O(n 2 ) for weight matrix representation of graph and array implementation of priority queueO(n 2 ) for weight matrix representation of graph and array implementation of priority queue O(m log n) for adjacency lists representation of graph with n vertices and m edges and min-heap implementation of the priority queueO(m log n) for adjacency lists representation of graph with n vertices and m edges and min-heap implementation of the priority queue

9 9-8 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 The Crucial Property behind Prims Algorithm Claim: Let G = (V,E) be a weighted graph and (X,Y) be a partition of V (called a cut). Suppose e = (x,y) is an edge of E across the cut, where x is in X and y is in Y, and e has the minimum weight among all such crossing edges (called a light edge). Then there is an MST containing e. y x X Y

10 9-9 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Another greedy algorithm for MST: Kruskals b Sort the edges in nondecreasing order of lengths b Grow tree one edge at a time to produce MST through a series of expanding forests F 1, F 2, …, F n-1 b On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.)

11 9-10 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Example c d b a c d b a c d b a c d b a c d b a c d b a 2 1 3

12 9-11 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Notes about Kruskals algorithm b Algorithm looks easier than Prims but is harder to implement (checking for cycles!) b Cycle checking: a cycle is created iff added edge connects vertices in the same connected component b Union-find algorithms – see section 9.2 b Runs in O(m log m) time, with m = |E|. The time is mostly spent on sorting.

13 9-12 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Minimum spanning tree vs. Steiner tree c db a c db a vs s In general, a Steiner minimal tree (SMT) can be much shorter than a minimum spanning tree (MST), but SMTs are hard to compute.

14 9-13 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Shortest paths – Dijkstras algorithm Single Source Shortest Paths Problem: Given a weighted connected (directed) graph G, find shortest paths from source vertex s to each of the other vertices Dijkstras algorithm: Similar to Prims MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum d v + w(v,u) d v + w(v,u)where v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree rooted at s) v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree rooted at s) d v is the length of the shortest path from source s to v w(v,u) is the length (weight) of edge from v to u d v is the length of the shortest path from source s to v w(v,u) is the length (weight) of edge from v to u

15 9-14 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Example d 4 Tree vertices Remaining vertices a(-,0) b(a,3) c(-,) d(a,7) e(-,) a b 4 e c a b d 4 c e a b d 4 c e a b d 4 c e b(a,3) c(b,3+4) d(b,3+2) e(-,) d(b,5) c(b,7) e(d,5+4) c(b,7) e(d,9) e(d,9) d a b d 4 c e

16 9-15 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Notes on Dijkstras algorithm b Correctness can be proven by induction on the number of vertices. b Doesnt work for graphs with negative weights (whereas Floyds algorithm does, as long as there is no negative cycle). Can you find a counterexample for Dijkstras algorithm? b Applicable to both undirected and directed graphs b Efficiency O(|V| 2 ) for graphs represented by weight matrix and array implementation of priority queueO(|V| 2 ) for graphs represented by weight matrix and array implementation of priority queue O(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queueO(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queue b Dont mix up Dijkstras algorithm with Prims algorithm! More details of the algorithm are in the text and ref books. We prove the invariants: (i) when a vertex is added to the tree, its correct distance is calculated and (ii) the distance is at least those of the previously added vertices.

17 9-16 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Coding Problem Coding: assignment of bit strings to alphabet characters Codewords: bit strings assigned for characters of alphabet Two types of codes: b fixed-length encoding (e.g., ASCII) b variable-length encoding (e,g., Morse code) Prefix-free codes (or prefix-codes): no codeword is a prefix of another codeword Problem: If frequencies of the character occurrences are known, what is the best binary prefix-free code? known, what is the best binary prefix-free code? It allows for efficient (online) decoding! E.g. consider the encoded string (msg) … E.g. We can code {a,b,c,d} as {00,01,10,11} or {0,10,110,111} or {0,01,10,101}. The one with the shortest average code length. The average code length represents on the average how many bits are required to transmit or store a character. E.g. if P(a) = 0.4, P(b) = 0.3, P(c) = 0.2, P(d) = 0.1, then the average length of code #2 is * * *0.1 = 1.9 bits

18 9-17 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Huffman codes b Any binary tree with edges labeled with 0s and 1s yields a prefix-free code of characters assigned to its leaves b Optimal binary tree minimizing the average length of a codeword can be constructed length of a codeword can be constructed as follows: as follows: Huffmans algorithm Initialize n one-node trees with alphabet characters and the tree weights with their frequencies. Repeat the following step n-1 times: join two binary trees with smallest weights into one (as left and right subtrees) and make its weight equal the sum of the weights of the two trees. Mark edges leading to left and right subtrees with 0s and 1s, respectively represents {00, 011, 1}

19 9-18 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 9 Example character AB C D _ frequency codeword average bits per character: 2.25 for fixed-length encoding: 3 compression ratio: (3-2.25)/3*100% = 25%


Download ppt "Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved."

Similar presentations


Ads by Google