Download presentation
Presentation is loading. Please wait.
1
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008
2
CPSC 411, Fall 2008: Set 42 Greedy Algorithm Paradigm Characteristics of greedy algorithms: make a sequence of choices each choice is the one that seems best so far, only depends on what's been done so far choice produces a smaller problem to be solved In order for greedy heuristic to solve the problem, it must be that the optimal solution to the big problem contains optimal solutions to subproblems
3
CPSC 411, Fall 2008: Set 43 Designing a Greedy Algorithm Cast the problem so that we make a greedy (locally optimal) choice and are left with one subproblem Prove there is always a (globally) optimal solution to the original problem that makes the greedy choice Show that the choice together with an optimal solution to the subproblem gives an optimal solution to the original problem
4
CPSC 411, Fall 2008: Set 44 Some Greedy Algorithms Kruskal's MST algorithm Prim's MST algorithm Dijkstra's Single Source Shortest Path algorithm fractional knapsack algorithm Huffman codes …
5
CPSC 411, Fall 2008: Set 45 Minimum Spanning Tree 7 16 4 5 6 8 11 15 14 17 10 13 3 12 2 9 18 Given: Connected graph with weighted edges. Goal: Find a subset of the edges that span all the nodes, create no cycle, and minimize the sum of the weights
6
CPSC 411, Fall 2008: Set 46 Facts About MSTs There can be many spanning trees of a graph In fact, there can be many minimum spanning trees of a graph 11 2 2
7
CPSC 411, Fall 2008: Set 47 Uniqueness of MST Proposition. If every edge in a connected graph has a unique weight, then there is a unique MST Seeking a contradiction, suppose there are two distinct MSTs, say M 1 and M 2. Let e be an edge with minimum weight that is one graph but not the other (w.l.o.g., suppose e in M 1 ). If e is added to M 2, then a cycle is formed. Let e' be an edge in the cycle that is not in M 1
8
CPSC 411, Fall 2008: Set 48 Uniqueness of MST e in M 1 but not M 2 e' in M 2 but not M 1. Since M 2 is MST, we have wt(e’) < wt(e) M2:M2: Replacing e with e' in M 1 creates a new MST M 3 whose weight is less than that of M 2
9
CPSC 411, Fall 2008: Set 49 Kruskal's MST algorithm 7 16 4 5 6 8 11 15 14 17 10 13 3 12 2 9 18 consider the edges in increasing order of weight, add an edge iff it does not cause a cycle
10
CPSC 411, Fall 2008: Set 410 Why is Kruskal's Greedy? Algorithm manages a set of edges s.t. these edges are a subset of some MST At each iteration: choose an edge so that the MST-subset property remains true subproblem left is to do the same with the remaining edges Always try to add cheapest available edge that will not violate the tree property locally optimal choice
11
CPSC 411, Fall 2008: Set 411 Correctness of Kruskal's Alg. Let E={e 1, e 2, …, e n-1 } be the sequence of edges chosen The graph corresponding to E is acyclic and connects all vertices, so it must be a tree. Suppose it is not minimum weight Let e i be the edge where the algorithm goes wrong {e 1,…,e i-1 } is part of some MST M but {e 1,…,e i } is not part of any MST
12
CPSC 411, Fall 2008: Set 412 Correctness of Kruskal's Alg. white edges are part of MST M, which contains e 1 to e i-1, but not e i M:e i, forms a cycle in M e* : min wt. edge in cycle not in e 1 to e i-1 replacing e* w/ e i forms a spanning tree with smaller weight than M, contradiction! wt(e*) > wt(e i )
13
CPSC 411, Fall 2008: Set 413 Implementing Kruskal's Alg. Sort edges by weight efficient algorithms known How to test quickly if adding in the next edge would cause a cycle? use disjoint set data structure, later
14
CPSC 411, Fall 2008: Set 414 Another Greedy MST Alg. Kruskal's algorithm maintains a forest that grows until it forms a spanning tree Alternative idea is to keep just one tree and grow it until it spans all the nodes Prim's algorithm At each iteration, choose the minimum weight outgoing edge to add greedy!
15
CPSC 411, Fall 2008: Set 415 Knapsack Problem There are n different items in a store Item i : weighs w i pounds worth $v i A thief breaks in Can carry up to W pounds in his knapsack What should he take to maximize the value of his haul?
16
CPSC 411, Fall 2008: Set 416 0-1 vs. Fractional Knapsack 0-1 Knapsack Problem: the items cannot be divided thief must take entire item or leave it behind Fractional Knapsack Problem: thief can take partial items for instance, items are liquids or powders solvable with a greedy algorithm…
17
CPSC 411, Fall 2008: Set 417 Greedy Fractional Knapsack Algorithm Sort items in decreasing order of value per pound While still room in the knapsack (limit of W pounds) do consider next item in sorted list take as much as possible (all there is or as much as will fit) O(n log n) running time (for the sort)
18
CPSC 411, Fall 2008: Set 418 Greedy 0-1 Knapsack Alg? 3 items: item 1 weighs 10 lbs, worth $60 ($6/lb) item 2 weighs 20 lbs, worth $100 ($5/lb) item 3 weighs 30 lbs, worth $120 ($4/lb) knapsack can hold 50 lbs greedy strategy: take item 1 take item 2 no room for item 3
19
CPSC 411, Fall 2008: Set 419 0-1 Knapsack Problem Taking item 1 is a big mistake globally although it looks good locally Later we'll see a different algorithm design paradigm that does work for this problem
20
CPSC 411, Fall 2008: Set 420 Finding Optimal Source Code Input: data file of characters and number of occurrences of each character Output: a binary encoding of each character so that the data file can be represented as efficiently as possible The code is optimal if the file is produced by a memoryless source that has the specified character frequencies.
21
CPSC 411, Fall 2008: Set 421 Huffman Code Idea: use short codes for more frequent characters and long codes for less frequent charabcdeftotal bits #4513121695 fixed000001010011100101300 variable010110011111011100224
22
CPSC 411, Fall 2008: Set 422 How to Decode? With fixed length code, easy: break up into 3's, for instance For variable length code, ensure that no character's code is the prefix of another no ambiguity 101111110100 b d e a a
23
CPSC 411, Fall 2008: Set 423 Binary Tree Representation 0 1 0 1 0 010101 abcdef fixed length code cost of code is sum, over all chars c, of number of occurrences of c times depth of c in the tree
24
CPSC 411, Fall 2008: Set 424 01 fe Binary Tree Representation 10 0 1 0 1 01 a b cd variable length code cost of code is sum, over all chars c, of number of occurrences of c times depth of c in the tree
25
CPSC 411, Fall 2008: Set 425 Algorithm to Construct Tree Representing Huffman Code Given set C of n chars, c occurs f[c] times insert each c into priority queue Q using f[c] as key for i := 1 to n-1 do x := extract-min(Q) y := extract-min(Q) make a new node z w/ left child x (label edge 0), right child y (label edge 1), and f[z] = f[x] + f[y] insert z into Q
26
CPSC 411, Fall 2008: Set 426
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.