Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 5. Greedy Algorithms

Similar presentations


Presentation on theme: "Chapter 5. Greedy Algorithms"— Presentation transcript:

1 Chapter 5. Greedy Algorithms
Computer Algorithms and Laboratory Some of contents in this lecture slides are from textbook Sanjoy Dasgupta, Christos Papdimitriou and Umesh Vazirani, Algorithms, McGraw-Hill, 2008.

2 What is greedy algorithms?
Greedy algorithm builds up a solution piece by piece, always choosing the next piece that offers the most obvious and immediate benefits.

3 Minimum Spanning Tree Tree: A connected component that does not contain any cycle. Spanning Tree: a tree that covers whole vertices in a graph. Minimum Spanning Tree: A spanning tree T = (V,E’) ⊂ G = (V,E) that minimizes

4 e.g) minimum spanning tree

5 A Greedy Approach Kruskal’s method
Repeatedly add the next lightest edge that does not make a cycle. Add edges in increasing-weight order (B-C, C-D, B- D, ... )

6 Properties of Trees A tree on n nodes has n-1 edges.
Any connected, undirected graph G=(V,E) with |E|=|V|-1 is a tree. An undirected graph is a tree if and only if there is a unique path between any pair of nodes.

7 Cut Property Suppose edges X are part of a minimum spanning tree G = (V,E). Pick any subset of nodes S for which X does not cross between S and V-S, and let e be the lightest edge across this partition. Then x ∪ {e} is part of some minimum spanning tree.

8 proof

9 Cut Property at Work

10 Kruskal’s Algorithm

11 Implementation makeset(x): create a singleton set containing x.
find(): to which set does x belong? union(): merge the sets containing x and y

12

13 Prim’s Algorithm X={} (edges picked so far) repeat until |X| = |V| - 1
pick a set S ⊂ V for which X has no edges between S and V - S Let e ∈ E be the minimum-weight between S and V-S X = X + {e}

14 Tracing Prim’s algorithm
Starting from A

15 Huffman Encoding Digitalize Audio Signal
Do Sampling at regular intervals Quantize real-valued sample into a near by number using a finite set T. Encode the resulting number in binary Huffman Encoding can be used for the step 3 above.

16 Example (Encoding with fixed length)
Symbol distribution in a file Total bits needed to store a file 2 (bits) × ( ) M = 260 M bits Symbol Frequency A 70 M B 3 M C 20 M D 37 M Encoding in 2 bits Symbol Encoding A 00 B 01 C 10 D 11

17 Example (Encoding with variable length)
Symbol distribution in a file Total bits needed to store a file 1 × 70(M) + 3 × 3(M) + 3 × 20(M) + 2 × 37(M) = 213 M 213/260 ≅ 82% 18 % gains Symbol Frequency A 70 M B 3 M C 20 M D 37 M Encoding in variable length Symbol Encoding A B 100 C 101 D 11 More Efficient

18 Encoding in Variable Length
Symbols are encoded based on their frequency of occurrences. - Lower frequency symbol has longer length. Build a tree that has encoding information - Lower frequency symbol has longer path from root.

19 Huffman Method Build a tree based on symbol’s frequency.

20 Building Huffman Tree B[3] A[70] [23] B[3] A[70] A[70] D[37] D[37]
C[20] C[20] C[20] D[37] 1 Symbol Encoding A B 100 C 101 D 11 [60] [60] A[70] A[70] [23] [23] D[37] D[37] B[3] C[20] B[3] C[20]

21 Encoding and Decoding Using Huffman tree
Encoding uses a table built from huffman table. abbacada ⇒ Decoding uses tree-traversal. From root, traverse the tree based on codes. if current code is 0, then move left subtree, if 1, then move right subtree. If the current node is leaf node, then print symbol in the leaf node, and move to root.

22 Entrophy A measure of how much randomness it contains.
Average bits needed to encode whole data if data distributed in random, then we need more bits to encode whole data. fair coin: 1/2 log 2 + 1/2 log 2 = 1 3/4 chance of head coin: 3/4 log 4/3 + 1/4 log 4 = 0.81

23 Set Cover Algorithm Input: a set of elements B; sets S1, ..., Sm ⊆ B
Output: A selection of Si whose union is B Cost: Number of sets picked. Algorithm Repeat until all elements of B are covered. Pick the set Si with the largest number of uncovered elements.

24 Set Cover Example Optimal sets needed to cover the above graph is 3 (b, i, e). Sets from greedy approach is 4 (a, c, g, j).

25 Optimality in Set Cover Problem
Is it optimal? Not far from optimal Assume nt be the number of elements still not covered after i iterations, and k is optimal number of covering all elements. Then, after tth iteration, at least nt/k elements will be covered and remaining elements will be nt - nt/k, so

26 So greedy approach in set cover problem is almost optimal.


Download ppt "Chapter 5. Greedy Algorithms"

Similar presentations


Ads by Google