Introduction to Algorithms Chapter 16: Greedy Algorithms.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Data Structures – LECTURE 10 Huffman coding
Chapter 9: Huffman Codes
Greedy Algorithms Huffman Coding
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Huffman Encoding Veronica Morales.
1 Analysis of Algorithms Chapter - 08 Data Compression.
 Greedy Algorithms. Greedy Algorithm  Greedy Algorithm - Makes locally optimal choice at each stage. - For optimization problems.  If the local optimum.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
Greedy Algorithms CSc 4520/6520 Fall 2013 Problems Considered Activity Selection Problem Knapsack Problem – 0 – 1 Knapsack – Fractional Knapsack Huffman.
Huffman Coding Yancy Vance Paredes. Outline Background Motivation Huffman Algorithm Sample Implementation Running Time Analysis Proof of Correctness Application.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Bahareh Sarrafzadeh 6111 Fall 2009
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
COSC 3101A - Design and Analysis of Algorithms 9 Knapsack Problem Huffman Codes Introduction to Graphs Many of these slides are taken from Monica Nicolescu,
Greedy Algorithms.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
Greedy Algorithms Chapter 16 Highlights
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Greedy Algorithms Analysis of Algorithms.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
HUFFMAN CODES.
CSC317 Greedy algorithms; Two main properties:
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Introduction to Algorithms`
Greedy Algorithm.
Data Compression If you’ve ever sent a large file to a friend, you may have compressed it into a zip archive like the one on this slide before doing so.
Chapter 9: Huffman Codes
Huffman Coding.
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Podcast Ch23d Title: Huffman Compression
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Huffman codes Binary character code: each character is represented by a unique binary string. A data file can be coded in two ways: a b c d e f frequency(%)
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Introduction to Algorithms Chapter 16: Greedy Algorithms

Greedy Algorithms Similar to dynamic programming, but simpler approach  Also used for optimization problems Idea: When we have a choice to make, make the one that looks best right now  Make a locally optimal choice in hope of getting a globally optimal solution Greedy algorithms don’t always yield an optimal solution Makes the choice that looks best at the moment in order to get optimal solution.

Fractional Knapsack Problem Knapsack capacity: W There are n items: the i -th item has value v i and weight w i Goal:  find x i such that for all 0  x i  1, i = 1, 2,.., n  w i x i  W and  x i v i is maximum

50 Fractional Knapsack - Example E.g.: Item 1 Item 2 Item 3 $60$100$ $60 $100 + $240 $6/pound$5/pound$4/pound $80 +

Fractional Knapsack Problem Greedy strategy 1:  Pick the item with the maximum value E.g.:  W = 1  w 1 = 100, v 1 = 2  w 2 = 1, v 2 = 1  Taking from the item with the maximum value: Total value taken = v 1 /w 1 = 2/100  Smaller than what the thief can take if choosing the other item Total value (choose item 2) = v 2 /w 2 = 1

Fractional Knapsack Problem Greedy strategy 2: Pick the item with the maximum value per pound v i /w i If the supply of that element is exhausted and the thief can carry more: take as much as possible from the item with the next greatest value per pound It is good to order items based on their value per pound

Fractional Knapsack Problem Alg.: Fractional-Knapsack ( W, v[n], w[n] ) 1. While w > 0 and as long as there are items remaining 2. pick item with maximum v i /w i 3. x i  min (1, w/w i ) 4. remove item i from list 5. w  w – x i w i w – the amount of space remaining in the knapsack ( w = W ) Running time:  (n) if items already ordered; else  (nlgn)

Huffman Code Problem Huffman’s algorithm achieves data compression by finding the best variable length binary encoding scheme for the symbols that occur in the file to be compressed.

Huffman Code Problem The more frequent a symbol occurs, the shorter should be the Huffman binary word representing it. The Huffman code is a prefix-free code.  No prefix of a code word is equal to another codeword.

Overview Huffman codes: compressing data (savings of 20% to 90%) Huffman’s greedy algorithm uses a table of the frequencies of occurrence of each character to build up an optimal way of representing each character as a binary string C: Alphabet

Example Assume we are given a data file that contains only 6 symbols, namely a, b, c, d, e, f With the following frequency table: Find a variable length prefix-free encoding scheme that compresses this data file as much as possible?

Huffman Code Problem Left tree represents a fixed length encoding scheme Right tree represents a Huffman encoding scheme

Example

Constructing A Huffman Code O(lg n) Total computation time = O(n lg n) // C is a set of n characters // Q is implemented as a binary min-heap O(n)

Cost of a Tree T For each character c in the alphabet C  let f(c) be the frequency of c in the file  let d T (c) be the depth of c in the tree It is also the length of the codeword. Why? Let B(T) be the number of bits required to encode the file (called the cost of T)

Huffman Code Problem In the pseudocode that follows: we assume that C is a set of n characters and that each character c €C is an object with a defined frequency f [c]. The algorithm builds the tree T corresponding to the optimal code A min-priority queue Q, is used to identify the two least-frequent objects to merge together. The result of the merger of two objects is a new object whose frequency is the sum of the frequencies of the two objects that were merged.

Running time of Huffman's algorithm The running time of Huffman's algorithm assumes that Q is implemented as a binary min-heap. For a set C of n characters, the initialization of Q in line 2 can be performed in O(n) time using the BUILD-MINHEAP The for loop in lines 3-8 is executed exactly n - 1 times, and since each heap operation requires time O(lg n), the loop contributes O(n lg n) to the running time. Thus, the total running time of HUFFMAN on a set of n characters is O(n lg n).

Prefix Code Prefix(-free) code: no codeword is also a prefix of some other codewords (Un-ambiguous)  An optimal data compression achievable by a character code can always be achieved with a prefix code  Simplify the encoding (compression) and decoding Encoding: abc  = Decoding: =  aabe  Use binary tree to represent prefix codes for easy decoding An optimal code is always represented by a full binary tree, in which every non-leaf node has two children  |C| leaves and |C|-1 internal nodes Cost: Frequency of c Depth of c (length of the codeword)

Huffman Code Reduce size of data by 20%-90% in general If no characters occur more frequently than others, then no advantage over ASCII Encoding:  Given the characters and their frequencies, perform the algorithm and generate a code. Write the characters using the code Decoding:  Given the Huffman tree, figure out what each character is (possible because of prefix property)

Application on Huffman code Both the.mp3 and.jpg file formats use Huffman coding at one stage of the compression

Dynamic Programming vs. Greedy Algorithms Dynamic programming  We make a choice at each step  The choice depends on solutions to subproblems  Bottom up solution, from smaller to larger subproblems Greedy algorithm  Make the greedy choice and THEN  Solve the subproblem arising after the choice is made  The choice we make may depend on previous choices, but not on solutions to subproblems  Top down solution, problems decrease in size