1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Lecture 4 (week 2) Source Coding and Compression
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Huffman Encoding Dr. Bernard Chen Ph.D. University of Central Arkansas.
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
A Data Compression Algorithm: Huffman Compression
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Data Structures – LECTURE 10 Huffman coding
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Data Structures Arrays both single and multiple dimensions Stacks Queues Trees Linked Lists.
Huffman Codes. Encoding messages  Encode a message composed of a string of characters  Codes used by computer systems  ASCII uses 8 bits per character.
Data Compression1 File Compression Huffman Tries ABRACADABRA
Huffman Encoding Veronica Morales.
1 Analysis of Algorithms Chapter - 08 Data Compression.
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
 Greedy Algorithms. Greedy Algorithm  Greedy Algorithm - Makes locally optimal choice at each stage. - For optimization problems.  If the local optimum.
Huffman Coding. Huffman codes can be used to compress information –Like WinZip – although WinZip doesn’t use the Huffman algorithm –JPEGs do use Huffman.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Huffman’s Algorithm 11/02/ Weighted 2-tree A weighted 2-tree T is an extended binary tree with n external nodes and each of the external nodes is.
Foundation of Computing Systems
Bahareh Sarrafzadeh 6111 Fall 2009
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
COSC 3101A - Design and Analysis of Algorithms 9 Knapsack Problem Huffman Codes Introduction to Graphs Many of these slides are taken from Monica Nicolescu,
Greedy Algorithms.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
Greedy Algorithms Chapter 16 Highlights
Greedy Algorithms Analysis of Algorithms.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Design & Analysis of Algorithm Huffman Coding
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
Greedy Technique.
Greedy Algorithm.
Data Compression If you’ve ever sent a large file to a friend, you may have compressed it into a zip archive like the one on this slide before doing so.
Chapter 16: Greedy Algorithm
Huffman Coding.
CS6045: Advanced Algorithms
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Huffman Coding CSE 373 Data Structures.
Chapter 16: Greedy algorithms Ming-Te Chi
Huffman Encoding Huffman code is method for the compression for standard text documents. It makes use of a binary tree to develop codes of varying lengths.
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Trees Addenda.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
File Compression Even though disks have gotten bigger, we are still running short on disk space A common technique is to compress files so that they take.
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms

2 The Knapsack problem The problem: A thief robbing a store finds n items. The i th item weighs w i pounds and has worth v i. The thief can only carry W pounds in his knapsack. Which items should he take (to get the maximum value)? Version 1 (the 0-1 version): The thief can only take each item whole. He cannot take fractional items (e.g. gold bars). Version 2 (the fractional version): The thief can take fractions of items (e.g. gold dust). Applications: There are many real-world problems analagous to the Knapsack problem. Example: The bin-packing problem. Objects of different sizes must be packed into bins of volume V in a way that minimizes the number of bins used. This problem occurs for loading trucks and for creating file back-ups in media.

3 Optimal Substructure of fractional problem The fractional version: Suppose the thief takes w pounds of item j first. There are w j - w pounds of item j left. The thief can fit W-w pounds in his knapsack. We now must find the most valuable load weighing <= W-w from the remaining items and w j - w pounds of the jth item.

Solving the fractional problem We can solve the fractional problem using a greedy algorithm: 1) Compute the value per pound of each item: vi/wi 2) The thief takes as much as possible of the most valuable (per pound) item. 3) If the thief takes all of the most valuable item and still has room in the knapsack, he takes as much as possible of the next most valuable item. 4) Repeat step 3 until the knapsack is full. Suppose the knapsack can hold 50 lbs. Suppose there are 3 items: ItemWeightValuePrice/lb 110 lbs$60$6 220 lbs$100$5 330 lbs$120$4 The fractional solution: (We will work through this in class)

5 Optimal Substructure of 0-1 problem The 0-1 version: Suppose the thief takes item j first. Item j weighs w j pounds. Now the thief can carry W - w j pounds. Now we must solve the subproblem of finding the most valuable combination of the remaining items weighing <= W-w j. We must find an optimal solution to this subproblem. Applying the same strategy to the 0-1 problem does not lead to an optimal solution. (We will show this in class).

6 The problem continued To find the optimal solution to the 0-1 problem we must compare solutions in which the first item is included with solutions in which it is not included. The problems overlap. We must solve the subproblems to make the choice. This means we must use dynamic programming to find the optimal solution.

7 Huffman Codes Huffman codes are a way to compress data. They are widely used and effective, leading to % savings of space, depending on the data. A greedy algorithm uses the frequency and occurrence of each character to build an optimal code for representing each character as a binary string.

8 Example Suppose we have 10 5 characters in a data file. Normal storage: 8 bits per character (ASCII) -- 8x10 5 bits in file. We want to compress the file and store it compactly. Suppose only 6 characters appear in the file: abcdefTotal freq How can we represent the data in a compact way? Fixed Length code: Each letter represented by an equal number of bits. (We will give an example in class).

9 Variable length codes We can save space by assigning frequently occurring characters short codes, and infrequently occurring characters long codes. abcdefTotal freq Example: a0a0 b101 c100 d111 e1101 f1100 Number of bits = ?

10 Prefix Codes Prefix codes are codes in which no codeword is a prefix for another code word. We can show that optimal data compression achievable by a character code can be achieved by a prefix code. Prefix codes simplify encoding and decoding. To encode: concatenate the code for each character: abc = To decode: There is no ambiguity, assign first character that fits = ?

11 Representing codes with a tree A tree is a convenient representation for prefix codes so codewords can be easily decoded. Code 1 a 000 b 001 c 010 d 011 e 100 f 101 Leaf value = frequency of character. Node value = sum of values of children = sum of leaves in subtrees. Find character by traversing tree from root to leaf. Move to left if 0. Move to right if 1. We will draw the tree in class

12 Tree for variable length code Code 2 a 0 b 101 c 100 d 111 e 1101 f 1100 We will draw this tree in class Only full binary trees (every non-leaf node has exactly 2 children) represent optimal codes. Code 2 is optimal. Code 1 is not.