Greedy Algorithms Alexandra Stefan.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Greedy Algorithms.
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
Optimal Merging Of Runs
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Data Structures – LECTURE 10 Huffman coding
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Huffman Codes Message consisting of five characters: a, b, c, d,e
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Foundation of Computing Systems
Bahareh Sarrafzadeh 6111 Fall 2009
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms.
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
Greedy Algorithms Chapter 16 Highlights
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Greedy Algorithms Analysis of Algorithms.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
Greedy Technique.
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Lecture 7 Greedy Algorithms
Greedy Algorithm.
Chapter 8 – Binary Search Tree
Chapter 16: Greedy Algorithm
Chapter 9: Huffman Codes
Chapter 16: Greedy Algorithms
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Huffman Coding CSE 373 Data Structures.
Chapter 16: Greedy algorithms Ming-Te Chi
Huffman Encoding Huffman code is method for the compression for standard text documents. It makes use of a binary tree to develop codes of varying lengths.
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Trees Addenda.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Data Structure and Algorithms
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Greedy Algorithms Alexandra Stefan

Greedy Method for Optimization Problems take the action that is best now (out of the current options) it may cause you to miss the optimal solution You build the solution as you go. No need to backtrace it. Examples: Knapsack Pic the item with the largest value/weight ratio. Other possible measures: largest value, smallest weight. Interval scheduling Pick the interval that finishes first Huffman codes Prefix codes: no code is also a prefix of another code. Huffman tree to decode (the code for a character, x, takes you to the leaf with x) Edit distance – greedy? – not clear.

Knapsack versions Unbounded (electronics warehouse) Unlimited amounts of each item. Cannot take fractions of an item. Fractional (market: flour, rice, wine) Limited amounts (e.g. 3 of item A) Can take a fraction of an item. What would a greedy thief do? Method: Sort in decreasing order of value/weight ratio. Pick as many of the largest ratio as possible. After that, try to take as many of the next ratio as possible and so on. Does NOT give an optimal solution. See next page. Would any of the above problems be solved optimally using Greedy? (Prove or give counter-example) Yes. The fractional version.

Greedy for Knapsack Unbounded Unbounded, fractional Picked D A Item A B C D E Value 4 5 11 14 15 Weight 3 7 8 9 Ratio 4/3= 1.3 5/4= 1.25 11/7=1.57 14/8=1.75 15/9=1.67 Reordered decreasing by ratio: D, E, C, A, B Unbounded Unbounded, fractional Picked D A Remaining weight 13 (=21-8) 5 (=13-8) 2 (=5-3) Value: 14+14+4 = 32 Picked D 5/8 of D Remaining weight 13 (=21-8) 5 (=13-8) (=5-5) Value: 14+14+(5/8)*14 = 36.75 0/1 0/1 fractional Picked D E A Remaining weight 13 (=21-8) 4 (=13-9) 1 (=4-3) Value: 14+15+4 = 33 Picked D E 4/7 of C Remaining weight 13 (=21-8) 4 (=13-9) (=4-4) Value: 14+15+(4/7)*11 = 35.28

Counter example used to prove that Greedy fails for Unbounded Knapsack Goal: construct an Unbounded Knapsack instance where Greedy does not give the optimal answer. Intuition: We want Greedy to pick only one item, when in fact two other items can be picked and together give a higher value: Item B: weight 2 (can fit 2), value 3 => total value 6 Item A: weight 3, value 5 => total value 5 Knapsack max weight: 4 !!! You must double-check that Greedy would pick item A => check the ratios: 5/3 > 3/2 (1.66 > 1.5). If item A had value 4, Greedy would also have picked B. Same example can be modified to work for 0/1 Knapsack: Item A: 3kg, $5 Item B: 2kg, $3 Item C: 2kg, $3 $3 2kg $5 3kg DP Greedy Item B Item A $6 Item C Item B Item A $3 2kg $3 2kg $5 3kg DP Greedy B $3,2kg A $5 3kg C $3,2kg $6 $5

Interval scheduling versions Worksheet With values ( job = (start, end, value) ): Each interval has a value. Goal: maximize sum of values of picked non-overlapping jobs. Greedy solution? Algorithm Is it optimal? Give proof or counter example. Without values (job = start, end) ): Pick the largest number of non-overlapping intervals. “Activity selection problem” in CLRS (page 415) Is it optimal? Give proof or counter example. (CLRS proof at page 418, proof of Theorem 16.1) Which of the two versions is more general? Is one a special case (or special instance) of the other? If you have a program to solve problems of one type, can you easily use it to solve problems of the other type? Which type should the program solve (with value, or without value)? Do NOT ask “which can be turned into the other”. This question is confusing. Can use: “Assume you have a program,A, that solves one type of problem. You are given an instance of the other type, P. Can you produce a ‘new’ problem, Pnew, run the program A on Pnew, get an answer and use that to get the optimal answer for your original instance problem, P.

Interval scheduling Greedy measures Problem version: All jobs have the SAME value. => maximize number of jobs you can pick. Optimal: job with that finishes first job with latest start time NOT optimal: Shortest duration Least overlaps Earliest start time 1 6 7 12 5 8 Example showing that Shortest duration Does not give an optimal solution. Greedy will pick the red job. Nothing else fits. Optimal: the 2 blue jobs.

Huffman code Online resource (shows all steps): http://www.ida.liu.se/opendsa/OpenDSA/Books/OpenDSA/html/Huffman.html Application: file compression Example from CLRS: File with 100,000 characters. Characters: a,b,c,d,e,f Frequency in thousands (e.g. the frequency of b is 13000): Goal: binary encoding that requires less memory. a b c d e f File size after encoding Frequency (thousands) 45 13 12 16 9 5 - Fix-length codeword 000 001 010 011 100 101 Variable-length codeword 111 1101 1100

Huffman codes Internal nodes contain the sum of probabilities of the leaves in that subtree. Optimal prefix codeword tree (Huffman tree) – optimal encoding Fixed-length codeword tree Compute the number of bits needed for the whole file using each of these encodings. 45,000*1 + 13,000 * 3 + 12,000*3 + +16,000*3 + 9,000 * 4 + 5,000 * 4 = 224,000 100,000 * 3 = 300,000 (Images from CLRS.)

Huffman codes Internal nodes contain the sum of probabilities of the leaves in that subtree. Optimal prefix codeword tree (Huffman tree) – optimal encoding Fixed-length codeword tree a->0 b->101 c->100 d->111 e->1101 f->1100 Compute the number of bits needed for the whole file using each of these encodings. Number of bits in code 100,000 * 3 = 300,000 45,000*1 + 13,000 * 3 + 12,000*3 + 16,000*3 + 9,000 * 4 + 5,000 * 4 = 224,000 (Images from CLRS.)

Building the Huffman Tree Make ‘leaves’ with letters and their frequency and arrange them s.t. they are always in increasing order (of frequency). Repeat until there is only one tree: Create a new tree from the two leftmost trees (with the smallest frequencies) and Put it in its place. Label left/right branches with 0/1 Encoding of char = path from root to leaf that has that char See book or lecture video for the step-by-step solution. In exam or hw, you must use this method. a b c d e f Frequency (thousands) 45 13 12 16 9 5

Huffman codes Tree properties: Left node is always smaller or equal than the right node. Every internal node has two children

Greedy Algorithms Greedy algorithms do not always guarantee optimal solution. It depends on the problem. Difference between Greedy and Dynamic Programming: In DP typically you solve all the problems and then make your choice. In greedy, you make the greedy choice and you are left with only one problem.