Greedy Algorithms Alexandra Stefan.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Greedy Algorithms.
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Greedy Algorithms Greed is good. (Some of the time)
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Data Structures – LECTURE 10 Huffman coding
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Huffman Codes Message consisting of five characters: a, b, c, d,e
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Huffman Encoding Veronica Morales.
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
 Greedy Algorithms. Greedy Algorithm  Greedy Algorithm - Makes locally optimal choice at each stage. - For optimization problems.  If the local optimum.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Foundation of Computing Systems
Bahareh Sarrafzadeh 6111 Fall 2009
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms.
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
Greedy Algorithms Chapter 16 Highlights
Greedy Algorithms Analysis of Algorithms.
Greedy algorithms 2 David Kauchak cs302 Spring 2012.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
HUFFMAN CODES.
CSC317 Greedy algorithms; Two main properties:
Greedy Technique.
Chapter 5. Greedy Algorithms
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Greedy Algorithm.
Data Compression If you’ve ever sent a large file to a friend, you may have compressed it into a zip archive like the one on this slide before doing so.
Chapter 9: Huffman Codes
Chapter 16: Greedy Algorithms
Binhai Zhu Computer Science Department, Montana State University
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Huffman Coding CSE 373 Data Structures.
Chapter 16: Greedy algorithms Ming-Te Chi
Huffman Encoding Huffman code is method for the compression for standard text documents. It makes use of a binary tree to develop codes of varying lengths.
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Trees Addenda.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Greedy Algorithms Alexandra Stefan

Greedy Method for Optimization Problems take the action that is best now (out of the current options) it may cause you to miss the optimal solution Examples: Knapsack Pic the item with the largest value/weight ratio. Interval scheduling Pick the interval that finishes first Huffman codes Prefix codes: no code is also a prefix of another code. Huffman tree to decode (the code for a character, x, takes you to the leaf with x) Airport Shuttle – greedy? – not covered this semester Edit distance – greedy? – not clear. Compare it with the fishing with gain problem. ***Matrix traversal and fishing – greedy? Difference between Greedy and DP: in Greedy you commit to your choice, you will not have 2D matrix of computed costs/gains, you will not need to backtrack (you are building the solution as you go).

Knapsack versions Unbounded (electronics warehouse) Unlimited amounts of each item. Can only either take the item or not take it. (No fractions allowed). Bounded fractional (market) Limited amounts (e.g. 3 of item A) Can take the whole item or a fraction of it (e.g. flour, rice, sugar) Not covered in this class: 0-1 (home) Have only one of each item. Can either take it or not take it. Bounded (store) Limited amounts (e.g. 3 of each item) What would a greedy thief do? Would any of the two covered problems be solved optimally using Greedy? (Prove or give counter-example)

Interval scheduling versions Worksheet With values ( job = (start, end, value) ): Each interval has a value. Goal: maximize sum of values of picked non-overlapping jobs/intervals. Greedy solution? Algorithm Is it optimal? Give proof or counter example. Without values (job = start, end) ): Pick the largest number of non-overlapping intervals. “Activity selection problem” in CLRS (page 415) Is it optimal? Give proof or counter example. (CLRS proof at page 418, proof of Theorem 16.1) Which of the two versions is more general? Is one a special case of the other?

Huffman code Example from CLRS: File with 100,000 characters. Characters: a,b,c,d,e,f Frequency in thousands (e.g. the frequency of b is 13000): Goal: binary encoding that requires less memory. a b c d e f File size after encoding Frequency (thousands) 45 13 12 16 9 5 - Fix-length codeword 000 001 010 011 100 101 Variable-length codeword 111 1101 1100

Huffman codes Internal nodes contain the sum of probabilities of the leaves in that subtree. Optimal prefix codeword tree (Huffman tree) – optimal encoding Fixed-length codeword tree Compute the number of bits needed for the whole file using each of these encodings. 45,000*1 + 13,000 * 3 + 12,000*3 + +16,000*3 + 9,000 * 4 + 5,000 * 5 = 224,000 100,000 * 3 = 300,000 (Images from CLRS.)

Huffman codes Internal nodes contain the sum of probabilities of the leaves in that subtree. Optimal prefix codeword tree (Huffman tree) – optimal encoding Fixed-length codeword tree a->0 b->101 c->100 d->111 e->1101 f->1100 Compute the number of bits needed for the whole file using each of these encodings. Number of bits in code 100,000 * 3 = 300,000 45,000*1 + 13,000 * 3 + 12,000*3 + 16,000*3 + 9,000 * 4 + 5,000 * 4 = 224,000 (Images from CLRS.)

Building the Huffman Tree Make ‘leaves’ with letters and their frequency and arrange them s.t. they are always in increasing order (of frequency). Repeat until there is only one tree: Create a new tree from the two leftmost trees (with the smallest frequencies) and Put it in its place. a b c d e f Frequency (thousands) 45 13 12 16 9 5

Huffman codes Tree properties: Left node is always smaller or equal than the right node. Every internal node has two children

Greedy Algorithms Greedy algorithms do not always guarantee optimal solution. It depends on the problem. Difference between Greedy and Dynamic Programming: In DP typically you solve all the problems and then make your choice. In greedy, you make the greedy choice and you are left with only one problem.