Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Advanced Algorithm Design and Analysis (Lecture 6) SW5 fall 2004 Simonas Šaltenis E1-215b
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Data Structures – LECTURE 10 Huffman coding
Chapter 9: Huffman Codes
Greedy Algorithms CSE 331 Section 2 James Daly. Reminders Exam 2 next week Thursday, April 9th Covers heaps to spanning trees Greedy algorithms (today)
Greedy Algorithms Huffman Coding
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Data Compression1 File Compression Huffman Tries ABRACADABRA
1 Analysis of Algorithms Chapter - 08 Data Compression.
Data Structures and Algorithms A. G. Malamos
Introduction to Algorithms Chapter 16: Greedy Algorithms.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
Greedy Algorithms CSc 4520/6520 Fall 2013 Problems Considered Activity Selection Problem Knapsack Problem – 0 – 1 Knapsack – Fractional Knapsack Huffman.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Bahareh Sarrafzadeh 6111 Fall 2009
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
Greedy Algorithms Chapter 16 Highlights
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Greedy Algorithms Analysis of Algorithms.
Huffman encoding.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Greedy algorithms: CSC317
HUFFMAN CODES.
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Introduction to Algorithms`
Chapter 16: Greedy Algorithm
Chapter 16: Greedy Algorithms
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Dynamic Programming II DP over Intervals
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Huffman codes Binary character code: each character is represented by a unique binary string. A data file can be coded in two ways: a b c d e f frequency(%)
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b

AALG, lecture 5, © Simonas Šaltenis, Greedy Algorithms Goals of the lecture: to understand the principles of the greedy algorithm design technique; to understand the example greedy algorithms for activity selection and Huffman coding, to be able to prove that these algorithms find optimal solutions; to be able to apply the greedy algorithm design technique.

AALG, lecture 5, © Simonas Šaltenis, Activity-Selection Problem Input: A set of n activities, each with start and end times: A[i].s and A[i].f. The activity last during the period [A[i].s, A[i].f) Output: The largest subset of mutually compatible activities Activities are compatible if their intervals do not intersect Time

AALG, lecture 5, © Simonas Šaltenis, “Straight-forward” solution Let’s just pick (schedule) one activity A[k] This generates two set’s of activities compatible with it: Before(k), After(k) E.g., Before(4) = {1, 2}; After(4) = {6,7,8,9} Solution: Time

AALG, lecture 5, © Simonas Šaltenis, Dynamic Programming Alg. The recurrence results in a dynamic programming algorithm Sort activities on the start or end time (for simplicity assume also “sentinel” activities A[0] and A[n+1]) Let S ij – a set of activities after A[i] and before A[j] and compatible with A[i] and A[j]. Let’s have a two-dimensional array, s.t., c[i, j] = MaxN(S ij ). MaxN(A) = MaxN(S 0,n+1 ) = c[0, n+1]

AALG, lecture 5, © Simonas Šaltenis, Dynamic Programming Alg. II Does it really work correctly? We have to prove the optimal sub-structure: If an optimal solution A to S ij includes A[k], then solutions to S ik and S kj (as parts of A) must be optimal as well To prove use “cut-and-paste” argument What is the running time of this algorithm?

AALG, lecture 5, © Simonas Šaltenis, Greedy choice What if we could choose “the best” activity (as of now) and be sure that it belongs to an optimal solution We wouldn’t have to check out all these sub-problems and consider all currently possible choices! Idea: Choose the activity that finishes first! Then, solve the problem for the remaining compatible activities MaxN(A[1..n], i) //returns a set of activities 01 m  i while m  n and A[m].s < A[i].f do 03 m  m if m  n then return {A[m]}  MaxN(A, m) 05 else return 

AALG, lecture 5, © Simonas Šaltenis, Greedy-choice property What is the running time of this algorithm? Does it find an optimal solution?: We have to prove the greedy-choice property, i.e., that our locally optimal choice belongs to some globally optimal solution. We have to prove the optimal sub-structure property (we did that already) The challenge is to choose the right interpretation of “the best choice”: How about the activity that starts first Show a counter-example

AALG, lecture 5, © Simonas Šaltenis, Data Compression Data compression problem – strings S and S’: S -> S’ -> S, such that |S’|<|S| Text compression by coding with variable-length code: Obvious idea – assign short codes to frequent characters: “ abracadabra ” Frequency table: How much do we save in this case? abcdr Frequency52112 Fixed-length code Variable-length code

AALG, lecture 5, © Simonas Šaltenis, Prefix code Optimal code for given frequencies: Achieves the minimal length of the coded text Prefix code: no codeword is prefix of another It can be shown that optimal coding can be done with prefix code We can store all codewords in a binary trie – very easy to decode Coded characters in leaves Each node contains the sum of the frequencies of all descendants c d b r a

AALG, lecture 5, © Simonas Šaltenis, Optimal Code/Trie The cost of the coding trie T: C – the alphabet, f(c) – frequency of character c, d T (c) – depth of c in the trie (length of code in bits) Optimal trie – the one that minimizes B(T) Observation – optimal trie is always full: Every non-leaf node has two children. Why?

AALG, lecture 5, © Simonas Šaltenis, Huffman Algorithm - Idea Huffman algorithm, builds the code trie bottom up. Consider a forest of trees: Initially – one separate node for each character. In each step – join two trees into a larger tree Repeat this until one tree (trie) remains. Which trees to join? Greedy choice – the trees with the smallest frequencies! ABAB A+B 1 0

AALG, lecture 5, © Simonas Šaltenis, Huffman Algorithm What is its running time? Run the algorithm on: “ oho ho, Ole ” Huffman(C) 01 Q.build(C) // Builds a min-priority queue on frequences 02 for i  1 to n–1 do 03 Allocate new node z 04 x  Q.extractMin() 05 y  Q.extractMin() 06 z.setLeft(x) 07 z.setRight(y) 08 z.setF(x.f() + y.f()) 09 Q.insert(z) 10 return Q.extractMin() // Return the root of the trie

AALG, lecture 5, © Simonas Šaltenis, Correctness of Huffman Greedy choice property: Let x, y – two characters with lowest frequencies. Then there exists an optimal prefix code where codewords for x and y have the same length and differ only in the last bit Let’s prove it: Transform an optimal trie T into one (T’’), where x and y are max-depth siblings. Compare the costs.

AALG, lecture 5, © Simonas Šaltenis, Correctness of Huffman Optimal sub-structure property: Let x, y – characters with minimum frequency C’ = C –{x,y}{z}, such that f(z) = f(x) + f(y) Let T’ be an optimal code trie for C’ Replace leaf z in T’ with internal node with two children x and y The result tree T is an optimal code trie for C Proof a little bit more involved than a simple “cut-and-paste” argument

AALG, lecture 5, © Simonas Šaltenis, Elements of Greedy Algorithms Greedy algorithms are used for optimization problems A number of choices have to be made to arrive at an optimal solution At each step, make the “locally best” choice, without considering all possible choices and solutions to sub-problems induced by these choices (compare to dynamic programming) After the choice, only one sub-problem remains (smaller than the original) Greedy algorithms usually sort or use priority queues

AALG, lecture 5, © Simonas Šaltenis, Elements of Greedy Algorithms First, one has to prove the optimal sub-structure property the simple “cut-and-paste” argument may work The main challenge is to decide the interpretation of “the best” so that it leads to a global optimal solution, i.e., you can prove the greedy choice property The proof is usually constructive: takes a hypothetical optimal solution without the specific greedy choice and transforms into one that has this greedy choice. Or you find counter-examples demonstrating that your greedy choice does not lead to a global optimal solution.

AALG, lecture 5, © Simonas Šaltenis, Other Greedy Algorithms Find a minimum spanning tree in a weighted graph Coin changing