CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
Review: Dynamic Programming
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
Data Structures – LECTURE 10 Huffman coding
Greedy Algorithms Huffman Coding
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Greedy Algorithms Dr. Yingwu Zhu. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy Algorithms CSc 4520/6520 Fall 2013 Problems Considered Activity Selection Problem Knapsack Problem – 0 – 1 Knapsack – Fractional Knapsack Huffman.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
Greedy Algorithms Chapter 16 Highlights
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Greedy Algorithms Analysis of Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
CS583 Lecture 12 Jana Kosecka Dynamic Programming Longest Common Subsequence Matrix Chain Multiplication Greedy Algorithms Many slides here are based on.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Greedy algorithms: CSC317
HUFFMAN CODES.
CSC317 Greedy algorithms; Two main properties:
CSCE 411 Design and Analysis of Algorithms
Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable.
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Greedy Algorithm.
Chapter 16: Greedy Algorithm
CS6045: Advanced Algorithms
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Advance Algorithm Dynamic Programming
Huffman codes Binary character code: each character is represented by a unique binary string. A data file can be coded in two ways: a b c d e f frequency(%)
Analysis of Algorithms CS 477/677
Presentation transcript:

CS6045: Advanced Algorithms Greedy Algorithms

Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice at the current moment (Local optimal) (Greedy choice) –A greedy algorithm always makes the choice that looks best at the moment –The hope: A locally optimal choice will lead to a globally optimal solution For some problems, it works. For others, it does not

Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment –The hope: a locally optimal choice will lead to a globally optimal solution –For some problems, it works Dynamic programming can be overkill (slow); greedy algorithms tend to be easier to code –Activity-Selection Problem –Huffman Codes

Activity-Selection Problem Problem: get your money’s worth out of a carnival –Buy a wristband that lets you onto any ride –Lots of rides, each starting and ending at different times –Your goal: ride as many rides as possible Another, alternative goal that we don’t solve here: maximize time spent on rides Welcome to the activity selection problem

Activity-Selection Formally: –Given a set S of n activities S = {a 1, …, a n } s i = start time of activity i f i = finish time of activity i –Find max-size subset A of compatible (non-overlapping ) activities n Assume that f 1  f 2  …  f n

Example Maximum-size mutually compatible set: {a 1, a 3, a 6, a 8 }. Not unique: also {a 2, a 5, a 7, a 9 }.

Activity Selection: Optimal Substructure

Activity Selection: Dynamic Programming

Greedy Choice Property Dynamic programming –Solve all the sub-problems Activity selection problem also exhibits the greedy choice property: –We should choose an activity that leaves the resource available for as many other activities as possible –The first greedy choice is a 1, since f 1  f 2  …  f n

Activity Selection: A Greedy Algorithm So actual algorithm is simple: –Sort the activities by finish time –Schedule the first activity –Then schedule the next activity in sorted list which starts after previous activity finishes –Repeat until no more activities Intuition is even more simple: –Always pick the shortest ride available at the time

Select the activity that ends first (smallest end time) –Intuition: it leaves the largest possible empty space for more activities Once selected an activity –Delete all non-compatible activities –They cannot be selected Repeat the algorithm for the remaining activities –Either using iterations or recursion Activity Selection: A Greedy Algorithm Greedy Choice: Select the next best activity (Local Optimal) Sub-problem: We created one sub-problem to solve (Find the optimal schedule after the selected activity) Hopefully when we merge the local optimal + the sub- problem optimal solution  we get a global optimal

Greedy Algorithm Correctness Theorem: –If S k (activities that start after a k finishes) is nonempty and a m has the earliest finish time in S k, then a m is included in some optimal solution. How to prove it? –We can convert any other optimal solution (S’) to the greedy algorithm solution (S) Idea: –Compare the activities in S’ and S from left-to-right –If they match in the selected activity  skip –If they do not match, w e can replace the activity in S’ by that in S because the one in S finishes first

Example S: {a 1, a 3, a 6, a 8 }. S’:{a 2, a 5, a 7, a 9 }. a 2, a 5, a 7, a 9 in S’ can be replaced by a 1, a 3, a 6, a 8 from S (finishes earlier) We mapped S’ to S and showed that S is even better

Recursive Solution Recursive-Activity-Selection(s, f, k, n) m = k +1 While (m <= n) && ( s[m] < f[k]) m++; If (m <= n) return {A m } U Recursive-Activity-Selection(s, f, m, n) Else return Φ Two arrays containing the start and end times (Assumption: they are sorted based on end times) The activity chosen in the last call The problem size Find the next activity starting after the end of k Time Complexity: O(n) (Assuming arrays are already sorted, otherwise we add O(n Log n)

Iterative Solution Iterative-Activity-Selection(s, f) n = s.length A = {a 1 } k = 1 for (m = 2 to n) if (S[m] >= f[k]) A = A U {a m } k = m Return A Two arrays containing the start and end times (Assumption: they are sorted based on end times)

Elements Of Greedy Algorithms Greedy-Choice Property –At each step, we do a greedy (local optimal) choice Top-Down Solution –The greedy choice is usually done independent of the sub-problems –Usually done “before” solving the sub-problem Optimal Substructure –The global optimal solution can be composed from the local optimal of the sub-problems

Elements Of Greedy Algorithms Proving a greedy solution is optimal –Remember: Not all problems have optimal greedy solution –If it does, you need to prove it –Usually the proof includes mapping or converting any other optimal solution to the greedy solution

Review: The Knapsack Problem 0-1 knapsack problem: –The thief must choose among n items, where the ith item worth v i dollars and weighs w i pounds –Carrying at most W pounds, maximize value Note: assume v i, w i, and W are all integers “0-1” b/c each item must be taken or left in entirety A variation, the fractional knapsack problem: –Thief can take fractions of items –Think of items in 0-1 problem as gold ingots, in fractional problem as buckets of gold dust

Review: The Knapsack Problem And Optimal Substructure Both variations exhibit optimal substructure To show this for the 0-1 problem, consider the most valuable load weighing at most W pounds –If we remove item j from the load, what do we know about the remaining load? –A: remainder must be the most valuable load weighing at most W - w j that thief could take from museum, excluding item j

Solving The Knapsack Problem The optimal solution to the 0-1 problem cannot be found with the same greedy strategy –Greedy strategy: take in order of dollars/pound –Example: 3 items weighing 10, 20, and 30 pounds, knapsack can hold 50 pounds Suppose 3 items are worth $60, $100, and $120. Will greedy strategy work?

Knapsack - Greedy Strategy Does Not Work Item 1 Item 2 Item 3 $60$100$ $60 $100 + $ $100 $120 + $ $6/pound$5/pound$4/pound W Greedy choice: –Compute the benefit per pound –Sort the items based on these values Not optimal

Solving The Knapsack Problem The optimal solution to the fractional knapsack problem can be found with a greedy algorithm Item 1 Item 2 Item 3 $60$100$ $60 $100 + $240 $6/pound$5/pound$4/pound W Greedy choice: –Compute the benefit per pound –Sort the items based on these values –Take as much as you can from the top items in the list 2/3 Of 30 $80 + Optimal

The Knapsack Problem: Greedy Vs. Dynamic The fractional problem can be solved greedily The 0-1 problem cannot be solved with a greedy approach –As you have seen, however, it can be solved with dynamic programming

Huffman code Computer Data Encoding: –How do we represent data in binary? Historical Solution: –Fixed length codes –Encode every symbol by a unique binary string of a fixed length. –Examples: ASCII (7 bit code), – EBCDIC (8 bit code), …

American Standard Code for Information Interchange

ASCII Example: AABCAA

Total space usage in bits: Assume an ℓ bit fixed length code. For a file of n characters Need nℓ bits.

Variable Length codes Idea: In order to save space, use less bits for frequent characters and more bits for rare characters. Example: suppose alphabet of 3 symbols:{ A, B, C }. suppose in file: 1,000,000 characters. Need 2 bits for a fixed length code for a total of 2,000,000 bits.

Variable Length codes - example CBA ,000 Suppose the frequency distribution of the characters is: CBA Note that the code of A is of length 1, and the codes for B and C are of length 2 Encode:

Fixed code: 1,000,000 x 2 = 2,000,000 Variable code: 999,000 x x x 2 1,001,000 Total space usage in bits: A savings of almost 50%

How do we decode? In the fixed length, we know where every character starts, since they all have the same number of bits. Example: A = 00 B = 01 C = A A A B B C C C B C B A A C C

How do we decode? In the variable length code, we use an idea called Prefix code, where no code is a prefix of another. Example: A = 0 B = 10 C = 11 None of the above codes is a prefix of another.

How do we decode? Example: A = 0 B = 10 C = 11 So, for the string: A A A B B C C C B C B A A C C the encoding:

Prefix Code Example: A = 0 B = 10 C = 11 Decode the string AAABBCCCBCBAACC

Desiderata: Construct a variable length code for a given file with the following properties: 1. Prefix code. 2. Using shortest possible codes. 3. Efficient.

Idea Consider a binary tree, with: 0 meaning a left turn 1 meaning a right turn A B C D

Idea Consider the paths from the root to each of the leaves A, B, C, D: A : 0 B : 10 C : 110 D : A B C D

Observe: 1.This is a prefix code, since each of the leaves has a path ending in it, without continuation. 2. If the tree is full then we are not “wasting” bits. 3. If we make sure that the more frequent symbols are closer to the root then they will have a smaller code A B C D

Greedy Algorithm: 1. Consider all pairs:. 2. Choose the two lowest frequencies, and make them brothers, with the root having the combined frequency. 3. Iterate.

Greedy Algorithm Example: Alphabet: A, B, C, D, E, F Frequency table: FEDCBA Total File Length: 210

Algorithm Run: A 10 B 20C 30D 40E 50F 60

Algorithm Run: A 10 B 20 C 30D 40E 50F 60X 30

Algorithm Run: A 10 B 20 C 30 D 40E 50F 60 X 30 Y 60

Algorithm Run: A 10 B 20 C 30 D 40E 50F 60 X 30 Y 60

Algorithm Run: A 10 B 20 C 30 F 60 X 30 Y 60 D 40 E 50 Z 90

Algorithm Run: A 10 B 20 C 30 F 60 X 30 Y 60 D 40 E 50 Z 90

Algorithm Run: A 10 B 20 C 30 F 60 X 30 Y 60 D 40 E 50 Z 90W 120

Algorithm Run: A 10 B 20 C 30 F 60 X 30 Y 60 D 40 E 50 Z 90 W 120

Algorithm Run: A 10 B 20 C 30 F 60 X 30 Y 60 D 40 E 50 Z 90 W 120 V

The Huffman encoding: A 10 B 20 C 30 F 60 X 30 Y 60 D 40 E 50 Z 90 W 120 V A: 1000 B: 1001 C: 101 D: 00 E: 01 F: 11 File Size: 10x4 + 20x4 + 30x3 + 40x2 + 50x2 + 60x2 = = 510 bits

Note the savings: The Huffman code: Required 510 bits for the file. Fixed length code: Need 3 bits for 6 characters. File has 210 characters. Total: 630 bits for the file.

Greedy Algorithm Initialize trees of a single node each. Keep the roots of all subtrees in a priority queue. Iterate until only one tree left: Merge the two smallest frequency subtrees into a single subtree with two children, and insert into priority queue.

Huffman(C) 1.n = |C| 2.Q = C // Q is a binary Min-Heap;  (n) Build-Heap 3.for i = 1 to n-1 4.z = Allocate-Node() 5.x = Extract-Min(Q) //  (lgn),  (n) times 6.y = Extract-Min(Q) //  (lgn),  (n) times 7.left(z) = x 8.right(z) = y 9.f(z) = f(x) + f(y) 10.Insert(Q, z) //  (lgn),  (n) times 11.return Extract-Min(Q) // return the root of the tree Huffman Algorithm Total run time:  (n lgn)

Algorithm correctness: Need to prove two things for greedy algorithms: Greedy Choice Property: The choice of local optimum is indeed part of a global optimum. Optimal Substructure Property: When we recursive on the remaining and combine it with the local optimum of the greedy choice, we get a global optimum.

Huffman Algorithm correctness: Greedy Choice Property: There exists a minimum cost prefix tree where the two smallest frequency characters are indeed siblings with the longest path from root. This means that the greedy choice does not hurt finding the optimum.

Algorithm correctness: Optimal Substructure Property: An optimal solution to the problem once we choose the two least frequent elements and combine them to produce a smaller problem, is indeed a solution to the problem when the two elements are added.

Algorithm correctness: Greedy Choice Property: There exists a minimum cost tree where the minimum frequency elements are longest path siblings: Proof by contradiction: Assume that is not the situation. Then there are two elements in the longest path. Say a,b are the elements with smallest frequency and x,y the elements in the longest path

Algorithm correctness: xy a dydy dada We know about depth and frequency: d a ≤ d y f a ≤ f y CT

Algorithm correctness: xy a dydy dada We also know about code tree CT: ∑f σ d σ σ is smallest possible. CT Now exchange a and y.

Algorithm correctness: xa y dydy dada CT’ (d a ≤ d y, f a ≤ f y Therefore f a d a ≥f y d a and f y d y ≥f a d y ) Cost(CT) = ∑f σ d σ = σ ∑f σ d σ +f a d a +f y d y ≥ σ≠a,y ∑f σ d σ +f y d a +f a d y = σ≠a,y cost(CT’)

Algorithm correctness: xa b dxdx dbdb CT Now do the same thing for b and x

Algorithm correctness: ba x dxdx dbdb CT” And get an optimal code tree where a and b are sibling with the longest paths

Algorithm correctness: Optimal substructure property: Let a,b be the symbols with the smallest frequency. Let x be a new symbol whose frequency is f x =f a +f b. Delete characters a and b, and find the optimal code tree CT for the reduced alphabet. Then CT’ = CT U {a,b} is an optimal tree for the original alphabet.

Algorithm correctness: CT x ab CT’ x f x = f a + f b

Algorithm correctness: cost(CT’)=∑f σ d’ σ = ∑f σ d’ σ + f a d’ a + f b d’ b = σ σ≠a,b ∑f σ d’ σ + f a (d x +1) + f b (d x +1) = σ≠a,b ∑f σ d’ σ +( f a + f b )(d x +1)= σ≠a,b ∑f σ d σ + f x (d x +1)+f x = cost(CT) + f x σ≠a,b

Algorithm correctness: CT x ab CT’ x f x = f a + f b cost(CT)+f x = cost(CT’)

Algorithm correctness: Assume CT’ is not optimal. By the previous lemma there is a tree CT” that is optimal, and where a and b are siblings. So cost(CT”) < cost(CT’)

Algorithm correctness: CT’’’ x ab CT” x f x = f a + f b By a similar argument: cost(CT’’’)+f x = cost(CT”) Consider

Algorithm correctness: We get: cost(CT’’’) = cost(CT”) – f x < cost(CT’) – f x = cost(CT) and this contradicts the minimality of cost(CT).

Greedy Vs. Dynamic Greedy Algorithms –Can assemble a globally optimal solution by making locally optimal choices –Making the choice before solving the sub-problems –Top-down (simpler and more efficient) –Can solve some problems optimally Dynamic Programming –Choice depends on knowing optimal solutions to sub- problems. Solve all sub-problems –Bottom-up (slow) –Can solve more problems optimally