Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Greedy Algorithms.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
The Greedy Method1. 2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1) Task Scheduling (§5.1.2) Minimum Spanning.
Chapter 5 Fundamental Algorithm Design Techniques.
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Lecture 1: The Greedy Method 主講人 : 虞台文. Content What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree – Kruskal’s Algorithm.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
5-1-1 CSC401 – Analysis of Algorithms Chapter 5--1 The Greedy Method Objectives Introduce the Brute Force method and the Greedy Method Compare the solutions.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Bahareh Sarrafzadeh 6111 Fall 2009
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Lecture on Data Structures(Trees). Prepared by, Jesmin Akhter, Lecturer, IIT,JU 2 Properties of Heaps ◈ Heaps are binary trees that are ordered.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
Greedy Technique.
Problem solving sequence
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
The Greedy Method and Text Compression
Design & Analysis of Algorithm Greedy Algorithm
The Greedy Method and Text Compression
Introduction to Algorithms`
Greedy Algorithm.
Pattern Matching 9/14/2018 3:36 AM
13 Text Processing Hongfei Yan June 1, 2016.
Chapter 16: Greedy Algorithm
Chapter 16: Greedy Algorithms
Analysis & Design of Algorithms (CSCE 321)
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Algorithms (2IL15) – Lecture 2
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Advanced Algorithms Analysis and Design
Merge Sort Dynamic Programming
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Merge Sort 1/18/ :45 AM Dynamic Programming Dynamic Programming.
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
Richard Anderson Autumn 2016 Lecture 7
Richard Anderson Lecture 7 Greedy Algorithms
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Podcast Ch23d Title: Huffman Compression
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method

Outline and Reading The Greedy Method Technique (§5.1) Merge Sort 11/28/2018 2:21 AM Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1) Task Scheduling (§5.1.2) Huffman Code Minimum Spanning Trees (§7.3) [future lecture] The Greedy Method

The Greedy Method Technique Merge Sort 11/28/2018 2:21 AM The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations: different choices, collections, or values to find objective function: a score assigned to configurations, which we want to either maximize or minimize It works best when applied to problems with the greedy-choice property: a globally-optimal solution can always be found by a series of local improvements from a starting configuration. The Greedy Method

Merge Sort 11/28/2018 2:21 AM Making Change Problem: A dollar amount to reach and a collection of coin amounts to use to get there. Configuration: A dollar amount yet to return to a customer plus the coins already returned Objective function: Minimize number of coins returned. Greedy solution: Always return the largest coin you can Example 1: Coins are valued $.32, $.08, $.01 Has the greedy-choice property, since no amount over $.32 can be made with a minimum number of coins by omitting a $.32 coin (similarly for amounts over $.08, but under $.32). Example 2: Coins are valued $.30, $.20, $.05, $.01 Does not have greedy-choice property, since $.40 is best made with two $.20’s, but the greedy solution will pick three coins (which ones?) The Greedy Method

The Fractional Knapsack Problem Merge Sort 11/28/2018 2:21 AM The Fractional Knapsack Problem Given: A set S of n items, with each item i having bi - a positive benefit wi - a positive weight Goal: Choose items with maximum total benefit but with weight at most W. If we are allowed to take fractional amounts, then this is the fractional knapsack problem. In this case, we let xi denote the amount we take of item i Objective: maximize Constraint: The Greedy Method

Example Given: A set S of n items, with each item i having Merge Sort 11/28/2018 2:21 AM Example Given: A set S of n items, with each item i having bi - a positive benefit wi - a positive weight Goal: Choose items with maximum total benefit but with weight at most W. 10 ml “knapsack” Solution: 1 ml of 5 2 ml of 3 6 ml of 4 1 ml of 2 Items: 1 2 3 4 5 Weight: 4 ml 8 ml 2 ml 6 ml 1 ml Benefit: $12 $32 $40 $30 $50 Value: 3 4 20 5 50 ($ per ml) The Greedy Method

The Fractional Knapsack Algorithm Merge Sort 11/28/2018 2:21 AM The Fractional Knapsack Algorithm Greedy choice: Keep taking item with highest value (benefit to weight ratio) Since Run time: O(n log n). Why? Correctness: Suppose there is a better solution there is an item i with higher value than a chosen item j, but xi<wi, xj>0 and vi>vj If we substitute some i with j, we get a better solution How much of i: min{wi-xi, xj} Thus, there is no better solution than the greedy one Algorithm fractionalKnapsack(S, W) Input: set S of items w/ benefit bi and weight wi; max. weight W Output: amount xi of each item i to maximize benefit w/ weight at most W for each item i in S xi  0 vi  bi / wi {value} w  0 {total weight} while w < W remove item i w/ highest vi xi  min{wi , W - w} w  w + min{wi , W - w} The Greedy Method

Task Scheduling Given: a set T of n tasks, each having: Merge Sort 11/28/2018 2:21 AM Task Scheduling Given: a set T of n tasks, each having: A start time, si A finish time, fi (where si < fi) Goal: Perform all the tasks using a minimum number of “machines.” 1 9 8 7 6 5 4 3 2 Machine 1 Machine 3 Machine 2 The Greedy Method

Task Scheduling Algorithm Merge Sort 11/28/2018 2:21 AM Task Scheduling Algorithm Greedy choice: consider tasks by their start time and use as few machines as possible with this order. Run time: O(n log n). Why? Correctness: Suppose there is a better schedule. We can use k-1 machines The algorithm uses k Let i be first task scheduled on machine k Task i must conflict with k-1 other tasks But that means there is no non-conflicting schedule using k-1 machines Algorithm taskSchedule(T) Input: set T of tasks w/ start time si and finish time fi Output: non-conflicting schedule with minimum number of machines m  0 {no. of machines} while T is not empty remove task i w/ smallest si if there’s a machine j for i then schedule i on machine j else m  m + 1 schedule i on machine m The Greedy Method

Example Given: a set T of n tasks, each having: Merge Sort 11/28/2018 2:21 AM Example Given: a set T of n tasks, each having: A start time, si A finish time, fi (where si < fi) [1,4], [1,3], [2,5], [3,7], [4,7], [6,9], [7,8] (ordered by start) Goal: Perform all tasks on min. number of machines Machine 3 Machine 2 Machine 1 1 2 3 4 5 6 7 8 9 The Greedy Method

Text Compression Files can often be compressed. Fixed-length encoding Represented using fewer bytes than the standard representation. Fixed-length encoding Somewhat wasteful, because some characters are more common than others. If a character appears frequently, it should have a shorter representation.

Compression “beekeepers & bees” 000 001 001 010 001 001 011 001 100 101 110 111 110 000 001 001 101 110 0 0 11110 0 0 11111 0 1011 100 1110 1010 1110 110 0 0 100

Compression Huffman encodings are designed so that no code is a prefix of another code.

Compression First construct a binary tree. On each pass through the main loop, we choose the two lowest-count roots and merge them. Ties don't matter. Count for the new parent is the sum of its children's counts.

Compression

Compression

Compression The code for each character is determined by the path from the root to the corresponding leaf. Right is 1 Left is 0 'b' is right-right-left and its code is 110