Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Greedy Algorithms.
Introduction to Algorithms Greedy Algorithms
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
CS38 Introduction to Algorithms Lecture 5 April 15, 2014.
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
CS 206 Introduction to Computer Science II 04 / 29 / 2009 Instructor: Michael Eckmann.
Data Structures – LECTURE 10 Huffman coding
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd.
Greedy Algorithms CSE 331 Section 2 James Daly. Reminders Exam 2 next week Thursday, April 9th Covers heaps to spanning trees Greedy algorithms (today)
Greedy Algorithms Huffman Coding
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Data Compression Gabriel Laden CS146 – Dr. Sin-Min Lee Spring 2004.
Lecture 23. Greedy Algorithms
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd.
Data Structures and Algorithms Lecture (BinaryTrees) Instructor: Quratulain.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
For Wednesday No reading No homework There will be homework for Friday, as well the program being due – plan ahead.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
1 Algorithms & Data Structures for Games Lecture 2A Minor Games Programming.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
Huffman’s Algorithm 11/02/ Weighted 2-tree A weighted 2-tree T is an extended binary tree with n external nodes and each of the external nodes is.
Bahareh Sarrafzadeh 6111 Fall 2009
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Greedy Algorithms Analysis of Algorithms.
CS 146: Data Structures and Algorithms July 28 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
HUFFMAN CODES.
Greedy Algorithms Alexandra Stefan.
CSC317 Greedy algorithms; Two main properties:
Chapter 5 : Trees.
Greedy Technique.
The Greedy Method and Text Compression
Proving the Correctness of Huffman’s Algorithm
Chapter 16: Greedy Algorithm
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Algorithms (2IL15) – Lecture 2
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
CSE 326: Data Structures Lecture #24 The Algorhythmics
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Podcast Ch23d Title: Huffman Compression
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Proving the Correctness of Huffman’s Algorithm
Analysis of Algorithms CS 477/677
Presentation transcript:

Algorithm Design Techniques: Greedy Algorithms

Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems Greedy, Divide and Conquer, Dynamic Programming, Randomized, Backtracking General approach Examples Time and space complexity (where appropriate)

Greedy Algorithms Choose the best option during each phase –Dijkstra, Prim, Kruskal Making change –Choose largest bill at each round –Does this always work? Bad examples where greedy does not work?

Greedy Algorithms Must have –Greedy-choice property: a globally optimal solution can be arrived at by making a locally optimal choice –Optimal substructure: an optimal solution to a problem contains optimal solutions to its subproblems

Making Change Greedy choice property –Highest denomination coin < n will reside in solution – if not, it will be replaced by two or more smaller coins which will be more coins and not optimal –This is also true for 1, 7, 10 denominations??? Optimal substructure –Solution for (n – highest denomination coin) is optimal

Scheduling Given jobs j1, j2, j3,..., jn with known running times t1, t2, t3,..., tn – what is the best way to schedule the jobs to minimize average completion time? JobTime j115 j28 j33 j410

Scheduling j1j2j3j Average completion time = ( )/4 = 25 j1j2 j3 j Average completion time = ( )/4 = 17.75

Scheduling Greedy-choice property: if shortest job does not go first, the y jobs before it will complete 3 time units faster, but j3 will be postponed by time to complete all jobs before it Optimal substructure: if shortest job is removed from optimal solution, remaining solution for n-1 jobs is optimal

Optimality Proof Total cost of a schedule is N ∑(N-k+1)t ik k=1 t1 + (t1+t2) + (t1+t2+t3)... (t1+t2+...+tn) N N (N+1)∑t ik - ∑k*t ik k=1 k=1 First term independent of ordering, as second term increases, total cost becomes smaller

Scheduling Suppose there is a job ordering such that x > y and t ix < t iy Swapping jobs (smaller first) increases second term decreasing total cost Show: xt ix + yt iy < yt ix + xt iy xt ix + yt iy = xt ix + yt ix + y(t iy - t ix ) = yt ix + xt ix + y(t iy - t ix ) < yt ix + xt ix + x(t iy - t ix ) = yt ix + xt ix + xt iy - xt ix = yt ix + xt iy

More Scheduling Multiple processor case –Algorithm?

More Scheduling Multiple processor case –Algorithm: order jobs shortest first schedule jobs round-robin Minimizing final completion time –When is this useful? –How is this different? –Problem is NP-Complete!

Huffman Codes 100 ASCII characters Need ceil(log 100) bits to represent each character Large file = lots of bits! Would like to reduce number of bits

Huffman Codes Idea – encode frequently occurring characters using fewer bits Need to make sure all characters are distinguishable –01 = A 0101 = B – =? AAA, AB, BA No character code should be a prefix of another character code

Huffman Codes Goal: find a full binary tree of minimum cost where characters are stored in the leaves Cost of tree: sum across all characters of the frequency of the character times its depth in the tree –frequently occurring characters should be highest in the tree

Huffman Codes t snl eisp CharacterCodeFrequencyTotal Bits a e i s t space newline total146

Huffman’s Algorithm How do we produce a code? –Maintain a forest of trees weight of a tree is the sum of the frequencies of the leaves start with C trees to represent each character –weight of each is frequency of that character –Until there is only 1 tree choose the 2 trees with the smallest weights and merge them by creating a new root and making each tree a right or left subtree –Running time – O (ClogC)

Optimality Proof – Idea 1.The tree must be full if it is not, move leaf with no siblings to its parent 2.Least frequent characters are the deepest nodes if not, a node can be swapped with an ancestor 3.Characters at the same depth can be swapped 4.As trees are merged, optimality holds

Optimality Proof – Idea Greedy choice property: given x and y -- characters with lowest frequency in alphabet C, there exists an optimal prefix code for C in which the codewords for x and y have the same length and differ only in the last bit –Take an arbitrary optimal prefix code and modify it to make it a tree representing another optimal prefix code such that x and y are sibling leaves of max depth

Optimality Proof – Idea Optimal substructure: C’ = C – {x, y} U {z} where f[z] = f[x]+f[y] T’ is optimal tree for C’ Replace z in T’ with internal node having x and y as children Result is optimal prefix code for C

Approximate Bin Packing N items of sizes s 1, s 2,..., s N 0 < s i <= 1 Goal: pack into fewest number of bins of size 1 NP-complete problem, but we can use greedy algorithms to produce solutions not too far from optimal Knapsack problem Examples? –Saving data to external media

Example – Optimal Packing Input:.2,.5,.4,.7,.1,.3,.8

On-line vs Off-line On-line –Process one item at a time –Cannot move an item once it is placed Off-line –Look at all items before you place first item

On-line Algorithms On-line algorithms cannot guarantee optimal solution –Problem: cannot know when input will end –M small items ½-ε – M large items ½+ε –Can fit into M bins with 1 large and 1 small in each bin –If all small come first, place in M separate bins –If input is only M small items, we have used twice as many bins as necessary –There are inputs that force any on-line bin-packing algorithm to use at least 4/3 the optimal number of bins.

On-line Bin Packing Algorithms Next fit First fit Best fit

On-line Bin Packing Algorithms Next fit –Algorithm if item first in bin with last item – place there else – place in new bin –(.2,.5) (.4) (.7,.1) (.3) (.8) –Running time? –Let M be the optimal number of bins required to pack a list I of items. Then next fit never uses more than 2M bins. At most, half of the space is wasted (B j + B j+1 > 1)

On-line Bin Packing Algorithms First fit –Algorithm Scan all bins and place item in first bin large enough to hold it if no bin is large enough, create new bin –(.2,.5,.1) (.4,.3) (.7) (.8) –Running time? –Let M be the optimal number of bins required to pack a list I of items. Then first fit never uses more than ceil(17/10M) bins.

On-line Bin Packing Algorithms Best fit –Algorithm Scan all bins and place item in bin with tightest fit (will be fullest after item is placed there) if no bin is large enough, create new bin –(.2,.5,.1) (.4) (.7,.3) (.8) –Running time? –Same performance as first fit.

Off-line Bin Packing Sort items (in decreasing order) first for easier placement of large items Apply first fit or best fit algorithm –First fit – (.8,.2) (.7,.3) (.5,.4,.1) Let M be the optimal number of bins required to pack a list I of items. Then first fit decreasing never uses more than (11/9M)+4 bins.