Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems.

Slides:



Advertisements
Similar presentations
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Advertisements

Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Greedy Algorithms Amihood Amir Bar-Ilan University.
Huffman Encoding Dr. Bernard Chen Ph.D. University of Central Arkansas.
Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. ASCII code A = B = C =
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
T(n) = 4 T(n/3) +  (n). T(n) = 2 T(n/2) +  (n)
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
CSc 461/561 CSc 461/561 Multimedia Systems Part B: 1. Lossless Compression.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
© 2004 Goodrich, Tamassia Greedy Method and Compression1 The Greedy Method and Text Compression.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Data Structures – LECTURE 10 Huffman coding
Chapter 9: Huffman Codes
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
CSE 326 Huffman coding Richard Anderson. Coding theory Conversion, Encryption, Compression Binary coding Variable length coding A B C D E F.
Variable-Length Codes: Huffman Codes
Greedy Algorithms Huffman Coding
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
Data Compression Gabriel Laden CS146 – Dr. Sin-Min Lee Spring 2004.
Basics of Compression Goals: to understand how image/audio/video signals are compressed to save storage and increase transmission efficiency to understand.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Huffman Codes. Encoding messages  Encode a message composed of a string of characters  Codes used by computer systems  ASCII uses 8 bits per character.
Data Compression1 File Compression Huffman Tries ABRACADABRA
1 Analysis of Algorithms Chapter - 08 Data Compression.
 Greedy Algorithms. Greedy Algorithm  Greedy Algorithm - Makes locally optimal choice at each stage. - For optimization problems.  If the local optimum.
Huffman Coding Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
ALGORITHMS FOR ISNE DR. KENNETH COSH WEEK 13.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Divide-and-Conquer & Dynamic Programming Divide-and-Conquer: Divide a problem to independent subproblems, find the solutions of the subproblems, and then.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Greedy Algorithms CSc 4520/6520 Fall 2013 Problems Considered Activity Selection Problem Knapsack Problem – 0 – 1 Knapsack – Fractional Knapsack Huffman.
Dynamic programming vs Greedy algo – con’t Input: Output: Objective: a number W and a set of n items, the i-th item has a weight w i and a cost c i a subset.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer.
Bahareh Sarrafzadeh 6111 Fall 2009
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Design & Analysis of Algorithm Huffman Coding
Greedy Algorithms Alexandra Stefan.
Greedy Algorithms – Chapter 5
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Chapter 16: Greedy Algorithms
CS6045: Advanced Algorithms
Algorithms (2IL15) – Lecture 2
Advanced Algorithms Analysis and Design
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Chapter 16: Greedy algorithms Ming-Te Chi
Data Structure and Algorithms
Greedy Algorithms Alexandra Stefan.
Chapter 16: Greedy algorithms Ming-Te Chi
CSE 326 Huffman coding Richard Anderson.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Dynamic Programming II DP over Intervals
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Analysis of Algorithms CS 477/677
Presentation transcript:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals (Greedy) algo:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals (Greedy) algo:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals (Greedy) algo:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals (Greedy) algo:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals (Greedy) algo:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals (Greedy) algo:

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals total time selected

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection 2 a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals Greedy algo ? total time selected

Greedy Algorithms Input: Output: Objective: - make decisions “greedily”, previous decisions are never reconsidered Optimization problems – Activity selection 2 a set of time-intervals a subset of non-overlapping intervals maximize # of selected intervals An algo ? total time selected

BackTracking - “brute force”: try all possibilities - running time?

BackTracking - “brute force”: try all possibilities - running time? O(n!) Anything better for Activity selection 2?

BackTracking - “brute force”: try all possibilities - running time? O(n!) Anything better for Activity selection 2?

BackTracking - “brute force”: try all possibilities - running time? O(n!) Anything better for Activity selection 2? - dynamic programming

Dynamic Programming - idea: remember values for smaller instances, use it for computing the value for a larger instance

Dynamic Programming – Activity selection 2 - idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length?

- idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length? Which value are we ultimately looking for? Dynamic Programming – Activity selection 2

- idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length? Which value are we ultimately looking for? S[n] How to compute S[k+1], having computed S[1], …, S[k] ? Dynamic Programming – Activity selection 2

- idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length ending with interval j ? Which value are we ultimately looking for? S[n] How to compute S[k+1], having computed S[1], …, S[k] ? S[k,j] Dynamic Programming – Activity selection 2

- idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length ending with interval j ? Which value are we ultimately looking for? S[n] How to compute S[k+1], having computed S[1], …, S[k] ? S[k,j] Dynamic Programming – Activity selection 2

Which value are we ultimately looking for? max j S[n,j] - idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length ending with interval j ? How to compute S[k+1], having computed S[1], …, S[k] ? S[k,j] Dynamic Programming – Activity selection 2

Which value are we ultimately looking for? max j S[n,j] - idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length ending with interval j ? How to compute S[k+1], having computed S[1], …, S[k] ? S[k,j] ? Dynamic Programming – Activity selection 2

Which value are we ultimately looking for? max j S[n,j] - idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length ending with interval j ? How to compute S[k,j], having computed S[k’,j’] for every (k’,j’) such that k’<k ? S[k,j] Dynamic Programming – Activity selection 2

Which value are we ultimately looking for? max j S[n,j] - idea: remember values for smaller instances, use it for computing the value for a larger instance Let S[k] =considering only the first k intervals, what is the optimal length ending with interval j ? How to compute S[k,j], having computed S[k’,j’] for every (k’,j’) such that k’<k ? S[k,j] S[k,j] = max j’: f[j’] · b[j] [ S[k-1,j’] + (f[j]-b[j]) ] Dynamic Programming – Activity selection 2

S[k,j] = max j’: f[j’] · b[j] [ S[k-1,j’] + (f[j]-b[j]) ] Dynamic Programming – Activity selection 2 Can output the optimal time, how to find the optimal selection?

Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer

Approaches to Problem Solving greedy algorithms dynamic programming backtracking divide-and-conquer our focus this week

Problem: Huffman Coding binary character code = assignment of binary strings to characters e.g. ASCII code A = B = C = … fixed-length code How to decode: ?

Problem: Huffman Coding binary character code = assignment of binary strings to characters e.g. code A = 0 B = 10 C = 11 … variable-length code How to decode: ?

Problem: Huffman Coding binary character code = assignment of binary strings to characters e.g. code A = 0 B = 10 C = 11 … variable-length code How to decode: ? DEF: A code is prefix-free if no codeword is a prefix of another codeword.

Problem: Huffman Coding binary character code = assignment of binary strings to characters e.g. another code A = 1 B = 10 C = 11 … variable-length code How to decode: ? DEF: A code is prefix-free if no codeword is a prefix of another codeword.

Problem: Huffman Coding binary character code = assignment of binary strings to characters e.g. another code A = 1 B = 10 C = 11 … variable-length code How to decode: ? DEF: A code is prefix-free if no codeword is a prefix of another codeword. not prefix-free

Problem: Huffman Coding - optimal prefix-free code Input: Output: Objective: an alphabet with frequencies prefix code expected number of bits per character E % A8.4966% R7.5809% I7.5448% O7.1635% T6.9509% N6.6544% S5.7351% L5.4893% C4.5388% U3.6308% D3.3844% P3.1671% M3.0129% H3.0034% G2.4705% B2.0720% F1.8121% Y1.7779% W1.2899% K1.1016% V1.0074% X0.2902% Z0.2722% J0.1965% Q0.1962%

Problem: Huffman Coding - optimal prefix-free code Input: Output: Objective: an alphabet with frequencies prefix code expected number of bits per character A60% B20% C10% D10%

Problem: Huffman Coding - optimal prefix-free code Input: Output: Objective: an alphabet with frequencies prefix code expected number of bits per character A60% B20% C10% D10%

Problem: Huffman Coding - optimal prefix-free code Input: Output: Objective: an alphabet with frequencies prefix code expected number of bits per character A60% B20% C10% D10% Optimal ?

Problem: Huffman Coding - optimal prefix-free code Input: Output: Objective: an alphabet with frequencies prefix code expected number of bits per character A60% B20% C10% D10% Optimal ? 2-bits per character Can do better ?

Problem: Huffman Coding - optimal prefix-free code Input: Output: Objective: an alphabet with frequencies prefix code expected number of bits per character A60% B20% C10% D10% Optimal ? 2-bits per character Can do better ? YES, 1.6-bits per character

Problem: Huffman Coding - optimal prefix-free code Huffman ( [a 1,f 1 ],[a 2,f 2 ],…,[a n,f n ]) if n=1 then code[a 1 ]  “” else let f i,f j be the 2 smallest f’s Huffman ( [a i,f i +f j ],[a 1,f 1 ],…,[a n,f n ] ) code[a j ]  code[a i ] + “0” code[a i ]  code[a i ] + “1”

Problem: Huffman Coding Let x,y be the symbols with frequencies f x < f y. Then in an optimal prefix code length(C x )  length(C y ). Lemma 1:

Problem: Huffman Coding Let x,y be the symbols with frequencies f x < f y. Then in an optimal prefix code length(C x )  length(C y ). Lemma 1: Let C = w0 be a longest codeword in an optimal code. Then w1 is also a codeword. Lemma 2:

Problem: Huffman Coding Let x,y be the symbols with frequencies f x < f y. Then in an optimal prefix code length(C x )  length(C y ). Lemma 1: Let C = w0 be a longest codeword in an optimal code. Then w1 is also a codeword. Lemma 2: Let x,y be the symbols with the smallest frequencies. Then there exists an optimal prefix code such that the codewords for x and y differ only in the last bit. Lemma 3:

Problem: Huffman Coding Let x,y be the symbols with frequencies f x < f y. Then in an optimal prefix code length(C x )  length(C y ). Lemma 1: Let C = w0 be a longest codeword in an optimal code. Then w1 is also a codeword. Lemma 2: Let x,y be the symbols with the smallest frequencies. Then there exists an optimal prefix code such that the codewords for x and y differ only in the last bit. Lemma 3: The prefix code output by the Huffman algorithm is optimal. Theorem: