4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd.

Slides:



Advertisements
Similar presentations
Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Advertisements

Introduction to Computer Science 2 Lecture 7: Extended binary trees
Lecture 4 (week 2) Source Coding and Compression
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Binary Trees CSC 220. Your Observations (so far data structures) Array –Unordered Add, delete, search –Ordered Linked List –??
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
22C:19 Discrete Structures Trees Spring 2014 Sukumar Ghosh.
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
Trees Chapter 8.
Lecture04 Data Compression.
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
Fall 2007CS 2251 Trees Chapter 8. Fall 2007CS 2252 Chapter Objectives To learn how to use a tree to represent a hierarchical organization of information.
Trees Chapter 8. Chapter 8: Trees2 Chapter Objectives To learn how to use a tree to represent a hierarchical organization of information To learn how.
Trees Chapter 8. Chapter 8: Trees2 Chapter Objectives To learn how to use a tree to represent a hierarchical organization of information To learn how.
Binary Trees A binary tree is made up of a finite set of nodes that is either empty or consists of a node called the root together with two binary trees,
A Data Compression Algorithm: Huffman Compression
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
DL Compression – Beeri/Feitelson1 Compression דחיסה Introduction Information theory Text compression IL compression.
Data Structures – LECTURE 10 Huffman coding
Chapter 9: Huffman Codes
4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd.
Greedy Algorithms Huffman Coding
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
Huffman Codes Message consisting of five characters: a, b, c, d,e
CSE Lectures 22 – Huffman codes
Huffman Codes. Encoding messages  Encode a message composed of a string of characters  Codes used by computer systems  ASCII uses 8 bits per character.
Huffman Encoding Veronica Morales.
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
Data Structures Week 6: Assignment #2 Problem
Lecture 10 Trees –Definiton of trees –Uses of trees –Operations on a tree.
Trees Chapter 8. Chapter 8: Trees2 Chapter Objectives To learn how to use a tree to represent a hierarchical organization of information To learn how.
Spring 2010CS 2251 Trees Chapter 6. Spring 2010CS 2252 Chapter Objectives Learn to use a tree to represent a hierarchical organization of information.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
Running Time of Kruskal’s Algorithm Huffman Codes Monday, July 14th.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
Huffman encoding.
Chapter 11. Chapter Summary  Introduction to trees (11.1)  Application of trees (11.2)  Tree traversal (11.3)  Spanning trees (11.4)
Compression and Huffman Coding. Compression Reducing the memory required to store some information. Lossless compression vs lossy compression Lossless.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Lossless Compression-Statistical Model Lossless Compression One important to note about entropy is that, unlike the thermodynamic measure of entropy,
Design & Analysis of Algorithm Huffman Coding
HUFFMAN CODES.
COMP261 Lecture 22 Data Compression 2.
B/B+ Trees 4.7.
4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd.
Chapter 5 : Trees.
Proving the Correctness of Huffman’s Algorithm
The Greedy Method and Text Compression
Chapter 8 – Binary Search Tree
Chapter 9: Huffman Codes
Chapter 11 Data Compression
Huffman Coding CSE 373 Data Structures.
4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd.
Data Compression Section 4.8 of [KT].
Greedy: Huffman Codes Yin Tat Lee
Data Structure and Algorithms
Podcast Ch23d Title: Huffman Compression
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
CSE 589 Applied Algorithms Spring 1999
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Proving the Correctness of Huffman’s Algorithm
Analysis of Algorithms CS 477/677
Presentation transcript:

4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd

2 Data Compression Q. Given a text that uses 32 symbols (26 different letters, space, and some punctuation characters), how can we encode this text in bits? Q. Some symbols (e, t, a, o, i, n) are used far more often than others. How can we use this to reduce our encoding? Q. How do we know when the next symbol begins? Ex. c(a) = 01 What is 0101? c(b) = 010 c(e) = 1

3 Data Compression Q. Given a text that uses 32 symbols (26 different letters, space, and some punctuation characters), how can we encode this text in bits? A. We can encode 2 5 different symbols using a fixed length of 5 bits per symbol. This is called fixed length encoding. Q. Some symbols (e, t, a, o, i, n) are used far more often than others. How can we use this to reduce our encoding? A. Encode these characters with fewer bits, and the others with more bits. Q. How do we know when the next symbol begins? A. Use a separation symbol (like the pause in Morse), or make sure that there is no ambiguity by ensuring that no code is a prefix of another one. Ex. c(a) = 01 What is 0101? c(b) = 010 c(e) = 1

4 Prefix Codes Definition. A prefix code for a set S is a function c that maps each x  S to 1s and 0s in such a way that for x,y  S, x≠y, c(x) is not a prefix of c(y). Ex. c(a) = 11 c(e) = 01 c(k) = 001 c(l) = 10 c(u) = 000 Q. What is the meaning of ? Suppose frequencies are known in a text of 1G: f a =0.4, f e =0.2, f k =0.2, f l =0.1, f u =0.1 Q. What is the size of the encoded text?

5 Prefix Codes Definition. A prefix code for a set S is a function c that maps each x  S to 1s and 0s in such a way that for x,y  S, x≠y, c(x) is not a prefix of c(y). Ex. c(a) = 11 c(e) = 01 c(k) = 001 c(l) = 10 c(u) = 000 Q. What is the meaning of ? A. “leuk” Suppose frequencies are known in a text of 1G: f a =0.4, f e =0.2, f k =0.2, f l =0.1, f u =0.1 Q. What is the size of the encoded text? A. 2*f a + 2*f e + 3*f k + 2*f l + 4*f u = 2.4G

6 Optimal Prefix Codes Definition. The average bits per letter of a prefix code c is the sum over all symbols of its frequency times the number of bits of its encoding: We would like to find a prefix code that is has the lowest possible average bits per letter. Suppose we model a code in a binary tree…

7 Representing Prefix Codes using Binary Trees Ex. c(a) = 11 c(e) = 01 c(k) = 001 c(l) = 10 c(u) = 000 Q. How does the tree of a prefix code look? l u a e k

8 Representing Prefix Codes using Binary Trees Ex. c(a) = 11 c(e) = 01 c(k) = 001 c(l) = 10 c(u) = 000 Q. How does the tree of a prefix code look? A. Only the leaves have a label. Pf. An encoding of x is a prefix of an encoding of y if and only if the path of x is a prefix of the path of y. l u a e k

9 Representing Prefix Codes using Binary Trees Q. What is the meaning of ? l e m p 1 i 0 s 1

10 Representing Prefix Codes using Binary Trees Q. What is the meaning of ? A. “simpel” Q. How can this prefix code be made more efficient? l e m p 1 i 0 s 1

11 Representing Prefix Codes using Binary Trees Q. What is the meaning of ? A. “simpel” Q. How can this prefix code be made more efficient? A. Change encoding of p and s to a shorter one. This tree is now full. l e m p 1 i 0 s 1 s 0

12 Definition. A tree is full if every node that is not a leaf has two children. Claim. The binary tree corresponding to the optimal prefix code is full. Pf. Representing Prefix Codes using Binary Trees v w u

13 Definition. A tree is full if every node that is not a leaf has two children. Claim. The binary tree corresponding to the optimal prefix code is full. Pf. (by contradiction) n Suppose T is binary tree of optimal prefix code and is not full. n This means there is a node u with only one child v. n Case 1: u is the root; delete u and use v as the root n Case 2: u is not the root – let w be the parent of u – delete u and make v be a child of w in place of u n In both cases the number of bits needed to encode any leaf in the subtree of v is decreased. The rest of the tree is not affected. n Clearly this new tree T’ has a smaller ABL than T. Contradiction. Representing Prefix Codes using Binary Trees v w u

14 Optimal Prefix Codes: False Start Q. Where in the tree of an optimal prefix code should letters be placed with a high frequency?

15 Optimal Prefix Codes: False Start Q. Where in the tree of an optimal prefix code should letters be placed with a high frequency? A. Near the top. Greedy template. Create tree top-down, split S into two sets S 1 and S 2 with (almost) equal frequencies. Recursively build tree for S 1 and S 2. [Shannon-Fano, 1949] f a =0.32, f e =0.25, f k =0.20, f l =0.18, f u =0.05 l u a e k e u a k l

16 Optimal Prefix Codes: Huffman Encoding Observation. Lowest frequency items should be at the lowest level in tree of optimal prefix code. Observation. For n > 1, the lowest level always contains at least two leaves. Observation. The order in which items appear in a level does not matter. Claim. There is an optimal prefix code with tree T* where the two lowest-frequency letters are assigned to leaves that are siblings in T*. Greedy template. [Huffman, 1952] Create tree bottom-up. Make two leaves for two lowest-frequency letters y and z. Recursively build tree for the rest using a meta-letter for yz.

17 Optimal Prefix Codes: Huffman Encoding Q. What is the time complexity? Huffman(S) { if |S|=2 { return tree with root and 2 leaves } else { let y and z be lowest-frequency letters in S S’ = S remove y and z from S’ insert new letter  in S’ with f  =f y +f z T’ = Huffman(S’) T = add two children y and z to leaf  from T’ return T }

18 Optimal Prefix Codes: Huffman Encoding Q. What is the time complexity? A. T(n) = T(n-1) + O(n) so O(n 2 ) Q. How to implement finding lowest-frequency letters efficiently? A. Use priority queue for S: T(n) = T(n-1) + O(log n) so O(n log n) Huffman(S) { if |S|=2 { return tree with root and 2 leaves } else { let y and z be lowest-frequency letters in S S’ = S remove y and z from S’ insert new letter  in S’ with f  =f y +f z T’ = Huffman(S’) T = add two children y and z to leaf  from T’ return T }

19 Huffman Encoding: Greedy Analysis Claim. Huffman code for S achieves the minimum ABL of any prefix code. Pf. by induction, based on optimality of T’ (y and z removed,  added) (see next page) Claim. ABL(T’)=ABL(T)-f  Pf.

20 Huffman Encoding: Greedy Analysis Claim. Huffman code for S achieves the minimum ABL of any prefix code. Pf. by induction, based on optimality of T’ (y and z removed,  added) (see next page) Claim. ABL(T’)=ABL(T)-f  Pf.

21 Huffman Encoding: Greedy Analysis Claim. Huffman code for S achieves the minimum ABL of any prefix code. Pf. (by induction over n=|S|)

22 Huffman Encoding: Greedy Analysis Claim. Huffman code for S achieves the minimum ABL of any prefix code. Pf. (by induction over n=|S|) Base: For n=2 there is no shorter code than root and two leaves. Hypothesis: Suppose Huffman tree T’ for S’ of size n-1 with  instead of y and z is optimal. Step: (by contradiction)

23 Huffman Encoding: Greedy Analysis Claim. Huffman code for S achieves the minimum ABL of any prefix code. Pf. (by induction) Base: For n=2 there is no shorter code than root and two leaves. Hypothesis: Suppose Huffman tree T’ for S’ of size n-1 with  instead of y and z is optimal. (IH) Step: (by contradiction) n Idea of proof: – Suppose other tree Z of size n is better. – Delete lowest frequency items y and z from Z creating Z’ – Z’ cannot be better than T’ by IH.

24 Huffman Encoding: Greedy Analysis Claim. Huffman code for S achieves the minimum ABL of any prefix code. Pf. (by induction) Base: For n=2 there is no shorter code than root and two leaves. Hypothesis: Suppose Huffman tree T’ for S’ with  instead of y and z is optimal. (IH) Step: (by contradiction) n Suppose Huffman tree T for S is not optimal. n So there is some tree Z such that ABL(Z) < ABL(T). n Then there is also a tree Z for which leaves y and z exist that are siblings and have the lowest frequency (see observation). n Let Z’ be Z with y and z deleted, and their former parent labeled . n Similar T’ is derived from S’ in our algorithm. n We know that ABL(Z’)=ABL(Z)-f , as well as ABL(T’)=ABL(T)-f . n But also ABL(Z) < ABL(T), so ABL(Z’) < ABL(T’). n Contradiction with IH.