Topic 20: Huffman Coding The author should gaze at Noah, and... learn, as they did in the Ark, to crowd a great deal of matter into a very small compass.

Slides:



Advertisements
Similar presentations
Introduction to Computer Science 2 Lecture 7: Extended binary trees
Advertisements

Lecture 4 (week 2) Source Coding and Compression
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture3.
Greedy Algorithms (Huffman Coding)
Lecture 10 : Huffman Encoding Bong-Soo Sohn Assistant Professor School of Computer Science and Engineering Chung-Ang University Lecture notes : courtesy.
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
Lecture04 Data Compression.
Compression & Huffman Codes
Huffman Coding: An Application of Binary Trees and Priority Queues
CSCI 3 Chapter 1.8 Data Compression. Chapter 1.8 Data Compression  For the purpose of storing or transferring data, it is often helpful to reduce the.
Compression JPG compression, Source: Original 10:1 Compression 45:1 Compression.
A Data Compression Algorithm: Huffman Compression
Is ASCII the only way? For computers to do anything (besides sit on a desk and collect dust) they need two things: 1. PROGRAMS 2. DATA A program is a.
CS 206 Introduction to Computer Science II 04 / 29 / 2009 Instructor: Michael Eckmann.
Chapter 9: Huffman Codes
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
Lossless Data Compression Using run-length and Huffman Compression pages
Data Compression Basics & Huffman Coding
1 Lossless Compression Multimedia Systems (Module 2) r Lesson 1: m Minimum Redundancy Coding based on Information Theory: Shannon-Fano Coding Huffman Coding.
Representation of Strings  Background  Huffman Encoding.
x x x 1 =613 Base 10 digits {0...9} Base 10 digits {0...9}
CSE Lectures 22 – Huffman codes
Data Structures and Algorithms Huffman compression: An Application of Binary Trees and Priority Queues.
CS 46B: Introduction to Data Structures July 30 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak.
Chapter 2 Source Coding (part 2)
ECE242 L30: Compression ECE 242 Data Structures Lecture 30 Data Compression.
Algorithm Design & Analysis – CS632 Group Project Group Members Bijay Nepal James Hansen-Quartey Winter
Data Compression1 File Compression Huffman Tries ABRACADABRA
Huffman Encoding Veronica Morales.
1 Analysis of Algorithms Chapter - 08 Data Compression.
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
Data Structures Week 6: Assignment #2 Problem
1 i206: Lecture 2: Computer Architecture, Binary Encodings, and Data Representation Marti Hearst Spring 2012.
Huffman Coding. Huffman codes can be used to compress information –Like WinZip – although WinZip doesn’t use the Huffman algorithm –JPEGs do use Huffman.
Lossless Compression CIS 465 Multimedia. Compression Compression: the process of coding that will effectively reduce the total number of bits needed to.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-217, ext: 1204, Lecture 4 (Week 2)
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Huffman Code and Data Decomposition Pranav Shah CS157B.
Priority Queues, Trees, and Huffman Encoding CS 244 This presentation requires Audio Enabled Brent M. Dingle, Ph.D. Game Design and Development Program.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Huffman’s Algorithm 11/02/ Weighted 2-tree A weighted 2-tree T is an extended binary tree with n external nodes and each of the external nodes is.
Main Index Contents 11 Main Index Contents Complete Binary Tree Example Complete Binary Tree Example Maximum and Minimum Heaps Example Maximum and Minimum.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Lossless Decomposition and Huffman Codes Sophia Soohoo CS 157B.
1 Huffman Codes. 2 ASCII use same size encoding for all characters. Variable length codes can produce shorter messages than fixed length codes Huffman.
Chapter 7 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5.
Lecture 12 Huffman Algorithm. In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly.
Compression and Huffman Coding. Compression Reducing the memory required to store some information. Lossless compression vs lossy compression Lossless.
Lossless Compression-Statistical Model Lossless Compression One important to note about entropy is that, unlike the thermodynamic measure of entropy,
GCSE COMPUTER SCIENCE Topic 3 - Data 3.3 Data Storage and Compression.
File Compression 3.3.
Design & Analysis of Algorithm Huffman Coding
Expression Tree The inner nodes contain operators while leaf nodes contain operands. a c + b g * d e f Start of lecture 25.
HUFFMAN CODES.
Assignment 6: Huffman Code Generation
Madivalappagouda Patil
ISNE101 – Introduction to Information Systems and Network Engineering
Huffman Coding Based on slides by Ethan Apter & Marty Stepp
Heaps, Priority Queues, Compression
Data Compression If you’ve ever sent a large file to a friend, you may have compressed it into a zip archive like the one on this slide before doing so.
Chapter 8 – Binary Search Tree
Chapter 9: Huffman Codes
Huffman Coding.
Huffman Coding CSE 373 Data Structures.
Huffman Encoding Huffman code is method for the compression for standard text documents. It makes use of a binary tree to develop codes of varying lengths.
Topic 20: Huffman Coding The author should gaze at Noah, and ... learn, as they did in the Ark, to crowd a great deal of matter into a very small compass.
Podcast Ch23d Title: Huffman Compression
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Presentation transcript:

Topic 20: Huffman Coding The author should gaze at Noah, and... learn, as they did in the Ark, to crowd a great deal of matter into a very small compass. Sydney Smith, Edinburgh Review

Agenda  Encoding  Compression  Huffman Coding 2

Encoding  UT CS    what is a file? how do some OS use file extensions?  open a bitmap in a text editor  open a pdf in word 3

ASCII - UNICODE 4

Text File 5

Text File??? 6

Bitmap File 7

Bitmap File???? 8

JPEG File 9

JPEG VS BITMAP  JPEG File 10

Encoding Schemes  "It's all 1s and 0s"  What do the 1s and 0s mean?   ASCII -> 2ym  Red Blue Green -> dark teal? 11

Agenda  Encoding  Compression  Huffman Coding 12

Compression  Compression: Storing the same information but in a form that takes less memory  lossless and lossy compression  Recall: 13

Lossy Artifacts 14

Why Bother?  Is compression really necessary? 2 Terabytes 500 HD, 2 hour movies or 500,000 songs 15

Little Pipes and Big Pumps Home Internet Access  40 Mbps roughly $40 per month.  12 months * 3 years * $40 = $1,440  10,000,000 bits /second = 1.25 * 10 6 bytes / sec CPU Capability  $1,500 for a laptop or desktop  Intel i7 processor  Assume it lasts 3 years.  Memory bandwidth 25.6 GB / sec = 2.6 * bytes / sec  on the order of 5.0 * instructions / second 16

Mobile Devices? Cellular Network  Your mileage may vary …  Mega bits per second  AT&T 17 download, 7 upload  T-Mobile & Verizon 12 download, 7 upload  17,000,000 bits per second = x 10 6 bytes per second iPhone CPU  Apple A6 System on a Chip  Coy about IPS  2 cores  Rough estimates: 1 x instructions per second 17

Little Pipes and Big Pumps Data In From Network CPU 18

Compression - Why Bother? 19  Apostolos "Toli" Lerios  Facebook Engineer  Heads image storage group  jpeg images already compressed  look for ways to compress even more  1% less space = millions of dollars in savings

Agenda  Encoding  Compression  Huffman Coding 20

21 Purpose of Huffman Coding  Proposed by Dr. David A. Huffman –A Method for the Construction of Minimum Redundancy Codes –Written in 1952  Applicable to many forms of data transmission –Our example: text files –still used in fax machines, mp3 encoding, others

22 The Basic Algorithm  Huffman coding is a form of statistical coding  Not all characters occur with the same frequency!  Yet in ASCII all characters are allocated the same amount of space –1 char = 1 byte, be it e or x

23 The Basic Algorithm  Any savings in tailoring codes to frequency of character?  Code word lengths are no longer fixed like ASCII or Unicode  Code word lengths vary and will be shorter for the more frequently used characters

24 The Basic Algorithm 1.Scan text to be compressed and tally occurrence of all characters. 2.Sort or prioritize characters based on number of occurrences in text. 3.Build Huffman code tree based on prioritized list. 4.Perform a traversal of tree to determine all code words. 5.Scan text again and create new file using the Huffman codes

25 Building a Tree Scan the original text  Consider the following short text  Eerie eyes seen near lake.  Count up the occurrences of all characters in the text

26 Building a Tree Scan the original text Eerie eyes seen near lake.  What characters are present? E e r i space y s n a r l k.

27 Building a Tree Scan the original text Eerie eyes seen near lake.  What is the frequency of each character in the text? Char Freq. Char Freq. Char Freq. E 1 y1 k1 e 8 s 2.1 r 2 n 2 i 1 a2 space 4 l1

28 Building a Tree Prioritize characters  Create binary tree nodes with character and frequency of each character  Place nodes in a priority queue –The lower the occurrence, the higher the priority in the queue

29  The queue after inserting all nodes  Null Pointers are not shown Building a Tree E1E1 i1i1 y1y1 l1l1 k1k1.1.1 r2r2 s2s2 n2n2 a2a2 sp 4 e8e8

30 Building a Tree  While priority queue contains two or more nodes –Create new node –Dequeue node and make it left subtree –Dequeue next node and make it right subtree –Frequency of new node equals sum of frequency of left and right children –Enqueue new node back into queue

31 Building a Tree E1E1 i1i1 y1y1 l1l1 k1k1.1.1 r2r2 s2s2 n2n2 a2a2 sp 4 e8e8

32 Building a Tree E1E1 i1i1 2 y1y1 l1l1 k1k1.1.1 r2r2 s2s2 n2n2 a2a2 sp 4 e8e8

33 Building a Tree E1E1 i1i1 k1k1 l1l1 y1y1.1.1 a2a2 n2n2 r2r2 s2s2 sp 4 e8e8 2

34 Building a Tree E1E1 i1i1 y1y1.1.1 a2a2 n2n2 r2r2 s2s2 sp 4 e8e8 2 k1k1 l1l1 2

35 Building a Tree E1E1 i1i1 y1y1.1.1 a2a2 n2n2 r2r2 s2s2 sp 4 e8e8 2 k1k1 l1l1 2

36 Building a Tree E1E1 i1i1 a2a2 n2n2 r2r2 s2s2 sp 4 e8e8 2 k1k1 l1l1 2 y1y

37 Building a Tree E1E1 i1i1 a2a2 n2n2 r2r2 s2s2 sp 4 e8e8 2 k1k1 l1l1 2 y1y

38 Building a Tree E1E1 i1i1 r2r2 s2s2 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4

39 Building a Tree E1E1 i1i1 r2r2 s2s2 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4

40 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s2 4

41 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s2 4

42 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s2 4 4

43 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s2 4 4

44 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

45 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s What is happening to the characters with a low number of occurrences?

46 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

47 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 r1r a2a2 n2n2 4 r2r2 s2s

48 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

49 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

50 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

51 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

52 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

53 Building a Tree E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s After enqueueing this node there is only one node left in priority queue.

54 Building a Tree Dequeue the single node left in the queue. This tree contains the new code words for each character. Frequency of root node should equal number of characters in text. E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s Eerie eyes seen near lake. 4 spaces, 26 characters total

55 Encoding the File Traverse Tree for Codes  Perform a traversal of the tree to obtain new code words  left, append a 0 to code word  right append a 1 to code word  code word is only completed when a leaf node is reached E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

56 Encoding the File Traverse Tree for Codes CharCode E0000 i0001 k0010 l0011 y space011 e10 a1100 n1101 r1110 s1111 E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

57 Encoding the File  Rescan text and encode file using new code words Eerie eyes seen near lake. CharCode E0000 i0001 k0010 l0011 y space011 e10 a1100 n1101 r1110 s

58 Encoding the File Results  Have we made things any better?  82 bits to encode the text  ASCII would take 8 * 26 = 208 bits  If modified code used 4 bits per character are needed. Total bits 4 * 26 = 104. Savings not as great.

59 Decoding the File  How does receiver know what the codes are?  Tree constructed for each text file. –Considers frequency for each file –Big hit on compression, especially for smaller files  Tree predetermined –based on statistical analysis of text files or file types

60 Decoding the File  Once receiver has tree it scans incoming bit stream  0  go left  1  go right A.elk nay sir B.eek a snake C.eek kin sly D.eek snarl nil E.eel a snarl E1E1 i1i1 sp 4 e8e8 2 k1k1 l1l1 2 y1y a2a2 n2n2 4 r2r2 s2s

Assignment Hints  reading chunks not chars  header format  the pseudo eof character  the GUI 61

Assignment Example  "Eerie eyes seen near lake." will result in different codes than those shown in slides due to: –adding elements in order to PriorityQueue –required pseudo eof character (PEOF) 62

Assignment Example 63 Char Freq. Char Freq. Char Freq. E 1 y1 k1 e 8 s 2.1 r 2 n 2 PEOF 1 i 1 a2 space 4 l1

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e8

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e8 2

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e8 2

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e8 2 2

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e

Assignment Example y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e

y1y1 E1E1 i1i1 k1k1 l1l1 PEOF 1 a2a2 n2n2 r2r2 s2s2 SP 4 e8e

73 y1y1 i1i1 k1k1 l1l1 PEOF 1 a2a2 SP 4 e8e E1E1 n2n2 r2r2 s2s

74 y1y1 i1i1 k1k1 l1l1 PEOF 1 a2a2 SP 4 e8e E1E1 n2n2 r2r2 s2s

y1y1 i1i1 k1k1 l1l1 PEOF 1 a2a2 SP 4 e8e E1E1 n2n2 r2r2 s2s

y1y1 i1i1 k1k1 l1l1 PEOF 1 a2a2 SP 4 e8e E1E1 n2n2 r2r2 s2s

Codes 77 value: 32, equivalent char:, frequency: 4, new code 011 value: 46, equivalent char:., frequency: 1, new code value: 69, equivalent char: E, frequency: 1, new code value: 97, equivalent char: a, frequency: 2, new code 0101 value: 101, equivalent char: e, frequency: 8, new code 10 value: 105, equivalent char: i, frequency: 1, new code 0000 value: 107, equivalent char: k, frequency: 1, new code 0001 value: 108, equivalent char: l, frequency: 1, new code 0010 value: 110, equivalent char: n, frequency: 2, new code 1100 value: 114, equivalent char: r, frequency: 2, new code 1101 value: 115, equivalent char: s, frequency: 2, new code 1110 value: 121, equivalent char: y, frequency: 1, new code 0011 value: 256, equivalent char: ?, frequency: 1, new code 0100