Download presentation

Presentation is loading. Please wait.

Published byGrace Selwyn Modified over 2 years ago

1
EE 4780 Huffman Coding Example

2
Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet A={a1, a2, a3, a4, a5}. Suppose that the probability of each symbol is as follows: {0.4, 0.2, 0.2, 0.15, 0.05}. Form the Huffman tree: a1 a2 a3 a4 a5 0.4 0.2 0.15 0.05 0.2 0.4 0.6 1.0 0 1 0 1 0 1 0 1 Symbol | Probability | Codeword a1 0.4 0 a2 0.2 10 a3 0.2 110 a4 0.15 1110 a5 0.05 1111 Average codeword length = 0.4*1 + 0.2*2 + 0.2*3 + 0.15*4 + 0.05*4 = 2.2 per symbol Entropy = = 2.08

3
Bahadir K. Gunturk3 Huffman Coding Example Another possible tree with the same source is: a1 a2 a3 a4 a5 0.4 0.2 0.15 0.05 0.2 0.4 0.6 1.0 0 1 0 1 0 1 0 1 Symbol | Probability | Codeword a1 0.4 0 a2 0.2 100 a3 0.2 101 a4 0.15 110 a5 0.05 111 Average codeword length = 0.4*1 + 0.2*3 + 0.2*3 + 0.15*3 + 0.05*3 = 2.2 per symbol

Similar presentations

Presentation is loading. Please wait....

OK

A Memory-efficient Huffman Decoding Algorithm

A Memory-efficient Huffman Decoding Algorithm

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on 2d transformation in computer graphics by baker Ppt on 9-11 conspiracy theories attacks Ppt on schottky diode voltage Ppt on automobile related topics on personality Ppt on cadbury dairy milk Ppt on dhaka stock exchange Ppt on pin diode symbol Ppt on non agricultural activities definition Ppt on electricity and circuits bill Ppt on smart grid concepts